“From the beginning, Elder Research partnered with us to create a world class upskilling program, teaching the fundamentals of Machine Learning to our organization with an equal focus on how to use the techniques as what goes wrong with them. Their decades of hands on experience brought to life how models fail and the combination of face to face interactions, weekly homework, exams, and cap-stone projects insured our colleagues used the course to transform their work.”
As innovative leaders awaken to the power of analytics and how its insights can positively impact their organizations’ bottom line, many are looking for effective ways to “upskill” their analyst workforce. And, while providing valuable skills, a program of formal analytics training can also serve to align the group’s vision, goals, and terminology.
Upskilling can be less expensive and disruptive than hiring, onboarding, and training new staff. Analysts have to learn the new analytic material but already possess the hard-won subject matter expertise used in their daily decision-making. Further, most employees value the opportunity to learn and grow within their job, so investing in their careers encourages a dedicated and stable workforce.
Designing an effective and efficient analytics training curriculum is a very daunting task (though for experts it can also be exhilarating). Here, we will share insights from a recent very successful training program we developed in close partnership with a top Fortune-50 organization, and briefly summarize many decisions points encountered throughout the process.
Training Options
The engagement described here is an intense custom course but is just one way to deliver training. For 25 years, Elder Research has offered many alternative training formats and platforms at a spectrum of depth, costs, and time commitments. They include:
- Intensive, live interactive custom-designed programs, as described here.
- A catalog of 80+ fully online courses in statistics, data science, and operations research offered through the Institute for Statistics Education at Statistics.com, a division of Elder Research.
- Affordable certificate programs in data science, biostatistics, and social science statistics offered through the Institute, with courses approved for academic credit by the American Council on Education.
- Affordable Bachelors and Masters degree programs offered through the Institute and its university partners.
- 1 or 2-day workshops on focused topics on analytics.
- Hybrid combinations of live interactive and asynchronous training, tailored to meet the needs and budget of any organization.
- Self-paced introductory courses in statistics, forecasting, and programming that can serve as background prerequisites for more comprehensive training.
Goals and Constraints of the Semester-Long Course
As with any analytical endeavor, defining the goal and the constraints are the crucial first steps. Our client wanted to increase value by upskilling approximately 250 analysts to intelligently and correctly apply advanced data science techniques to their existing business analysis tasks, be capable of identifying and exploring new analysis ideas, and develop informal and formal cross-department collaborative teams within the organization.
As for constraints (besides cost), the biggest would be time and focus:
- A class (cohort) will last about a college semester in calendar time.
- Each cohort will include two weeks with everyone all together in one site.
- Registrants will still have much work to keep up with, so plan for students to dedicate ~2 days/week to the course (on off-site weeks).
- Proficiency in the material will be required to advance in the organization!
So, performance will be graded, and failure is possible. - Practical is always to be emphasized over theoretical, but concepts are essential. Nothing is to be “plug and play”; explore why and how a technique works or doesn’t.
- Everything will be taught through hands-on exercises as well as through lectures.
- Mentoring and help is to be available (from instructors and peers) at every stage
Experience
The on-site weeks were believed to be crucial to kick-start the learning, as well as to build the peer community and fellowship key to the collaborative teaming. And, indeed it worked. It is well-known that teaming is enhanced by shared suffering, and the instructors provided that 😃 through a pace of material more intense than any course the students could remember! At least that was the feeling expressed by participants early on; they quickly learned that, contrary to their past university experiences, the practical emphasis here allowed the material to be covered much more quickly than was typical. And, the hands-on exercises, grounded in the domain they were familiar with, brought home the key points with great efficiency.
But that is getting ahead in the story; first, there was a great deal of time spent brainstorming, debating, and testing how the precious hours were to be filled; that is, what topics, in what order, would make up the curriculum. Crucial to this, is understanding, or demanding, what the initial base knowledge state of the students must be; that is, setting the prerequisites. The client determined the software tools and programming languages their environment required, and minimal prerequisites were set in those areas and in basic statistics that must be passed before staff could register.
Over several meetings in the initial four weeks we worked with the client to design a curriculum to meet the technical goals. The topics included:
- Modeling Approaches – random forests, neural networks, text mining, etc.
- Practical Application – model validation, bias detection, cross-validation, etc.
- Data Engineering – extract, transform, load, etc. based on their technical stack
Course planning and material development for the first few weeks of the curriculum were completed within three months, including review and completion of slide decks and final review of the videos for the remote learning weeks. Still, feedback about material and other details were constant throughout the delivery of the course, and adjustments were made frequently to improve the experience. For instance, the midterm exam was added after a few cohorts in order to give a better “early warning” to the students about what the final exam would be like. (It is a 1-hour version of the 3-hour closed-book final.) It also could alert instructors and management about who might be in danger of failing, so additional help or time could be provided as needed. It worked well for both goals.
Homework assignment grades weren’t as good as the midterm was at revealing students who might be having trouble grasping some of the deeper concepts, as homeworks were often collaborative endeavors which, on balance, were encouraged. The peer cooperation in learning and reviewing concepts together outside of class was in fact so strong and positive an experience for most participants that many cohorts voluntarily maintained their working groups past the end of the course as they resumed their normal work life. This helped accomplished one of the secondary goals of the training.
The course kicked off with an onsite week of lectures and hands-on exercises, followed by five remote weeks, each of which consisted of a 1 – 2 hours video lecture, optional office hours and Q&A sessions, and a graded homework assignment. This sequence was repeated once more: gathering for a week of onsite learning (which included a timed midterm exam) and then five more weeks of remote learning with video lectures + homeworks. The course wrapped up with a week of review and instructor support (as needed) and a timed and challenging final exam.
Practical Projects
Not mentioned so far is that enterprising students could opt to design, complete, and publicly present an original project in lieu of taking the exam. (If a student did both a project and exam, the higher grade would be used.) Only the strongest students were expected to choose to do a project; it was explained that it would be much harder, that delays such as data availability or noise could be out of one’s control, that the open-ended nature could be disorienting if one is not disciplined, etc. Students needed permission from their manager as well as a senior instructor. But, the rewards – for both the student and the entire program – were very strong. The return on investment (ROI) from some of the projects were alone capable of paying for the entire training program! And the students with successful projects (and their managers) were especially enthusiastic proponents of the training program, advertising the excitement they experienced integrating the new concepts they were learning into their everyday tasks to achieve new levels of productivity and insight.
A secret to how efficiently participants were able to integrate their new skills into their work was the guidance they received from an assigned mentor from Elder Research. At every stage, from project design to presentation, they could call on an experienced expert for advice. When possible, Elder Research mentors would guide students Socraticly – that is, through guided queries, rather than through direct answers – which helps lessons take hold. And students very frequently acknowledged fondly, in their final presentations, the help of their mentors. Importantly, the presentations for one cohort were scheduled to occur during the kickoff week of a later cohort. The live audience of new students got to be inspired by the projects of the graduating students and could see what they might also be able to accomplish.
Projects are so valuable the question might arise “Why not make them required instead of optional?” Since I’ve done this (as a professor in graduate engineering school courses) let me warn that not everyone is suited to an open-ended project. It can be very hard to help a person who may be quite capable at accomplishing very well-defined tasks to also thrive at inventing a suitable task from scratch. Also, data problems can strike at random, or time demands or stress can be felt quite acutely by some during the period of the course, not allowing the flexibility a “professional science fair project” demands. Does everyone on your team need to demonstrate this skill of being able to design a project? Allowing projects as an option (and slightly discouraging it) worked best for a required, graded course, in our opinion. And, it led to a very high average quality of projects completed.
The Stakes of Performance
There are many options for assessing student performance. Some organizations only want to know that a student attended a course and engaged with the material. If the organization needs to be certain that a set level of mastery has been achieved then grades are needed. And, what are the stakes of the training engagement for the students? Will they be fired if they didn’t pass the course? Will they be promoted if they do well in the course? Determining how course outcome will affect the students helps guide decisions on:
- How the course will be evaluated (e.g. a percentage, a grade, or a pass/fail)
- How much (if any) of the trainee’s normal responsibilities could be released
- How much time could be allocated for completion of course work.
Because the primary goal of this training engagement was to teach trainees to competently practice data science skills, each student was granted approximately half their time to devote to the course. Elder Research and the client agreed on a percentage-based grading system for the homework and exams, ending with a “Pass” or “Fail” assigned as the overall outcome. Because the client wanted to know who excelled, we also summarized our thoughts, at the end of each cohort, on advanced and exceptional students.
ROI and Feedback
Elder Research has currently taught ten cohorts, with more in the planning stages. Training has been a huge success. The ROI for some individual student projects recouped the cost of the training program many times over and strong cohort teams across the globe still collaborate. We frequently receive feedback such as “This is the best training investment in me that the company has made in my 14 years at my organization” and “I’m super excited that my project board has aligned to the recommendations from this project and will be rolling them out as guidance to the field at the sales meetings scheduled for the end of the month”. And, more advanced modules are currently being co-designed by ERI and the client team to extend the program’s wide success.
Acknowledgements
Many of my excellent colleagues at Elder Research did tremendous work to build and deliver this course and support and mentor the students. I want to name here those essential to helping design the course, who were our Principal Scientist, Mike Thurber, who also oversees the projects and mentoring, and our Managing Director (and Senior Scientist) Dr. Jennifer Schaff, who manages the large project, including the nigh-impossible task of herding the company founder.