It is widely accepted that newly graduated analytics and data science students require substantial investment from their first employer to become productive. While new graduates will always require more handholding than experienced employees, I’ve always felt that there had to be a way to better prepare students for the workforce than how we do it today.
Now that I’ve gotten a much closer look at how universities and their partnerships with the private sector work, I’ve come to believe that there are changes that can be made to university degree programs, as well as how companies invest in talent, to make analytics and data science students more ready for the workforce. Note that this blog’s concepts should also be directly relevant to other applied fields with technical academic programs such as computer science, engineering, etc.
How Things Used To Be Done
When I went through college and graduate school, it was considered a good thing to take part in an internship during the summer. However, there weren’t many formal programs to support that effort. Further, the universities I attended, while very large and established, didn’t put much emphasis on getting real world experience. Students of my generation regularly graduated without ever setting foot outside the halls of academia. That approach leads to situations where students have a lot of theoretical knowledge and book smarts but are unable to apply that knowledge effectively in a practical, real world business setting. I discussed this concern in a prior blog and it is bad for both students and their future employers.
Where We Are Today
Today, many university programs require internships or other work experience to be obtained as part of a degree program, and most of the rest at least heavily encourage it and attempt to facilitate it. Similarly, many companies have formal internship, co-op, and university partnership programs to try to recruit new talent while simultaneously helping to develop that talent. Universities also often offer, if not require, applied project courses which focus students on the application of their knowledge to real problems.
All those programs are aimed at making students better prepared, and forward-thinking universities and companies have embraced this model alongside motivated students. However, there is more that can be done to make graduates ready for what they’ll face in their jobs and to enable employers to get more productivity, faster, from fresh graduate hires.
What Blacksmiths Did Right
Back in the day, if one wanted to be a blacksmith, it wasn’t a matter of taking some courses and then getting a job. A core part of becoming a blacksmith was a formal apprenticeship under a highly experienced blacksmith. This mentor would help the apprentice understand how everything worked and slowly move them from shoveling coal while watching the blacksmith do all the work, to helping the blacksmith do the work. Many other hands-on careers followed the same model. I recall hearing that it used to take seven years as an apprentice to become an official Japanese hibachi grill chef!
The point is that, especially for trade jobs, the thought of someone just taking classes in a classroom and then getting to work is unfathomable – and rightly so. There is a lot more to hammering out a horseshoe than simply reading about how to do it. There is a lot more to being a master carpenter than reading about the techniques a master carpenter uses. The best way to learn a trade is to watch and then mimic and practice what was seen to build up one’s skills.
How Data Science Can Borrow From Blacksmithing
If we really want our educational system to make students ready for the workplace, we need to consider some radical changes. Internships are fine. Co-ops are even a further step in the right direction. However, it would be even better if getting an analytics and data science degree required substantive work experience as part of the degree. In other words … an apprenticeship.
This could mean adjusting coursework requirements to make room for a year or more of focused apprenticeship. It might also mean extending a degree’s timeline. The assumption is that students will be paid during an apprenticeship so that they won’t need to worry about funding and running up student debt. An apprenticeship model will also require a change in how corporations make use of students. Assigning an employee to be a formal mentor to an apprentice for six months to a year necessitates modifying current approaches to working with students.
Why Should An Apprentice Model Be Adopted?
Data science is a dynamic, rapidly changing field. The courses taught at universities can be years behind the latest tools and approaches being used in the workplace. The only way to get skills up to date is to work in the real world and merge the reality of the workplace with the necessary underlying theory being learned in school. At the same time, if students start working while they are still in school, they’ll be better able to target their coursework to what they like best and will be able to put the academic theory they are learning into a real-world context even as they initially learn it.
Even if an apprenticeship approach is adopted, it won’t be a one and done endeavor. Data scientists will continuously need to learn the latest tools and techniques to stay relevant. I’ve discussed in the past that there is a difference between having outdated skills and having an outdated mindset. Top data scientists endeavor to continually learn on their own and from their peers. They’ll also be eager to give back by mentoring a young apprentice to follow in their footsteps.
Without a concerted effort from both the university and the corporate communities, however, we’ll remain trapped in the cycle of largely graduating smart, motivated students who are well versed in the theory of data science, but who have learned little about how to apply that knowledge in a way that will keep them employed. Agree? Disagree? Feel free to comment!
The post What Data Science Can Learn From Blacksmiths appeared first on Datafloq.