Industry 4.0 is a German government term coined in 2011, meant to drum up enthusiasm for the next wave of technologies. This includes the need to prepare the industrial sector for the internet of things or IoT. Call me old, but 2011 is a long time ago. In my view, there’s nothing in the industry worth trying to implement for years, only to realize the work has just begun, except perhaps placing bets on major energy transitions towards regenerative industry ecosystems, or the productization of a new energy source such as fusion. Who believes industry 4.0 is still relevant?
The Industrie 4.0 approach, as the Germans call it, is a well documented initiative publicly funded with €200 million from the federal government. Across Germany, firms have benefited from a long-term strategy aiming to digitize, research, network, and “rapidly” advance the industrial infrastructure in that country and across the EU. The initial aim was to transfer scientific results into the development of technology. Decades later, why is the quest not finished?
Despite the hype, Industry 4.0 is a technocratic project using clever numerology and labeling former industrial revolutions 1.0, 2.0, and 3.0. The 4.0 addition promises nirvana but what does that entail? Presumably, it means a deep integration of computer systems–assuming that by the point in time it references, all-digital manufacturing is “complete.” In the United States, this term never caught on. Instead, policy folks speak of “smart manufacturing”, and envy the German approach. Smart is never a good term for anything. Calling it out makes it kind of dumb by definition.
Manufacturing in the States is quite clearly lagging behind the Germans in the widespread adoption of new technologies, a widely acknowledged observation made in the community. Scientific American published a 2012 piece by Stefan Theil titled, “The U.S. Could Learn from Germany’s High-Tech Manufacturing”. Dan Breznitz continued the conversation in a 2014 HBR article on “Why Germany Dominates the U.S. in Innovation”, and by 2015, nonprofit organization Brookings Institute released the report “Lessons from Germany.” The main culprit could be how small and medium-sized enterprises (SMEs) innovate. German SMEs are world class, US SMEs are not, implies US political author Steven Hill in the Atlantic’s 2013 piece, “President Obama Wants America to Be Like Germany—What Does That Really Mean?”. Is this true?
Germany may have gotten industry 4.0 right, but what if industry 4.0 itself is wrong? The problem is: workers are out of focus in industry 4.0. This is harder to see with a European lens because the worker support systems are so good at compensating. Seemingly, we are back to exploiting labor, just with more efficient means of production.
Industry “four-point-0” is not just about robotics, by the way. Robotics is a relatively minor feature of contemporary factory work. At least, according to MIT’s “The Work of the Future” study, which ran from 2018-2021. MIT reports that most of the worry—and some excitement— centers around much more mundane industrial control systems. Historically, these have been near impossible to learn, have horrible user interfaces, and the machines refuse to talk to each other, so you need as many of them as you have machines. If a factory floor is a cacophony of spouses trying to get their points across, this machinery is the arbiter who has not even bothered to show up. The MIT study suggests investing in skills and training. Maybe so, but could that not be wasteful, too?
Consider that most of the manufacturing sector consists of SME companies, each with day-to-day concerns that far outshine corporate imperatives to invest in technology and re-train their workers. There’s trouble within this situation, and policymakers, although well-meaning, would have us believe we have a massive re-skilling challenge on our hands.
It’s worth asking if the technology used for upskilling, or more importantly the technology we skill for, benefits the workforce. After all, why would the training be made so difficult in the first place? Experts claim it could take thirty years to upskill Ohio and Michigan’s manufacturing bases, let alone those in Africa, South America, or Asia.
No company should be allowed to put overly complex technology in place. Unfortunately, there are no regulations to outlaw complexity, even if technology is tricky-to-use, too challenging to learn, and incapable of communicating with older generations. Operators trained in a day cannot run this machinery. At a minimum, there could be UX designers on the job to certify the tech for use in the factory. A moratorium on bad technology could perhaps bring some shopfloors to a halt while invigorating others?
All that being said, what would the remaining curriculum look like? Still needed: a core competency in management frameworks, technologies (e.g. Additive/3D/4D, AI/ML, Edge/IoT/Sensors, Industrial production systems, No-code software, Robotics systems), industrial operations platforms, and digitally enhanced operational practices, to name a few. Each could arguably be taught in a week instead of months or years, as exemplified by educators who award digital badges to workers for such courses which send them toward lucrative careers with modest effort and investment. Right now, these ideas are limited by the poor state of factory floors, well-intentioned but stodgy educational institutions, and the lack of curated training paths.
Industrial tech has become too complex, and industry 4.0 will soon lose relevance without providing the value expected, even if the implementation strategy gets perfected. Perhaps though, the workforce and the economy will be better for it. The “German envy” the US has been displaying is not helping us along. It may be as simple as pulling off the band aid. What if the States were to develop a pragmatic, simple, and next-generation Lean approach that will not take decades to implement? Take that, Europeans.
Source: https://www.forbes.com/sites/trondarneundheim/2022/03/24/is-industry-40-still-relevant/