In a previous post I outlined cobots utopia where collaborative robots extend the worker’s abilities and compensate some human weaknesses. In this perspective cobots could keep aging workers on the job and help to improve industrial jobs’ image, often quoted Dirty, Dangerous and Difficult.
The cooperation between robots and workers could increase manpower productivity, hence reducing the cost gap with low-cost countries.
How likely is this to happen?
Let’s put it bluntly: why should an investor invest to compensate the human weaknesses with high-tech, knowing that in the system made of the association of robots and humans, the latter will still be the limiting factor?
Everything else being equal, why should an investor choose the precarious option of backing-up expensive workforce with cobots when a cheaper basic workforce is available somewhere in poorer, not-so-advanced countries?
Everything else being equal, why should an investor choose to invest in a complex combination of man-and-machine when full automation / robotics may soon be / already is (?) available?
If investors are facing the choice between a cobot assisted human worker and a full automated process, I’m not sure many cobots will sell. What’s sure, the robot makers will sell, either robots or cobots!
Related: Cobots utopia
Promises of technologies for the future by economist Marco Annunziata on TEDTalks
“The worker of the future will be more like iron man than the Charlie Chaplin of Modern Times”
GE’s view about Industrial Internet
The rise of Big Data may be the new and serious threat. In this early month of 2014, this article explores possible scenarios for the future of Six Sigma.
Industry 4.0 and Big Data
Six Sigma was born, as many other methods, in industry. The fourth industrial revolution is about harnessing smart machines and objects, enabling them to communicate between them and with the world via Internet of Things. This will generate huge masses of data and will need Big Data techniques to manage them.
Here is the threat for Six Sigma, in this tremendous capacities to quick compute and exploit incredible masses of data. Part of this ability can even directly be embedded in the machines and facilities.
Big Data techniques allow not only to “manipulate” quickly huge amounts of data, even heterogeneous ones, and compute analysis based on correlations and/or scenarios.
This capacity offers new perspectives for providing predictive exploration and self-learning abilities to objects, e.g. machines.
Machines, equipment or even facilities will be able to auto-check and learn to sense early signs of probable failure and act accordingly: drive down to safe state, slow down, warn, auto-correct, etc.
These “objects” will be able to understand what the most favorable e.g. productive, effective, efficient scenarios are and adjust themselves for optimization.
Consequently, this real-time or short interval very smart monitoring based on a great number of parameters poses the question about further need for approaches like Six Sigma, focused on a very small number of critical parameters.
Weaknesses of Six Sigma
In an article from January 2008 on Harvard Business Review blog (HBR Blog Network) Why Six Sigma Is on the Downslope, Tom Davenport points five weaknesses of Six Sigma from his point of view:
- statistical mumbo-jumbo / seldom delivered on in most companies’ implementations
- didn’t incorporate information technology
- overly elitist, relying on Six Sigma expert when every employee should be a process improver
- only enabled incremental improvement, not radical breakthroughs
- wasn’t a good fit for innovation-oriented work.
From my perspective, these weaknesses are still valid even so they deserve some moderation.
Opportunities for Six Sigma
In The Future of Six Sigma 2013 Update (date?), Thomas Pyzdek, author of Six Sigma Handbook, recalls Six Sigma often announced doom, its evolutions before hoping for Six Sigma to embrace the Big Data Revolution, which from his point of view is an opportunity, not a threat.
Big Data crunches the data warehouse contents to look for correlations. Correlations are then used for planning activities and, usually, the cause of the correlation is not pursued.
Six Sigma and the quality profession can add a dimension to Big Data by filling in the gap between correlation and causation.
Pyzdek gives examples of biases or misinterpretation of Big Data analysis based on correlations and not planned experiments.
His article ends with a somewhat paradoxical proposal for sheltering stranded “six sigma belts”:
Speaking of skilled professionals, the obvious preferred group for addressing Big Data issues is Statisticians. However, Statisticians are in notoriously short supply and have been for decades (if not always.) Six Sigma “belts,” quality engineers, and reliability engineers are trained in a significant subset of useful statistical techniques. This pool of skilled workers can be leveraged to greatly expand the reach of the few statisticians available in most organizations.
What I find paradoxical is the faith in Six Sigma’s future on one hand and the escape route for experts on the other.
Design For Six Sigma
Independently of revolutions to come, Six Sigma just as Lean could see its application gaining new attention in early design and development stage. The goal is to design right first time, robust and reliable for later efficiency in production, what’s called Design For Six Sigma (DFSS) and Design For Manufacture and Assembly (DFMA).
In these fields, statistics, capabilities and such methods and techniques keep all their interest, unchallenged (?) by Big Data.
The question is: is the future of Six Sigma limited to this (kind of) niche?
Follow me on twitter: @HOHMANN_Chris