For years, there has been a strenuous debate over the minimum wage. In the lats few years, the intensity of the debate has grown due to massive unemployment in the Great Recession. During the Great Recession, the total income earned by the middle class fell while fortunes for the 1% improved. Despite a very rosy outlook for the upper class, the wealthy have provided mighty resistance to any effort to raise the minimum wage.
The minimum wage has not kept up with inflation. Study after study has shown that if the minimum wage kept pace with inflation, it would be about $10 an hour. Yet, when the mere suggestion of raising the minimum wage to $10 an hour is made in public discourse, conservatives cry foul, and warn us that millions of teenagers will be put out of work. While a significant fraction of minimum wage workers are teenagers, a far larger proportion are older workers.
The Republican argument that raising the minimum wage would cost jobs fails muster since the real value of the minimum wage has fallen over time, yet unemployment remains uncomfortably high - unless there is a bubble propping things up. The GOP maintains that having a low minimum wage or even *no* minimum wage will increase employment. Seems odd then that while the real value of the minimum wage is less than what it was in the 1960s unemployment is still very high. There is simply no evidence to support their claim in this natural experiment.
Whether or not to raise the minimum wage is a fair question. But it is a distraction. There is a deeper, more basic question we should all be asking. Think of the technology you use every day, at home, and at work. Computers, smart phones and tablets, programs and web-based applications.
I love the web based application, for they are a wonder to behold. I have watched Google's Gmail evolve over time. Google Office is amazing and evolves over time. Even at my workplace, we use web-based apps every day, and they have evolved, noticeably, in the short time that I've been there, a mere 3 months. Suggestions, bug reports, incremental innovation, it's all happening and at a fairly rapid place. The web application I use at work beats the pants off of any desktop application and the Oracle interface that I can use from time to time.
The debate over the minimum wage is a ruse for a much larger, much more basic question. Employees innovate all the time. They find new ways to do things with technology, even if it is just a mop. They write programs, they learn programs and find ways to work around programs that don't do the job right. They report bugs. They report errors. They make suggestions. Yet, if one small suggestion leads to a leap in productivity, or even a small improvement, he is not compensated for it.
If the minimum wage had kept up with inflation and were adjusted for increases in productivity, we'd see a minimum wage set around $22 an hour. Who gets this compensation? Certainly not your average hourly knave. No, that's too good for them. By conservative standards, its better to build character by withholding increases in compensation and encouraging more hours to work. Why pay an employee more money when he could save that money and have time to look for a job he really likes? God forbid that he should ever have time to start his own business. Having a captive audience is very important to the upper classes. Just ask Alice Walton of Wal-Mart fame, she knows.
The winners for the largest slice of the innovation economy? The manager? Sometimes. Most often though, it's the VP, the CEO. Don't forget the board of directors that meets 2 or 3 times a year pretending to work while sitting around at the board table while determining the fate of their employees. The people at the top are the winners who get the prize for innovation. Studies have indicated that they have captured over 90% of the gains from productivity in the last 10 years and more.
While the question of the minimum wage is relevant, the question we should all be asking is: should the productivity gains from innovations flow only to the top 1%?