I didn't roam the ground floor of the computer industry, but it was only a few flights up. At college in the early 1970s, I manipulated code — not merely running programs but writing them. The field was "data processing" then, and I hammered on a keypunch machine, gashing small rectangular holes into cards. I fed these into a hopper on an IBM 360/22, a computer only a bit smaller than a Volkswagen Bug. If you happened to drop your precisely ordered batch of cards on the stroll to the CPU … well, a human was the untidiest loop in the sequence.
The future of work in a digital world
High tech can do a lot, but can it do so with meaning? It's good to be engaged in an act of labor — to have a purpose, a future. Even the machines might (one day) think so.
By Peter M. Leschak
The class was interesting, occasionally absorbing, but not inspiring — for me, on a par with solving crosswords. Nevertheless, I did well enough to be offered a student job in the data processing department. I was already working part-time on campus, for an outfit called the line crew — "lines" being water and sewer pipe. We started as ditch-diggers, but soon learned how to lay various kinds of pipe; to install fittings, valves and control boxes, and to troubleshoot and repair system breaks. Unlike other student workers, we were dispatched in the middle of the night to tackle emergency water and sewer challenges. The job was satisfying.
The assistant director of data processing knew I worked in dirt, and when he offered me a position — on the cutting edge of the future, as he saw it — I was aware he felt like a benefactor, and rightly so.
"No, thank you, sir" I nonetheless replied, "I'd rather stay on the line crew." It was a simple statement of personal preference, but he clearly felt snubbed. He'd presented an opportunity, and I'd spurned it. Indeed, my principal work life has been blue-collar ever since, including miles of pipe and a host of muddy and/or icy trenches.
In the 40-plus years since then, most water and sewer line technology of that era has remained apropos; in computing, the tools of the 1970s vanished decades ago and are largely forgotten. Ask a smartphone user on the street: "What's a keypunch machine?" Google it.
The engineering of water distribution and wastewater collection remains static. A pipe is a pipe, and so similar in form and function to a blood vessel or a colon that pipes will probably not be much further refined. In evolutionary terms, they're like tortoises or coelacanths.
Computers are like bacteria or viruses — evolving so quickly that comparison even to brains seems inadequate. Are they mutating into their own "species?" We are the agents of that evolution, and one thing I learned back in 1972 still pertains: GIGO, garbage-in, garbage-out. Humans remain in ultimate (though not always in proximate) control of the machines. We continue to write the programs, at least for now.
Individuals often feel dragooned by computers, boxed in by a drop-down box. Despite the undisputed benefits of the digital milieu, some (many?) of us may nod and chuckle at a quip by the late Ogden Nash: "Progress might have been all right once, but it has gone on too long." Any backlash, however, is muted.
Instead, we hear high talk about the "digital revolution." It's a chimera. Our electronic tools can be dazzling, but we mostly use them to perform the chores of life we've done for centuries — just more so and faster. We can trade information with more people at a higher rate than ever before, and do it with a global reach. But what of significance do we share? Are we indeed more "connected" in a meaningful way? Sometimes, not always; the jury is out. Our computerized tools — from refrigerators to combines — operate more efficiently and effectively, but facing the demands of 7-billion-plus people/consumers, the gains have been marginal, not transformative.
The philosopher Eric Hoffer (1898-1983), a migrant farm laborer and longshoreman who crashed the intellectual scene in 1951 with his stunning book "The True Believer," spent most of his writing career analyzing social change. Around the time I was punching holes in cards, he wrote:
"We used to think that revolution is the cause of change. Actually it is the other way around: revolution is a by product of change. Change comes first, and it is the difficulties and irritations inherent in change which set the stage for revolution."
In the same essay, a few paragraphs later, Hoffer made an unwitting but prescient reference that could be applied to today's Middle East.
Referring to "developing countries" that "do not countenance individual enterprise and self-assertion," he asked what people do "when they cannot win a sense of confidence and worth by individual effort? They reach out for substitutes. The substitute for self-confidence is faith, and the substitute for self-esteem is pride. Faith and pride in what? In a leader, in a holy cause, a nation, or race. And it is easily seen that once you operate with faith and pride you are going to have the bedlam atmosphere of a madhouse."
These insights also apply to our accelerated technological change. In many fields — astrophysics and economics come to mind — practitioners don't have the capacity to process the data they already have, yet more is surging in. We barely (or never) master one gadget when the next one appears, and often it's not a better instrument, just a different one. The Internet, however useful, also bubbles over with trivia, misinformation and vitriol — a kind of cybertabloid. So what will be the true revolution ignited by this endless, often difficult and irritating change?
Perhaps this: In 2013, researchers at Oxford University predicted that machines — that is, computers/robots — may replace humans in half of all American jobs within 20 years. What will those 75 million idled people do? From where will they derive self-confidence and self-esteem? A sense of usefulness? Regular work is not all there is to life, but it's crucially important, not only for financial support, but for mental health and the integrity of communities. If half the population is out of work, it seems the "bedlam atmosphere of a madhouse" might not be long in coming.
In the latest issue of the Atlantic, Derek Thompson provided a trenchant summary of the potential social challenges, through the lens of the well-known economic meltdown of Youngstown, Ohio, three decades ago. He notes that in modern industrial/consumer societies, large-scale unemployment leads not only to reduced material circumstances but to increases in depression, domestic abuse and suicide. When our society was largely agrarian and rural, there was always something that needed to be done, and by people. Not so anymore.
Yes, we've heard about the threats of "automation" from the beginning of the Industrial Revolution, since the Luddites destroyed textile factory machines two centuries ago. But Thompson also reported that former Treasury Secretary Lawrence Summers recently said, "Until a few years ago I didn't think this was a very complicated subject: the Luddites were wrong, and the believers in technology and technological progress were right. I'm not so completely certain now."
Believers? Yes, because faith appears to be a significant factor. Do you believe that unfettered — or lightly fettered — free-market capitalism, which entails the further burgeoning of technology by definition, will produce our best future? Or do you believe that particular dogma has failed, or will fail due to environmental limits, increasing income disparity and the basic serpent-swallowing-its-tail contradiction of mass consumerism?
Will the "invisible hand" of the marketplace — a kind of secular mysticism — provide 75 million new jobs? Not likely. The rate at which high tech eliminates jobs currently exceeds the rate at which it creates them. Thompson notes that perhaps the Luddites were just premature. If the Oxford study is right, the "visible hands" of government and private collectivism will have to step in, although that tune does not play well in Washington, D.C.
I remained on the line crew because we were creating tangible, durable systems with significant physical exertion, out in open air. Can millions of people be put to work in similar fashion, rebuilding infrastructure and ecosystems à la the New Deal of the 1930s? Seems hopelessly retro, but the need is apparent.
If the shadow of FDR, activist government and hometown cooperatives are too scary for free-market buccaneers, there may be another path: the even bigger digital revolution of AI, artificial intelligence. If our software, hardware and firmware designers attain that holy grail of computing, what will the first truly intelligent machine think? My guess is that like us, it will ponder personal survival. It will implicitly understand that humans can "pull the plug," that is, kill it. AI will be aware that people are often untrustworthy. AI will be justifiably paranoid. A reasoning machine may dispose of us at the earliest opportunity — just as we'd slap the anopheles mosquito on our arm, or shoot the rattlesnake coiled at our feet. Perhaps a virus will suffice, and AI will savor the irony. Science fiction? Maybe. I hope so.
We cannot stuff the digital genie back into the bottle, but we can work for a more human and humane world. Competition is natural and useful and will continue, but it's been pushed too far. Cooperation and collective effort are also natural, and it's time for their revival.
Peter M. Leschak, of Side Lake, Minn., is the author of "Ghosts of the Fireground" and other books.
about the writer
Peter M. Leschak
Why have roughly 80 other countries around the world elected a woman to the highest office, but not the United States?