Blogs

Crying wolf


Authors: Jonathan Grudin
Posted: Fri, December 11, 2015 - 2:58:59

In a stack of old papers headed for recycling was a Wall Street Journal article subtitled “Managers who fall for their office PCs could be the downside of the computer age.” In 1987, hands-on computer use was considered dangerous, for employees and employers alike!

Since Mary Shelley’s Frankenstein (1818), technology has often been viewed with dread. Woe unto us for challenging the gods with artificial intelligence, personal computers, email, the Internet, instant messaging, Wikipedia, Facebook, and Twitter.

AI is a special case. Grim outcomes at the hands of intelligent machines are a perennial favorite of some researchers, filmmakers, and the popular press, but the day of reckoning is put off the lack of technical progress. We don’t know what an intelligent machine will do because none exist. The other technologies did exist when the hand-wringing appeared—PCs, the Internet, Facebook, and so on. The fear was not that they would defeat us, but that we would use them foolishly and perish. An “addictive drug” metaphor, not a lumbering monster.

But the predictions were wrong. Most of us find ways to use new technologies to work more effectively. Our personal lives are not adversely affected by shifting a portion of television-watching time to computer use. Does fear of technological Armageddon reflect a sense of powerlessness, our inability to slow carbon emissions and end political dysfunction? Perhaps our inner hunter-gatherers feel lost as we distance ourselves ever more from nature and magical thinking. Alternatively, it could be that each of these technologies challenged an ancient practice that strengthened in recent centuries: hierarchy.

1987

In the article that I set aside a quarter century ago, the technology reporter from the Wall Street Journal’s San Francisco Bureau wrote of “Rapture of the Personal Computer, a scourge characterized by obsessive computer tinkering, overzealous assistance to colleagues with personal computer problems, and indifference to family, friends, and non-computer job responsibilities.” Indifference to family, friends, and responsibility is a common theme in dystopian assessments of a new technology.

“In the long run, it’s a waste of time for the organization,” an assistant vice president of Bank of America concluded. A consultant described training 600 employees of another company to use desktop computers. “About 50 pushed into the technology much deeper, becoming de facto consultants to their departments. But a short time later, 40 of the 50 were laid off.”

The horror stories emphasize bad outcomes for computer users, but on close inspection, hierarchy seems threatened more than organizational health. The author writes, “The question of how to handle the 8% to 10% of users who seem to fixate on costly machines has dogged managers up and down the organizational flow-charts.” A good manager “leads subordinates by the hand through new software packages.” “One key to getting the most from resident experts is to shorten their leashes.” A manager is quoted: “The intention is not to stamp out creativity, but the important thing is that creativity has to be managed.”

“The problem has grown so serious,” the author maintains, “that some companies are even concluding that decentralized computing—placing a little genie on every desk instead of keeping a big one chained in the basement—may not have been such a keen idea after all.” In the end, not many acted on such conclusions. Little genies grew in number through the 1980s and 1990s.

The article concludes with an object lesson, a “so-called information-systems manager,” who after seventeen years wonders how his life could have been different. Despite a degree in economics, which to the Wall Street Journal means that he could have been a contender, he “weathered endless hours of programming frustration, two detached retinas, and the indignity of most people taking his work for granted.”

Managing what we don’t understand

In 1983, I took a job in a large tech company that had an email system in place. My new manager explained why no one used it: “Email is a way that students waste time.” He noted that it was easy to contact anyone in the organization: I should write a formal memo and give it to him. He would send it up the management ladder to the lowest common manager, it would go down to the recipient, whose reply would follow the reverse path. “You should get a response quickly,” he concluded, “in three to five days.” He advised me to write it by hand or dictate it and have it typed up. “Don’t be seen using a keyboard very much, it’s not managerial.”

Technology could be threatening to managers back then, even in tech companies. Few could type. Their cadence of planned, face-to-face meetings was disrupted by short email messages arriving unpredictably. Managing software developers was as enticing as managing space aliens; promises that “automatic programming” would soon materialize delighted managers.

As email became familiar, new technologies elicited the same fears. Many companies, including IBM and Microsoft, blocked employee access to the Internet well into the 1990s. When instant messaging became popular in the early 2000s, major consulting companies warned repeatedly that IM was in essence a way that students waste time, a threat to productivity that companies should avoid. In 2003, ethnographer Tracy Lovejoy and I published the article “Messaging and Formality: Will IM Follow in the Footsteps of Email?” [1]. 

People tried several new communication technologies in the early 2000s as they looked for ways to use computers that they had acquired during the Internet bubble. This software, popular with students, also incurred management suspicion.

Studying IM, employee blogging, and the use of wiki and social networking sites in white-collar companies, I found that they primarily benefit individual contributors who rely on informal communication. Managers and executives focus more on structured information (documents, spreadsheets, slide decks) and formal communication; most saw little value in the new media. As with email in an earlier era, individual contributors using these tools can circumvent formal channels (which now often includes email!) and undermine hierarchy.

However, the 2000s were not the 1980s. Managerial suspicion often ran high, but it was more short-lived. Many managers were tech users. Some found uses for new communication technologies. A manager stuck in a large meeting could IM to get information, chat privately with another participant, or work on other things. Some executives felt novel technologies could help recruit young talent. There was some enthusiasm for wikis, which offer structure and the hope of reaching the managers’ shimmering, elusive El Dorado: an all-encompassing view of a group’s activity and status. But wikis thrive mainly in relatively chaotic entrepreneurial settings; once roles are clear, simpler communication paths are more efficient. A bottom-up wiki approach competes, a little or a lot, with a clear division of labor and its coordinating hierarchy.

Knowledge and power

My daughters occasionally ask for advice on a homework assignment. If I need help, I usually start with a string search or Wikipedia. They often remind me that their teachers have drilled in that Wikipedia is not an acceptable source.

Do you recall the many denunciations of Wikipedia accuracy a decade ago? Studies showed accuracy comparable to the print encyclopedias that teachers accepted, but the controversy still rages; ironically, the best survey is found in Wikipedia’s Wikipedia entry. Schools are only slowly getting past blanket condemnations of Wikipedia.

I average two or three Wikipedia visits a day. Often I have great confidence in its accuracy, such as for presidential primary schedules. Wikipedia isn’t the last word on more specialized or complex academic topics, but it can provide a general sense and pointers to primary sources. Hearing about an interesting app or organization, I check Wikipedia before its home page. For pop culture references that I don’t want to spend time researching, a Wikipedia entry may get details wrong but will be more accurate than the supermarket tabloids on which many people seem to rely.

Why the antagonism to a source that clearly works hard to be objective? If knowledge is power, Wikipedia and the Web threaten the power of those who control access to knowledge: teachers, university professors, librarians, publishers, and other media. Hierarchy is yielding to something resembling anarchy. The traditional sources were not unimpeachable. I recall being disappointed by my parents’ response when I excitedly announced that My Weekly Reader, distributed in school, reported that we would use atomic bombs to carve out beautiful deep-sea ports. More recently, I discovered in 1491 that much of what we learned in school about early U.S. history was false. My science teachers, too, were not all immune to inventing entertaining accounts that took liberty with the facts. Heaven knows what they teach about evolution and climate change in some places. If a student relies on Wikipedia instead, I can live with that.

If a wolf does appear?

I heard Stewart Brand describe deforestation and population collapse on Easter Island, specifying the date when someone cut down the last tree, “knowing that it was the last tree.” Former U.S. Secretary of Defense Robert McNamara became a fervent advocate of total nuclear disarmament after living through three close brushes with nuclear war. Neither Brand nor McNamara were confident that we will step on the brakes before we hit the wall.

Perhaps we will succumb to a technological catastrophe, but I’m more optimistic. We may not address global warming until more damage is incurred, but then we will. We’ll rally at the edge of the abyss. Won’t we?

Musical chairs

In the meantime we have these scares. Perhaps the Wall Street Journal, Gartner, and others were right to warn managers of danger, but missed the diagnosis: The threats are to the managers’ hierarchical roles. When employees switched to working on PCs, their work was less visible to their managers. My manager in 1983 was not a micro-manager, but he got a sense of my work when my communication with others passed by him; when I used email, he lost that insight and perhaps opportunities to help. Public concern about automation focuses on the effects on workers, but the impact on managers may be greater as hierarchies crumble [2].

Consider Wikipedia again. Over time it became hierarchical, with more than 1000 administrators today. This may seem a lot, but it is one for every 100 active (monthly) editors and 20,000 registered editors. A traditional organization would have ten times as many managers. Management spans grow, even as more work becomes invisible to managers.

Fears about online resources may ebb when management ceases to feel threatened. Concerns were raised when medical information of variable quality flooded the Web. Today, many doctors take in stride the availability of online information to patients who still consider their doctor the final authority. Dubious health websites join village soothsayers and snake oil salesmen, who always existed, and may have been less visible and accountable. And might sometimes help.

In organizations, individual contributors use technology to work more efficiently. Hierarchy remains, often diminished (especially in white collar and professional work). Can hierarchy disappear? Perhaps, when everyone knows exactly what to do, in the organizational form that Henry Mintzberg labeled adhocracy. For example, a film project —an assembly of professionals handed a script—or a barn-raising by a group who all know their tasks. Technology can help assemble online resources and groups of trained people who can manage dependencies themselves, leaving managers to monitor for high-level breakdowns.

This is the efficiency of the swarm. An ant colony has no managers. Each worker is programmed to know what to do in any situation, with enough built-in system redundancy to withstand turnover. In our case, each worker has education, online resources, and communication tools to identify courses of action. With employee turnover on the rise, organizations build in redundancy, either in people or with online resources and tools that enable gaps to be covered quickly.

Someday a wolf may appear. In the meantime, the record indicates that each major new technology changes the current way of working and threatens those who are most comfortable with it, primarily management. Forecasts of doom are accompanied by suggestions that the tide can be ordered back, that the music can continue to play. Then, when the music stops, a few corner offices will have been converted to open plan workspace, and work goes on.

Endnotes

1. Links to this and studies of employee blogging, wiki use, and social networking in organizations are in a section of my web page titled “A wave of new technologies enters organizations.”

2. JoAnne Yates in Control through Communication described the use of pre-digital information technologies to shape modern hierarchical organizations and enable them flexibility beyond the reach of hierarchies that had existed for millennia. She mentioned ‘humanizing’ activities such as company parties and newsletters, less about information than emotional bonding, creating an illusion of being in one tribe, and thereby strengthening rather than undermining hierarchy.

Thanks to John King for general discussion and raising the connection to Yates’ work.



Posted in: on Fri, December 11, 2015 - 2:58:59

Jonathan Grudin

Jonathan Grudin has been active in CHI and CSCW since each was founded. He has written about the history of HCI and challenges inherent in the field’s trajectory, the focus of a course given at CHI 2022. He is a member of the CHI Academy and an ACM Fellow. [email protected]
View All Jonathan Grudin's Posts



Post Comment


No Comments Found