This year’s Labor Day finds millions of Americans — those who labor in offices — almost bubbling about the prospects for an epic transformation of their workspaces. Within Corporate America, working remotely may soon become a permanent standard operating practice.
Imagine that. No more horrific daily commutes. No more stressing in cramped cubicles. And, maybe most of all, no more hours spent trapped in corporate meeting rooms that deaden the soul.
American workers simply hate meetings. In November 2019, a few months before Covid hit, one reputable survey found 67 percent of employed Americans insisting that meetings were keeping them “from getting their best work done.” Just 11 percent saw their meetings as “productive.”
Some 60 percent of workers, adds a Harris poll, say they spend more time prepping for meetings than they spend at the meetings themselves, with 8 percent saying they’d rather undergo a root canal than meet. Endless rounds of meetings, other research shows, are exacting a heavy price. One Fortune 500 company has estimated that unproductive meetings cost the firm $75 million a year. Meeting “pain,” sums up a Harvard Business School analysis, has “real consequences.” Time wasted in meetings “eats into time” for the solo work “essential for creativity and efficiency” and disrupts our capacity “to focus without distraction” on cognitively demanding tasks.
The academics behind that Harvard analysis have a laundry list of suggestions for corporate meeting planners, everything from collecting “data from each person” to regularly debriefing “as a group.” Other consultants have their own checklists for conducting effective meetings. None of these checklists have made a significant impact.
The obvious question: Why? Does our contemporary meeting glut reflect the cluelessness of the managers who run corporate meetings or something more fundamental? Why are executives in Corporate America now averaging 23 hours a week in formal meetings, up from fewer than 10 hours in the 1960s?
A little corporate history can help here. Back in those 1960s, we were still living in a mass-production Industrial Age. But by the 1980s a new “Information Age” was emerging, and analysts were preaching that Information Age success will come to companies that start customizing products to what customers want. And who knows best what customers want and how to deliver on that desire? The employees who interact directly with consumers and the production process. Effective enterprises, the new Information Age mantra continued, value all these employees.
How could corporations maximize that value? In the late 20th century, consultants rushed forward with strategies that would go by a host of different labels. Quality circles. Total quality management. Experts would talk endlessly about “reengineering” corporations to effect “high-performance organizations.” Employee involvement, the experts stressed, would give corporations smart enough to try it the “ultimate competitive advantage.”
This empowering ethos swept through Corporate America in the 1980s and 1990s. Managers and staffs sat together, in meeting after meeting, drafting, discussing, and debating “mission” and “values” statements. But precious little actually changed. Only a small fraction of the companies that claimed to be empowering workers, a 1998 Journal of Business study would reveal, were actually engaging in any serious empowerment work. Executives, concluded an Administrative Science Quarterly study published in 2000, were essentially just going through the motions.
The explanation? Observers typically blamed corporate bureaucracy, everything from turf wars between managers to the endless delays while ideas go up and down the decision-making ladder.
Bureaucracies, sociologists tell us, grow naturally — and inevitably — in enterprises organized along hierarchical lines. In the classic corporate hierarchy, workers sit at the base of a pyramid. Above them rest layers of management. The more layers, the steeper the pyramid, the greater the distance between actual workers and ultimate corporate decision-making authority. To succeed in the Information Age, analysts contended, corporations needed to “flatten” these towering pyramids. In the early decades of the Information Age, corporations failed to do that. American companies weren’t becoming leaner, economist David Gordon documented, just meaner. Corporate America’s demand for managers, the Wall Street Journal reported in 1996, was “booming.”
None of this, careful critics pointed out, should have surprised anyone. Information Age theorists tend to see corporate hierarchies as anachronistic, easily disposable hangovers from bygone days of Industrial Age command-and-control. But corporate hierarchies still serve a real purpose in our new Information Age. They amount, in essence, to income-maintenance programs for top executives.
Peter Drucker, the father of modern management theory, understood this income-maintenance dynamic years before anyone else. In any hierarchy, Drucker noted in 1982, every level of bureaucracy must be compensated at a higher rate than the level below. The more levels, the higher the pay at the top. Hierarchies would remain appealing to executives, he argued, so long as they prop up and push up executive pay. His solution? To make hierarchies less appealing to executives, Drucker suggested, limit executive pay. No corporate executives, Drucker wrote, should be allowed to make more than 20 times their worker compensation.
In 2005, upon Drucker’s death at age 95, obituaries hailed his enormous contribution to modern management science. His ideas, one analyst told the Financial Times, “have become part and parcel of today’s commonsense understanding of business.” But corporations before and since have studiously ignored Drucker’s wisdom on limiting the gap between chief executive and worker pay.
Indeed, America’s CEO-worker pay gap — just 21 to 1 in 1965 and still only 40 to 1 in 1982 — is now running at 351 to 1, the Economic Policy Institute noted last month .
Propping up today’s enormous levels of CEO compensation: layer upon layer of middle managers who all need something to do. Meetings provide that something. In today’s corporate workplaces, meetings have become little more than make-work for middle managers, one key reason why so many office workers hate to meet. Meetings now rank, The Surprising Science of Meetings author Steven Rogelberg told the Washington Post last month, as “the largest single cost that goes unevaluated and undiscussed” on organizational balance sheets.
True, but beside the point. Our contemporary meeting-happy corporate culture meets real corporate needs. Middle managers need meetings, and top executives need middle managers.
Back in the 1960s, a time of many fewer meetings in Corporate America, we didn’t have a limit on the CEO-worker pay gap along the lines of what Peter Drucker proposed in 1982. But we did have a steeply progressive federal income tax. Individual incomes over $200,000 faced a 91 percent tax rate until the mid-1960s. That left top execs with little incentive to inflate their paychecks by any means necessary. Why bother? Uncle Sam would merely tax away their gains.
Our current top federal income tax rate: just 37 percent. That needs to be higher. We also need a pay-gap limit along the lines that Peter Drucker proposed four decades ago, and progressive lawmakers in Congress are pushing in that direction. The pending Tax Excessive CEO Pay Act would increase federal taxes on corporations with executive-median worker pay gaps greater than 50 to 1.
Meanwhile, office workers of America continue to meet and meet and meet. Amazon CEO Jeff Bezos, for his part, has a solution, what he calls his “two-pizza rule.” Bezos says he won’t go to any meeting if two pizzas wouldn’t feed the entire group.
How about a “two-mansion rule” instead? The rest of us simply stop attending meetings inside corporations whose top execs own more than two mansions. Bezos has a half dozen.