Note: Clay Reynolds (retired professor of literature and writing, Univ. of Texas, Dallas, one-time dean, significant novelist, essayist, and book reviewer extraordinaire) is a four decade veteran/survivor of American higher education. What he summarized in an email, reacting to my praise of Professor Gayle Greene’s superb essay, Toxic Clichés,enhanced my outdated knowledge (as a one-time academic, now online essayist). Clay’s compact history merits wider distribution so I turned his email into an interview format, thus my questions and his cogent, articulate and witty responses. Read Greene’s essay first. RBecker
RB: What happened to American higher education across your career?
CR: Gayle Greene’s fine essay aptly describes the deterioration of learning across the board in all institutions of learning. We start with declining entrance requirements, academic rigor, and scholarly standards, with the worst outcomes caused by lower-level schools offering “open enrollment” (shredding essential entrance requirements, not just any high school diploma). Thus junior and community colleges became the functional equivalent of simply extending high school into “thirteenth and fourteenth grades.”
RB: Does a single image capture your career and sense of the problem infecting higher education?
My academic career reminds me of the ancient Danish/English King Cnute’s chosen duty, depicted as wading into the surf to confront the incoming tide. On his throne installed facing the sea-shore, Cnute addresses the sea, ‘You are subject to me, as the land on which I am sitting is mine, and no one has resisted my overlordship with impunity. I command you, therefore, not to rise on to my land, nor to presume to wet the clothing or limbs of your master.’ When the tide persisted, drenching the royal feet, the king cried, ‘Let all the world know that the power of kings is empty and worthless” except “Him by whose will heaven, earth and the sea obey eternal laws.'” If one substitutes “ignorance” for “tide” and “knowledge” for “overlordship,” the story speaks to the futility of educating ill-prepared students with insufficient resources.
RB: Are we talking a sudden decline or over two generations?
CR: I have witnessed the erosion of learning standards over nearly four decades, from the late 1970s, when a massive expansion in the size of and the number of higher education rippled forth after the Johnson presidency. Many community colleges “grew up” into instant universities, but without a parallel expansion of standards for professional faculty requirements; on the contrary, the exploitation of newly minted PhDs was rampant, producing the “scholar gypsies” phenomenon wherein gifted instructors had to accept temporary contracts, not tenure tracks (anticipating the corrosive “adjunct” status common today). Budget cuts reduced tenure slot, thus permanence of quality staffing until others (sometimes called “dead wood”). This salary shrinkage cut state budgets and, worse still, private schools. This dearth of openings contrasted with a surfeit of applicants: in 1978 there were 400 full-time jobs nationwide for over 1100 newly-minted PhDs.
Along with reduced entrance requirements, superior high school students took college-like classes that allowed them to “place out” of having a more demanding college experience; even “honors” or “advanced placement” classes did not provide, for example, an equivalence of a good English composition course with a more qualified teacher. The high school instructors did little more than “teach to the test,” a notably inadequate preparation in skills or knowledge to handle higher college-level courses. Many large, flagship state universities had dropped foundational, “General Education Requirements,” pressuring freshman applicants to satisfy such basic standards at junior or community colleges. Overall, the first year no long functioned as a necessary “trial by fire,” designed to separate those especially geared to college vs those much better off establishing career skills elsewhere (higher academic training is not for everyone; there are other, better venues for finding skills and life careers).
RB: How did these cumulative, systemic shifts impact the management of higher education?
CR: These downsides were solidly in place when the Reagan Administration torpedoed higher education by decreasing federal funding from state and even private campuses. Cutting the spigot forced universities to hire top administrators without keen understanding of academics but good at crunching numbers, fundraising from private donors or corporate entities and dealing with legislators. “Managing education” shifted to hobnobbing with wealthy donors or corporate sponsors, often enhancing athletic programs, particularly in football and basketball so that better-heeled alums could relive sophomore fantasies from private boxes.
This conversion changed universities from centers for learning, theoretical research and inquisitive investigation into “education businesses” going hat in hand to donors for sophisticated research in technology, new stadiums or buildings, even corporate jets. Emphasis shifted to graduate schools and away from educating undergraduates, many of whom struggled to simply keep up; graduates students had second lives as research assistants, teaching assistants, and go-fers for high-dollar faculty now conducting research and development aided and abetted by shrewd corporate entities.
RB: So much then for the idea of general education?
CR: Less general operational money through the early 2000s further axed basic requirements (and skills). Rigorous English and history requirements were reduced to bare bones, for example, twelve hours of composition, rhetoric, and literature cut to nine, with only three being in literature (and technical or “business writing” substituted); nine hours of history were reduced to three. Philosophy, foreign language, economics were exiled to the “electives” list while minimum requirements for math and science continued. In sum, STEM was born as a meaningful acronym, as universities converted from centers for higher learning into centers for advanced vocational training and extension labs for DOW Chemical, Proctor and Gamble, and all kinds of Big Pharma and industrial tech companies.
At the same time, deals were struck between major hospital corporations and universities, combining resources and staffing as older city, county, and regional hospitals closed or were absorbed into corporate entities. Rural and small town hospitals faced their own fatal pressures. Medical research, which had once been almost entirely the province of independent hospitals, was now corporate-funded and associated with universities. Gone was the idea that a specialized graduate needed apprenticeship, internship, or on-the-job-training; instead, their new diploma won them full-time professionals work, some on day one as corporate executives.
Many vocational areas – nursing, accounting, dental assistants and hygienists, X-Ray techs, and the like –once the province of independent “schools” were absorbed wholesale into the university and junior college system, increasing enrollment, cutting the “Time to degree,” and maximizing the production of job-ready graduates. So much for foundational knowledge necessary for adulthood, if not citizenship. What mattered was what a qualified or certified graduate (though untested as to performance) and how quickly that certificate could be earned.
RB: And then came the predatory revolution in student loans?
A Reagan era changed how educational loans worked, with downsides. Under Eisenhower, then LBJ, publicly-administered low-interest loans, with extended, somewhat open-ended terms, served less affluent students. Reagan privatized the whole program, turning the loans over to profit-oriented banks that could sell them, as commodities, so student no longer dealt with a single, government-connected entity. Suddenly, the student loans had different rules, lorded over by new corporations with draconian methods of collection. Reagan also eliminated options to acquit some liability for teaching, military service, or serving in the Peace Corps and George H. W. Bush allowed for new penalties on late payments and defaults, restricting the restructuring of repayments when post-graduate income fell shot. Thus originated the steady rise in defaults that now, forty years later, snowballed into one of the largest areas of consumer debt in the country.
During the same period, Pell Grants and financial aid for impoverished students now came with less obvious strings attached: failing to maintain certain grade averages or make certain hash marks of progress could result in cancellation, even demand for repayment. As was the case with the broader loan program, there was absolutely no governance as to how the money was spent by the recipient, either. Students who often had never had more than a few hundred dollars at their disposal, if that much, were suddenly in possession of thousands, even tens of thousands of dollars to manage wisely. That money designated for tuition and fees, room and board, books and supplies were squandered on booze and drugs, even new cars and clothes were not surprising. Campus merchants looked forward each semester to steady income flow and big problems when money ran out for tuition and the like. Plus, handing out credit cards to students with a checking account, regardless of assets, and a steadily rising interest rate on credit card debt, was a formula for credit ruination.
RB: So, all these must have directly impacted classroom performance by all parties?
CR: In the meantime, faculty were cautioned, unofficially, that failing too many students would have a deleterious effect on the university’s ability to keep the lights on. This crunch especially affected smaller, open-admission schools who began to monitor pass/fail rates, charted how minorities did and asked, sometimes ordered, that lowering standards served the institution. Minimum quotas for failure even effected, and woe to any instructor not meeting these minimum two semesters in a row. The new mantra became, “The average student is below average.” Indifferent students brought pillows to class, sitting way back to facilitate napping. I responded appropriately, never pressured about pass/fail quotas.
Lost in the process were attention to Fine Arts or just Arts but there were far more practical replacements: what had been nascent programs in radio/TV/film grew into full degree programs; classes developed in the visual arts, performing arts as well as in art history (an outgrowth of “art appreciation”), film studies; newly-coined “computer-centric arts design and development” surfaced at USC, Chapman, Ringling, Purdue, plus Columbia and NYU. Focus on theater gained specialized categories, such as: acting, directing, technical and production skills, etc.), though the faculty rarely boasted professional training. Similar development in music, painting, sculpture, dance, etc. were commonplace. The motivation behind this was to produce more people who could teach these things, particularly in the public schools, now solidly shifted into marginal extra-curricular activities.
In the meantime, as state and federal funding withered under conservative government policy, tuition exploded: in 1980 a four-year degree with room and board from a good, if not great state institution cost around $15,000. By 2010, that figure hit over $100,000, and specialized skills required a graduate degree, adding $20-30K per year, room and board not included. A recent estimate put forth the full costs for a four-year degree at upwards of $120,000.
As the nineties dawned, I began to see erosion in standards and requirements, as mandatory essays were suddenly reduced to five-hundred word summaries and exercises. Any who were saddled with the large survey courses (with 200 students) turned the lecture and grading duties over to TAs, and they spot checked them, or so they claimed. They would make an appearance in the classroom, then turn it over to a graduate student to carry on. That was particularly true in history courses. I was scandalized, but my notes about it were ignored by both my own dean and by the university’s undergraduate dean, as well. “It’s just the way it is,” the latter told me when I complained about it in person. “I’m retiring in two years, and it can’t happen fast enough.”
RB: What about your experience as an associate dean and the futility of accountability?
CR: Enter SACS, and its regional sisters around the US. The Southern Association of Colleges and Schools (SACS) is one of just six regional accreditation authorities, divided by region, in the United States that is recognized by the Council for Higher Education Accreditation (CHEA) and the United States Department of Education (USDE). SACS provides accreditation for over 13,000 public and private preschools, elementary, junior and high schools as well as hundreds of colleges and universities in the South plus the Caribbean, Mexico and South and Central America. SACs was a more or less toothless outfit, a rubber stamp outfit staffed by faculty from minor schools glad not to teach but do routine paper pushing and get free dinners and lunches once or twice a month. It required periodic reporting that took the form of stacks of paperwork no one ever read that satisfied the requirement for reporting on all aspects of an institution’s function.
Under the George W. Bush gubernatorial Administration, though, and in coordination with the mostly Republican governors of the affected states, SACS and its sister branches grew some teeth. Now there was a demand for “accountability,” for “governance,” and terms like “quantification of learning outcomes,” and “assessment” were all instantly familiar. No one understood what these were or what they actually meant so my fellow associate deans were given the task of evolving some kind of standardized method of doing it. We met way into the night for several weeks. What we came up with was so innocuous and meaningless as to be laughable to any educated person. Our local SACS officer was delighted, though. “I knew you could do it,” she said. All were duly submitted to the University SACs office; I am 100% confident that not one was ever read. I stepped down as associate dean and returned to the classroom in 2006. This standardized submission was used by all faculty in my school (as we didn’t have departments.) from 2006 through the year of my retirement in 2019; so far as I know, it’s still being used as a template.
RB: So what does all this change add up to? What are the larger lessons?
CR: On top of these “internal” dynamics,’ the times produced a growing intolerance for free expression and assignments that challenge contemporary values and standards of behavior. But it all is of a piece. What happened is a general redefinition of what a higher education actually means, what its value is, and how it fits in with modern life. Far from being a privilege, a commonplace during my high school days, as something to be earned and demanding struggle, college is now a rite of passage, much as high school became in the post-World War II decades. College reset the line of adulthood forward from eighteen to about twenty-one or twenty-two, when individual had presumably to shoulder adult responsibility and the burdens of citizenship. As the next two logical steps following secondary school at mid-century (being drafted/getting a job, and getting married and starting a family) faded into the future, college replaced them. Not everyone made it, but then, not everyone made it in the military or marriage, either. College wasn’t for everyone; neither was parenthood nor full time employment. But as no military draft threatened male drop-outs, keeping them out of the bars and in the library at night, and as women began to think of careers, beyond the future of housewifery, college fostered another seasoning of adolescence.
About that same time, expectations as to what a college graduate should have learned, should know, dropped and narrowed. No longer was a college graduate especially well read, well informed, well versed in social sciences and math, capable of speaking and reading a foreign language, an appreciator of fine art, music, and theater, knowledgeable of basic biology, chemistry, physics. Now all was specialized. It’s impossible to measure the disastrous effects of the loss of humanities as an emphasis in higher education. This culture exhibits all the vulnerabilities when the best educated people of a generation have not been challenged by abstraction, not been tempered by the strictures of rhetorical argument, have not been taught how to investigate and conduct independent research, cannot articulate an argument and don’t know the difference between a topic and a thesis. When people who tasted, even succeeded in higher education, cannot tell a lie from the truth based on the presented facts, then society is in deeper trouble than it can imagine. It explains why seemingly well-educated people buy entirely into the absurd spewings of a man like Donald Trump.
Ignorance is not measured by a piece of paper hanging on a wall. Neither is knowledge. The key to education is that it teaches you how much you do not know, but then gives you the tools to find out. Somewhere the American academy lost that point; the real question is, now that it’s devouring its own children, if it, like Saturn, will also become nothing more than a myth.
If you liked this article, please donate $5 to keep NationofChange online through November.