“LEFT BEHIND” in “THE EDUCATION MYTH”
LEFT BEHIND
The Politics of Education Reform and Rise of the Creative Class
In 2002, Carnegie-Mellon University professor Richard Florida published the best-selling book The Rise of the Creative Class (2002). Florida explicitly built on the work of Bell, Drucker, and Reich, but The Rise of the Creative Class, however, went even further, arguing that, by the early 2000s, the economy was driven not just by knowledge labor but by the growing number of Americans who were engaged in valuable “creative” work, in his words, the “decisive source of competitive advantage.” These Americans flocked to cities and valued leisure activities, flexible work hours, and diversity in their occupational and geographic choices. If cities wanted to facilitate economic prosperity, Florida deduced, they should cultivate an attractive creative-class atmosphere, including a “cutting-edge music scene or vibrant artistic community.”1
For Florida, what distinguished the creative class in the early twenty-first century is that they were increasingly unmoored from the more hierarchical employment relationships that existed when the professional class emerged in the 1970s and 1980s. Indeed, though they didn’t yet see themselves as a class, he believed they were connected. “Rather than live in one town for decades,” he argued, including himself in the creative class, “we seek places where we can make friends and acquaintances easily and live quasi-autonomous lives.” And for Florida, this experience was generalized to “modern life,” which was “increasingly defined by contingent commitments. . . . Where people once found themselves bound together by social institutions and formed their identities in groups, a fundamental characteristic of life today is that we strive to create our own identities. It is this creation and re-creation of the self, often in ways that reflect our creativity, that is a key feature of the creative ethos.”2
This group, however, as had been the case with Reich’s “symbolic analysts,” represented only about 30 percent of the labor force. Though The Rise of the Creative Class was not really a book about education, Florida clearly defined the group as one that was overwhelmingly college educated. Indeed, his “talent index”—which policymakers were encouraged to consult—represented “a simple human-capital measure of the percentage of the population with a bachelor’s degree or above.” The prosperity of cities, therefore, essentially correlated to the percentage of the population with college degrees. By this logic, the way to create broader prosperity was for cities to develop a “people climate” that supported creativity and would attract talented workers.3 For the rest of Americans, who lacked college degrees, rootlessness and disruption brought insecurity and despondence—not opportunity and creativity—such as with the working-class Kenoshans after Chrysler closed the AMC plant. A reasonable response to this problem might have been to figure out how to ensure those without college degrees could also have good lives.
Florida did note the growing tension around this inequality. And, to his credit, he also pointed out that in his diversity indices—including immigration and the “gay index”—nonwhites were clearly not benefiting from the new creative-class cities, a development he called “particularly disturbing.” In response, however, Florida could only make the case for “tapping the creativity of these people” or at least helping them “share in some of the rewards.” While Florida noted the creative class had doubled since 1980, he also pointed out the continued trajectory of what Bell showed in the 1970s: the service sector now made up about 45 percent of the workforce and, of course, the manufacturing sector had been reduced to about 25 percent of workers. Florida noted that many jobs in the service sector, which were becoming less sustainable over time and paid insufficient wages, grew out of necessity, as the growing creative class paid these workers to do the “chores” they needed done in restaurants, homecare, janitorial, and lawn services.4
Nothing could be done, however, either to directly create good jobs, provide noncreative class workers more rights, or to “recultivate and rebuild” older forms of community. All of these efforts, according to Florida, were “fruitless, since they fly in the face of today’s economic realities.” He implored members of the creative class to push their political leaders to invest in human creativity, which included R&D, education, and the arts.5 Indeed, Florida did little more than reprise the human-capital-as-destiny arguments Clinton, Reich, and Riley had made in the 1990s.
The Rise of the Creative Class, widely read and cited in the 2000s, highlights just how deeply the education myth permeated American culture and politics.6 There simply were no other serious mainstream possibilities for cultivating either economic security or social esteem for those who lacked the right degree. Democrats, by the end of the 1990s, had emerged as the preeminent champions for the education myth: in a globalizing world, the right education was all it took to give Americans the opportunity for secure lives, and that opportunity represented the common starting point for workers in a supposed meritocracy. Many Democrats still supported some legacy social democratic interventions such as Social Security and Medicare or the very limited supports provided by a reformed welfare system, while many Republicans, of course, wanted to go further in shredding union rights and the social safety net. More and more Republicans, however, in the 1990s, like President George H.W. Bush, Secretary of Education Lamar Alexander, and Wisconsin governor Tommy Thompson also began to embrace human capital as the centerpiece of a conservative argument about how to help workers in this leaner world.
As this chapter makes clear, President George W. Bush (2001–9) did more than any other Republican to elevate the education myth with his work on bipartisan education reform—No Child Left Behind—that assumed education was essential to any shot at a decent livelihood. Even when Republicans and Democrats disagreed over tax cuts, or eventually, the war in Iraq, the notion that investment in education represented the panacea to facilitate equal opportunity in a competitive job market continued to grow from 2001 on. Celebrations of the power of the educated class, such as The Rise of the Creative Class, spun deeper and deeper fantasies about the importance of investment in human capital, even arguing that urban economic development required it. All the while, Democrats further embraced market-based education reforms that implicitly viewed education as the commodity every young person must have to succeed. Experts and politicians across the political spectrum now also pushed for “accountability” in higher education, just as they did for K-12, asserting the responsibility for economic opportunity lay with the individual or the school, not the overarching social structure.
The consequences of this path were made tragically evident in 2008 when the economy collapsed after decades of financial deregulation and mounting inequality. In the pivotal election that year, Illinois senator Barack Obama raised expectations that he would lead an effort to reinvigorate a robust American social democracy, running in the Democratic primary against disastrous free trade deals and for union rights, and to hold corporations accountable for their actions in the unfair economic conditions under which Americans worked and accessed social goods like housing. After a resounding victory, however, Obama and congressional Democrats did little to change the course of American politics, instead, bolstering the myth in both rhetoric and policy. By 2010, the first signs of a political revolt against the education myth—from a growing populism harnessed by right-wing reactionaries—were apparent.
No Child Left Behind: The Mountaintop of the Education Myth
By 1996, Bill Clinton had led Democrats to make education reform a key part of their political brand. Kansas senator Bob Dole, responding to the rightward shift in the GOP after 1994, took an ultraconservative perspective on human capital, pushing vouchers and slamming teacher unions as impediments to the reform necessary to maximize economic opportunity for future American workers. Dole’s resounding defeat by Clinton convinced future Republicans running for office to accept the principle of federal support for school reform, and this group prominently included Texas governor George W. Bush. In Congress, this political shift brought bipartisan increases in education funding—$7 billion in the 1997 budget agreement between the Republican-controlled Congress and Clinton, for example—and in 1998 and 1999, calls by both Democrats and Republicans for greater accountability along with this new money.7
In states like Wisconsin, Republicans such as Tommy Thompson also further embraced the conservative iteration of the education myth. In Thompson’s 1997 State of the State address, for instance, the governor connected the “end” of welfare with investment in education and job training, arguing, “Not just in Wisconsin, but in America, if it were not for the bold, visionary and courageous work done here, thousands of families across this country would be sentenced to life in a failed welfare system instead of capitalizing on the freedom to pursue their dreams.” Citing the reduction in welfare rolls during his administration, Thompson argued that investing in education would ensure all those forced to work could find good jobs, thus also reducing the state’s growing number of incarcerations: “Our goal must be no less ambitious than creating the highest skilled, most productive workforce in the world, with businesses flocking to Wisconsin for our greatest asset—our people. It is my goal as we march toward the year 2000 to make education so compelling and relevant to the lives of our youth that someday we put the Department of Corrections out of business.” Bringing together a number of education themes in the 1990s, Thompson argued, “Parents must be empowered with more choices; Education must be relevant to the work-place; Schools must be held accountable for their performance; and Technology must pervade every facet of education.” Here Thompson sought to preserve the market-based Milwaukee School Choice program (then under challenge in the courts), and, working with Milwaukee activist Howard Fuller, sought to make it easier to charter new schools in the city.8
The presidential election of 2000 pitted former Tennessee senator, DLC Democrat, and vice president Al Gore against Texas’s governor Bush. Both candidates, propelled by the education myth, campaigned on investing in education and increasing school accountability. After almost twenty years of politicians arguing it was the key to economic opportunity in a global economy, in fact, education had emerged as the number one campaign issue in 2000. For Bush, education reform represented the centerpiece of the slogan “compassionate conservatism,” and his campaign, building on a record of supporting accountability, school choice, and increases in education spending as governor, outlined a plan to hold states accountable in exchange for increased federal funding. Bush employed the narrative of the so-called Texas miracle in which test scores had appeared to increase dramatically and dropout rates supposedly decreased during Bush’s time as governor, even though the story was misleading at best.9
Bush’s campaign, indeed, sought to eliminate Democrats’ high ground on education, even co-opting the Progressive Policy Institute’s plan for performance-based education funding. And Bush, characterizing education policies that did not include accountability as racist, famously cited the “soft bigotry of low expectations” as the reason Latinos and African Americans scored disproportionately lower than whites on standardized tests. In a speech in New Hampshire in November 1999, for instance, the Texas governor linked entrepreneurialism and economic prosperity with public education for all:
I believe that the next century will be a time of incredible prosperity—if we can create an environment where entrepreneurs like you can dream and flourish. A prosperity sustained by low taxes, unleashed by lighter regulation, energized by new technologies, expanded by free trade. . . . But this hope, in the long-run, depends directly on the education of our children—on young men and women with the skills and character to succeed. So, for the past few months, I have focused on the problems and promise of our public schools. The diminished hopes of our current system are sad and serious—the soft bigotry of low expectations. Schools that do not teach and will not change must have some final point of accountability.10
Gore, who earned the endorsement of both the AFT and the NEA, called for even bigger spending increases (more likely since the federal budget deficit had been wiped out under Clinton) to augment teacher salaries and reduce class sizes. Gore also proposed states turn around schools with poor results by shutting them down and reconstituting them with new leadership or as charters (these would be key features of NCLB). Third-party candidates Pat Buchanan and Ralph Nader, in very different ways, attacked the growing neoliberal consensus in the United States. Nader went after the increase of corporate power while Buchanan continued to attack free trade and criticized Bush for proposing to expand the federal role in public education.11
In one of the closest elections in American history, Gore won the popular vote but the Supreme Court awarded Bush the presidency.12 Bush immediately made good on his election promises, as his very first legislative proposal was for a bill called No Child Left Behind, which would be the latest reauthorization of ESEA. The proposal built on many of the shifts in federal education policy pushed by the DLC and enacted in the 1990s under Clinton.13 But Bush widened the net for reform, working with Democrats, most surprisingly liberal Ted Kennedy, who had long opposed accountability measures.
The negotiations between Republicans and Democrats in Congress and the White House featured a rare level of comity in American politics during that era, since by that time most of the officeholders in both parties had accepted the notion that education was fundamental in economic opportunity. As Connecticut senator and education reformer Joe Lieberman pointed out, both Congress and the White House shared a “common purpose”: “We all want to deliver on the promise of equality and opportunity for every child. We all want to increase the supply of highly skilled workers, which we all know is critical to our future economic competitiveness and the long-term prosperity and security of this Nation.”14
The Business Roundtable and other corporate lobbying groups helped to firm up support for provisions such as mandatory testing some conservatives found unpalatable. The justification reflected two decades of corporate jeremiads about the necessity of building human capital. As Keith Bailey from the Business Coalition for Excellence in Education (BCEE) argued, “The reforms we are seeking cannot be delayed—the world is changing rapidly and we need to ensure that our educational systems can equip our children with the knowledge and skills to meet the challenges they will face.”15
After passing with huge margins in the House (384–45) and the Senate (91–8) in May and June, respectively, the bills went to conference to reconcile discrepancies in funding proposals (the Senate bill proposed almost twice as much money) and the final accountability measures. Negotiations were contentious, but September 11 helped motivate both sides toward agreement, and the final bill passed by overwhelming margins in both houses.16
Congressional debate highlighted the bipartisan nature of the education myth. As the bill came out of committee on May 1, Republican senator Bill Frist (Tennessee) argued, “Those basic skills that we know and that everyone—liberals, conservatives, Democrats, and Republicans—recognizes you have to be equipped with if you are going to live a fulfilling life are increasingly competitive, not just in local towns, communities, States, or regions in this Nation but across this great world in which we live, such as in mathematics.”17 Dale Kildee (D-MI) spoke about the House bill as it left the Committee on Education and the Workforce. “In a time when we are in an increasingly competitive world,” he argued, “we can no longer tolerate low-performing schools that place the education of our children at risk.”18 Charles Rangel (D-NY) also asserted the imperative for education reform. “We are going to have to invest in our young people to make certain that we can keep up with foreign technology,” he argued. “We hear pleas every day from the medical industry, from the State Department, how important it is for us to train people for these important jobs, and yet we find that if they are not ready to get a decent public school education, how in God’s name are they going to be ready for higher education and high tech?”19
But even those who critiqued the bill did not challenge the notion that education was essential for a chance at economic security. Kennedy, for example, who argued much more funding was needed for the policy to work, pointed out, “We know that as we move into a global society and economy, that only about 20 percent of the new entrants into the job market have the skills which 60 percent of them need at the present time.” The aging liberal highlighted just how much the party had been remade by the DLC: “We are lagging in education and in investing in people and training. The Republicans act as if [Bush’s proposed] tax cut is an economic program—it is not. . . . We have to invest in terms of the training, and we have to ask this Nation what its priorities are.”20
There were few truly dissident voices such as Minnesota’s Democratic senator Paul Wellstone. Wellstone pointed to the problem of enhancing accountability without equalizing the conditions under which students learned. “I find it stunning so many Republican colleagues, much less Democratic colleagues, will vote for this,” he complained. The bill would mandate that every student be tested every year, but “at the same time, we are quite unwilling to pass a Federal mandate that there will be equality of opportunity for every child to have a good education and to do well and to succeed.” Wellstone, one of only five Democrats to vote against the final bill, came the closest in the Senate to pointing out how connecting education and economic opportunity would reinforce inequalities that already existed, arguing that any new mandates should come with full funding for Title I. As Wellstone summed it up, “The White House bill will test the poor against the rich and then announce that the poor are failing. Federally required tests without federal required equity amounts to clubbing these children over the head after systematically cheating them.”21
The No Child Left Behind Act, signed into law by Bush on January 8, 2002, exchanged increases in federal funding for deliverables from states. States had to adopt academic standards and agree to test all students in math and reading from grades three to eight and at least once in high school. Results had to be publicized and disaggregated by demographic groups and income level. The law also mandated every classroom have a “highly-qualified teacher” for every core subject. Finally, NCLB implemented “accountability” by progressively punishing schools that failed to meet goals: forcing them to pay for the transportation of students who exited their schools; providing students the right to use the school’s Title I funds to pay for private tutoring; and requiring schools that did not meet these criteria to hire a different staff or even to transform into a different structure, such as a charter school. (To facilitate the last part, the bill also provided $300 million for states to develop charters.)22
No Child Left Behind, however, did not come close to doing what it intended to do. Allowing states to develop their own tests motivated them to calibrate standards and to teach to the tests so they could be seen as making progress. Some principals and teachers, under tremendous pressure to show results, engaged in scandalous behavior such as doctoring students’ test answers. In other cases, NCLB metrics caused successful schools to be labeled “failing.” The charter industry, however, exploded in the 2000s: from 2001 to 2010, the number of charter schools in the United States more than doubled—from around 2,300 to 5,000.23
Opposition to NCLB grew quickly and dramatically. The NEA, for example, did not initially oppose NCLB. Just a year later, however, that changed, and the union argued the law needed serious revision since its “testing provisions, school labels and teacher standards [would] cause chaos.”24 NEA president Reg Weaver argued, rightly, that the law would “force many teachers to do nothing more than teach to the tests.”25 By 2005, the union challenged NCLB in court, arguing it was an “unfunded federal mandate.”26
At root, the very premise of the bill—that punishing schools for the scores of their students would improve the school’s performance—was simply flawed, particularly when school districts did not have the ability to raise students out of poverty or alleviate the trauma of racism. As had been the case with virtually every federal education policy since 1965, NCLB ignored the broader economic structures that might lead a student to succeed or fail in school as well as the relationship between where a student got an education and what jobs would actually be available to them.27 As political scientist Neil Kraus aptly sums it up, “No Child Left Behind, while ostensibly about extending opportunity to disadvantaged students, was yet another step in exclusively linking economic opportunity solely to the schools one attended.”28
The most lasting legacy of No Child Left Behind, however, was that it represented the mountaintop of the education myth. Bush’s State of the Union in January 2002, for instance, made that clear. The first half of the speech, unsurprisingly, reviewed the US war in Afghanistan and legislative proposals to strengthen “homeland security.” But when the president discussed the domestic agenda, education and jobs were inextricably linked, and Bush showed that this commonsense crossed the political aisle. After proposing extending unemployment benefits and tax credits for healthcare costs, Bush pointed out that “American workers want more than unemployment checks—they want a steady paycheck. When America works, America prospers, so my economic security plan can be summed in one word: jobs. Good jobs begin with good schools, and here we’ve made a fine start. Republicans and Democrats worked together to achieve historic education reform so that no child is left behind.”29 Bush would go on to outline the necessity of energy policy, free trade, and tax reduction, but human capital had become the foundation of any economic opportunity. That notion, indeed, was so commonsensical that almost no one in the mainstream of either party could even imagine challenging it.
The Higher Education Myth
As the number of college graduates continued to grow after the 1970s, American policymakers also began to focus more on postsecondary education and its connection to human capital and economic opportunity. In 1975, almost 22 percent of Americans had completed at least a four-year college degree. After a spike to 24 percent in 1977, the number hovered around 22–23 percent over the course of the 1980s. By the end of the 1990s, however, the percentage of graduates had exploded, reaching 29.1 percent by 2000.30
By the 1990s, politicians argued that higher education would not only train new graduates for jobs, but would also facilitate innovation, both through research and development, and also by graduating students who would become innovators themselves. In Wisconsin, Governor Thompson, before joining Bush’s cabinet as secretary of health and human services in 2001, emerged as a staunch advocate of this approach. In July 1996, for example, Thompson created the Blue-Ribbon Commission on 21st Century Jobs, tasking the group with explaining “the role the University of Wisconsin System, the Wisconsin Technical College System and Wisconsin’s 21 private colleges and universities can play in job development, and examine methods of insuring that job growth is successful in all geographic areas of the state.”
In calling for the commission, the governor highlighted national themes we have seen connected to K-12 education. Thompson pointed out both that “technical progress and the imperatives of a global economy have mandated a change in the skill profile of the labor force” and that “Wisconsin must be aggressive and competitive in attracting and keeping high tech jobs.”31 The commission, chaired by the president of the UW System, Katharine Lyall, put together a report that argued there was in Wisconsin an “‘underdeveloped’ workforce yet to realize its potential” consisting of “chronically un- or under-employed workers who generally lack modern skills and work attitudes.” The commission implied these groups were concentrated in “inner-city and rural areas of Wisconsin.” And since “higher levels of employee proficiency are required as businesses fight to remain competitive in a global economy” the commission concluded that “worker training and education must be at the heart of Wisconsin’s job strategy for the 21st Century.”
How would Wisconsin do that? Resurrecting the notion that the university should serve the broad needs of the state, the commission called for a “New Wisconsin Idea for the 21st Century” built on “innovative learning opportunities” for all Wisconsinites, wherever they lived. This education investment was to include “work-based learning” at universities and technical colleges and career planning in high school, in addition to “recogniz[ing] and support[ing] the state’s technical colleges and public and private colleges and universities as primary engines of high-quality economic growth.” The rising inequality in the state (what the report called the “Two Wisconsins”) and lack of jobs for those hit hard by capital flight would thus be met with a holistic plan to tailor the state’s education system, particularly in higher education, to the supposed needs of employers and to invest in technology by connecting the resources of the UW System to private industry.32 This was a much narrower vision of the Wisconsin Idea, one that did not seek to inform broader social democratic possibilities at all, but, instead, sought to ready workers for the supposed realities of a zero-sum global economy.33
A purported “shortage of skilled labor” represented the biggest impediment to job growth in the state. But the data on which this argument rested was dubious. Indeed, the commission cited the Forward Wisconsin Survey, which examined why companies moved to Wisconsin from 1992 to 1995. The survey, however, found that 96 percent of companies were either “Very Satisfied” or “Satisfied” with their move. Though employers surveyed believed job training was important and cared deeply about ready access to labor (what employer doesn’t?), neither the FWS or either of the other two surveys on which the commission based its work provided any actual evidence employers were not able to fill skilled jobs. Had that been the case, wages would likely have been driven up dramatically. While the report pointed out that a few employers were offering higher wages, the state economic outlook tracked with what Reich had argued was happening nationally at that time: for most workers, wage increases lagged behind the rate of inflation. More likely, other policies in Wisconsin—like employer resistance to unions in new manufacturing or the fact that most job growth occurred in the nonunion service sector, which the report projected would grow to 75 percent of all new jobs by 2005—drove down wages.34
Further, while the Governor’s Commission acknowledged the lack of jobs in certain parts of the state—particularly in Milwaukee, which had lost thirty thousand manufacturing jobs since 1986—it blamed the culture and lack of education of those in “inner city areas” instead of capital flight. Indeed, the primary culprit was the “alarming numbers” of teenage pregnancies out of wedlock and high school dropout rates. There were few concrete recommendations for bringing jobs to the city that had been decimated by systematic disinvestment.35 Instead, the report assumed that, as if by magic, better education for Milwaukee residents would create good jobs.
Therefore, the commission sought more education as the solution, and the report’s recommendations highlighted just how much the social democratic vision had narrowed to little more than subsidizing the needs of employers in Wisconsin as the education myth choked off other possibilities. As the Blue-Ribbon Commission argued, expanding “opportunities at universities and technical colleges for work-based learning, [and] integrating classroom studies with productive work experience in fields related to the student’s career goals” would “provide employers with screened and semi-experienced employees” and improve worker retention. In turn, Wisconsin could become more competitive in the race “to attract and retain firms offering skilled employment.”36
Thompson, building from the commission’s findings, made investment in higher education a major part of his agenda in the 1990s, proudly touting the New Wisconsin Idea and especially the notion that investing in the UW System would yield new high-technology, professional jobs as the state moved beyond traditional manufacturing. Indeed, Governor Thompson did bring billions of dollars in new funding to the UW System, even though his vision represented a far cry from that of John Bascom and Robert La Follette.37
By the early 2000s, the importance of higher education—through training college graduates to gain professional skills, become innovators, or even join the creative class à la Richard Florida—was entrenched in the education myth. Even if bringing more Americans into higher education was the right approach to the growing class divide, however, broad, affordable access to colleges and universities would have been essential. And yet, in the 1990s and 2000s, access to universities instead became even more unequal. Economic recessions led states to cut funding for higher education, rationing access to the best colleges and universities in a zero-sum competition supposedly based on merit (very much unlike the 1940s and 1950s). Nationally, state appropriations for colleges and universities fell from about $8.30/$1,000 personal income in 1986 to $6.60/$1,000 in 1998 and between 2001 and 2004, spending per student fell by 15 percent. Student tuition made up the shortfall, tripling from 1980 to 2010 at public colleges and universities. Federal policies like Pell grants, designed to democratize higher education, failed to keep up, and students relied on a greater percentage of loans to make up the shortfall.38 These policies, particularly in California, where the university system was forced to significantly weaken affirmative action in the 1990s, led to an even broader racial gap in the construction of the future creative class.39
Across the nation, Gary Becker’s more conservative version of financing human capital through student loans was gaining ground, even if these loans were not disbursed through strictly private credit markets and instead were guaranteed by the federal government.40 The result was a growing chasm between wealthy families’ ability to send their kids to college (and thus access to the professional class) and that of working families.41 And from the 1970s on, the gap between students who could access elite universities and those who could not widened considerably. As just one example of this trend, despite critiques of affirmative action for minorities to access a rationed set of slots at elite institutions, the most lasting affirmative action program in American higher education was legacy admissions to elite universities, which reinforce economic and racial inequalities.42
Given the push for more higher education combined with growing inequality in access, it is unsurprising that a major Education Department commission in 2006 served as higher education’s version of A Nation at Risk, arguing both for the necessity of human capital and that the system should better condition its students for the global reality of the job market. The commission was charged by Education Secretary Margaret Spellings, an education reformer in Texas who served as a senior adviser to Bush as governor. As a domestic policy adviser to the president, Spellings played a major role in negotiations between the administration and Congress over NCLB.43 After Roderick Paige’s departure as secretary in 2004, Bush appointed Spellings to fill the position.
The Spellings Commission worked to elevate a conservative version of the education myth. Chaired by private investor Charles Miller, the commission highlighted how much Bush’s Education Department saw the function of higher education as serving the direct employment needs of corporations. Of the other seventeen members, four were corporate leaders (from IBM, Microsoft, Kaplan, and Boeing) and another was an executive of the US Chamber of Commerce. Others included education reformers (the director of Education Trust, the chairperson of EduCap Inc., and former North Carolina governor James Hunt), university administrators, and two engineering professors. Arturo Madrid represented the only faculty member from the humanities, and, of course, there were no workers. The report also thanked the Lumina Foundation and Microsoft for their assistance.44
Like ANAR, the report, entitled A Test of Leadership: Charting the Future of U.S. Higher Education, began with anxiety around the competitiveness of American education, only this time with regard to universities: “We may still have more than our share of the world’s best universities. But a lot of other countries have followed our lead, and they are now educating more of their citizens to more advanced levels than we are. Worse, they are passing us by at a time when education is more important to our collective prosperity than ever” (emphasis in original). For the Spellings Commission, it was obvious why this was a problem: “We have seen ample evidence that some form of postsecondary instruction is increasingly vital to an individual’s economic security.” The commission pointed to the lack of preparation in high school for postsecondary work, rising costs, a “confusing” financial aid system, and “unacceptable numbers of college graduates [who] enter the workforce without the skills employers say they need in an economy where, as the truism holds correctly, knowledge matters more than ever.” In particular, A Test of Leadership worried that “the consequences of these problems are most severe for students from low-income families and for racial and ethnic minorities.”45 Clearly the report helped to support the notion that, once again, the failure of working people, especially minorities, to accrue the right human capital was responsible for growing economic inequality.
The Spellings Commission did highlight some of the challenges many students faced in degree attainment. The report rightly pointed out that while more students were going to college, the percentage of students graduating was not increasing at the same rate. Further, a disproportionate amount of these students were minorities and/or lacked economic resources. Still, though the report spoke to some of these inequalities, the recommendations were limited, pushing for better high school standards to “create a seamless pathway between high school and college”; for admitted students to “take [more] responsibility for academic success”; and for “cost-cutting and productivity improvements in U.S. postsecondary institutions” while supporting an overhaul of the byzantine financial aid system and a “significant increase in need-based financial aid.” The overarching argument assumed the US system of higher education should become more accountable, just as had been proposed with NCLB, in exchange for any additional funding.46
Though A Test of Leadership explained how crucial a college education was for economic opportunity, the report’s conservative perspective meant the report simply accepted as given that state support would be limited. The Spellings Commission produced staggering numbers: between 1995 and 2005, average tuition and fees at public universities had risen 51 percent, including 30 percent at community colleges as “state funding fell to its lowest level in over two decades.” A logical conclusion might have been to call on Congress to provide a massive infusion of funding to support state governments’ higher education budgets. The commission did not make this argument, however. “The bottom line,” concluded A Test of Leadership, “is that state funding for higher education will not grow enough to support enrollment demand without higher education addressing issues of efficiency, productivity, transparency, and accountability clearly and successfully. However, based on our commission’s review of the education needs of our nation, we encourage states to continue their historic and necessary commitment to the support of public higher education.”47 Accountability for higher education, just as was the case for K-12, was intended to show that responsibility for economic opportunity lay with the individual or the school, not changing the overarching social structure.
By the end of the Bush administration, prominent scholars were pointing to the inequities that existed in American higher education and arguing to rectify them in order to make the economy more equal. The economists Claudia Goldin and Lawrence Katz, for instance, in their widely cited book The Race Between Technology and Education showed the inverse correlation between state support for education and economic inequality over the course of the twentieth century, citing a “race” between the demands of employers to keep up with technological change and the American education system’s ability to improve educational attainment. Though some scholars have pointed out that this correlation did not necessarily equate to causation (indeed, it seems equally plausible that greater investment in education stems from widespread economic security), Goldin and Katz at the very least laid out the case for democratizing access to higher education, arguing that while more Americans were responding to the incentive of a higher wage premium by attending college, they were not completing it at the same higher rates. The economists called for greater access to preschool for families with limited resources; reform of K-12 education so more kids would be ready for college (also recommended by the Spellings Commission); and finally, “financial aid sufficiently generous and transparent so that those who are college ready can complete a four-year college degree or gain marketable skills at a community college.”48 Still, just as had been the case with Spellings, the economists did not make the obvious leap: if access to higher education was so crucial to economic success, shouldn’t students receive that education as a right, as is the case with K-12 education?
Furthermore, as both Florida and the Spellings Commission failed to do, Goldin and Katz never really addressed the question of what would happen to those American workers who did not go to college, even if all 61 percent of those who attended could graduate. Empirical evidence from the Bureau of Labor Statistics shows that the majority of American jobs have never required college degrees. In 2010, for example, almost 70 percent of jobs in the United States still only required a high school degree or less education. Expanding access to education, then, would seem only to flood the market with more creative-class workers who would be overqualified and underemployed.49
But even if we assume that, magically, more educated workers would somehow facilitate such broad prosperity that there would suddenly be enough professional jobs for all college graduates, what would their jobs be like? Florida had fanta-sized that the creative class had begun to “control the means of production . . . because it is inside their heads; they are the means of production.”50 This may have been true for some workers, but the reality for most was that their employers—from rich tech companies like Apple, Microsoft, or Facebook that hired elite graduates all the way down to the regional insurance company that hired graduates from public comprehensive universities—continued to extract profits from the labor of these workers, and they had their own motivations to drive down labor costs.
Indeed, beginning in the 1990s, workers in many sectors of the knowledge economy were finding their relative economic position decline as the wealth of the richest Americans exploded. By that time, many employers had begun using business principles such as “knowledge management,” which sought to maximize returns to shareholders by determining which knowledge work created the most value and to reduce the costs of lower-value work as much as possible. In some sectors, employers sought cheaper labor through outsourcing or automating many of the lower end creative-class jobs and by importing the work of cheaper labor elsewhere.51 The tech industry, in particular, sought to lower costs by pushing Congress to liberalize immigration of workers with training in STEM fields.52 For those who kept their jobs in the white-collar sector (outside of the work that was high-value “proprietary labor”), the strategy was more surveillance and discipline, and it was no coincidence that this management strategy was similar to the accountability measures put into place for teachers and schools under NCLB and, later, Obama’s “Race to the Top.” In sum, by the end of the 1990s, the incomes of most college graduates, while still paying a premium above most blue-collar workers, were also starting to stagnate, and many white-collar workers were losing the very discretion that would have made their work “creative.”
Even the higher education labor force—a group of workers at the upper echelons of education attainment in the United States and paragons of what Florida called the creative class—was in the midst of a long trend of growing insecurity and deprofessionalization. From 1970 to 2019, as the number of college students in the nation more than doubled, so did the number of college and university instructors. But much of this latter increase was for part-time, nontenure track faculty, which increased by a factor of seven during the same period, from 105,000 to 755,000. The period in which the creative class supposedly began to dominate the American ethos, from the late 1990s to the precipice of the Obama administration, was characterized by a dramatic shift in this direction: the percentage of full-time tenure track faculty from 1997 to 2007 declined from 33.1 percent of the total higher education teaching force (about one-third) to 27.3 percent (just over one-fourth). Part-time faculty increased from 34.1 percent to 36.9 percent. Not only did tenure track status bring higher pay and better benefits and enhanced job security, but not being eligible for tenure, importantly, prevented these workers from having the kind of academic freedom and professional discretion that supposedly marked what Drucker had called “knowledge work” or what Florida called creativity.53
Even if Goldin and Katz, then, had been right about the importance of more education, their argument neglected to chart the political decisions needed to re-empower the growing knowledge core of the professional class, not to mention the noncollege-educated workers left behind entirely. Nevertheless, the college premium, coupled with politicians from both parties who turned their back on the livelihoods of those without college degrees, still left going to college as the commonsense solution to economic inequality.54
Hope and Change
On February 13, 2008, Illinois senator and presidential candidate Barack Obama spoke in the southern Wisconsin city of Janesville. By that point, the economy was already feeling the impact of a wave of housing foreclosures, and the jobs of the UAW workers at the auto plant there were under threat (major cuts would come later that year, and by 2010, the plant would be completely closed). A candidate for the Democratic nomination, Obama characterized himself as a reformer who would take on the inequality that had grown under twenty years of the DLC at the helm of the nation’s oldest political party.
In Janesville, Obama spoke to the promise of secure, blue-collar jobs that many Democrats no longer believed were worth fighting for. “It was nearly a century ago that the first tractor rolled off the assembly line at this plant,” he began. “The achievement . . . led to a shared prosperity enjoyed by all of Janesville. Homes and businesses began to sprout along Milwaukee and Main Streets. Jobs were plentiful, with wages that could raise a family and benefits you could count on.” Obama contrasted himself with his eventual predecessor George Bush and his two opponents that year, Democratic senator Hillary Clinton and Republican senator John McCain, both of whom he implicitly argued had abetted the recession and the housing crisis in a “failure of leadership and imagination in Washington—the culmination of decades of decisions that were made or put off without regard to the realities of a global economy and the growing inequality it’s produced.”55
In a sharp rebuke to the Clintons, Obama argued this Washington was one “where decades of trade deals like NAFTA and China have been signed with plenty of protections for corporations and their profits, but none for our environment or our workers who’ve seen factories shut their doors and millions of jobs disappear; workers whose right to organize and unionize has been under assault for the last eight years.” The senator from Illinois knew Americans wanted change: “I realize that politicians come before you every election saying that they’ll change all this. They lay out big plans and hold events with workers just like this one, because it’s easy to make promises in the heat of the campaign. But how many times have you been disappointed when everyone goes back to Washington and nothing changes?” Obama, however, would be different: “When I talk about real change that will make a real difference in the lives of working families . . . it’s not just the poll-tested rhetoric of a political campaign. It’s the cause of my life. And you can be sure that it will be the cause of my presidency from the very first day I take office.”56
Though Obama could not extricate himself from the notion that we must “train our workforce for a knowledge economy” by offering up tax credits to college-goers in exchange for community service (an idea the DLC had been pushing since the 1990s) and simplifying the financial aid process (as the Spellings Commission had called to do), his vision for those without college degrees went much further than Democrats had in decades. He pledged to raise the minimum wage substantially, create millions of new green jobs, and provide universal healthcare. In fact, he even criticized Clinton’s plan for “requir[ing] the government to force you to buy health insurance.” As important, he went after Clinton on trade, attacking her for “go[ing] around talking about how great [NAFTA] was and how many benefits it would bring” and pledging he would “not sign another trade agreement unless it has protections for our environment and protections for American workers.” Here Obama spoke to a major political pressure point in Wisconsin, as empirical studies showed that by 2010, NAFTA had cost about 700,000 American manufacturing jobs and China’s admittance to the World Trade Organization in 2001, a process of trade normalization begun under Bill Clinton, had cost another two to three million jobs.57 Though Obama did not talk specifically about labor rights in this speech, elsewhere he pledged to join picket lines and to make labor reform a priority when he took office.
His promises would end up mostly in the same dustbin as those of the politicians he caricatured at Janesville, however. Obama won a massive victory over McCain as the economy melted down, retaking states like Ohio that had begun to tilt Republican, and Democrats in Congress won a filibuster-proof majority.58 But the major social democratic reform Obama promised was not forthcoming. For starters, Obama’s administration immediately accommodated professionals, particularly in the banking industry. As Thomas Frank has argued, Obama’s political instincts were deeply rooted in the notion that America was a meritocracy in which the talented would rise to the top, particularly since his own biography was marked by a rise from obscurity to fame through an Ivy League education. Thus, his administration was run exclusively by elites who had graduated from prestigious universities, especially Wall Street insiders like Timothy Geithner (secretary of treasury) and neoliberal economist Larry Summers (chair of the Council of Economic Advisors).59
Obama’s legislative agenda in 2009 rightly prioritized preventing greater economic harm and putting demand back into the economy, as he worked with Congress to quickly pass the $800 billion American Recovery and Reinvestment Act (ARRA). Still, given the scale of the crisis, the stimulus package he passed was likely not large enough, as unemployment continued to dog the American economy for years, nor did it prioritize direct job creation, instead spending billions on tax cuts to placate Republicans.60 Even so, pumping more money to state and local governments did stanch public-sector job losses, particularly for teachers, as about $95 billion of ARRA was used for this purpose.
Obama and his administration’s belief in the magic of the education myth, however, limited their commitment to helping those outside the reach of the good-paying jobs of the professional class.61 After passing ARRA, Obama focused on healthcare, and while the Affordable Care Act (ACA) in 2010 brought some real change—eliminating discrimination based on preexisting conditions, for instance—the law did exactly what Obama had criticized Clinton’s proposal for doing: forced people to purchase insurance in private markets instead of offering a single-payer system or even a “public option” that would have made insurers compete with the federal government. The ACA never actually covered all Americans and for those it did, it wasn’t always “affordable,” either. Obama also put very little effort into badly needed labor law reform, following the legacy of both Carter and Clinton in this regard.62 Given the work of the AFL-CIO and its member unions to turn out the vote for Obama in 2008, this was a major disappointment for unionized workers.63
Whatever moderate advances in social democracy Obama delivered elsewhere, however, in terms of public education, his belief in the magic of human capital continued to cement the nation’s commitment to market-based education reform. Obama’s community organizing work, just as Lyndon Johnson’s teaching experience as a young man in Texas had, seems to have instilled in him a commitment to assist kids from limited means—disproportionately African American and Latino—to access better education.64
The intentions of the president’s education agenda became immediately apparent when he appointed Arne Duncan instead of Linda Darling-Hammond as secretary of education in 2009. Darling-Hammond, a Stanford professor who many assumed had the inside track on the position as Obama’s chief education policy adviser during the campaign, had criticized NLCB for its insufficient funding and narrowing of the curriculum, and she had pointed to the broader inequalities in American education that needed to be rectified instead of blaming teachers and schools.65 Obama selected Duncan, however, who had been the CEO of Chicago schools from 2001 to 2009. There, Duncan had fought unions for greater flexibility, pushed for more authority for principals, and dramatically expanded the number of charter schools while closing neighborhood public schools. In the words of one scholar, Chicago represented “Arne Duncan’s prototype on his national road show to promote school closings and education markets after he was appointed U.S. Secretary of Education in 2008.”66
Indeed, in Chicago, where hundreds of thousands of blue-collar manufacturing jobs had been lost (including in the steel industry on the south side, where Obama had begun his political career as a community organizer), by the 1990s and 2000s, education reformers like Duncan sought to remake the schools using market fundamentalist ideology. The underlying logic was that students needed schools that could properly equip them with the right human capital to seek economic opportunities in the neoliberal city.67 By that time, Duncan was hardly alone, as the hedge fund–backed Democrats for Education Reform (DFER) had organized a core of procharter, human capital–focused officeholders like New Jersey mayor Cory Booker.68 Booker, for instance, with $100 million in funding from tech billionaire Mark Zuckerberg, would soon hand over Newark’s schools to corporate consultants in one of the greatest boondoggles in American history.69
With Duncan in the fold, the Obama administration got right to work reforming schools. Obama and Duncan used $4.35 billion earmarked from ARRA as leverage to force states to change their education policies. Entitled Race to the Top (RTT), the administration’s plan provided funding to a small group of states that won competitive grants in a contest to create more testing, to use that testing to evaluate teachers’ supposed added value, and to create more charter schools. While only ten states initially won grants, twenty-three states changed their laws in order to make their applications more competitive, including thirteen that raised caps on charters and eleven tying teacher evaluations to test scores.70
Obama’s first two years in office, indeed, evidenced a deep belief in the magic of human capital. In his 2010 State of the Union address, for example, the president assessed his administration’s first year of efforts to alleviate the economic crisis, pointing to the successful bank bailout, expanded unemployment benefits, and the tax cuts and jobs saved by ARRA. He called on Congress to fund community banks so they could lend money to small businesses, and he also asked for an additional infrastructure bill to create jobs. But the rest of the speech decidedly promised no guarantees. Instead, Obama argued that “the only way to move to full employment is to lay a new foundation for long-term economic growth, and finally address the problems that America’s families have confronted for years.”
And what were those problems? First and foremost, in the international competition for good jobs, Obama argued that to compete with China, India, and Germany, the United States needed to put “more emphasis on math and science.” He concluded by doubling down on the DLC-derived argument about free trade and human capital. Indeed, in a mere two paragraphs, Obama brought together Democratic assumptions—from the Johnson era to the Clinton era—about the connection between reducing poverty and enhancing education, accountability-based school reform, and the creative-class notion that only college degrees could guarantee a decent job:
Instead of funding the status quo, we only invest in reform—reform that raises student achievement; inspires students to excel in math and science; and turns around failing schools that steal the future of too many young Americans, from rural communities to the inner city. In the 21st century, the best anti-poverty program around is a world-class education. . . .
When we renew the Elementary and Secondary Education Act, we will work with Congress to expand these reforms to all 50 states. Still, in this economy, a high school diploma no longer guarantees a good job. That’s why I urge the Senate to follow the House and pass a bill that will revitalize our community colleges, which are a career pathway to the children of so many working families.71
Storm on the Horizon
Obama’s presidency, which largely failed to expand social democracy and instead represented yet another Democratic administration that advanced the education myth, would bring a backlash. That backlash was multifaceted, but the response, in which Republican politicians mobilized some of the same forces mobilized by Nixon, Reagan, Perot, and Buchanan, exploited a growing populist resentment on the right, particularly against the supposed meritocracy in which those outside the professional class were increasingly excluded. But the revolt occurred on the left as well, as disappointment with the high promises set by Obama also cracked open the assumptions at the center of the Democratic Party.
We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website. You can change this setting anytime in Privacy Settings.