read

This is part six of my annual review of the year in ed-tech

One of education technologies’s greatest luminaries passed away this year. Seymour Papert died at his home in Blue Hill, Maine in August (yes, that’s the site of the Blue Hill Fair, where Charlotte the spider saved Wilbur the pig). Seymour was 88; or, as he was born on February 29 – a Leap Year baby – he was just 22. Either way, he’s gone too soon; we’ve lost too much, too many in 2016.

Seymour, as my friend Gary Stager has described him, was the “inventor of everything (good) in education.” He developed “constructionism,” a theory of learning based on Jean Piaget’s “constructivism”; he co-invented the programming language LOGO; he was the inspiration for Lego’s Mindstorms; he co-founded the MIT Media Lab; he’s been called the father of the “maker” movement; he authored two books that everyone in education should read: Mindstorms and The Children’s Machine (he authored more than that, but you really must read these books, particularly if you work in ed-tech); and he was a mentor and friend to many.


There were many remembrances penned for Seymour this year. I’d hardly know where to begin in writing one, but I want to open this particular article – one that focuses, in part, on the whole “everyone should learn to code” craze – recognizing his great contribution to educational computing as well as his loss. It’s all of our loss, really, as too many in education technology happily reduce the potential of computer programming as an epistemological endeavor to a market for new products.

I worry that we’ll see a lot of fights over Seymour’s legacy in the coming years – attempts to claim affinity and accordance with his work where there really is none. (Never forget: Bill Gates once called constructionism “bullshit.”)

I first read an article by Seymour Papert in a Women’s Studies class in the mid 1990s – “Epistemological Pluralism,” which he co-wrote with Sherry Turkle. I’m pleased to say that my first introduction to Papert's work was through “feminist epistemology” and not through “tech” or “ed-tech.” But of course, that’s not quite true: my first introduction to Papert's work was actually a decade earlier, when I sat at an Apple II and taught a Turtle how to move about the screen. My work as a thinker and writer about tech and ed-tech – and one explicitly committed to social justice – has been profoundly affected by Seymour. I am so grateful for his work, grateful of the little time I got to spend with him.

I am committed to fighting for a world in which technologies – educational and otherwise – are not built for control and compliance. I do so in Seymour’s memory. I do so because thinking and building a more just and equitable future demands it.

“You can’t think about thinking without thinking about thinking about something” – Seymour Papert

Seymour’s arguments for why children should use powerful “thinking machines” to develop their own powerful thinking in turn stand in stark contrast to many of the arguments (and the accompanying commercial products) for computer science education – those that Globaloria’s Idit Harel has described as “pop computing.”

Note the significant difference in language in this headline from The Verge, for example – “Harvard’s Root robot teaches kids how to code” – and the way in which Seymour would describe the LOGO Turtle – that students would using programming to teach the robot.

Perhaps, as developmental psychologist Edith Ackermann argued this year, the “makermovement will serve as “Papert’s Perestroika,” thawing the school system’s reluctance to embrace constructionist practices. Perhaps.

Or perhaps, as Seymour himself once described the school system’s reaction to LOGO,

Thus, little by little the subversive features of the computer were eroded away: Instead of cutting across and so challenging the very idea of subject boundaries, the computer now defined a new subject; instead of changing the emphasis from impersonal curriculum to exited live exploration by students, the computer was now used to reinforce School’s ways. What had started as a subversive instrument of change was neutralized by the system and converted into an instrument of consolidation.

One might recognize that very trajectory in Minecraft, once heralded as a sandbox for playful, open-ended exploration and programming. Now, Minecraft, as educational games scholar Dean Groom pointed out this year, is often used to “advance unproven and sensationalist discourses about imagination and creativity – with no regard for the value of games as a broader phenomenon.”

Microsoft acquired Minecraft in 2015 – which probably speaks volumes right there about its progressive potential – and in January of this year, Microsoft bought TeacherGaming, the maker of a Minecraft version aimed at classroom usage. In turn, Microsoft released an educational version of the popular block-building game this fall.

Microsoft Weaponizes Minecraft in the War Over Classrooms” was how the technology publication Backchannel described the move – that is to say, this isn’t a signal of “Papert’s Perestroika” as much as it is a push by Microsoft for market- and mindshare. The educational version requires an Office 365 account, for starters, making Minecraft part of vendor lock-in much more than any imaginative openness or pedagogical transformation.

Funding data about “learn to code” startups can be found at funding.hackeducation.com.

Computer Science For All


Although there is some lip service paid to learning computer programming in order to deepen students’ thinking and expand their creativity, much of the conversation about computer science is framed in terms of developing students who are “job ready” – the rationale for teaching computer science President Obama gave in his final State of the Union address in January.

Obama proposed a $4 billionComputer Science for All” initiative this year, invoking the “skills gap” in order to justify the funding.

More schools are teaching computer science – although many still see an “Hour of Code” as a sufficient commitment to that end. Some cities and states are considering making computer science compulsory – in Chicago, Virginia, Queensland, for example. For their part, Florida legislators debated in February whether or not computer science classes might fulfill a foreign language requirement – a terribly wrong-headed idea.

Clearly, we don’t all agree on what “computer science” entails or what a class in it should require. And what does that oft-repeated phraselearn to code” even mean?

To that end, the Computer Science Teachers Association proposed CS standards for K–12 this year – curricular guidelines written by a “coalition of industry and education organizations,” as Edsurge, always happy to plug those “public-private partnerships,” put it: “Code.org, Cyber Innovation Center, National Math and Science Initiative, the Association for Computing Machinery and Computer Science Teachers Association. The work is also supported by companies including Apple, Google and Expedia, as well as education organizations including the CollegeBoard, Teach For America and STEMx.”

Industry, not the child’s imagination as Seymour would have it, largely dictates the shape and direction of the CS trend.

With or without these CS standards, questions remain about “who’s going to teach America’s kids to code,” as TakePart’s Liz Dwyer wondered in April. Another key question, this one from Kathi Fisler, Shriram Krishnamurthi, and Emmanuel Schanzer: “exactly who all” is Computer Science for All? I’ll turn to this issue of equity – what are the demographics of the computer science industry, who benefits from how we frame the “skills gap” – in the next section in this article. After all, we know from the history of “career and technical education,” that these “opportunities” tend to be limited to certain types of students and carry with them a legacy of discriminatory tracking practices.

Nevertheless, computer science education, along with STEM (science, technology, engineering, and math) continued to be promoted this year, often framed as something students should pursue instead of “liberal arts.” This argument is, of course, ridiculously daft, as “the liberal arts” has historically included math and science.

“Hardly Anyone Wants to Take a Liberal Arts MOOC,” Edsurge informed its readers in February. Only “1.86 unique users have enrolled 4.1 million times in edX’s liberal arts course.” So hardly anyone at all, I guess. Just one person enrolling over and over and over and over again. (Maybe the word “million” was left out. Who knows. “Hardly anyone,” either way.)

There seems to be a real distaste for “liberal arts” among many Silicon Valley it seems – funny since that’s what many of tech execs studied in college, several of whom now prominently advocate computer science as utterly necessary while subjects like ethics or aesthetics or history are a waste of time, both intellectually and professionally.

Despite all the investment and marketing surrounding CS and STEM education, the results – enrollments and graduation rates – have been mixed at best. The demand for CS classes is up, for example, at some universities, and it’s down significantly at others. The perceived importance of CS, however, has led many universities to partner with coding bootcamps and to continue to outsource technical competencies to (largely) for-profit companies. Prevailing narratives about the importance of computer science and STEM education has also prompted some politicians to call for cutbacks in funding for liberal arts education.

Despite the drumbeat that their focus should be “coding,” student do continue to sign up for courses in the humanities – which, again, are different from the “liberal arts” – even in the face of predictions about “troubled job markets” for academics in the field and the long-running “myth of the English major barista.” Yes, the number of Bachelor’s Degrees granted in humanities has fallen, but that’s partly due to STEM majors becoming more appealing to (although not necessarily more welcoming of) women.

“Everyone should learn to code.” Once upon a time, the chant was “everyone should go to law school.” That hasn’t really worked out that well, nor has the mantra “everyone learn to buy and sell real estate” – although that one was a hell of a powerful lure for Trump University, wasn’t it.

And then there’s the advice from Pearson. “Which major is the best?” the education giant asked in June. “Try marketing,” it suggested. Someone’s got to write all these education technology press releases, after all.

The “Skills Gap”


The “skills gap” is repeatedly invoked as a rallying cry for more computer science education. But much like the phrase “learn to code,” the meaning of “skills gap” is unclear.

As education reporter Mikhail Zinshteyn wrote in The Atlantic earlier this year,

a loud chorus of researchers in education and labor markets question the notion that workers are unqualified for the growing sophistication of tech jobs. For several years some academics have pushed back against concern the U.S. labor market has a dearth of employees for Science, Technology, Engineering, and Math (STEM), citing data that shows positions in those fields aren’t experiencing spikes in wages – something economists say would need to happen in a labor shortage because it shows employers are willing to pay more to attract the talent they need.


Michael Teitelbaum, a scholar on the history of STEM, said in 2014 to an audience of education reporters that the post-war U.S. period is dotted with “repeated cycles of alarm, boom, and bust.” He went on to say that “many of the people who were attracted in during the boom phase into majors and graduate degrees in these fields … end up graduating and finding there’s no attractive career path.”


Ron Hira, a scholar at Howard University who studies labor and technology, has been one of the most vocal skeptics about shortages in the informational technology fields. He argues that employers want to saturate the labor market with foreign employees – who are here on work visas and typically earn less than their American counterparts – with the goal of, driving down wages in the IT sector. Others have argued the recent rash of layoffs at tech companies belies concerns there’s a worker shortage in that sector.

If there is a gap, it’s perhaps the one that persists in STEM education and in the technology industry more broadly – that is, these remain overwhelmingly white and overwhelmingly male. Women and people of color are underrepresented in CS classes, STEM degree programs, and technical and scientific careers.

A couple of notable exceptions: Harvey Mudd College in California where 55% of its CS grads are women. (The rate nationally: 16%.) As Quartz reported in August, Harvey Mudd “has done it by removing obstacles that have typically barred women – including at the faculty level. The school emphasizes teaching over research, hiring and rewarding professors on the basis of their classroom performance, says Maria Klawe, Harvey Mudd’s president since 2006. And it places women in leadership positions throughout the school. Next year, six of the school's seven department chairs, and 38% of its professors school-wide, will be women.” And HBCUs “are more than pulling their weight in preparing African-American students for STEM careers,” The Chronicle of Higher Education argued in February in response to a Georgetown University’s Center on Education and the Workforce study that found that Black students are “underrepresented in the college majors that tend to lead to higher-income occupations and overrepresented in majors that tend to lead to lower salaries.”

For its part, the technology industry has been under fire for several years now for its woeful lack of diversity – a problem that, despite these companies’ highly publicized diversity initiatives, seems to have left their employee demographics largely unchanged. Or, in some cases, even less diverse.

It’s commonplace to hear Silicon Valley executives blame the problem on “the pipeline,” but it’s certainly more complicated that that. It’s a problem of pay and promotion and culture. A study released in January found, for example, that “sexism is rampant in the tech industry, with almost two-thirds of women reporting sexual harassment and nearly 90 percent reporting demeaning comments from male colleagues.”

This problem extends to startups and investors too: “In the Bay Area, 16 companies that received A funding in 2015 were led by a female CEO, or 8% of the total. This represents a 30% year-over-year decrease in the number of female-led companies that raised an A in the Bay Area compared with 2014,” as Female Founders Fund’s Claire Burke wrote in January. And it extends to education and education technology as well.

In light of all this, it’s fair to ask, as Melinda D. Anderson did in The Atlantic in February, “Will the Push for Coding Lead to ‘Technical Ghettos’?” What can we learn from the history of discriminatory practices in vocational education at the K–12 level, in for-profit higher education, in recruitment, in hiring, in pay, and in retention? How can address a “skills gap” as an equity issue not simply as a “skills” issue? (And can we even do so without some understanding of the humanities, not just of “coding”?)

(In fairness: education also suffers from a lack of diversity at both the K–12 and college level. The Department of Education pointed out in a report on racial diversity (or the lack thereof) in the teaching profession that recruiting and retaining teachers of color is crucial because these educators are “are more likely to (1) have higher expectations of students of color (as measured by higher numbers of referrals to gifted programs); (2) confront issues of racism; (3) serve as advocates and cultural brokers; and (4) develop more trusting relationships with students, particularly those with whom they share a cultural background.”)

The New Economy and the History of the Future of Work


Donald Trump did not campaign on a platform that “everyone should learn to code.” (Hillary Clinton did.) But among Trump’s many campaign promises: bringing back jobs in manufacturing and mining. It was an obvious appeal to nostalgia for a prosperous post-war America and the comforts it provided – real or imagined – the middle class, to a time when factory jobs paid well, when a college degree wasn’t necessary in order to earn a good salary.

There have been a lot of opinion pieces written since the election about whether or not Trump’s support reflected voters’ economic anxieties and about whether or not economic recovery has been experienced by all Americans.

According to the latest jobs report, unemployment in the US is at 4.6%, the lowest since 2007. But that figure obscures the fact that some 95 million workers are no longer counted as part of the labor force. There are several reasons why, but as Former Secretary of Labor Robert Reich observed, “The American economy isn’t providing nearly as many good jobs as are needed.” “Trump’s neo industrial policies won’t create these jobs,” he added.

Nor, I’d argue, will a fixation on simply “learning to code.”

There have long been efforts to “retrain” displaced workers – with varying success. (Although job training is something that companies have been unwilling to pay for – or at least that they pay for quite unevenly.) So, will “learning to code” in particular provide much needed job opportunity and job security?

Probably not. For as The Wall Street Journal cautioned in October, “America’s Dazzling Tech Boom Has a Downside: Not Enough Jobs.”

The technology revolution has delivered Google searches, Facebook friends, iPhone apps, Twitter rants and shopping for almost anything on Amazon, all in the past decade and a half.


What it hasn’t delivered are many jobs. Google’s Alphabet Inc. and Facebook Inc. had at the end of last year a total of 74,505 employees, about one-third fewer than Microsoft Corp. even though their combined stock-market value is twice as big. Photo-sharing service Instagram had 13 employees when it was acquired for $1 billion by Facebook in 2012.


Hiring in the computer and chip sectors dove after companies shifted hardware production outside the U.S., and the newest tech giants needed relatively few workers. The number of technology startups fizzled. Growth in productivity and wages slowed, and income inequality rose as machines replaced routine, low- and middle-income, human-powered work.

There are several employment trends here that might suggest that simply “learning to code” is an insufficient response to today’s economic realities: the technology industry’s own practices of outsourcing, offshoring, subcontracting, anti-unionism, use of volunteer labor, and automation, for starters.

Machines are coming for our jobs, we’ve been told. (Indeed, many economists agree that it’s automation and not the target of Trump’s ire – trade – that has prompted the elimination of US manufacturing jobs in recent years.) The threat of impending automation, so the story goes, now extends beyond the factory floor. “Nearly half of young people fear jobs will be automated in 10 years,” The Guardian reported in January.

Perhaps the more immediate concern regarding what’s coming for our jobs should not be robots, but rather an ideology and accompanying economy that praise “precarity.” Silicon Valley has actively pushed the notion of a “gig economy” – exemplified by companies like AirBnB and Uber – where, instead of full-time employment, workers must piece together various freelance “gigs.” Reflecting, perhaps, its Silicon Valley roots, the gig economy has been found to be rife with gender and racial bias and to benefit those with the greatest social and financial capital. (The Awl summarizes a Pew Research study, noting that “The Sharing Economy Is Only For People Who Can Afford To Not Share.”) As Data & Society wrote in November,

Pay attention to online gig work because it is dramatically reshaping our society. Labor economists Lawrence Katz and Al Krueger estimate that conventional temp and alternative contract-driven work rose from 10 to 16%, accounting for all net employment growth in the US economy in the past decade. Assuming Pew’s trends continue at the current rate, by the year 2027, nearly 1 in 3 American adults will transition to online platforms to support themselves with on-demand gig work. This is only bad news if we do nothing to change the outdated laws and structures in place to support working people. Ignoring corporate and consumer dependency on an on-demand gig workforce is not a sustainable strategy.

Education, for what it’s worth, is one of the top-five industries demanding freelance laborers.

How are workers supposed to respond to the new economy? “Adapt, or Else” – that’s what AT&T told its employees. Stay in “perpetual beta,” as workplace analyst Harold Jarche puts it. Be a “lifelong learner.” Train and retrain.

And I’d add too: be quite suspicious of Silicon Valley’s support for a universal basic income. After all, Silicon Valley ideology rests upon neoliberalism and libertarianism; it furthers the dismantling of public institutions, moving responsibility and risk onto the individual.

All this – a gig economy, an emphasis on “learning to code,” a demand for job training – will be a boon for education technology. (Or that’s what entrepreneurs and investors hope, at least.)

Education Technology and the Business of Job Training


As I argued in the previous article in this series, it is impossible to separate for-profit higher education from the narrative that insists on a need for specialized technical and business training – a narrative that has, for over a century now, contended that traditional colleges and universities fail to provide this sort of education. [Universities can’t solve our skills gap problem,“ as Degreed’s Jonathan Munk argued in a Techcrunch op-ed in May, ”because they caused it."

Education technology companies like Degreed have identified a problem that they claim they alone can fix. “Are Bootcamps the Answer to the ‘Skills Standoff?’” asked a venture capitalist in an op-ed published by Pearson. “Could Computer Coding Academies Ease the Student Loan Crisis?” asked the stock market news site The Street.

Coding bootcamps are just one manifestation of a resurgent for-profit higher education industry. These unaccredited schools argue they’re best positioned to “close the skills gap.” MOOC startups like Udacity and Coursera have also rebranded to target this particular post-secondary technical training market. Both Udacity and Coursera now work closely with corporations, providing curriculum designed by companies and industries, aimed at professional development for employees and potential) employees – and freelance workers, of course.

No surprise, some of these bootcamps also replicate the discriminatory employment practices of Silicon Valley. 42, for example, is a new teacher-less bootcamp that only allows students under the age 30.

These bootcamps echo the narrative that Silicon Valley pushes about “the new economy.” Udacity, for example, which has touted itself in the past as “Uber for education,” offers piecework in lieu of full-time employment for its graduates. And yet, this freelance work “counts” towards Udacity’s money-back guarantee for job placement. As The New York Times describes it,

The program, called Blitz, provides what is essentially a brief contract assignment, much like an internship. Employers tell Udacity the skills they need, and Udacity suggests a single candidate or a few. For the contract assignment, which usually lasts about three months, Udacity takes a fee worth 10 to 20 percent of the worker's salary. If the person is then hired, Udacity does not collect any other fees, such as a finder’s fee.

Edsurge, always there to hype the Silicon Valley’s “disruptive innovations,” suggests that there’s much that universities can learn from the Udacity offering – ignoring, of course, the pervasive issues of racial, gender, and economic bias in the “gig economy.”

“Who’s Playing Matchmaker Between Students and Employers?” Edsurge asked this summer. Why, startups of course. Job placement services are poised to be a growth area for startups, particularly as counseling services get slashed at public schools and as workers are perpetually in search of a new “gig.”

Funding data about job placement startups and for corporate training startups can be found at funding.hackeducation.com.

But all these startups are going to have to battle another technology giant in the hunt for job-hunters: Microsoft. It announced in June that it would make its largest acquisition ever, buying LinkedIn for $26.2 billion. LinkedIn, of course, had acquired online job training company Lynda.com last year for $1.5 billion.

That acquisition prompted Mindwires Consulting’s Michael Feldstein to call LinkedIn “the most interesting – and possibly the most consequential – company in ed tech,” in no small part because of its massive dataset about people’s professional and educational histories and the skills they attach to their profiles.

LinkedIn/Microsoft launched several education-related services this year that signal its ongoing interest in education and professional development – “Learning Paths” in March and “LinkedIn Learning” in September. The price tag for this particular, productized version of “lifelong learning” is $29.99 per month. Thirty bucks will get you recommendations for courses and connections that other professionals have followed in order to achieve their position.

Writing this fall, Feldstein seemed quite a bit more skeptical of Linkedin’s radical ed-tech potential:

LinkedIn Learning has the look of an effort to be another Netflix of education. If we can just create consumable content chunks and then apply data science to deliver the right chunk at the right time to the right person, the thinking goes, then we'll achieve nirvana.


We’ve seen this before in the MOOC world. We also see it in many instances of adaptive-learning software. That software, which attempts to respond to the needs of individual students, often winds up turning the platform into yet another recommendation engine for content and playlists. LinkedIn Learning, despite the best of intentions, has fallen prey to the Netflix concept without questioning the underlying assumptions – without first trying to understand how learning occurs and how best to support it.

Feldstein’s work aside, little of the analysis about LinkedIn’s ed-tech interests seem to take into account its position now as a Microsoft subsidiary. (We come full circle here to how I started this article: the subversive potential of Minecraft deadened by the hype and by its acquisition.) But let’s note what Randall Stross wrote in an op-ed in The New York Times: “Why LinkedIn Will Make You Hate Microsoft Word” – an invocation of Clippy and social graphs and, once again, this belief that the problems of employment and education are simply a matter of faulty code or insufficient engineering.

Education Technology and Education Labor


Perhaps one of the most important questions that educators need to ask themselves – particularly those who see ed-tech as a benign or even progressive force: how is education technology changing “the work” of teaching and learning? How is it changing work for educators? How it is changing it for administrators? How is it changing it for students?

Because, of course, the values that I’ve chronicled in this article – an embrace of “learning to code” at the expense of liberal arts education, the promotion of a “gig economy,” the pressures for “lifelong learning,” the elimination of services like career counseling at schools and their productization in turn – will all have profound implications on education institutions.

Schools have already demonstrated that they’d rather outsource many technical functions than build capacity in-house. See: the LMS, the MOOC. That also seems to be what’s happening with computer science education too, as many universities announced this year they were partnering with coding bootcamps: University of California, Berkeley Extension partnered with Trilogy Education Services. CUNY partnered with Revature. Davidson College partnered with Revature. Bellevue College partnered with Coding Dojo. Sierra College partnered with Hacker Lab. Rather than develop their own faculty’s ability to teach programming, the teaching of CS gets outsourced. “What Happens When Universities and Bootcamps Join Forces?” Edsurge asked in May. Why, nothing but amazingness, as if there’s no history of exploitation in technical training and career education.

Funding data for coding bootcamps can be found at funding.hackeducation.com.

Educators are also experiencing their own fair share of precarity, despite the stereotype of teaching as a “job for life.” The majority of college instructors are “off the tenure track” and more than half are adjuncts – the latter, Phillip Magness suggested this year, a result of the expansion of for-profit higher education.

At the K–12 level, pay for teachers has stagnated. According to the Economic Policy Institute, “In 1994, teachers earned on average 1.8 percent less than other comparable workers; by 2015, they earned 17 percent less, adjusted for inflation. Factoring in total compensation, including health benefits and pensions, teachers earned the same as other workers with college degrees in 1994 but 11 percent less by 2015, the report found.” Many teachers cannot afford to live in the cities in which they teach – teachers in communities experiencing a “tech boom” are particularly hard hit. And as the “gig economy” spreads, “ Teachers Are Working for Uber Just to Keep a Foothold in the Middle Class,” The Nation reported in September.

Low pay and challenging working conditions are contributing to a teacher shortage – a staffing shortage that could reach more than 100,000 teachers annually by 2025, according to a report by the Learning Policy Institute, issued this fall. States have taken different approaches to this problem – some increasing compensation and benefits, some using the H1-B visa program to hire teachers from elsewhere, some increasing recruitment efforts at the college level, and some reducing standards for who can be hired as a classroom teacher. Utah, for example, started hiring teachers with no teaching experience or education degree, as did Georgia and New York. (I’ll look at new teacher certification rules in the next article in this series.)

There’s a mistaken belief that teachers, once hired, are almost impossible to fire. But job protections for teachers have been systemically dismantled in recent years. There were a few victories for teachers’ unions this year: A California appeals court ruled that the state’s job protection rules did not deprive poor and minority students of a quality education or violate their civil rights, overturning a lower court’s decision that would have altered California’s tenure rules. The North Carolina Supreme Court ruled that a state law that had phased out teacher tenure was in fact unconstitutional. The Supreme Court in Kansas is now weighing a similar case. The US Supreme Court refused to re-open the Friedrichs v. California Teachers Association case, which involved public sector union dues, after deadlocking on it earlier this year. But education reform organizations continue to file similar sorts of lawsuits and legislators continue to introduce tenure-changing legislation around the country, challenging teachers’ job protections – in Minnesota, in California (again) for example. At the higher education level, Wisconsin continued to steadily dismantle tenure protections for its public university faculty.

There were wins (partial wins, more accurately) for graduate student unions – the NLRB ruled in August that Columbia University grad student instructors were employees, not merely students as the school contended – and wins (partial wins, more accurately) for college athletes. The NLRB ruled in October that “Northwestern University must eliminate ‘unlawful’ rules governing football players and allow them greater freedom to express themselves,” as ESPN reported. “The ruling, which referred to players as employees, found that they must be freely allowed to post on social media, discuss issues of their health and safety, and speak with the media.” For its part, the US Supreme Court refused to hear an appeal of O’Bannon v NCAA, leaving questions surrounding college athletes, their employment status, and possible pay unresolved.

A unionized workforce – one with job protections like tenure that demands academic freedom in its teaching and in its research – is, no surprise, a fine target for automation. Robots are coming for education jobs, the Brookings Institution insisted in January. Once again, Edsurge serves as a bellwether for this narrative, publishing three stories in one week alone in April touting robo-essay writers, robo-essay graders, and their various marketing claims.

The defeat by Google’s artificial intelligence system of the best human Go player in the world, South Korea’s Lee Sodel, prompted plenty of speculation this year that we were on the cusp of automating education. “Will this new ‘socially assistive robot’ from MIT Media Lab (or its progeny) replace teachers?” the Kurzweil AI Network’s newsletter asked in March. “Imagine Discovering That Your Teaching Assistant Really Is a Robot,” The Wall Street Journal said in May, in a story about “Jill Watson” (of course it’s a female name), an automated teaching assistant at Georgia Tech. (It doesn’t look as though students knew they were being experimented upon, but who cares about ethics. This is ed-tech!)

Arguments in for automation in education often insist that it’s preferable for machines to handle boring, repetitive tasks – say, robots replacing the low-wage workers who have traditionally been hired by assessment companies to score standardized tests. But it’s worth asking how much of teaching – correctly or not – is viewed as “boring” and “repetitive”? Why do these practices continue? That is, why automate them? Why not eliminate “boring” and “repetitive” work altogether? How much of educational labor is already devalued, with or without technological intervention? How, with or without innovations in artificial intelligence, does our current political and economic climate support an elimination of one of the last bastions of unionized labor in the US, a dismantling of protections for teachers’ jobs and as well as of their academic freedom?

During his Senate campaign, Ron Johnson (R-WI) suggested that high quality documentaries like those made by Ken Burns could replace many teachers. “We’ve got the internet,” Johnson said in an appearance this summer, “you have so much information available. Why do you have to keep paying different lecturers to teach the same course? You get one solid lecturer and put it up online and have everybody available to that knowledge for a whole lot cheaper? But that doesn’t play very well to tenured professors in the higher education cartel. So again, we need disruptive technology for our higher education system.”

The disruptions of education technology always have a political bent to them, and rarely is that bent a progressive one. What shape does education technology, in its current manifestation, want the “new economy” to take? What products and services and stories is it selling to that end?

Financial data on the major corporations and investors involved in this and all the trends I cover in this series can be found on funding.hackeducation.com. Icon credits: The Noun Project

Audrey Watters


Published

Hack Education

The History of the Future of Education Technology

Back to Archives