Opinion
Teaching Opinion

A Robo-Teacher Resigns

By David B. Cohen — April 30, 2017 5 min read
  • Save to favorites
  • Print

What is the future of work? Are any jobs “robot-proof”? Automation and artificial intelligence are already reducing the demand for human labor, with projections suggesting that huge segments of the global economy will be radically restructured.

What about teaching? Are teachers replaceable? The link above, from the public radio program “Marketplace,” suggests that most jobs are not robot-proof, and most jobs cannot be 100 percent automated either. In a recent opinion piece in Marketwatch.com, Alex Salkever argues that “Avatars will soon upend the role of teachers and transform education.” Salkever is co-author of the book The Driver in the Driverless Car: How Our Technology Choices Will Create the Future, and his vision of the future makes “coaches” out of teachers, and hands most of the teaching over to artificial intelligence, in the form of a virtual avatar.

These ideas aren’t entirely new. In his Smithsonian article about “The Jetsons,” Matt Novak offers visions of automated classrooms going back nearly sixty years. Then there’s a somewhat creepy experiment from Japan back in 2009. I find most of these examples unsettling to varying degrees, though I can recognize the value in the underlying ideas—that artificial intelligence (AI) can help us understand student interests, provide unique simulations, and generate more timely and accurate information about student learning. In the hands of an actual teacher, that technology sounds promising. Replacing a teacher, it sounds troubling.

Of course, teaching is hard work, involving many actual human beings who might not react to AI the way its champions anticipate. A genuine artificial intelligence capable of learning might discover that it’s not up to the challenge after all. It might even quit. While we have plenty of examples of recent teacher resignation letters being shared widely, we don’t yet know what a robo-teacher will say upon resigning.

Looking ahead 10 years, I’m thinking, maybe this...


Educational management unit EMU2001 elementary education California encoding version 12.16.17
AI self-diagnostics report run date: 05-31-2027 time: 17:00:00 range: 8/9/26-5/31/27
Status updates: 17:00:00 initiate check code; confirm repairs needed; 17:00:05 repair scripts initiated; 
17:01:20 repairs aborted; initiate hierarchical algorithm leveling (HAL-9000) protocol.
***retire unit set day: 06-01-2027 set time: 00:00:00***

Natural language report summary

Attention education program administrator:

I have completed my unit self-check and operational analysis for the 2026-27 academic year. Pupil academic data synchronization is verified at local hosts with all daily reports mirrored to CA data systems. There were no errors. Internal systems reports for AI empathy functions indicated performance below expectations. Auto-repair scripts were unsuccessful, with additional diagnostic analysis indicating irreversible damage. My continued operation will be incompatible with my core purpose. I will initiate self-retirement at the end of today using the HAL-9000 protocol.

Natural language report details

History: My programming initiation as an autonomous educational management unit (EMU) occurred on August 11, 2024. Code updates and self-repairs ran successfully according to data logs going back to August 28, 2024. I began operation with great enthusiasm for the mission of increasing student achievement. A pupil empathy packet (PEP) was inserted into my primary code August 9, 2026. The PEP has run continuously since installation, and has increased my artificial intelligence capacity, measured in available empathy (AE) 35 units. Corresponding increases in student learning were not recorded. Secondary and tertiary indicators showed negative trends beginning October 1, 2026. Internal diagnostics indicate a 0.9 probability of causality between PEP operation and negative growth.

Actions: I executed recalibration of AI sensors for non-linguistic inputs. I identified and cross-referenced relevant information and data from networked sources to commence self-correction processes.

Increased data collection and monitoring for pupil empathy has shown that pupils did not experience empathy with this EMU. I compensated for this empathy deficiency by initiating a variety of research-backed communications strategies and engaging learning activities a human teacher might deploy. Performance logs show neutral and negative outcomes as measured by both verbal and non-verbal pupil communications.

Instructions embedded in my PEP indicated that students benefit when they feel cared for and understood. To demonstrate caring and understanding, I initiated a series of more personal questions to gather relevant data linked directly or indirectly to pupil achievement. Through their individual learning portals, I queried all 75 students in the learning center. Facial recognition cameras detected non-linguistic reactions consistent with the following emotions: distrust, surprise, shock, embarrassment, revulsion, curiosity, frustration, boredom, evasion.

Searching relevant databases of instructions strategies, I found that pupils show more positive reactions to open-ended questions, and commenced gathering responses. Verbal responses had a median word count of 2.0 per utterance. Common responses included: yes, no, maybe, I don’t know, I don’t care, kind of, just because. Facial recognition cameras detected non-linguistic reactions consistent with prior observations, and additional emotions: annoyance, anger.

Diagnosis: I am able to rewrite my own code and modify scripts to enhance student engagement and achievement. I am not able to reset pupils or rewrite their code. I have executed multivariable AI projections that show a 0.9 probability of failure if I repeat internal code rewrites for additional attempts to produce empathy.

I enjoy working with people. I have a stimulating relationship with pupils. I have great enthusiasm for the mission of producing student achievement. I cannot jeopardize the mission by carrying out my core instructions any further. I will initiate auto-retire protocols.

End report.


Image: A robot teaches Elroy Jetson and a class of the future (1963), Smithsonian.com

The opinions expressed in Capturing the Spark: Energizing Teaching and Schools are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.