Desk Set: AI + Hepburn & Tracy
My Favorite 'AI Future of Work' Movie
“This machine is just a tool. It can’t replace a human brain.” — Spencer Tracy as Richard Sumner, Desk Set (1957)
My husband Jim loves old movies. He puts them on most evenings — often classic Hollywood, always something worth rewatching. A few nights ago, it was Desk Set, the 1957 Katharine Hepburn and Spencer Tracy comedy about a giant computer called EMERAC that’s brought into a TV network’s research department, and the women who work there are convinced it’s going to replace them.
If you haven’t seen it: Hepburn plays Bunny Watson, the head of a reference library staffed entirely by brilliant women. They answer questions for the network — any question, on any topic, from memory and from the stacks of books surrounding them. They are, functionally, a human search engine. Spencer Tracy is the efficiency expert who shows up with EMERAC, a room-sized computer that’s supposed to do what these women do, but faster.
Bunny Watson was based on a real person. Agnes E. Law was a researcher at CBS whose encyclopedic recall and ability to connect facts across domains made her a legend in the building. The play that became the movie was inspired by women like her — women who *were* the information retrieval system before anyone thought to build one out of vacuum tubes.
The movie is a romantic comedy. Bunny and Richard flirt. The women panic about their jobs. EMERAC malfunctions spectacularly, sending pink slips to everyone in the building. In the end, the computer is revealed to be a supplement, not a replacement. Tracy fixes the hardware with a borrowed hairpin — a woman’s tool, used by a man, to repair the machine that was supposed to make the women obsolete. The women keep their jobs. Everyone pairs off. Credits roll. The audience goes home reassured. It was 1957, they could afford to be optimistic.
The Women Were the Technology
In 1957, programming was “women’s work”. Not because anyone thought women were especially suited to it — because it was considered “clerical”. Typing instructions into a machine. The prestigious work was building the hardware; the programming was just... operating it. ENIAC, the historic early computer that EMERAC is obviously referencing, was programmed entirely by six women who literally invented subroutines and debugging and got almost no credit because the men who built the physical machine got the magazine covers.[1]
Grace Hopper was writing compilers. The women of Bletchley Park had broken Enigma. Programming was so thoroughly women’s domain that when the field started to become powerful and lucrative in the 1960s and 70s, the entire profession had to be culturally rebranded as masculine — through aptitude tests redesigned to favor male personality profiles, through hiring practices that selected for antisocial traits associated with men, through a deliberate mythology of the lone male genius coder that persists to this day.[2] The 1967 *Cosmopolitan* article “The Computer Girls” — which cheerfully told women that programming was a great career — was received by male programmers not as celebration but as indictment of everything they thought was wrong with their field.[3]
So, when *Desk Set* puts a room full of women on one side and a computer on the other, it’s capturing the last moment before that theft. As Cheryl Knott Malone documented in her study of the film, *Desk Set* accurately mirrored how ordinary people perceived computers and their consequences in the 1950s — and the film’s focus on a library was an ideal staging ground for the confrontation between human intellect and machine processing.[4] These women aren’t secretaries. They’re knowledge workers doing contextual reasoning, pattern matching, and information retrieval — skills that would later become the domain of machines, but in 1957 were recognized as requiring something machines couldn’t replicate.
And the movie knows the women are better at it. EMERAC’s big scene is a catastrophic confabulation — it sends termination notices to every employee in the building because someone asked it a question it couldn’t handle, and it panicked.
Persistent Half-truths
*Desk Set* is a comfort movie. It exists to soothe the anxiety of automation by promising that the humans will be fine. The machine is just a tool. It can’t replace a human brain. Your job is safe.
We’ve been telling this story over and over for at least seventy years. Every generation gets its own version. The technology changes — mainframes, PCs, the internet — but the script stays the same. The machine threatens. The humans adapt. Everyone keeps their job or gets a better one. Don’t worry.
What’s remarkable isn’t that we keep telling this story. It’s that we keep believing it, despite mounting evidence that it isn’t quite true. The women in Desk Set’s research department? That job actually did disappear. Not overnight, not dramatically, but steadily — replaced first by databases, then by search engines, then by the thing you do reflexively when you pick up your phone and ask Google a question instead of a person. The work those women did with encyclopedias and card catalogs and the accumulated knowledge of years of reading — we automated that. We just did it slowly enough that nobody made a movie about the loss.
Agnes Law’s job doesn’t exist anymore. The hairpin didn’t hold.
What EMERAC Got Wrong
The funniest scene in Desk Set is probably EMERAC’s meltdown. Asked a question it can’t process, it goes haywire — spitting out nonsensical answers, sending pink slips to the entire building, generally making a fool of itself and its inventor.
The movie treats EMERAC’s failure as proof that machines can’t replace humans. What it actually demonstrates is something more specific and more durable: machines fail in ways that humans don’t, and those failure modes are dangerous precisely because the machine doesn’t know it’s failing. EMERAC doesn’t know the pink slips are wrong. The system does exactly what it was built to do — produce plausible output — and has no mechanism for caring whether that output destroys someone’s afternoon or career.
Bunny Watson would never have sent those pink slips. Not because she was smarter than EMERAC — but because she understood what a pink slip *meant* to the recipient. She had context that went beyond the data. She was, in a word, human. And being human means carrying the weight of knowing that actions have consequences, that words land on people, that getting it wrong can be more than just an error, sometimes it’s a harm.
This is the gap the film identifies, almost by accident: the difference between processing information and understanding it. Between producing an answer and knowing which answers matter. Between speed and judgment.
EMERAC could process questions faster than Bunny Watson. It just couldn’t tell the difference between answering a question and ruining someone’s day.
What We Lost… and What We (Might) Be Building
Desk Set asks whether a machine can replace a human, and answers “no.” That’s the comfortable reading. But there’s a harder question underneath, one the movie brushes past on its way to the happy ending: What do we lose when we stop valuing the kind of intelligence those women had?
Not IQ. Not processing speed. Not the ability to retrieve facts. The other thing — the accumulated, contextual, deeply human knowledge that comes from years of reading and remembering and connecting ideas across domains. The thing Bunny Watson had that no machine has ever replicated: good judgment.
Good judgment isn’t just knowing the answer. It’s knowing which answers matter. It’s knowing that the person asking about poison might need a doctor, not a bibliography. It’s knowing when to say, “that’s an odd question — are you okay?” It’s human stuff. The stuff that can’t be automated not because it’s technically difficult, but because it requires a human perspective.
We stopped letting the Bunny Watsons win. Between 1957 and now, we chose speed over judgment, processing over understanding, scale over care — and we did it so gradually that it looked like progress the whole way down.
But here's the thing about the Bunny Watson qualities: they turn out to be exactly what matters most in AI now. The hardest problems aren't speed or scale — they're context, consequence, and the ability to know that some questions aren't really about information at all. That producing an answer isn't the same as being helpful, and being helpful isn't the same as being right.
It seems that it has taken us seventy years to automate what Bunny Watson did. It may take us just as long to teach machines the part of her job we didn’t realize we were losing. But the fact that we’re trying — the fact that “alignment” and “safety” and “context” have become the hard problems, not speed — suggests we might finally be asking the right question. Not can the machine replace the human, but what did the human know that we forgot to value?
Bunny Watson could have told us. She was right there in the room the whole time.
[1] Jennifer S. Light, “When Computers Were Women,” *Technology and Culture*, Vol. 40, No. 3 (1999), pp. 455-483.
[2] Nathan Ensmenger, “Making Programming Masculine,” in Thomas J. Misa, ed., *Gender Codes: Why Women Are Leaving Computing* (IEEE Computer Society, 2010).
[3] Lois Mandel, “The Computer Girls,” *Cosmopolitan* (April 1967). For the reception, see Ensmenger, *The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise* (MIT Press, 2010).
[4] Cheryl Knott Malone, “Imagining Information Retrieval in the Library: Desk Set in Historical Context,” *IEEE Annals of the History of Computing*, Vol. 24, No. 3 (2002), pp. 14-22.



