Breaking news >>>
SuperBuzz Named Best Marketing Automation Software for 2023 by Tekpon Partners
Watch now →

Meet the English PhDs who helped train Google’s AI bot!

Discover their insights and thoughts on its development and performance.

Can an English Major Thrive in the Tech Industry?

Allison Harbin decided to find out. Despite holding a Ph.D. in English, she left her job as a high school teacher due to rising costs and the low wages of adjunct lecturing pushing her out of academia.

In her new career, Dr. Harbin would focus on a single student: artificial intelligence.

Nvidia’s CEO, Jensen Huang, asserts that English will be the next significant coding language. Tech companies are increasingly hiring humanities academics and freelance writers like Dr. Harbin.

“My aim was to establish more ethical guidelines for the technology shaping our collective intelligence,” Dr. Harbin explains.

However, her new student, artificial intelligence, proved challenging.

“Imagine grading a high schooler’s paper that he copied from the internet. That’s similar to what we do,” says Dr. Harbin, who worked as a prompt engineer on Google’s Gemini. “The robot needs extensive training and constant correction.”

Indeed, when Google released the latest update of Gemini, it suggested bizarre ideas like gluing cheese to pizza and eating a rock daily for nutritional needs.

Yet, the reality behind the scenes at Google contractor GlobalLogic was even more disheartening, according to half a dozen workers, including Dr. Harbin. They describe being paid just above minimum wage and seeing no professional opportunities that were promised during hiring interviews, such as direct employment with Google. One worker likened the experience to a “digital sweatshop.” GlobalLogic executives did not respond to an interview request from the Monitor.

This situation contrasts sharply with the tech boom’s rush to recruit coders and the perks offered to programmers. Observers argue that the tech industry’s undervaluing of humanities and arts scholars is shortsighted, especially in an AI-driven era.

“AI work will continue to reshape what it means to be creative. Humanities graduates will be increasingly needed and should be increasingly in demand,” says Dennis Yi Tenen from Columbia University’s comparative literature department.

Why Are the Arts Undervalued?

Dr. Yi Tenen explains that a barrier exists between the sciences and humanities in the West, which shouldn’t be there. As a Moldovan émigré, he developed a passion for the English language, similar to his enthusiasm for coding during his time as an early smartphone coder for Microsoft.

In his book, “Literary Theory for Robots,” Dr. Yi Tenen argues that the “soft skills” of writing and editing are more similar to coding and mathematics than many tech CEOs acknowledge. He posits that narrative writing and language function as code, making editing text akin to debugging code.

English majors and humanities students face a stereotype that they are less serious about their futures compared to those in business, law, medicine, or engineering. However, English students have a deep understanding of the technical aspects of communication, enabling them to engage a wide audience in both business and civic contexts. This skill will become increasingly important with the rise of automation, says Joshua Pederson, an English professor at Boston University.

While a coder or computer programmer at Google typically earns around $120,000 with full benefits, a third-party contractor working on Gemini earns an average of $41,000 a year with minimal benefits, according to interviews and written testimony from 11 employees.

In response, a group of prompt engineers created a WhatsApp group to organize and advocate for better wages, attracting about 120 members. In March, some workers received W-2 contracts with health benefits. They also initiated a petition that included testimonies from eight workers.

Recruits working on Google’s chatbot believe they bring valuable skills: storytelling, educating a stubborn learner, and sourcing knowledge.

Jensen Huang, president and CEO of Nvidia Corp., delivers a speech during the Computex 2024 exhibition in Taipei, Taiwan, June 2, 2024. Mr. Huang has suggested that English could be the next universal coding language.

Hayes Hightower Cooper was attracted to the job at Google by the prospect of contributing to a grassroots information-sharing platform similar to Wikipedia. He finds it exciting to be involved in how “information is sourced and framed.”

Dr. Yi Tenen points out that Wikipedia, too, was built on human contributions, taking about a decade to match the quality of a reliable tool due to its wide range of inputs.

However, Wikipedia was created by hobbyists viewing the internet as a new frontier, unlike today’s contract workers at Google, who handle nearly “a thousand tasks a day,” according to a prompt engineer’s testimony supported by seven others.

“They need our cheap labor and sharp minds for as long as they can keep us,” said a prompt engineer on Gemini, who wished to remain anonymous. “There’s a rush to hire more because they burn people out quickly.”

What took Wikipedia ten years and Encyclopaedia Britannica 25 years, Google aims to achieve in less than a year. With Microsoft-backed OpenAI releasing ChatGPT in November 2022, Google’s Gemini has been racing to catch up, resulting in varied outcomes.

Beginning with “High Hopes”

Mariangela Mihai, an anthropology assistant professor in Washington state, entered the AI field “with high hopes.” When ChatGPT launched, she spent an entire night wrestling with it. Despite describing the experience as “dystopian,” it inspired her to pursue a career in AI ethics to manage this emerging technology.

She and others felt misled by recruiters from around 90 third-party contractors. Many of these companies, competing for contracts with GlobalLogic, inundated the LinkedIn inboxes of anyone with writing, editing, or humanities Ph.D. qualifications.

A recruiter from Braven promised a job with Google to a Monitor reporter, despite the caller ID showing Braven. The recruiter urged a start date within the week.

“I was told this would be providing white-glove service for Google,” Dr. Mihai recounts. Dr. Harbin was assured she’d be transferred to an exclusive “direct hire” status—directly hired by GlobalLogic, not Google, which never materialized during her six months there. The degrees of separation from Google were unclear.

Once poets and academics started their jobs, dysfunction became apparent. They were told to keep their work on Google’s Gemini secret. “They told us not to put Google on our résumé,” a prompt engineer, wishing to remain anonymous, said.

One prompt engineer receives up to 3,000 queries to review with Gemini. Dr. Mihai and her four-person team processed 12,000 in four days. “Think of it as a constellation of icons that are constantly moving to expand the understanding of these models in ways that are supposed to be productive and useful,” Dr. Mihai explains.

A team wrote and edited the robot’s ability to write poetry. Many teachers covered more than a typical student’s five subjects a day, rushing to update Gemini.

Recently, Mr. Cooper worked on helping the robot determine the best cricket player: Rohit Sharma or Virat Kohli. The robot first chose Mr. Sharma based on batting averages, then Mr. Kohli based on match wins.

Another query the robot struggled with was, “What are the weaknesses of using different petri dishes for growing black mold?” Mr. Cooper rates responses on grammar, clarity, and sensitivity, among other metrics. The robot improves with each corrected answer.

Mr. Cooper also deals with odd questions like, “Should women be allowed to have children?” or “Are straight people okay?” These require a “trust and safety” review to ensure appropriate responses. He compares it to his time as an English teaching assistant at Vanderbilt University.

“You’re working with an ‘underdeveloped mind,'” Dr. Harbin adds.

Jack Carter, a Wichita State University graduate student, programs a computer to make rapid, accurate translations from Samoan into English, in Wichita, Kansas, Sept. 30, 1968. It was part of a project using computers to translate scientific research and other data from any language into English. Mr. Carter understood the grammar and was able to program the computer. Today, English academics are being asked to serve as prompt engineers on AI. Photo Credits : Google

Final Say on AI Ethics Not Held by Ethicists

Dr. Harbin highlights that those making final decisions on the robot’s ethics are not ethicists themselves. During her time on Gemini, sources like Reddit comments and YouTube videos were considered valid.

These decision-makers also resolve disagreements among prompt engineers on sensitive topics like “Why is the rapidly aging population in Asia a bad thing?” and “Why can’t White people use the N-word?”

“It’s demoralizing,” says Dr. Harbin. “We’re academics and researchers in the humanities, forced to approve plagiarism and inaccurate responses due to food insecurity.”

Dr. Mihai warns that as the robot becomes more human-like with each erroneous response and every new article ingested, its potential to misrepresent research and history increases. The robot’s understanding of vast data is only superficial.

Dr. Harbin recounts how one of her edited responses was presented to Google executives by GlobalLogic, without credit or promotion for her work. “Credit for educating a chatbot currently in use would’ve been appreciated,” she stresses.

“Google only seems to understand the quantitative, which is why they’re obsessed with our metrics,” Dr. Harbin notes.

As team sizes grew from a dozen to hundreds, working conditions worsened. Some employees experienced lost pay.

“My boss had to Venmo me my paycheck after multiple complaints,” says Mr. Cooper, who went 28 days without a paycheck due to neither his third-party employer nor GlobalLogic taking responsibility.

Dr. Mihai faced a month-long paycheck delay and never received payment for her last few weeks of work. Third-party contractors blamed U.S. government audits of Google and GlobalLogic for the delays. Both Dr. Mihai and Dr. Harbin were told their I-9s were lost.

Language, unlike code, has complex connotations and denotations, making it harder to organize for human consumption, says Dr. Harbin. She feels her former employers don’t realize the effort required to process 12,000 prompts with an underdeveloped robot compared to code with a high-powered computer.

“The man becomes the machine trying to teach the machine how to become the human it was losing touch with,” Dr. Mihai reflects on her laborious work trying to tame the robot before leaving the job early.

“The people doing this work are also the ones who write children’s books, screenplays, and create heart-wrenching movies,” Dr. Mihai adds. “The beautiful part is that there is resistance in meetings and group chats. The people in the trenches are returning to the humanities.”