Did ChatGPT Write This Article?

Advances in Artificial Intelligence, such as ChatGPT, brought GW faculty together for the first of many planned looks at the challenges—and opportunities.

January 30, 2023

AI graphic

The academic landscape shifted dramatically in November 2022 with the launch of ChatGPT, a chatbot developed by OpenAI. ChatGPT is artificial intelligence software that can write essays, poems, code and perform other tasks traditionally done by humans.

Articles were quickly published foretelling the death of academic integrity in general and the college essay in particular. Other writers suggested that the new technology might have something useful to offer faculty members and students alike.

GW Libraries and Academic Innovation recently sponsored a panel discussion on the topic, involving faculty from multiple disciplines. The discussion, at Gelman Library, was attended by well over 200 people, some virtually and some in person.

The event was introduced by Katrin Schultheiss, associate professor of history in the Columbian College of Arts and Sciences, who suggested it would be the first of several dialogues on AI in the classroom.

“I call myself an ambassador from the nontech part of the university,” Schultheiss said, adding that her first reaction to hearing of the new software was “fear and trepidation.” Instead, she concluded, “We can use AI as a spur to rethink our teaching,” and also to better prepare students to survive in a world in which AI will be more and more common.

She is one of the planners of a daylong symposium scheduled for April 14, “I Am Not a Robot: The Entangled Futures of AI and Humanity.” The panel at Gelman Library had a more pragmatic aim, she said, “to discuss ways to respond to the challenges that these new technologies pose.”

Noting that other panelists have more expertise in this area than she does, Schultheiss said, “I’m here as a skeptic, but also as someone who’s well aware that I have a lot to learn.”

Ryan Watkins, professor and director of GSEHD’s Educational Technology Leadership Program, began by acknowledging that ChatGPT is a “big step forward” before describing some of his own tests of the software.

When he instructed ChatGPT to write a summary of 300 words explaining the causes of the French Revolution, it produced a document listing political corruption, social inequality and other causes.

“It looks and sounds very realistic,” Watkins said, “but it could be wrong.”

He asked it to write a poem about butterflies in the voice of Winston Churchill (“Yet, these fragile creatures, though a delight to the eye, / Are in danger, their numbers on the wane….”).

“You can ask it to write an outline of a presentation,” Watkins said, and then ask it for multiple versions to help you create a stronger outline. He has talked to many colleagues who want to use the new technology in their courses this semester, he said.

Lorena Barba, SEAS professor of mechanical and aerospace engineering, quoted Andrew Moore, a former dean of the School of Computer Science at Carnegie Mellon University, in a citation from Forbes: “AI is the science and engineering of making computers behave in ways that, until recently, we thought required human intelligence.” Machines can be designed to mimic certain abilities of humans, she added, but that doesn’t make them intelligent.

“Any suggestion that machines are thinking, or becoming human-like, should be avoided,” Barba said. “The machines are not really learning, that is a metaphor. The machine is doing something that, if we did it, we would characterize as learning, but the machine is doing it through an algorithm.”

The mechanical simulation of thought has drawbacks — a lack of context in the machine’s dataset can lead to incorrect responses, for example, and the software is programmed to reflect the biases of its human operators.

But the new technology also offers new opportunities. Students may find their engagement is higher when working with AI software; faculty members can use ChatGPT to generate quizzes.

“I can tell the software to generate 12 questions similar to a given example,” Barba said, and then repeat the exercise as needed.

ChatGPT may help students prepare for debates by presenting arguments contradicting the positions students intend to defend; it can also help students with language barriers become better able to express themselves.

The positives associated with incorporating the new technology in classrooms, Barba said, outweigh the negatives, especially since “Our students will be living in an AI-rich environment.”

Alexa Joubin
Alexa Joubin sees both danger and opportunity in the new technology. (Photo by Kate Woods/GW Today)

Alexa Joubin, professor of English, theatre, international affairs, and East Asian languages and cultures, sees both danger and opportunity in the new software — one of the dangers being the way ChatGPT can encourage mistaking synthesis for critical thinking. On the plus side, it can encourage students to apply high-level editorial and curatorial skills to material generated by ChatGPT.

“AI text is actually very repetitive at this point in time,” Joubin said, while noting that it can be expected to improve. ChatGPT may also not be reliable for current events such as the Russian invasion of Ukraine. But the technology is not going away, and the discussion of its use in the classroom will be ongoing.

“Clearly,” said Gaetano Lotrecchiano, associate dean of innovative and collaborative pedagogy, “this is the beginning of a much larger discourse.”

GW Provost Christopher A. Bracey agreed that further discussion of the use of AI in classrooms is needed, and said he will welcome additional dialogue.

“The use of AI programs in the classroom is a complex issue,” Bracey said, “but I appreciate how AI can be used to advance learning objectives. I look forward to continued conversation and valuable perspectives among our faculty experts about this evolving technology.”