Top George Washington University research and cybersecurity leadership hosted the Chief Information Officer of the German Ministry of Defense Lt. Gen. Michael Vetter Wednesday for a roundtable discussion on the impact of advancements in artificial intelligence (AI) and the global implications of the new technology.
Pamela Norris, the vice provost for research at GW, began the discussion at Duques Hall by highlighting the importance of the topic given the recent, rapid developments seen in artificial intelligence. “Cyber security is an area of great interest at GW for our students, our faculty and our staff,” Norris said. She added that research and policy in this area requires collaboration between disciplines and institutions and that GW is ready to lend its expertise to those endeavors.
The discussion was moderated by Costis Toregas, the director of the Cybersecurity and Privacy Research Institute at the School of Engineering and Applied Science.
Vetter spoke about current geopolitical challenges including the impact of the Russian invasion of Ukraine and the role of technology in modern warfare.
“Concerning cyber and IT, we have to reckon that this turning point in history takes place at a time we are all experiencing a technological tsunami,” Vetter said. “Its most consequential areas include biotech, quantum technology, microelectronics, semiconductors, robotics and artificial intelligence. Digital has become one of the engines that is driving change in almost every area of society.”
Vetter said the military is already seeing significant consequences of new technology as they fight an increasingly digitized battle.
“In addition to classical cyber-attacks, with the huge amount of disinformation, the public domain plays a very important role in this conflict. It is used by the Russian Federation in a very aggressive way,” Vetter said.
He said while hybrid warfare is not new, what is new is the technology being used. As an example, Vetter spoke about how Deepfake AI is so advanced now that people are not able to distinguish a fake image or video from a real one in some instances.
He said disinformation campaigns have now become common practice in modern warfare.
Brian Ensor, the associate vice president of cybersecurity, infrastructure and research services for GW Information Technology, asked Vetter how to help others better understand this new technology.
“A point you have made around generative AI and deepfakes is how it’s hard to distinguish what is real if you don’t know what to look for. So, as we look at our strategies for informing and developing this skill base within our organization and across the university, we must think of how to lay it out,” Ensor said. “What are the models of getting technology and information into the hands of people so that it is usable to them, and they can be part of the greater mission that we're trying to build?
Vetter said the German military has spent lots of resources educating their workforce on digital literacy.
“We have specialist training and depending on the role of the individual, they get additional training and additional courses in cybersecurity,” Vetter said. “We have a center for cybersecurity as well, which includes interactive possibilities for computer forensic capabilities, a computer emergency response team, and we also use a lot of offensive cyber capabilities to attack our system, which is good for training.”
He underscored the importance of ensuring more people understand how to use this technology since it is here to stay, and also spoke about the importance of securing a strong work force in the cyber domain.
Geneva Henry, dean of libraries and academic innovation and vice provost for information technology at GW, said even at the university there have been challenges caused by the rapid advancements in AI with software like ChatGPT.
“I've been around AI for a long time but what we've been seeing on campus, as of this last fall, the emergence of this technology via the use of ChatGPT in the classroom is sort of unsettling, disrupting the way things normally operate,” Henry said.
Henry asked Vetter to discuss some of the challenges and opportunities he’s seeing with the rapid and unchecked deployment of generative AI and ChatGPT. “How does cybersecurity stand to be impacted and or assisted and are there currently any contemplated uses of the technology by the German army,” Henry asked.
Vetter said there are many beneficial opportunities—and risks—for using AI. Regulations are needed, he said, and ideally consumers of this new technology should be aware of the risk while still being able to enjoy the benefits.
“Apart from our military operations, there are opportunities to use AI in how we assess, develop and analyze our workforce. How do we assess the health of the military? It can help us do preventive maintenance repairs. It can help us become smarter in getting our company systems operationally ready,” Vetter said. “And this technology will also be there for environmental protection, for sustainable use of resources. So there are many, many applications, and we have to advocate for those opportunities.”
Vetter said although there are many opportunities with this technology, there are also risks which are why regulations are needed. Ideally, Vetter said, consumers of this new technology should be aware of the risk while still being able to enjoy the benefits.
Sibin Mohan, an associate professor in the Department of Computer Science, raised the importance of interdisciplinary research, one of GW’s strengths. Mohan pointed to advances in drones and autonomous vehicle systems powered with AI as examples of strong research that is having an impact in the field. Vetter agreed and stated that these could be potential areas of cooperation between GW and German universities.