Steganography, the art of concealing secret messages within other pieces of text or media, has been in practice for centuries. Its history dates to Ancient Greece, but the first recorded use of the term was in 1499 in a book disguised as a book about magic titled Steganographia. Even today, this method remains relevant, with people altering images to hide messages by changing pixels and encoding the information. A computer can then compare the two images and decipher the hidden message for the recipient. 

However, traditional methods of using steganographic images have a significant flaw: an adversary can find the original image, making it possible to compare it with the image containing the hidden message. This comparison allows them to decode the message and potentially compromise the sender’s security. Luke Bauer, a Ph.D. student at the University of Florida conducting research at the Florida Institute of Cybersecurity Research (FICS), is working to eliminate this vulnerability by developing an innovative approach that leverages artificial intelligence (AI) image generation. 

Bauer’s innovative research focuses on using AI to create images that already contain hidden messages, eliminating the need for a comparison between the original and encoded images. This means that a message sender can upload the hidden message to public platforms without arousing suspicion, making it difficult for an adversary to identify the post containing the encoded message. Meanwhile, the message receiver can decipher the hidden message using the same AI model, without requiring specialized software. 

Bauer is driven by the real-world applications of his research, particularly in situations where people face danger or live under oppressive regimes with strict censorship. He hopes that his work can provide a valuable tool for those seeking to communicate secretly. Bauer’s initial research focused on steganography through text utilizing large language models was conducted as part of the DARPA Resilient Anonymous Communication for Everyone (RACE) project, in partnership with Galois, a company that leverages research to deliver solutions and tools that increase security, reliability and operational efficiency. Galois has continued to develop the project and plans to release an app, making it accessible to a broader audience. Applying the knowledge gained from this research to images was a natural next step in the process of furthering steganographic techniques through the use of AI. 

 “Generative models were just beginning to take off when I began my Ph.D. Although there were obvious uses for them as personal or commercial tools, I wished to examine how they could be used to help people,” Bauer said. “Through my research here at FICS, I was able to discover and improve ways that these models could be used to protect people’s privacy, freedom of expression, and safety.” 

Bauer’s fascination with cybersecurity began during his undergraduate studies at Duke University, where he took a class on hardening systems against attacks. This interest led him to work on steganography with his advisor, Vincent Bindschaedler, Ph.D., an assistant professor in the UF Department of Computer & Information Science & Engineering.  

“Despite numerous challenges in his Ph.D. journey, Luke has demonstrated his unwavering commitment to research. In fact, his unending efforts and tenacity were instrumental for the project’s success,” Bindschaedler said. 

Bauer’s previous research explored another form of steganography by hiding messages in text through large language models. He plans to continue this line of research by incorporating steganographic audio and video AI generation, further expanding the possibilities of secure communication. 

“In my research, I have strived for usability and theory that holds up in real-world use cases,” Bauer said. “I hope that one day my research will be used by people around the world.”