Friday, March 20, 2026

Take a peek behind the curtain and test drive the NEW StateNews.com today!

MSU Museum panel teaches about AI, politics and misinformation

March 20, 2026
<p>Speakers enlighten the crowd during the AI, Elections, and the Fight for Facts Event at the MSU Museum in East Lansing, Michigan on Thursday, March 19, 2026. </p>

Speakers enlighten the crowd during the AI, Elections, and the Fight for Facts Event at the MSU Museum in East Lansing, Michigan on Thursday, March 19, 2026.

Starting with an exhibition that opened in January, the MSU Museum's Blurred Realities exhibit gives visitors the chance to interact with AI-created content and think critically about what they are consuming. On Mar. 19, the Museum hosted a panel "AI, Elections, and the Fight for Facts," put on in collaboration with the Michigan State University Department of Political Science. 

One of the exhibits within the “Blurred Realities" series, entitled "Generative Persuasion," was curated by Associate Professors at Northeastern University in Art + Design Dr Jennifer Gradecki and Dr. Derek Curry. Gradecki and Curry both participated as panelists in the event. 

The exhibit “Generative Persuasion” showcases how Generative AI can misinform citizens and create convincing, false arguments that sway people toward radicalism or radical views. Curry and Gradecki were inspired to take on the project due to real-world inspiration. 

“Our artistic research project, Generative Persuasion, is based on research into current and historical approaches to influence operations, including the use of microtargeting and AI in disinformation and persuasion campaigns. To give two examples: One of the references for our project is Cambridge Analytica, which became well known in 2018 for the so-called Facebook scandal, where data from over 50 million profiles was harvested, without consent, to microtarget voters to persuade them to vote for Brexit and Trump in 2016. Exactly how much of an impact Cambridge Analytica had on those elections has been debated; both of the campaigns Cambridge Analytica was hired to support were successful," Gradecki said. "Another source of information for the project is OpenAI’s threat reports, which reveal the attempts by state and non-state actors alike to create disinformation campaigns using ChatGPT. These reports reveal how influence operations are integrating generative AI into their workflows. We can see that these campaigns use not only paid services, like ChatGPT, which collect all of the data in your chats, but also local open source LLMs, which helps to provide secrecy for influence campaigns." 

Gradecki and Curry were able to incorporate these real world examples to create a fictional piece exhibition. 

 "As an artwork, Generative Persuasion is fictional, but it is not speculative: we know that influence campaigns use microtargeting and generative AI to quickly create convincing disinformation. It’s important to note here that disinformation can include not only false information, but also truths, half-truths and value-laden judgments. The key thing to keep in mind is that disinformation is used to influence and manipulate; it may cause people to feel strong feelings, like pride, hatred, or outrage, or it may be used to distract or prevent dissent,” Gradecki said. 

Curry said that the goal of the exhibit is to "help viewers develop media, data and AI literacies and to encourage them to be generally skeptical of online content." 

Senior Director of Content and Education for WKAR, Ashlee Smith, joined Curry and Gradecki, serving as the panel's moderator. 

“I will be guiding conversation with our fantastic panel of experts. We have so much we could cover, it could be a multi-day series, so it’s my job to ensure we’re getting to the most meaningful and impactful topics. It won’t be a hard job because listening to this group is a compelling experience. They have incredible insights from their research that are sure to draw the audience in," Smith said. 

Smith spoke about how having conversations like this during an election year is "extremely poignant" as AI content is on the rise and makes it difficult for people to"distinguish between truth and fiction." Smith herself has more experience with AI from a media perspective, having to navigate AI in her daily life while working at WKAR. 

"AI is evolving at such a fast pace that things can change from day to day. We will continue monitoring the expanding world of AI, but what is most important in my work in public media is that we can stand behind the fact that our offerings are human, factual, and editorially sound. We endeavor to remain a trusted source of information in a sea of synthetic media," she said. 

Smith hopes that, despite the rapid increase of AI, students are able to recognize the greater implications. 

“AI has become a subject that immediately turns people off, but it is so important to learn about the impact it’s having on society. One of the most important things we can do as citizens and consumers is to be media literate," Smith said. "To understand the effects and implications of AI so that we can seek out truth instead of just accepting what we’re given. I hope that students will engage with this conversation and the panelists, be inspired to learn more, and share information with their friends.” 

International relations and comparative cultures and politics sophomore Claire Urban said that being a student in the James Madison College has helped her to "think critically" about what she sees and hears in the news. Urban talked about how she has seen AI become increasingly prevalent in politics.  

“I think that AI will have a huge impact on elections. I recently read an article in The Washington Post that said about 20 candidates in certain states' primaries have been bought out or influenced by AI companies. On top of that, you hear in the news that OpenAI has partnered with the Department of Defense (DOD), and immediately after that, a campaign was launched against Anthropic, calling it a 'supply chain risk.' As we move into the future, figuring out what AI is and what is not is getting harder as it improves," Urban said. 

Despite AI's increasing use, Urban believes "AI should not have a hand in our elections," and spoke about aspects of politics that AI cannot replace. 

"The one thing I learned most is that AI can never replace diplomacy, nor the face-to-face connections that make up our relationships. Diplomacy is our first line of defense, before weapons and before you send people to fight a war. When we rely on AI for our elections or to inform us about what is going on in the world, we are getting a watered-down version that does not push us to think critically or contain empathy," Urban said. "I think it is important to actually try to understand the real words that politicians are saying, to understand the means by which they are pushing legislation. AI just cannot do that. I wish people knew that we are in a new age of information that is constantly being created and forged, and that you have to fact-check what you see and hear. If you don’t, you can misunderstand conflicts, as well as the ideas and cultures around you.” 

Support student media! Please consider donating to The State News and help fund the future of journalism.

Discussion

Share and discuss “MSU Museum panel teaches about AI, politics and misinformation” on social media.