Serving the UMN community since 1900

The Minnesota Daily

Serving the UMN community since 1900

The Minnesota Daily

Serving the UMN community since 1900

The Minnesota Daily

Daily Email Edition

Get MN Daily NEWS delivered to your inbox Monday through Friday!

SUBSCRIBE NOW

Editorial Cartoon: Peace in Gaza
Editorial Cartoon: Peace in Gaza
Published April 19, 2024

Opinion: The ethical implications of misinformation and deepfakes

More should be done to address the impacts of widespread misinformation.
Opinion%3A+The+ethical+implications+of+misinformation+and+deepfakes
Image by Sarah Mai

Misinformation became evident during the 2016 and 2020 U.S. presidential elections. It created polarization affecting global economic and political environments. One-fifth of users at the far ideological extremes shared nearly half of the fake news. Because of the internet, we are flooded with real-time events and news as it breaks. Celebrities and politicians disseminate opinions on social media without fact checking, which is aggregated and spread regardless of their (in)accuracy. Consumers are attracted to headlines and quick summaries that provoke reactions and clicks. Information no longer undergoes the validation process used for newspapers and books. Similarly, we used to believe video told a story rooted in truth; deepfake technology now allows sophisticated manipulation of digital imagery.

Stakeholders include news/media companies, consumers, political/celebrity influencers and governments. News/media companies print reactionary headlines to support their business models. Most adults (76%) say news companies have an obligation to identify misinformation. Consumers who do not have time to fact check are susceptible to misinformation. Those at the left and right ends of the spectrum, who comprise 20% of the public, have a substantial impact on the political process. They are most likely to vote, donate to campaigns and participate in politics. Political/celebrity influencers have been called out for spreading misinformation and are also targets of deepfakes that can cause irreparable damage. For example, deepfake pornography websites use influencers’ likenesses to attract customers. Governments have a tough role because deepfake technology is in its infancy. Legislation needs to address misinformation and deepfakes to protect those impacted.

News/media companies have a right to sell truthful news. From a justice perspective, they should aim for restorative justice for those deceived or radicalized by misinformation. From a utilitarian perspective, they should not publish misinformation. From a duty perspective, they have a duty to report the truth. From a virtue perspective, news/media companies need to be held accountable for publishing clickbait with dishonest titles or misinformation.

Consumers have the right to make their own choices on what to believe. From a justice perspective, consumers should treat everyone fairly, no matter differences in opinion. From a utilitarian perspective, consumers should examine their own views and determine if they are creating a positive net change in society. From a duty perspective, consumers have a duty to not lie or perpetuate lies from other sources. From a virtue perspective, consumers must always act honestly, no matter if the truth does not support their own agenda.

Politicians/celebrity influencers have the right to tweet or talk about whatever they want. For deepfakes, influencers deserve the right to not have their likeness taken without their consent. From a justice perspective, influencers should be treated the same way as all citizens, including consequences for perpetuating misinformation. For deepfakes, influencers deserve equal treatment to help counter the negative impacts. From a utilitarian perspective, influencers should examine their own views and determine if they are creating a positive net change in society. From a duty perspective, influencers have a duty to be good role models and not exploit their power and influence. From a virtue perspective, influencers should act compassionately and honestly toward their fans/followers.

The U.S. government must respect citizens’ rights to pursue their own choices unless they significantly impact others negatively. Deepfake sex videos are a new form of sexual privacy invasion. Sexual privacy serves an invaluable function in society: it facilitates identity development, intimacy and equality. Violation of ethical and legal values is a compelling case to address deep fake pornography with legislation. From a justice perspective, governments need to be fairly electing legislatures that are unbiased and create corrective, retributive and restorative justice for those impacted by deepfakes. From a utilitarian perspective, governments need to think about what actions they take and how they impact society. From a duty perspective, governments have a duty to be truthful to their constituents and to protect them from misinformation and deepfakes. From a virtue perspective, governments must be honest with their constituents to be virtuous and good.

Misinformation and deepfakes contribute to political divisiveness due to mistrust in news sources. Deepfakes have profound negative consequences on democracies: deep-faked news reports target political opponents, portray fake events and impact elections. Deepfakes erode trust in political institutions and deepen division among social groups. While some misinformation (such as parodies) is not harmful, the government must take action to stop deepfake videos. The impact of misinformation and deepfakes is not yet known. It is important to remember the stakeholders most impacted; this will allow us to look at the issue from different ethical perspectives to find the best solution.

 

Chase Fingerson is a senior at the University of Minnesota and an aspiring writer, please reach out to connect on LinkedIn.

Leave a Comment
More to Discover

Accessibility Toolbar

Comments (0)

All The Minnesota Daily Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *