The Third-Person Effect (TPE), introduced by W. Phillips Davison in 1983, is a communication theory that explains how people perceive media influence differently on themselves compared to others. Individuals tend to believe that while they are relatively resistant to media manipulation, other people—particularly those who are socially distant or less informed—are more vulnerable. This perception creates a psychological gap between “self” and “others,” shaping attitudes and behaviors in significant ways.

In the context of elections, the Third-Person Effect becomes particularly relevant. Modern campaigns are no longer limited to policy debates and manifesto-driven outreach; they are increasingly shaped by digital media, targeted advertising, and the circulation of misinformation. Voters often acknowledge the presence of misleading or false information during campaigns, but usually assume that “other voters” are more likely to be misled than themselves. This assumption not only influences how individuals process political messages but also affects their trust in democratic processes and institutions.

Misinformation plays a critical role in this dynamic. When combined with the Third-Person Effect, misinformation creates a dual challenge: while individuals dismiss the idea that they are personally influenced, they simultaneously fear that mass audiences are being manipulated. This perception leads to growing anxiety about the integrity of elections, deepening polarization, and support for stronger regulation of political communication.

The central question, therefore, is not just whether misinformation influences voting decisions, but how the belief that others are more susceptible to it shapes political behavior and democratic trust. Understanding this perception gap is crucial for addressing electoral integrity in the age of misinformation, where both the direct impact of false content and the indirect impact of perceived vulnerability can alter the democratic landscape.

Understanding the Third-Person Effect

The Third-Person Effect (TPE) is a communication theory that suggests people believe media messages affect others more than themselves. In election campaigns, this perception gap creates a powerful psychological dynamic. Voters often assume they are rational and resistant to misinformation, while viewing others as more easily swayed. This belief influences not only personal attitudes toward political messaging but also broader concerns about the fairness of elections, trust in institutions, and calls for stricter regulation of campaign communication. Understanding this effect is key to examining how misinformation reshapes voter perceptions and democratic outcomes.

Origin of the Theory

The Third-Person Effect was first proposed by W. Phillips Davison in 1983. He observed that people generally assume media messages have a more substantial influence on others than on themselves. This perception reflects a psychological bias where individuals see themselves as more rational or less impressionable, while considering others—especially those outside their social group—as more vulnerable to persuasion. In the context of election campaigns, Davison’s idea provides the foundation for understanding why voters often underestimate their own exposure to misinformation while overestimating its impact on the broader electorate.

Davison’s Contribution (1983)

The concept of the Third-Person Effect was first introduced by communication scholar W. Phillips Davison in 1983. He argued that people consistently perceive media to influence others more strongly than themselves. This assumption reflects a psychological bias where individuals regard their own judgment as less susceptible to persuasion while assuming that others, especially those outside their immediate social circles, are more easily misled.

Core Idea

At the heart of the theory lies the self–other perception gap. Individuals believe they are capable of critically evaluating messages, but they underestimate their own exposure to subtle persuasion. Conversely, they overestimate the effect of media content on distant or less-informed groups. This belief shapes not only personal confidence in media resistance but also attitudes toward censorship, regulation, and political communication.

Relevance to Election Campaigns

In the context of election campaigns, Davison’s theory is especially significant. Voters frequently encounter political advertisements, news coverage, and misinformation on digital platforms. While many insist these messages do not alter their own decisions, they often assume that “other voters” are strongly influenced. This perception leads to anxiety about the fairness of elections, distrust of media outlets, and support for restrictions on campaign messaging.

Continuing Influence

Davison’s original idea continues to guide research on media effects, particularly in the study of misinformation, propaganda, and digital campaigning. By identifying the psychological divide between self and others, the Third-Person Effect provides a framework for understanding how voter perceptions are shaped not only by information itself but also by assumptions about its impact on society.

Mechanics of the Effect

The Third-Person Effect operates through a perception gap between how individuals view themselves and how they view others in relation to media influence. People generally believe they are rational and less susceptible to persuasion, while assuming that others are more easily swayed. In election campaigns, this self–other gap shapes political behavior by reinforcing the idea that misinformation will mislead the broader electorate even if one personally resists it. This perception fuels distrust in the democratic process and can influence support for censorship, regulation, and stricter controls on political communication.

The Self–Other Perception Gap

The core of the Third-Person Effect lies in the difference between how people perceive themselves and how they perceive others in relation to media influence. Individuals often assume they are rational and capable of resisting persuasion. At the same time, they believe others, particularly those who appear less informed or socially distant, are more vulnerable to manipulation.

Psychological Function

This perception gap operates as a form of cognitive bias. It provides individuals with a sense of control and self-assurance, reinforcing the idea that they can critically evaluate information. However, this bias overlooks the subtle ways in which media, including misinformation, can still shape their attitudes and decisions.

Implications in Elections

During election campaigns, the mechanics of this effect become highly visible. Voters frequently dismiss the idea that campaign advertisements or false information influence their own choices. Yet they often assume that the broader electorate is strongly affected. This belief shapes how people view the fairness of elections and may increase demands for stricter regulation of political communication.

Broader Consequences

The perception gap not only affects personal confidence but also contributes to political polarization. When individuals believe that “other voters” are misled, they develop suspicion toward rival groups and distrust in democratic processes. As misinformation spreads, the mechanics of the Third-Person Effect amplify concerns about manipulation, even when people refuse to acknowledge their own susceptibility.

Psychological Basis

The Third-Person Effect is grounded in several psychological tendencies. At its core is cognitive bias, which leads people to misjudge their own vulnerability to persuasion. This creates an illusion of personal immunity, where individuals believe they can resist misinformation more effectively than others. The effect is further reinforced by social distance, as people assume those outside their immediate group are more gullible. In election campaigns, these psychological patterns explain why voters often dismiss their own exposure to misleading content while exaggerating its influence on the broader electorate.

Cognitive Bias

The Third-Person Effect is rooted in cognitive bias, where individuals consistently misjudge how media influences them compared to others. This bias leads people to overestimate the susceptibility of the general public while underestimating their own. Research in communication studies has repeatedly shown that people interpret messages through filters shaped by their prior beliefs and self-perceptions, reinforcing the idea that others are more easily persuaded.

Illusion of Personal Immunity

Another key factor is the illusion of personal immunity. Many voters believe they are capable of identifying and resisting misinformation, even when subtle messages influence their thinking. This sense of immunity provides reassurance but blinds individuals to the unconscious ways that political messaging affects attitudes, candidate preferences, and even turnout decisions. By underestimating personal vulnerability, voters reduce their awareness of how misinformation shapes their own political behavior.

Social Distance

The perception gap grows wider when social distance is introduced. Individuals assume that those outside their immediate group—different communities, opposing political supporters, or less-educated audiences—are more gullible. This tendency reinforces stereotypes about who is vulnerable to propaganda and misinformation. In elections, such assumptions deepen political polarization by framing rival groups as easily manipulated while casting one’s own group as rational and informed.

Implications for Election Campaigns

These psychological patterns explain why misinformation often generates more anxiety about its impact on others than about its effect on oneself. The combined influence of cognitive bias, personal immunity, and social distance strengthens public calls for stricter regulation of campaign communication while simultaneously undermining trust in the electorate’s ability to make independent decisions.

Misinformation in Election Campaigns

Misinformation has become a defining feature of modern election campaigns, spreading through social media platforms, messaging apps, and targeted digital advertising. False or misleading content often takes the form of manipulated news stories, deepfakes, distorted statistics, and viral memes designed to trigger emotional reactions. Within the framework of the Third-Person Effect, voters typically believe they can identify and resist such content, yet they assume that other voters are more vulnerable. This perception fuels distrust in the electoral process, increases political polarization, and shapes public demand for stronger regulation of campaign communication.

Forms of Misinformation

Misinformation in election campaigns appears in multiple formats, each designed to distort perceptions and influence voter behavior. Fake news spreads fabricated stories disguised as credible reporting, while deepfakes use AI-generated audio or video to depict candidates falsely. Manipulated statistics present selective or misleading data to strengthen partisan claims. Additionally, meme warfare simplifies complex political issues into shareable visuals that reinforce stereotypes and spread rapidly across digital platforms. Together, these forms of misinformation amplify the Third-Person Effect by making voters believe others are more easily deceived, even as they underestimate their own exposure.

Fake News

Fake news refers to fabricated or deliberately misleading stories presented as legitimate journalism. In election campaigns, these stories often spread rapidly across social media platforms and messaging apps, shaping narratives about candidates, parties, or policies. Fake news thrives because it appeals to emotions such as fear, anger, or loyalty, which make voters more likely to share it without verifying its authenticity.

Deepfakes

Deepfakes use artificial intelligence to generate audio or video content that convincingly imitates real individuals. In political campaigns, deepfakes can make a candidate appear to say or do something that never happened. This type of misinformation is hazardous because it exploits the credibility of visual evidence, eroding public trust in authentic recordings and raising doubts about verified information.

Manipulated Statistics

Statistics carry an impression of authority, which makes them powerful tools for persuasion. Campaigns and partisan groups sometimes manipulate data by presenting selective figures, removing context, or exaggerating trends. For example, unemployment rates or crime figures may be framed in a misleading way to promote a candidate’s agenda or discredit an opponent. Such misuse of numbers can give misinformation a veneer of legitimacy, making it harder for voters to detect bias.

Meme Warfare

Memes distill complex issues into simplified images or slogans designed for rapid online sharing. While they appear lighthearted or humorous, political memes often embed stereotypes, reinforce partisan divides, and spread misinformation. Their viral nature allows them to reach broad audiences quickly, bypassing fact-checking mechanisms. Meme warfare contributes to reinforcing echo chambers where voters repeatedly encounter content that confirms their existing beliefs.

Implications for the Third-Person Effect

Each of these forms of misinformation strengthens the perception gap central to the Third-Person Effect. Voters often claim that they can identify fake news, deepfakes, or misleading data, yet assume that others cannot. This belief fosters anxiety about the electorate’s vulnerability, reinforcing distrust in rival groups and deepening polarization in democratic systems.

Digital Platforms and Algorithms

Algorithms prioritize content that generates strong engagement, often favoring sensational or emotionally charged material over verified information. This amplification allows false stories, deepfakes, and political memes to spread faster than fact-checked corrections. Within the framework of the Third-Person Effect, voters may believe they are less affected by such content but assume that others are heavily influenced, reinforcing concerns about electoral manipulation and undermining trust in democratic processes.

Algorithmic Amplification

These systems prioritize posts that generate high engagement, such as likes, shares, and comments. Because misinformation often uses emotional triggers like fear, outrage, or humor, algorithms tend to promote it over balanced or fact-based reporting. As a result, false narratives spread faster and reach broader audiences than corrections or verified news.

Virality and Speed

Misinformation benefits from the speed of digital sharing. Once a piece of content begins to gain traction, algorithms push it into more feeds, creating a cycle of visibility and reinforcement. By the time fact-checking organizations respond, the misinformation has often already reached millions, leaving corrections with limited impact.

Personalization and Echo Chambers

Algorithms also personalize content by analyzing user preferences and behaviors. While personalization keeps users engaged, it reinforces echo chambers where individuals see repeated messages that confirm their existing beliefs. In elections, this cycle strengthens partisan divisions and makes voters more confident that their perspective is correct while assuming rival groups are misinformed.

Connection to the Third-Person Effect

The way platforms amplify misinformation intensifies the perception gap central to the Third-Person Effect. Many voters believe they can recognize manipulative content in their feeds, yet they assume that the broader electorate is far more susceptible. This belief fosters distrust in democratic outcomes and increases support for regulatory measures aimed at controlling political communication on social media.

Case Studies

Misinformation in election campaigns has taken different forms across countries, offering clear examples of how the Third-Person Effect operates. In India, WhatsApp forwards spread rumors and manipulated narratives during national and state elections, shaping perceptions of parties and candidates. In the United States, the 2016 and 2020 elections saw extensive misinformation campaigns, including foreign interference and domestic false narratives amplified on social media. In Georgia, post-Soviet disinformation strategies targeted public trust by framing elections as manipulated or externally controlled. Across these contexts, voters often believed they could resist such content but assumed others were highly influenced, reinforcing the self–other perception gap.

India: WhatsApp Forwards and Electoral Influence

In India, WhatsApp has become a central tool in election communication. During national and state campaigns, false narratives spread through forwards that included doctored videos, fabricated quotes, and misleading statistics. These messages often exploited cultural and religious sensitivities, framing political debates in polarizing ways. While many voters claimed they could identify misinformation, they frequently expressed concern that others were being misled. This perception illustrates the Third-Person Effect, where individuals believe their peers and rural voters are more vulnerable to manipulation than themselves.

United States: 2016 and 2020 Elections

The U.S. elections in 2016 and 2020 highlighted the scale of misinformation in advanced democracies. In 2016, investigations revealed foreign influence campaigns that used targeted social media ads, bots, and fake accounts to amplify divisive content. In 2020, domestic misinformation flourished, including conspiracy theories about mail-in voting and claims of election fraud. Many Americans dismissed the idea that these campaigns swayed their own votes but insisted that the broader electorate was manipulated. This reinforced political polarization and reduced trust in electoral outcomes.

Georgia: Post-Soviet Disinformation Campaigns

In Georgia, a post-Soviet state, disinformation has been used repeatedly to undermine democratic processes. Both domestic and foreign actors spread claims that elections were externally controlled or fraudulent. Disinformation campaigns often portrayed Western influence as a threat, aiming to weaken support for democratic reforms. Citizens exposed to these narratives tended to believe that while they personally were not influenced, others in society were vulnerable, fueling skepticism about democratic institutions.

Comparative Insight

These cases show how misinformation takes different forms depending on political and technological contexts, yet the Third-Person Effect remains constant. Across India, the United States, and Georgia, voters underestimated their own exposure to misinformation while exaggerating its impact on others. This self–other perception gap deepens distrust, intensifies polarization, and shapes calls for stronger controls on election communication.

The Role of Political Actors

Political actors play a direct role in spreading and amplifying misinformation during election campaigns. Parties use targeted narratives to discredit opponents or mobilize support. Influencers and partisan voices amplify these narratives on social media, giving them greater reach and credibility among followers. Automated bots are often deployed to artificially boost visibility, making misinformation appear more widespread than it actually is. In addition, paid campaigns strategically promote misleading ads or selective data to shape voter perceptions. Within the framework of the Third-Person Effect, voters often believe they can resist such tactics but assume others are highly influenced, reinforcing distrust in the fairness of elections.

Parties

Political parties often design and circulate misinformation as part of their campaign strategy. They may spread distorted narratives to discredit opponents or present selective achievements to strengthen their image. By shaping the flow of information, parties seek to control voter perceptions and frame the election agenda in their favor.

Influencers

Influencers, including popular social media personalities and partisan commentators, amplify political messages and misinformation. Their credibility with followers gives weight to content, even when it lacks verification. Influencers help misinformation spread faster by combining personal storytelling with political messaging, making it appear more authentic and relatable.

Bots

Automated accounts, or bots, play a significant role in amplifying misleading narratives. Bots can artificially inflate engagement metrics such as likes, shares, and retweets, creating the illusion of widespread support for certain viewpoints. This manufactured visibility pressures users into believing that misinformation reflects the majority opinion.

Paid Campaigns

Paid campaigns use targeted advertisements to promote misinformation to specific audiences. By exploiting user data, political actors can deliver highly tailored messages that reinforce existing biases. These campaigns often include misleading statistics, emotional appeals, or fabricated claims designed to influence undecided voters.

Connection to the Third-Person Effect

The involvement of parties, influencers, bots, and paid campaigns demonstrates how organized efforts shape the spread of misinformation. Within the framework of the Third-Person Effect, voters frequently assume they can resist such strategies but view others as highly susceptible. This perception widens the self–other gap, fueling distrust in election fairness and deepening polarization.

How the Third-Person Effect Shapes Voter Perceptions

The Third-Person Effect influences voter perceptions by creating a belief that others are more vulnerable to misinformation than oneself. This perception gap shapes political attitudes in several ways. It reduces trust in the electorate’s ability to make informed choices, fuels suspicion toward rival groups, and increases support for censorship or stricter regulation of campaign communication. Voters may dismiss the idea that misinformation changes their own views, yet they often fear it misleads the broader public, which can heighten polarization and weaken confidence in democratic processes.

Self–Other Gap in Elections

In elections, the Third-Person Effect is most visible in the belief that others are more gullible to misinformation than oneself. Voters often claim they can recognize false or misleading content, yet they assume that the broader electorate—especially those from rival groups or less-informed backgrounds—is easily influenced. This self–other gap shapes perceptions of electoral fairness, increases political distrust, and reinforces polarization by framing opponents as manipulated while portraying oneself as rational and resistant.

Perceived Immunity

The self–other gap reflects a common perception that one’s own judgment is less affected by misinformation compared to that of the general public. Voters often view themselves as critical thinkers capable of evaluating campaign messages, while assuming that others lack the same level of awareness. This belief creates a sense of personal immunity and reinforces confidence in one’s ability to resist manipulation.

Assumptions About Others

While individuals dismiss their own vulnerability, they frequently assume that large segments of the electorate, particularly rival supporters or socially distant groups, are easily deceived. For example, a voter may claim that fake news does not affect their decision, but believes that rural voters or those on the opposing side are swayed by it. These assumptions exaggerate the influence of misinformation on others, deepening suspicion and hostility toward competing political groups.

Effects on Electoral Trust

The self–other gap has direct consequences for how people view elections. When voters believe others are more gullible, they question the legitimacy of outcomes and express distrust in the democratic process. This perception leads to concerns that misinformation, rather than informed choice, drives electoral results. As a result, many support stricter controls on campaign communication or censorship, even if they deny being influenced themselves.

Reinforcing Polarization

The belief that others are more susceptible to misinformation also contributes to polarization. Voters perceive their own side as rational and informed, while painting opponents as manipulated. This reinforces group identity and widens divisions, making constructive political dialogue more difficult.

Impact on Trust

The Third-Person Effect contributes to declining trust in core democratic structures. When voters believe others are easily misled by misinformation, they lose confidence in the media’s credibility, question the neutrality of election commissions, and doubt the fairness of electoral outcomes. This erosion of trust extends to democratic institutions as a whole, as citizens begin to suspect that decisions are shaped less by informed choice and more by manipulated narratives. Such perceptions weaken public faith in democracy and fuel calls for stricter oversight of political communication.

Declining Confidence in Media

The spread of misinformation and the belief that others are easily deceived erode confidence in traditional and digital media. Voters who see misinformation circulating assume that news outlets either fail to prevent its spread or actively contribute to it. This perception weakens the role of media as a credible source of information during elections, making voters more skeptical of both reporting and fact-checking efforts.

Questioning Election Commissions

Election commissions are designed to safeguard fairness, but the Third-Person Effect shapes public attitudes toward their credibility. When voters assume false narratives mislead others, they often suspect that election authorities cannot effectively counter manipulation. Allegations of bias or incompetence emerge more easily in such contexts, reducing trust in electoral oversight.

Erosion of Democratic Institutions

Misinformation, combined with the perception that others are more susceptible to it, extends distrust to broader democratic institutions. Courts, legislatures, and regulatory bodies are viewed with suspicion when outcomes do not align with a voter’s expectations. Instead of attributing results to informed decision-making, individuals may blame manipulation or institutional failure.

Broader Consequences

This decline in trust carries profound implications. As voters lose faith in media, election commissions, and democratic institutions, they become more receptive to calls for censorship, regulation, or populist narratives that challenge established systems. The perception that democracy is shaped by misinformation rather than informed choice undermines legitimacy and fuels political polarization.

Behavioral Outcomes

The Third-Person Effect shapes voter behavior in significant ways. Believing that others are more vulnerable to misinformation often leads individuals to develop political cynicism, assuming elections are influenced by manipulation rather than informed choice. This perception can also increase support for censorship or stricter regulations on campaign communication, as voters seek to protect the electorate from perceived threats. In some cases, it influences voting decisions directly, as individuals justify their choices based on the belief that misinformation has already swayed others, reinforcing polarization and distrust in democratic outcomes.

Increased Political Cynicism

When voters assume misinformation influences others more than themselves, they often develop political cynicism. This outlook leads them to believe elections are shaped by manipulation rather than informed decision-making. As cynicism grows, public engagement may decline, with citizens doubting whether their participation has any real impact on democratic outcomes.

Support for Censorship or Stricter Regulations

The perception that others are easily misled also drives support for censorship and stronger regulation of political communication. Voters may call for restrictions on social media, tighter controls on campaign advertising, or direct intervention by election commissions. While such measures are intended to protect democracy, they also raise concerns about free speech and the concentration of power in regulatory bodies.

Voting Decisions Influenced by Defensive Partisanship

The Third-Person Effect can also shape voting behavior directly. Believing that misinformation has already swayed rival groups, voters may respond with defensive partisanship. Instead of evaluating candidates or policies on merit, they may cast their ballots to counteract what they perceive as the manipulated choices of others. This reaction intensifies polarization and shifts elections away from issue-based debates toward identity-driven contests.

Example: Perceived Immunity vs. Vulnerability of Others

A clear example of the Third-Person Effect in elections is when voters claim, “I won’t be fooled, but others will vote wrongly because of fake news.” This statement reflects the belief that one’s own judgment is immune to manipulation while the broader electorate is vulnerable. Such perceptions reinforce distrust in electoral outcomes, as individuals assume misinformation distorts the choices of others, even if they deny being influenced themselves. This mindset strengthens polarization and fuels demands for tighter control over political communication.

Expression of the Third-Person Effect

A common expression of the Third-Person Effect in elections is when individuals assert, “I won’t be fooled, but others will vote wrongly because of fake news.” This statement illustrates the belief that one’s own reasoning is resistant to manipulation, while others are seen as more susceptible. Such declarations reflect the self–other perception gap that lies at the heart of the theory.

Impact on Electoral Confidence

When voters adopt this perspective, they begin to doubt the ability of the broader electorate to make informed decisions. Even if they feel confident in their own choices, they worry that misinformation will distort the behavior of others, which in turn undermines confidence in the legitimacy of election results.

Consequences for Political Behavior

This perception often shapes political behavior. Believing others are misled, individuals may support censorship, stricter regulations, or aggressive fact-checking policies. Some may also justify voting defensively, not only to support their preferred candidate but to counterbalance what they perceive as the manipulated decisions of others.

Reinforcement of Polarization

Finally, this mindset reinforces polarization by dividing voters into two categories: the “rational self” and the “misled others.” This division fuels suspicion toward rival groups, making it harder to sustain trust in democratic debate and electoral processes.

The Third-Person Effect and Political Polarization

The Third-Person Effect intensifies political polarization by reinforcing divisions between groups. Voters often see themselves and their allies as rational while perceiving rival supporters as misled by misinformation. This perception deepens confirmation bias and strengthens echo chambers, where individuals repeatedly encounter content that validates their own views. As a result, the belief that “others are gullible” fosters distrust, hostility, and a sharper divide between political groups, making consensus and constructive dialogue increasingly difficult in democratic systems.

Confirmation Bias and Echo Chambers

When voters assume others are more vulnerable to misinformation, they become more confident in their own positions and less willing to question partisan narratives. Social media algorithms reinforce this effect by repeatedly exposing users to similar content, creating echo chambers. Within these closed environments, misinformation circulates unchecked, and voters perceive their own group as well-informed while seeing others as misled.

Group Identities

Political identity plays a central role in how the Third-Person Effect operates. Voters often frame their own group as rational and resistant to manipulation while portraying rival groups as gullible or misinformed. This division reinforces in-group loyalty and out-group hostility, making compromise less likely. The perception that one’s political allies are informed and opponents are deceived deepens social divides and fosters mistrust across party lines.

Consequences for Democracy

The widening self–other gap intensifies polarization with several consequences for democratic life. First, it encourages tribalism, where voters prioritize group loyalty over issue-based reasoning. Second, it fosters hostility, as rival groups accuse each other of being manipulated rather than engaging in substantive debate. Third, it contributes to the erosion of common facts, since opposing sides no longer agree on what constitutes reliable information. These dynamics undermine shared democratic norms, weaken the possibility of consensus, and leave societies more vulnerable to misinformation campaigns.

Case Studies and Global Perspectives

The influence of the Third-Person Effect can be seen across different democracies, where misinformation interacts with cultural and political contexts in unique ways. In India, WhatsApp forwards have spread rumors and manipulated narratives that shaped voter perceptions during elections. In the United States, misinformation in the 2016 and 2020 elections, from foreign interference to domestic conspiracy theories, fueled polarization and distrust. In Georgia, post-Soviet disinformation campaigns have undermined confidence in democratic reforms by portraying elections as externally controlled or fraudulent. These cases highlight how the self–other perception gap operates globally, reinforcing polarization and weakening trust in democratic processes.

United States: Russian Disinformation and Voter Manipulation

In the United States, the 2016 election exposed how foreign disinformation campaigns targeted voters through social media. Russian-linked networks spread false stories, divisive memes, and misleading ads designed to exploit partisan divides. The 2020 election saw a continuation of these tactics, with added domestic misinformation such as conspiracy theories about mail-in voting and election fraud. Many Americans denied being personally influenced but believed that other voters were misled, illustrating the Third-Person Effect. This perception deepened polarization and eroded trust in the legitimacy of electoral outcomes.

The 2016 Election

The 2016 U.S. presidential election revealed the scope of foreign interference in democratic processes. Russian-backed groups created fake social media accounts, bought targeted political ads, and circulated fabricated news stories. Their strategy relied on exploiting existing divisions within American society, particularly around race, immigration, and political identity. False content often appeared in the form of memes or misleading articles designed to inflame emotions and encourage hostility between groups. Research later confirmed that these campaigns reached millions of Americans, raising questions about the integrity of the electoral process.

The 2020 Election

In 2020, misinformation did not only come from foreign actors but also from domestic sources. Conspiracy theories about mail-in voting, unfounded claims of widespread fraud, and manipulated statistics spread widely across platforms such as Facebook, Twitter, and YouTube. These narratives fueled distrust in the electoral system and undermined confidence in the legitimacy of results, even after courts and election officials verified their accuracy.

Connection to the Third-Person Effect

Both election cycles highlight the operation of the Third-Person Effect. Many voters insisted they could recognize and dismiss false content, yet they believed other Americans were highly susceptible to it. This assumption deepened polarization, as individuals framed rival supporters as misled by propaganda. The perception that misinformation had a decisive influence on “others” eroded public trust not only in electoral outcomes but also in democratic institutions as a whole.

Broader Implications

The U.S. case shows how misinformation, amplified by both foreign and domestic actors, interacts with the self–other perception gap to intensify political divides. When voters assume misinformation primarily affects others, they grow more cynical about the fairness of elections and more receptive to narratives that challenge democratic legitimacy.

India: WhatsApp Misinformation in 2019 and 2024 Elections

In India, WhatsApp became a powerful channel for spreading misinformation during the 2019 and 2024 elections. False narratives, doctored videos, and communal rumors circulated widely through forwarded messages, often targeting voter identity and group loyalties. These campaigns shaped perceptions of parties and candidates, sometimes escalating social tensions. While many voters claimed they could recognize misinformation, they believed others—mainly rural communities and less media-literate groups—were misled. This illustrates the Third-Person Effect, where individuals underestimate their own vulnerability but assume misinformation decisively influences the broader electorate, fueling distrust in electoral fairness.

The 2019 Election

During the 2019 general election, WhatsApp emerged as one of the most influential tools for political communication. With millions of active users, the platform enabled the rapid spread of misinformation through forwarded messages. False narratives included doctored videos, fabricated quotes, and exaggerated claims about parties and candidates. Many of these messages exploited religious and cultural sentiments, heightening social divisions and reinforcing partisan loyalties. Voters often admitted receiving misleading messages but insisted they could distinguish fact from fiction, while suggesting that rural or less-educated communities were more easily deceived.

The 2024 Election

By 2024, misinformation campaigns on WhatsApp became even more sophisticated. Political actors used large group networks to circulate communal rumors, targeted propaganda, and manipulated statistics. The content often blended factual elements with misleading claims, making it harder for recipients to verify. Despite public awareness campaigns and fact-checking initiatives, misinformation spread widely, shaping debates around economic performance, national security, and minority rights. Again, many voters claimed they resisted such messages, yet they feared others were swayed, reinforcing the self–other gap central to the Third-Person Effect.

Connection to the Third-Person Effect

Both election cycles demonstrate how misinformation thrives on trust within private messaging spaces. Because WhatsApp forwards often come from friends, family, or community members, recipients perceive them as credible. However, the Third-Person Effect leads individuals to downplay their own exposure while exaggerating the vulnerability of others. This perception contributes to a decline in trust in electoral outcomes and intensifies polarization, as groups accuse each other of being manipulated by propaganda.

Broader Implications

The Indian case shows how closed communication networks amplify misinformation while making it difficult to track or regulate. The combination of high digital penetration, cultural diversity, and political mobilization creates fertile ground for such campaigns. When combined with the Third-Person Effect, this environment not only spreads misinformation but also reinforces suspicion, hostility, and division within the electorate.

Georgia/Eastern Europe: Foreign Influence Operations Shaping Voter Trust

Foreign influence operations, often linked to Russia, spread narratives portraying elections as fraudulent, externally controlled, or manipulated by Western powers. These campaigns target public trust in reforms and democratic institutions, framing them as weak or compromised. Within the framework of the Third-Person Effect, citizens often claim they can resist such narratives but assume that others are easily influenced. This perception fuels skepticism toward election outcomes and reinforces political polarization across the region.

Foreign Disinformation Campaigns

In Georgia and across Eastern Europe, foreign actors, particularly Russia, have used disinformation to weaken democratic systems. These operations spread narratives suggesting elections are illegitimate, externally controlled, or manipulated by Western governments. The aim is to reduce faith in democratic reforms and encourage skepticism toward pro-Western political movements.

Strategies of Influence

Tactics include the use of state-backed media, fake online accounts, and coordinated social media campaigns. These efforts amplify divisive issues such as national identity, territorial disputes, and corruption. By framing elections as compromised, disinformation seeks to undermine the credibility of governments and shift public opinion in favor of foreign influence.

The Third-Person Effect in Action

Citizens exposed to these narratives often insist they are not influenced, yet they assume others are more vulnerable. This perception reflects the Third-Person Effect, where individuals downplay their own susceptibility but exaggerate the impact on the broader electorate. As a result, they lose confidence in fair competition and suspect that misinformation has already swayed outcomes.

Consequences for Democratic Trust

The long-term effect of these campaigns is a decline in trust toward democratic institutions, political parties, and electoral commissions. By reinforcing the belief that others are manipulated, disinformation not only polarizes societies but also creates persistent doubts about the legitimacy of elections. This erosion of trust leaves states vulnerable to continued external interference and weakens the foundations of democratic governance.

Comparative Insight: Democracies and the Third-Person Effect

Different democracies respond to misinformation and the Third-Person Effect in varied ways. In the United States, concerns about foreign interference and domestic conspiracy theories have fueled debates on regulation and free speech. In India, encrypted platforms like WhatsApp make misinformation harder to track, prompting fact-checking initiatives and public awareness campaigns. In Georgia and Eastern Europe, foreign disinformation has pushed governments to strengthen counter-propaganda measures while citizens remain divided over trust in reforms. Across these contexts, the self–other perception gap remains constant, as voters consistently believe they are less affected than others, reinforcing polarization and weakening confidence in democratic outcomes.

United States

In the United States, misinformation during the 2016 and 2020 elections sparked debates over the balance between free speech and regulation. Policymakers and platforms introduced measures such as labeling false content, removing coordinated disinformation networks, and improving ad transparency. Despite these efforts, polarization persisted because voters continued to believe others were more easily manipulated, reinforcing the Third-Person Effect.

India

India faces unique challenges due to the popularity of encrypted messaging apps like WhatsApp. During the 2019 and 2024 elections, false narratives spread widely through forwards that often blended social, cultural, and political themes. The government and civil society responded with fact-checking initiatives and public awareness campaigns. Yet, the private nature of these networks made monitoring difficult, and voters often maintained the perception that while they resisted misinformation, others—particularly rural and less digitally literate populations—were influenced.

Georgia and Eastern Europe

In Georgia and other Eastern European states, foreign disinformation operations, particularly from Russia, have shaped public trust in elections. Governments attempted to counter this influence through media literacy programs, monitoring of foreign-backed outlets, and strategic communication campaigns. However, citizens frequently viewed others as the primary targets of propaganda, which reinforced skepticism toward electoral legitimacy and prolonged political polarization.

Shared Patterns

Across these democracies, responses differ in approach but share a common challenge: the persistence of the Third-Person Effect. Voters consistently downplay their own vulnerability to misinformation while exaggerating its impact on others. This perception gap fuels distrust, reduces confidence in democratic outcomes, and complicates regulatory responses.

The Role of Media Literacy and Education

Media literacy and education are essential in countering the influence of misinformation and reducing the Third-Person Effect. By teaching citizens how to critically evaluate news, recognize manipulative content, and verify sources, media literacy programs help narrow the gap between perceived self-resilience and assumed vulnerability of others. Public awareness campaigns, fact-checking initiatives, and school-based digital literacy training equip voters with tools to identify misinformation before it spreads. Strengthening these efforts not only improves individual resilience but also builds collective trust in democratic processes by reducing the belief that others are easily misled.

Critical Thinking Programs

Critical thinking education is central to reducing the influence of misinformation. Training programs that emphasize analytical skills, source evaluation, and logical reasoning help citizens recognize manipulative content more effectively. By teaching individuals to question the credibility of information and verify facts before sharing, such programs narrow the gap created by the Third-Person Effect. When people understand their own susceptibility, they become less likely to assume that misinformation only affects others.

Digital Literacy Campaigns

Digital literacy initiatives target the fast-changing information environment where misinformation thrives. Fact-checking platforms, WhatsApp tip lines, and independent media watchdogs provide resources for verifying political claims. Campaigns also raise public awareness of standard techniques used in fake news, such as misleading headlines or selective statistics. These initiatives strengthen individual resilience against false content and encourage responsible online behavior, reducing the unchecked spread of misinformation during elections.

Civic Responsibility

While education and literacy programs enhance detection skills, they must also foster civic responsibility. Encouraging citizens to remain vigilant without creating paranoia is essential. If individuals constantly fear manipulation, trust in democratic processes may erode further. Balanced media literacy emphasizes both awareness and constructive participation, promoting an informed electorate that remains engaged rather than disengaged.

Connection to the Third-Person Effect

Together, these efforts address the self–other gap by making voters more aware of their own vulnerabilities. When people acknowledge that misinformation can influence them as well as others, they become less cynical about the electorate and more confident in collective decision-making. Media literacy and education, therefore, are key to countering misinformation while restoring trust in elections.

Policy and Regulatory Implications

The Third-Person Effect shapes public demand for stronger regulation of election communication, as voters often believe others are more vulnerable to misinformation. This perception leads to support for stricter rules on political advertising, greater accountability for social media platforms, and more vigorous oversight by election authorities. While such measures aim to protect democratic integrity, they also raise questions about censorship and the balance between free expression and regulation. Addressing this challenge requires transparent policies, independent monitoring, and safeguards that limit misinformation without undermining democratic freedoms.

Debate on Censorship vs. Free Speech

Governments face the challenge of addressing misinformation while protecting free expression. On one side, stricter controls can reduce the spread of false narratives and protect voters from manipulation. On the other hand, censorship risks silencing legitimate debate and concentrating power in the hands of authorities. The Third-Person Effect intensifies this debate, as voters often call for tighter restrictions not because they feel personally influenced, but because they assume others are more vulnerable. Finding a balance between regulation and liberty requires transparent processes, judicial oversight, and clear safeguards against abuse.

Platform Responsibility

Social media channels such as Meta, X, and Google play a central role in regulating election-related content. Each company has adopted policies to detect and remove false claims, flag manipulated media, and restrict paid political ads that spread misinformation. However, enforcement remains inconsistent, and critics argue that platforms prioritize engagement over accuracy. The Third-Person Effect fuels public demand for stronger accountability, as voters perceive that the unchecked amplification of false content misleads others. Platforms, therefore, face pressure to strengthen their role as gatekeepers of information while avoiding charges of political bias.

Possible Reforms

Several reforms have been proposed to address misinformation in election campaigns:

  • Stricter political ad disclosures to ensure transparency about funding sources and targeting criteria.
  • Real-time fact-checking mechanisms to correct false claims before they reach mass audiences.
  • Independent election watchdogs to monitor campaign content and hold parties and platforms accountable.

These measures aim to safeguard democratic integrity without undermining open debate. Yet they also highlight the tension between protecting voters from manipulation and preserving the freedoms that define democratic societies.

Future of Election Campaigns in the Age of AI

Artificial intelligence is transforming election campaigns, creating both new risks and opportunities. Tools such as generative AI enable the production of deepfakes, synthetic news, and micro-targeted propaganda that can spread misinformation at unprecedented speed. At the same time, AI-driven fact-checking and content moderation offer potential solutions to detect and counter false information in real time. Within the framework of the Third-Person Effect, voters may continue to believe they are resistant to AI-generated misinformation while assuming others are more easily deceived, reinforcing distrust and polarization in democratic processes.

Generative AI and Deepfakes: The Next Frontier of Misinformation

Generative AI has introduced powerful tools for creating deepfakes—highly realistic but fabricated audio, video, and images. In election campaigns, these technologies can falsely depict candidates making statements or engaging in actions that never occurred, making misinformation more convincing and more challenging to detect. While many voters believe they can identify such manipulations, they often assume others cannot, reinforcing the Third-Person Effect. This perception heightens anxiety about electoral integrity and deepens polarization, as political groups accuse rivals of being swayed by fabricated content.

Rise of Synthetic Media

Generative AI technologies can now create highly realistic audio, video, and images at low cost. These tools enable the fabrication of events that never occurred, such as candidates delivering false speeches or being placed in misleading scenarios. Because deepfakes appear authentic, they challenge voters’ ability to distinguish genuine information from manipulated content.

Application in Election Campaigns

During elections, deepfakes can be weaponized to discredit opponents, spread fabricated scandals, or reinforce existing stereotypes. Unlike traditional misinformation, which often relies on text or images, AI-generated media carries a higher level of credibility because it mimics real-world evidence. This makes it more persuasive and harder for fact-checkers to counter in time.

Interaction with the Third-Person Effect

The Third-Person Effect intensifies the dangers of deepfakes. Many voters believe they can identify manipulated media but assume others are unable to do so. This belief fuels distrust in the electorate, creating anxiety about election outcomes even when exposure to deepfakes is limited. As a result, people often call for stricter regulation of digital platforms and stronger verification systems.

Democratic Consequences

The proliferation of deepfakes risks eroding trust in authentic media as well. When voters question the legitimacy of any video or recording, it creates an environment where genuine evidence can be dismissed as fake. This phenomenon, often called the “liar’s dividend,” weakens accountability and gives political actors greater freedom to deny verified wrongdoing. Combined with the Third-Person Effect, the spread of deepfakes not only misleads but also amplifies suspicion, polarization, and the perception that democracy itself is vulnerable to manipulation.

AI Fact-Checking: Automated Detection of Manipulated Content

AI-powered fact-checking systems are increasingly used to detect misinformation, including deepfakes and synthetic media, in real time. These tools analyze text, images, and videos to flag manipulation and alert platforms or users before false content spreads widely. While such systems offer a promising safeguard, their effectiveness depends on transparency, accuracy, and timely response. Within the framework of the Third-Person Effect, voters may trust their ability to resist misinformation but assume others rely heavily on fact-checking to avoid being deceived. This perception underscores the need for reliable AI-driven verification to protect electoral integrity.

How AI Fact-Checking Works

AI fact-checking systems use machine learning to analyze text, images, audio, and video for signs of manipulation. These tools can identify inconsistencies in speech patterns, detect alterations in visuals, and cross-check claims against verified data sources. When integrated with social media platforms, AI can flag suspicious content before it reaches large audiences, slowing the spread of misinformation.

Benefits for Elections

During election campaigns, AI-driven detection helps address the speed at which false narratives spread. Automated systems provide real-time analysis, giving voters and platforms timely warnings about misleading material. This immediacy makes them more effective than traditional fact-checking, which often lags behind viral misinformation.

Challenges and Limitations

Despite its potential, AI fact-checking faces limitations. Algorithms can produce false positives, mislabeling legitimate content as manipulated, or miss sophisticated deepfakes. Transparency is another concern, as voters and regulators often lack clarity on how platforms train and apply these systems. Overreliance on AI detection without human oversight may also reduce accountability, as errors in labeling could shape public opinion unfairly.

Connection to the Third-Person Effect

AI fact-checking interacts with the Third-Person Effect in meaningful ways. Many voters believe they can spot misinformation independently, but they assume others rely heavily on fact-checking tools to avoid deception. This perception reinforces the idea that the electorate is vulnerable, even if individuals deny personal influence. Reliable and transparent AI detection can help narrow this perception gap by improving collective confidence in electoral communication.

Will the Third-Person Effect Intensify?

As misinformation tools grow more sophisticated with generative AI, deepfakes, and micro-targeted propaganda, the Third-Person Effect may become stronger. Voters are likely to continue believing that they can resist manipulation while assuming others are more vulnerable to it. This widening perception gap could deepen distrust in electoral outcomes, increase support for stricter controls on political communication, and intensify polarization, as each group views the other as more easily deceived by false information.

Growing Sophistication of Misinformation Tools

Advances in generative AI and deepfake technology are making misinformation more convincing and more challenging to detect. Campaigns can now deploy synthetic videos, hyper-targeted advertisements, and automated propaganda at scale. These tools blur the line between authentic and fabricated content, challenging voters’ ability to assess credibility.

Expanding the Perception Gap

As misinformation becomes more sophisticated, the perception gap at the center of the Third-Person Effect is likely to grow. Individuals may continue to believe they are capable of recognizing false content while assuming others cannot. This belief reinforces the notion that elections are determined not by informed choice but by the manipulation of a misled electorate.

Effects on Democratic Trust

An intensifying Third-Person Effect could deepen voter cynicism and undermine faith in democratic outcomes. When people assume others are constantly deceived, they begin to question the fairness of elections and the legitimacy of institutions overseeing them. This mindset may also increase support for censorship, stricter regulations, or state intervention in digital communication.

Polarization and Defensive Behavior

The widening perception gap can fuel polarization by framing political opponents as gullible and manipulated. Voters may respond with defensive partisanship, casting their ballots not only in support of their preferred candidate but also to counteract what they see as the misinformed decisions of others. This behavior shifts elections further away from policy-driven debates toward identity-driven competition.

Conclusion

The Third-Person Effect provides a critical lens for understanding how misinformation shapes voter perceptions and influences democratic processes. While false or misleading content directly affects decision-making, its indirect effects are equally significant. Voters often perceive themselves as resistant to manipulation yet assume that others are easily swayed. This perception gap not only undermines trust in the electorate but also reshapes political attitudes, deepens polarization, and fuels skepticism about the legitimacy of elections.

This dual reality is central to the challenge of misinformation. On one side, misinformation changes political behavior by altering beliefs, preferences, and voter choices. On the other hand, the widespread belief that “others are misled” creates cynicism, justifies defensive partisanship, and strengthens demands for censorship or stricter regulations. Together, these dynamics erode confidence in democratic institutions, reduce the quality of public debate, and make consensus-building more difficult.

Media literacy and education can help citizens recognize their own vulnerabilities while equipping them to identify and challenge misinformation. Institutional trust must be rebuilt through transparency, accountability, and consistent enforcement of electoral rules. Finally, regulatory frameworks need to balance free expression with safeguards that prevent manipulation, ensuring that platforms, political actors, and governments share responsibility for protecting democratic integrity.

The Third-Person Effect reminds us that misinformation is not just about what people believe, but also about what they assume others believe. Recognizing and addressing this perception gap is essential for strengthening electoral integrity and sustaining trust in democracy.

The Third-Person Effect in Election Campaigns: How Misinformation Shapes Voter Perceptions – FAQs

What Is the Third-Person Effect in Communication Theory?

The Third-Person Effect is the idea that people believe media messages affect others more than themselves, creating a perception gap between “self” and “others.”

Who Introduced the Concept of the Third-Person Effect?

The theory was first proposed by W. Phillips Davison in 1983.

How Does the Third-Person Effect Apply to Election Campaigns?

In elections, voters often assume they can resist misinformation but believe others are easily misled, which influences their political attitudes and trust in outcomes.

What Psychological Biases Explain the Third-Person Effect?

It is linked to cognitive bias, the illusion of personal immunity, and the perception that socially distant groups are more vulnerable.

What Forms of Misinformation Are Most Common in Election Campaigns?

Key forms include fake news, deepfakes, manipulated statistics, and meme-based propaganda.

How Do Digital Platforms Amplify Misinformation?

Algorithms prioritize sensational and emotionally charged content, which spreads false information faster than fact-checked corrections.

What Role Do Political Actors Play in Spreading Misinformation?

Parties, influencers, bots, and paid campaigns deliberately create and amplify false narratives to shape voter perceptions.

How Does the Third-Person Effect Shape Voter Trust?

It reduces trust in media, election commissions, and democratic institutions by fostering the belief that others are constantly misled.

What Behavioral Outcomes Result From the Third-Person Effect?

It contributes to political cynicism, support for censorship, and voting decisions influenced by defensive partisanship.

Can You Give an Example of the Third-Person Effect in Elections?

A typical example is when voters say, “I won’t be fooled, but others will vote wrongly because of fake news.”

How Does the Third-Person Effect Contribute to Polarization?

It reinforces echo chambers, strengthens group identities, and promotes tribalism by framing one’s own group as rational and others as gullible.

How Did Misinformation Influence the 2016 and 2020 U.S. Elections?

Foreign and domestic campaigns spread disinformation through social media, leading voters to suspect others were manipulated, which intensified polarization.

What Role Did WhatsApp Play in India’s 2019 and 2024 Elections?

False narratives, communal rumors, and doctored videos circulated widely on WhatsApp, shaping voter perceptions while reinforcing the belief that rural voters were more vulnerable.

How Have Foreign Influence Operations Affected Elections in Georgia and Eastern Europe?

Disinformation campaigns, often linked to Russia, portrayed elections as fraudulent or externally controlled, eroding public trust in democratic reforms.

Do Different Democracies Respond Differently to Misinformation?

Yes. The U.S. focuses on free speech debates, India emphasizes fact-checking and awareness campaigns, and Eastern Europe invests in counter-propaganda and media literacy programs.

How Can Media Literacy Reduce the Third-Person Effect?

By training citizens to recognize their own susceptibility and critically evaluate information, media literacy narrows the self–other perception gap.

What Role Do Digital Literacy Campaigns Play in Elections?

They provide fact-checking tools, public awareness initiatives, and watchdog oversight to help voters verify information and resist manipulation.

What Policy Reforms Could Help Reduce Misinformation in Elections?

Possible reforms include stricter political ad disclosures, real-time fact-checking, and independent watchdogs monitoring campaign content.

How Does Artificial Intelligence Affect the Future of Misinformation?

Generative AI enables deepfakes and synthetic propaganda, but it also powers AI-driven fact-checking tools to detect and counter false content.

Will the Third-Person Effect Intensify With AI-Driven Misinformation?

Yes. As misinformation becomes harder to detect, voters may continue to believe they can resist it while assuming others are more vulnerable, widening the perception gap.

Published On: August 24th, 2025 / Categories: Political Marketing /

Subscribe To Receive The Latest News

Curabitur ac leo nunc. Vestibulum et mauris vel ante finibus maximus.

Add notice about your Privacy Policy here.