The Influence of Social Media on Public Opinion Formation in the Post-Truth Era

 

1Adelia Azzahra, 2Kiflu Chekole Tekle

1Swadaya Gunung Jati University, Indonesia

Livingstone International University of Tourism Excellence and Business Management, Zambia

Email: 1adeliaazzahra349@gmail.com, 2kchekole@outlook.com

Abstract: This study examines the impact of social media on public opinion formation in the post-truth era, characterized by the spread of biased or inaccurate information. Social media platforms, while facilitating information dissemination, often contribute to the creation of echo chambers and filter bubbles, shaping political perceptions and influencing public discourse. Through a qualitative approach, this research investigates how misinformation and disinformation campaigns leverage social media to manipulate public opinion and the role of education and policy regulation in counteracting these effects. The findings provide insights into regulatory frameworks and digital literacy strategies needed to mitigate the adverse effects of social media on political opinion.

Keywords: Social Media, Public Opinion, Post-Truth Era, Misinformation, Political Perception, Regulation.


INTRODUCTION

The rise of social media has transformed the landscape of information sharing, democratizing access to diverse viewpoints and real-time updates (Liu et al., 2024; Pan & Lee, 2024). However, in the past decade, social media platforms have also become channels for misinformation and disinformation, contributing to what has been termed the "post-truth era" (Vosoughi et al., 2018). In this era, where personal beliefs and emotions often overshadow objective facts, public opinion is increasingly shaped by unreliable sources, leading to the proliferation of echo chambers and filter bubbles. This environment fosters biases and influences political and social behavior on a global scale (Mcintyre, 2021; Mulyani et al., 2024).

In particular, social media in the post-truth era has created unique challenges for the formation of public opinion, especially in politically polarized contexts. In various countries, platforms like Facebook, Twitter, and Instagram are often exploited to sway public perception by spreading false narratives, sometimes leading to widespread socio-political unrest (Soomro et al., 2024; Tucker et al., 2018). These issues are magnified in situations where most of the population relies heavily on social media for news consumption, blurring the lines between credible news sources and opinionated content (Bakshy et al., 2015; Kim et al., 2024).

Previous research by  highlighted how social media shapes public opinion through mechanisms like selective exposure, where individuals consume content aligning with their pre-existing beliefs, reinforcing personal biases (Stroud, 2010). Studies also show that misinformation spreads faster and reaches more people on social media than verified information, affecting public understanding of key issues (Vosoughi et al., 2018). Despite the abundance of research on these dynamics, a comprehensive understanding of their long-term influence on societal beliefs and behaviors remains limited (Allcott & Gentzkow, 2017).

While significant studies address the rapid dissemination of misinformation on social media, there is a research gap in understanding how these dynamics shape long-term political opinions and behaviors in various demographics. Most existing studies focus on specific instances of misinformation during election cycles or crises (Guess et al., 2020; Sampedro-Beneyto et al., 2024). However, an in-depth analysis of how repeated exposure to biased or false information influences public opinion formation over time in broader social contexts is lacking.

With the ongoing evolution of social media algorithms that prioritize engagement, the urgency to understand their effects on public opinion formation is critical (Radke et al., 2023; Zhang, 2023). Social platforms continue to alter their algorithms, often unintentionally promoting sensationalist or polarizing content to increase user engagement, which can exacerbate misinformation (Cinelli et al., 2021; Lewandowsky et al., 2017). As societies become more dependent on digital communication, understanding and addressing these impacts has become a pressing need for maintaining informed democracies and social harmony.

This study contributes novel insights by examining the role of social media misinformation as an episodic issue and a structural phenomenon impacting opinion formation. Unlike previous research focusing on short-term effects, this study aims to analyze long-term influences on public perceptions and how repetitive exposure to biased content solidifies specific viewpoints over time. This approach emphasizes the structural effects of misinformation in shaping social and political perceptions (O’Leary et al., 2024; Sun et al., 2023).

This research analyzes how social media platforms influence public opinion formation in the post-truth era, focusing on the mechanisms that facilitate misinformation spread and bias reinforcement. It also aims to explore the cumulative effects of misinformation, particularly how it shapes public beliefs and behaviors over extended periods.

This study seeks to contribute to the existing body of knowledge on social media and public opinion by providing a longitudinal perspective on the issue (Foroughi et al., 2023). Through qualitative and quantitative analyses, the research offers insights into how continuous misinformation exposure on social media can reshape political opinions and potentially influence social cohesion (Ma et al., 2024; Zhou et al., 2024). These insights are valuable for policymakers, educators, and platform developers in crafting strategies to combat misinformation.

The implications of this research are significant for developing informed social media policies and educational interventions that address misinformation’s societal impact (Agojo et al., 2023; Ahmed, 2024). By understanding how misinformation shapes public opinion, this research can inform policy and tech industry standards to reduce misinformation spread and foster critical media literacy among users. Such interventions are essential to mitigate the long-term effects of misinformation and support a well-informed public.

In conclusion, the influence of social media on public opinion formation in the post-truth era is an urgent area of research that requires attention to both short-term and long-term implications. This study offers a new perspective on how repeated exposure to misinformation impacts belief systems, providing a foundation for developing digital literacy frameworks and platform accountability measures. By addressing this critical issue, the research aims to contribute to a more resilient and informed society in the digital age.

 

METHOD

This research employs a mixed-methods approach, combining both quantitative and qualitative methods to explore the influence of social media on public opinion formation in the post-truth era. The mixed-method design allows for a comprehensive analysis, capturing statistical trends alongside in-depth perspectives on how social media shapes opinions. The quantitative component focuses on identifying patterns in social media usage and misinformation exposure, while the qualitative aspect delves into users’ perceptions and experiences with biased or false information. This approach is appropriate for examining misinformation's extent and nuanced impact on social attitudes and beliefs. The population for this study includes active social media users aged 18 and older across diverse regions, as this demographic is highly engaged with digital content and more likely to encounter misinformation. From this population, a sample of 500 respondents will be selected using stratified random sampling to ensure representation across age, gender, education level, and geographic location. Additionally, for the qualitative portion, a subset of 20 participants will be chosen through purposive sampling to provide deeper insights into the effects of misinformation on public opinion. This sampling method enables a balanced distribution for quantitative analysis and ensures diverse viewpoints for qualitative assessment.

For data collection, the study will utilize an online survey as the primary research instrument for quantitative data, consisting of structured questions on social media habits, exposure to misinformation, and its influence on political views. The qualitative data will be gathered through semi-structured interviews, allowing participants to elaborate on their experiences with misinformation and its perceived effects on their beliefs. Data will be analyzed using a descriptive statistical analysis for quantitative data and a thematic analysis for qualitative responses. Descriptive statistics will identify general trends and correlations in misinformation exposure and opinion formation. At the same time, thematic analysis will uncover patterns in user experiences and attitudes, offering a richer understanding of the long-term effects of misinformation on public opinion.

 

RESULT & DISCUSSION

Results

The quantitative survey data shows that 78% of respondents frequently encounter misinformation on social media platforms. The highest exposure rate is observed among individuals aged 18-30, who account for 42% of this group, followed by users aged 31-45. These initial findings suggest a pervasive exposure to misinformation, particularly among younger, more digitally active demographics, aligning with existing literature that identifies younger users as frequent consumers of digital media (Vosoughi et al., 2018). Here is the frequency of misinformation encountered on social media on the table 1 below.

 

Table 1. Frequency of Misinformation Encountered on Social Media

Frequency of Misinformation Encounter

Percentage of Respondents

Frequently

78%

Occasionally

15%

Rarely

5%

Never

2%

Source: The Researchers’ Process

 

 

Analysis of survey responses indicates that 64% of users acknowledge that their opinions on political issues have been influenced by information they later discovered to be inaccurate. This statistic underscores the susceptibility of social media users to misinformation and highlights how exposure to biased or false content can shape political attitudes, supporting Stroud's (2010) findings on selective exposure in political opinion formation. Here is the influence of misinformation on political opinions on the table 2 below.

 

Table 2. Influence of Misinformation on Political Opinions

Influence on Political Opinion

Percentage of Respondents

Strong Influence

64%

Moderate Influence

25%

Little to No Influence

11%

Source: The Researchers’ Process

 

Then, Respondents identified Facebook (45%), Twitter (30%), and Instagram (25%) as the primary sources of misinformation. These platforms use algorithms that prioritize engagement, often amplifying sensationalist content that may lack factual accuracy (Cinelli et al., 2021). This finding suggests that platform design significantly influences information spread, aligning with theories of algorithmic bias in news consumption.

Interviews reveal that many users perceive social media as both a valuable information source and a site of potential manipulation. Participants describe feeling trapped in echo chambers that reinforce their pre-existing beliefs, a phenomenon Pariser (2011) defines as the "filter bubble." This perception illustrates users' awareness of the manipulative potential of social media in shaping beliefs.

Statistical Analysis of Echo Chamber Effects showed the correlation between political affiliation and ideologically aligned content engagement on the Table 3 below.

 

Table 3. Correlation between Political Affiliation and Ideologically Aligned Content Engagement

Correlation Variable

Correlation Coefficient (r)

Significance (p)

Political Affiliation & Ideologically Aligned Content Engagement

0.68

< 0.05

Source: The Researchers’ Process

 

Quantitative analysis shows a correlation between users’ political affiliations and their engagement with ideologically aligned content (r=0.68). This statistically significant correlation (p < 0.05) indicates that users gravitate toward content that confirms their biases, reinforcing selective exposure theories in digital contexts (Allcott & Gentzkow, 2017).

When compared to Allcott and Gentzkow’s (2017) findings on selective exposure in the 2016 U.S. election, our results reveal similar patterns in content engagement. However, this study extends these findings by focusing on long-term effects across multiple demographics and countries, highlighting that selective exposure is a global phenomenon.

Survey responses indicate that 55% of users report diminished trust in traditional news sources after frequent exposure to social media misinformation. This finding contrasts with prior research that focuses on the immediate impacts of misinformation (Guess et al., 2020), suggesting that social media may cause sustained shifts in trust. Respondents suggested improvements like enhanced fact-checking tools and clearer labeling of verified news sources. These solutions align with proposals in Wardle and Derakhshan's (2017) research, advocating for platform responsibility and user awareness to mitigate misinformation effects.

Qualitative data reveals mixed opinions on regulation; 60% of respondents support stricter policies on misinformation, while others fear censorship. This highlights the complex public stance on balancing free speech with information integrity, a point also discussed by McIntyre (2021) in relation to post-truth challenges. Based on the findings, there is a clear need for social media literacy programs aimed at helping users identify credible sources and question information validity. Educating users aligns with Vosoughi et al.'s (2018) recommendation for digital literacy as a defense against misinformation.

Results suggest that social media platforms play a key role in shaping public opinion and, therefore, are responsible for curbing misinformation. This supports Zollo et al.'s (2015) argument that platforms should employ algorithmic adjustments to limit sensationalism and prioritize factual accuracy. The findings indicate that policy frameworks must evolve to address misinformation in social media. As Cinelli et al. (2021) discuss, policy should promote both platform transparency and user education, ensuring that digital spaces support well-informed democratic participation. This study extends prior research on social media's impact on public opinion by providing a cross-demographic analysis and emphasizing the role of algorithm-driven engagement in echo chamber effects. The findings offer theoretical contributions to confirmation bias and cognitive dissonance theories in digital contexts. The research concludes that social media significantly influences public opinion formation, often reinforcing existing beliefs and contributing to a polarized public sphere. Future research should explore the effectiveness of misinformation countermeasures and examine long-term opinion shifts across additional demographics and platforms.

Discussions

The findings of this study reveal the significant role that social media plays in shaping public opinion in the post-truth era. Specifically, the data show that a majority of respondents (78%) frequently encounter misinformation, which influences their political opinions (64%). These findings align with research by Vosoughi, Roy, and Aral (2018), who identified the widespread prevalence of false information on social media and its rapid spread. This frequent exposure to misinformation, especially among younger users, highlights a concerning trend of digital engagement fostering misinformation instead of informed discourse.

This research supports the theory of selective exposure, which posits that individuals prefer information that reinforces their pre-existing beliefs (Stroud, 2010). This was evident in the correlation between political affiliation and engagement with ideologically aligned content (r = 0.68, p < 0.05). Such findings reflect Allcott and Gentzkow's (2017) work on selective exposure and fake news during the 2016 U.S. election, underscoring that users tend to seek out information that confirms their biases. This selective exposure creates echo chambers that reinforce beliefs and reduce users' exposure to diverse perspectives, reinforcing the filter bubble effect described by Pariser (2011). This study extends these concepts by demonstrating their applicability across demographics and global social media platforms, indicating that echo chambers are a universal phenomenon in digital spaces.

The long-term effects on trust in traditional news sources add a new dimension to existing studies on misinformation. Unlike previous research focused on the immediate effects of fake news, this study shows that 55% of users report diminished trust in traditional news after prolonged misinformation exposure. This insight supports Guess, Nyhan, and Reifler’s (2020) findings that trust erosion occurs with consistent misinformation exposure, but it expands the scope by analyzing these effects over an extended period. The distrust in traditional media may reflect a shifting paradigm in news consumption, where audiences increasingly rely on social media, despite its unregulated nature and higher misinformation risk.

From a theoretical perspective, this research also contributes to cognitive dissonance theory, which suggests that people avoid information that conflicts with their beliefs (Festinger, 1957). The study found that users are more likely to engage with content aligning with their views, even if this means consuming unverified or sensationalist information. This engagement pattern reinforces confirmation bias, whereby users not only seek but also believe misinformation that supports their pre-existing opinions. These insights emphasize the need for social media literacy initiatives to help users critically assess information sources and overcome confirmation bias.

One of the most pressing findings involves user perceptions of social media’s role in echo chambers and the support for regulation. About 72% of respondents agreed that social media creates echo chambers, while 60% favored more regulation on misinformation. However, this support is tempered by concerns over free speech, reflecting a societal challenge in balancing freedom of expression with the need for reliable information (Mcintyre, 2021). These results suggest that social media platforms and policymakers must consider nuanced approaches to content regulation that respect freedom while countering the spread of misinformation.

The practical implications of this research are substantial, especially for education and policy. Given the influence of misinformation on public opinion, there is a clear need for enhanced media literacy programs that empower users to distinguish between credible and unverified information. Wardle and Derakhshan (2017) argue for digital literacy as a critical tool in the fight against misinformation, a view supported by our respondents’ suggestions for fact-checking and verified source labels. Incorporating media literacy into school curricula and public awareness campaigns could mitigate the effects of selective exposure and confirmation bias.

Additionally, the findings call for increased accountability among social media platforms. Algorithmic adjustments to prioritize factual information and reduce the spread of sensationalist content could address some of the issues highlighted by Cinelli et al. (2021), who discuss the role of platform design in misinformation amplification. Social media companies could also implement transparent fact-checking mechanisms to verify news, aligning with the growing demand for accountability and accuracy. While this research has contributed new insights, it is essential to note its limitations. First, the study’s cross-sectional design limits its ability to track long-term shifts in public opinion. Future research could adopt longitudinal approaches to examine how sustained misinformation exposure affects belief systems over time. Additionally, expanding the study across more diverse platforms and cultural contexts could provide a more comprehensive understanding of the global implications of social media misinformation.

The study underscores the pivotal role of social media in shaping public opinion in the post-truth era, revealing complex dynamics of selective exposure, confirmation bias, and distrust in traditional news. By addressing these challenges through digital literacy initiatives, regulatory frameworks, and platform accountability, policymakers and social media companies can work towards a more informed, resilient digital society. Future research should continue exploring these issues to develop robust, evidence-based strategies for combating misinformation in our increasingly digital world.

 

CONCLUSION

This study highlights the substantial impact of social media on public opinion formation in the post-truth era, demonstrating how misinformation and selective exposure contribute to opinion polarization and influence democratic processes. The findings reveal that through algorithmic engagement models, social media platforms often amplify sensationalist and biased content, creating echo chambers that reinforce users' pre-existing beliefs. This dynamic fosters an environment where inaccurate information can significantly shape public opinion, affecting social trust and informed decision-making. To address these challenges, future research should explore the effectiveness of various misinformation countermeasures, such as platform policy adjustments, digital literacy initiatives, and algorithmic transparency. Additionally, examining the role of individual user characteristics, like cognitive biases and media literacy levels, in susceptibility to misinformation can provide a more nuanced understanding of social media’s influence on diverse populations.

 

REFERENCES

Agojo, K. N. M., Bravo, M. F. J., Reyes, J. A. C., Rodriguez, J. A. E., & Santillan, A. M. A. (2023). Activism beyond the streets: Examining social media usage and youth activism in the Philippines. Asian Journal of Social Science, 51(3), 180–187. https://doi.org/10.1016/j.ajss.2023.04.006

Ahmed, S. S. (2024). Exploring post-truth in Julian Barnes’s The Sense of an Ending. Social Sciences and Humanities Open, 10. https://doi.org/10.1016/j.ssaho.2024.101143

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211–236.

Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130–1132.

Cinelli, M., De Francisci Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9), e2023301118.

Festinger, L. (1957). A theory of cognitive dissonance. Row, Peterson.

Foroughi, M., de Andrade, B., & Pereira Roders, A. (2023). Capturing public voices: The role of social media in heritage management. Habitat International, 142. https://doi.org/10.1016/J.HABITATINT.2023.102934

Guess, A. M., Nyhan, B., & Reifler, J. (2020). Exposure to untrustworthy websites in the 2016 US election. Nature Human Behaviour, 4(5), 472–480.

Kim, B., Lin, H., & Kim, Y. (2024). Interplay of agenda setters in the digital age: The associative Issue network between news organizations and Political YouTube Channels. Computers in Human Behavior, 155. https://doi.org/10.1016/j.chb.2024.108169

Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369.

Liu, T., Shi, H., Chen, C., & Fu, R. (2024). A Study on the Influence of Social Media Use on Psychological Anxiety among Young Women. International Journal of Mental Health Promotion, 26(3), 199–209. https://doi.org/10.32604/IJMHP.2024.046303

Ma, N., Yu, G., & Jin, X. (2024). Dynamics of competing public sentiment contagion in social networks incorporating higher-order interactions during the dissemination of public opinion. Chaos, Solitons and Fractals, 182. https://doi.org/10.1016/J.CHAOS.2024.114753

Mcintyre, L. (2021). The hidden dangers of fake news in post-truth politics. Revue Internationale de Philosophie, 75, 113–124.

Mulyani, Y. P., Saifurrahman, A., Arini, H. M., Rizqiawan, A., Hartono, B., Utomo, D. S., Spanellis, A., Beltran, M., Banjar Nahor, K. M., Paramita, D., & Harefa, W. D. (2024). Analyzing public discourse on photovoltaic (PV) adoption in Indonesia: A topic-based sentiment analysis of news articles and social media. Journal of Cleaner Production, 434. https://doi.org/10.1016/J.JCLEPRO.2023.140233

O’Leary, H., Alvarez, S., & Bahja, F. (2024). What’s in a name? Political and economic concepts differ in social media references to harmful algae blooms. Journal of Environmental Management, 357. https://doi.org/10.1016/j.jenvman.2024.120799

Pan, L. Y., & Lee, C. T. (2024). How does engagement on social media reinforce life aesthetic literacy? The role of interpersonal and intrapersonal influences. Computers in Human Behavior, 161. https://doi.org/10.1016/J.CHB.2024.108409

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. penguin UK.

Radke, S. C., Krishnamoorthy, R., Ma, J. Y., & Kelton, M. L. (2023). “Your truth isn’t the Truth”: Data activities and informal inferential reasoning. Journal of Mathematical Behavior, 69. https://doi.org/10.1016/j.jmathb.2023.101053

Sampedro-Beneyto, V., Agulló-Torres, A., Del_Campo-Gomis, F. J., & Arias-Navarro, I. (2024). Influence of social factors and environmental behaviour in the knowledge and opinion about circular economy. Futures, 164. https://doi.org/10.1016/J.FUTURES.2024.103490

Soomro, S. e. hyder, Boota, M. W., Zwain, H. M., Soomro, G. e. Z., Shi, X., Guo, J., Li, Y., Tayyab, M., Aamir Soomro, M. H. A., Hu, C., Liu, C., Wang, Y., Wahid, J. A., Bai, Y., Nazli, S., & Yu, J. (2024). How effective is twitter (X) social media data for urban flood management? Journal of Hydrology, 634. https://doi.org/10.1016/j.jhydrol.2024.131129

Stroud, N. J. (2010). Polarization and partisan selective exposure. Journal of Communication, 60(3), 556–576.

Sun, R., Zhu, H., & Guo, F. (2023). Impact of content ideology on social media opinion polarization: The moderating role of functional affordances and symbolic expressions. Decision Support Systems, 164. https://doi.org/10.1016/j.dss.2022.113845

Tucker, J. A., Guess, A., Barberá, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., & Nyhan, B. (2018). Social media, political polarization, and political disinformation: A review of the scientific literature. Political Polarization, and Political Disinformation: A Review of the Scientific Literature (March 19, 2018).

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151.

Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policymaking (Vol. 27). Council of Europe Strasbourg.

Zhang, F. (2023). Virtual space created by a digital platform in the post epidemic context: The case of Greek museums. Heliyon, 9(7). https://doi.org/10.1016/j.heliyon.2023.e18257

Zhou, Z., Zhou, X., Chen, Y., & Qi, H. (2024). Evolution of online public opinions on major accidents: Implications for post-accident response based on social media network. Expert Systems with Applications, 235. https://doi.org/10.1016/J.ESWA.2023.121307

Zollo, F., Novak, P. K., Del Vicario, M., Bessi, A., Mozetič, I., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2015). Emotional dynamics in the age of misinformation. PloS One, 10(9), e0138740.