Friday, April 18, 2025

How Healthy is Civic Literacy in the U.S.?

 

 

By Lilian H. Hill

 

Do you remember:

·      How many senators serve in the Senate?

·      How many members are in the U.S. House of Representatives?

·      What is the difference between the House and the Senate?

·      How many justices sit on the U.S. Supreme Court?

o   What are their responsibilities?

·      How many branches of government are there?

o   What are their responsibilities?

·      What is the Constitution?

o   Why is it important?

 

You would have learned the answers to these questions if you took Civics in school or studied to pass the test to become a naturalized citizen of the United States. The answers to these questions are all part of the necessary knowledge of civic literacy for American citizens. Other countries have their own required knowledge for civic participation.

 

Definition and Importance

Civic literacy refers to the knowledge, skills, and attitudes that enable individuals to participate actively and responsibly in civic and democratic life. It encompasses knowledge and understanding of government structures, laws, rights, and responsibilities and the ability to analyze social and political issues critically. Civic literacy is not just about knowing how government works; it also includes skills such as:

 

  • Media literacy: evaluating sources of information and recognizing misinformation.
  • Critical thinking: assessing policies, political discourse, and legal frameworks.
  • Civic engagement skills: advocacy, voting, and participating in community initiatives.

 

A civically literate population is essential for a healthy democracy, social progress, and an empowered citizenry. Unfortunately, time dedicated to civic education in American public schools began to decline in the 1960s. For many decades, neither the federal nor state governments have prioritized civics. Additionally, instructional time for civics has decreased as an unintended consequence of shifting educational priorities, such as the emphasis on STEM subjects and policies like No Child Left Behind (Sandra Day O’Connor Institute, 2024).

 

General Population Findings

Recent studies indicate a concerning deficient civic literacy among Americans, reflecting a widespread lack of understanding of fundamental governmental structures and processes.

 

For example, a 2024 study from the U.S. Chamber of Commerce Foundation highlights a troubling reality as America nears its 250th anniversary: the nation's civic knowledge is significantly lacking. The national survey, which gathered responses from 2,000 registered voters, reveals that over 70% of respondents failed a basic civics quiz covering topics such as the three branches of government, the number of Supreme Court justices, and fundamental democratic processes (these are all part of the U.S. Citizenship test). Only half could correctly identify which branch of government is responsible for turning bills into laws. Interestingly, while two-thirds reported taking civics in high school, only 25% felt “very confident” in explaining how the U.S. government functions.

 

The American Bar Association conducted its annual Survey of Civic Literacy for five years. The 2024 Survey highlighted that while 37% of respondents believe the general public should safeguard democracy, nearly two-thirds felt the public is “not very informed” or “not at all informed” about how democracy functions (Smith, 2024).

 

A survey administered to 3,026 undergraduate students by the American Council of Trustees and Alumni (ACTA) found that only 31% of college students could correctly identify James Madison as the Father of the Constitution. Additionally, 60% were unaware of the term lengths for U.S. House and Senate members, and just 27% knew that the Vice President serves as the President of the Senate. Further, ACTA findings indicated that approximately one-third of students could not identify the current Speaker of the House, and many incorrectly believed that the Supreme Court is mandated to have nine justices. ACTA President Michael Poliakoff said,

 

The dismal results of our survey show that current students and recent college graduates have little idea of the American past or its core principles and values, no guide to take them through the roiling controversies facing us today or to enable them to defend and protect the free institutions that are the glory of our nation and an inspiration to the world. They cannot uphold what they do not comprehend. There is so much to be proud of as we near the 250th anniversary of our independence and the birth of our democratic republic. But being the world’s oldest democracy is no guarantee for the future of our democratic republic (para. 4).

 

The Importance of Civic Literacy

Civic literacy offers five distinct advantages:

 

1.    Empowers Informed Decision-Making

A key aspect of civic literacy is equipping individuals with the knowledge to make informed decisions about governance and societal issues. It helps citizens understand political candidates’ platforms, government policies, and legislative changes. Civic literacy enables people to critically evaluate sources of news and distinguish between facts, opinions, and propaganda. It fosters awareness of economic, environmental, and social issues, allowing individuals to make responsible decisions in both personal and public life. Without civic literacy, individuals may be more susceptible to misinformation, manipulation, and political rhetoric that does not align with their best interests.

 

2.    Strengthens Democracy

A functioning democracy relies on the active participation of its citizens. Civic literacy helps to encourage voter participation and engagement in elections; promote accountability by ensuring people understand their rights to petition, protest, and hold leaders responsible; and support the rule of law by ensuring citizens are aware of legal rights, civic duties, and due process. When citizens are uninformed or disengaged, democratic institutions weaken, and the risk of authoritarianism, corruption, and political apathy increases.

 

3.    Promotes Social Responsibility and Community Engagement

Civic literacy fosters a sense of shared responsibility for the well-being of society. This includes encouraging volunteerism, community service, and grassroots activism; understanding and advocating for marginalized or underrepresented communities; and taking part in local governance, such as attending town hall meetings, joining advisory boards, or contributing to civic initiatives. By recognizing how personal actions impact the community, individuals become proactive in solving societal challenges, such as poverty, climate change, and human rights violations.

 

4.    Enhances Critical Thinking and Civil Discourse

In an era of social media and rapid information dissemination, the ability to critically analyze information is crucial. Civic literacy helps individuals engage in respectful, fact-based debates on complex social and political issues; encourages open-mindedness and respect for diverse perspectives; reduces polarization by promoting evidence-based discussions rather than emotional or partisan reactions. This contributes to a more informed and respectful public dialogue, which is essential for social cohesion and policymaking.

 

5.    Encourages Advocacy and Active Civic Engagement

Civic literacy empowers individuals to advocate for meaningful change. It provides knowledge of the legislative process, helping citizens influence policies and laws; skills to organize and mobilize communities around critical issues; and human rights and social justice awareness, encouraging activism to address inequality and discrimination. Civically literate individuals play a crucial role in shaping policies that affect their lives and communities through petitions, protests, and public discussions.

 

Civic literacy is foundational to a thriving, equitable, and resilient society. It empowers individuals to make informed choices, strengthens democratic institutions, fosters community involvement, and cultivates the skills necessary for respectful dialogue and effective advocacy. As our world's challenges grow more complex, the need for an engaged, informed citizenry becomes ever more urgent. Investing in civic education prepares students and adults for lifelong participation in a democratic society where their voices and actions matter.

 

References

Smith, M. (2024, May 1). The link between civics literacy and our threatened democracy. American Bar Association. https://www.americanbar.org/news/abanews/aba-news-archives/2024/05/link-between-civics-and-democracy/

Nietzel, M. T. (2024, July 17). New survey reveals low level of civics literacy among college students. American Council of Trustees and Alumni. Forbes. https://www. goacta.org/2024/07/new-survey-reveals-low-level-of-civics-literacy-among-college-students/?utm_source=chatgpt.com

Sandra Day O’Connor Institute for American Democracy (2024, September). When and why did America stop teaching civics? https://oconnorinstitute.org/wp-content/uploads/When-and-Why-Did-America-Stop-Teaching-Civics_.pdf

U.S. Chamber of Commerce Foundation (2024, February 12). New study finds alarming lack of civic literacy among Americans. https://www.uschamberfoundation.org/civics/new-study-finds-alarming-lack-of-civic-literacy-among-americans

Friday, March 14, 2025

Can Social Media Platforms Be Trusted to Regulate Misinformation Themselves?


 

By Lilian H. Hill

Social media platforms wield immense influence over public discourse, acting as primary sources of news, political debate, and social movements. While they once advertised their policies intended to combat misinformation, hate speech, and harmful content, their willingness and ability to effectively enforce these policies is problematic. The fundamental challenge is that these companies operate as profit-driven businesses, meaning their primary incentives do not always align with the public good. Myers and Grant (2023) commented that many platforms are investing fewer resources in combating misinformation. For example, Meta recently announced that they have ended their fact-checking program and instead will rely on crowdsourcing to monitor misinformation (Chow, 2025). Meta operates Facebook, Instagram, and Threads. Likewise, X, formerly known as Twitter, slashed it trust and safety staff in 2022. Experts worry that diminished safeguards once implemented to combat misinformation and disinformation decreases trust online (Myers & Grant, 2023).  

 

Key Challenges in Self-Regulation

There are four key challenges to social media platforms’ self-regulation: 

 

 

1.    Financial Incentives and Engagement-Driven Algorithms

Social media platforms generate revenue primarily through advertising, which depends on user engagement. Unfortunately, research has shown that sensationalized, misleading, or divisive content often drives higher engagement than factual, nuanced discussions. This creates a conflict of interest: aggressively moderating misinformation and harmful content could reduce engagement, ultimately affecting their bottom line (Minow & Minow, 2023).

 

For example, Facebook’s own internal research (revealed in the Facebook Papers) found that its algorithms promoted divisive and emotionally charged content because it kept users on the platform longer. YouTube has been criticized for its recommendation algorithm, which has in the past directed users toward conspiracy theories and extremist content to maximize watch time. Because of these financial incentives, social media companies often take a reactive rather than proactive approach to content moderation, making changes only when public pressure or regulatory threats force them to act.

 

 

2.    Inconsistent and Arbitrary Enforcement

Even when platforms enforce their policies, they often do so inconsistently. Factors like political pressure, public relations concerns, and high-profile users can influence moderation decisions. Some influential figures or accounts with large followings receive more leniency than average users. For instance, politicians and celebrities have been allowed to spread misinformation with little consequence, while smaller accounts posting similar content face immediate bans. Enforcement of community guidelines can vary across different regions and languages, with content in English often being moderated more effectively than in less widely spoken languages. This leaves many vulnerable communities exposed to harmful misinformation and hate speech (Minow & Minow, 2023).

 

 

3.    Reduction of Trust and Safety Teams

In recent years, many social media companies have cut back on their Trust and Safety teams, reducing their ability to effectively moderate content. These teams are responsible for identifying harmful material, enforcing policies, and preventing the spread of misinformation. With fewer human moderators and fact-checkers, harmful content is more likely to spread unchecked, especially as AI-driven moderation systems still struggle with nuance, context, and misinformation detection (Minow & Minow, 2023).

 

 

4.    Lack of Transparency and Accountability

Social media companies rarely provide full transparency about how they moderate content, making it difficult for researchers, policymakers, and the public to hold them accountable. Platforms often do not disclose how their algorithms work, meaning users don’t know why they see certain content or how misinformation spreads. When harmful content spreads widely, companies often deflect responsibility, blaming bad actors rather than acknowledging the role of their own recommendation systems. Even when they do act, platforms tend not to share details about why specific moderation decisions were made, leading to accusations of bias or unfair enforcement (Minow & Minow, 2023).

 

 

What Can Individuals Do?

Disinformation and “fake news” pose a serious threat to democratic systems by shaping public opinion and influencing electoral discourse. You can protect yourself from disinformation by:

 

1.     Engaging with diverse perspectives. Relying on a limited number of like-minded news sources restricts your exposure to varied viewpoints and increases the risk of falling for hoaxes or false narratives. While not foolproof, broadening your sources improves your chances of accessing well-balanced information (National Center of State Courts, 2025).

 

2.     Approaching news with skepticism. Many online outlets prioritize clicks over accuracy, using misleading or sensationalized headlines to grab attention. Understanding that not everything you read is true, and that some sites specialize in spreading falsehoods, is crucial in today’s digital landscape. Learning to assess news credibility helps protect against misinformation (National Center of State Courts, 2025).

 

3.     Fact-checking before sharing. Before passing along information, verify the credibility of the source. Cross-check stories with reliable, unbiased sources known for high factual accuracy to determine what, and whom, you can trust (National Center of State Courts, 2025).

 

4.     Challenging false information. If you come across a misleading or false post, speak up. Addressing misinformation signals that spreading falsehoods is unacceptable. By staying silent, you allow misinformation to persist and gain traction (National Center of State Courts, 2025).

 

What Can Be Done Societally?

As a society, we all share the responsibility of preventing the spread of false information. Since self-regulation by social media platforms has proven unreliable, a multi-pronged approach is needed to ensure responsible content moderation and combat misinformation effectively. This approach includes:

 

1. Government Regulation and Policy Reform

Governments and regulatory bodies can play a role in setting clear guidelines for social media companies by implementing stronger content moderation laws that can require companies to take action against misinformation, hate speech, and harmful content. Transparency requirements can force platforms to disclose how their algorithms function and how moderation decisions are made. Financial penalties for failure to remove harmful content could incentivize more responsible practices. However, regulation must be balanced to avoid excessive government control over speech. It should focus on ensuring transparency, fairness, and accountability rather than dictating specific narratives (Balkin, 2021).

 

2. Public Pressure and Advocacy

Users and advocacy groups can push social media companies to do better by demanding more robust moderation policies that are fairly enforced across all users and regions. Independent oversight bodies to audit content moderation practices and hold platforms accountable. A recent poll conducted by Boston University’s College of Communications found that 72% of Americans believed it is acceptable for social media platforms to remove inaccurate information. More than half of Americans distrust the efficacy of crowd-source monitoring of social media (Amazeen, 2025). Improved fact-checking partnerships are needed to counter misinformation more effectively.

 

3. Media Literacy and User Responsibility

Since social media platforms alone cannot be relied upon to stop misinformation, individuals must take steps to protect themselves. They can verify information before sharing by checking multiple sources and rely on reputable fact-checking organizations. Other actions can include diversifying news sources and avoiding relyiance on a single platform or outlet for information, reporting misinformation and harmful content by flagging false or dangerous content, and educating others by encouraging media literacy in communities can help reduce the spread of misinformation (Sucui, 2024).

 

Conclusion

Social media companies cannot be fully trusted to police themselves, as their financial interests often clash with the need for responsible moderation. While they have taken some steps to curb misinformation, enforcement remains inconsistent, and recent cuts to moderation teams have worsened the problem. The solution lies in a combination of regulation, public accountability, and increased media literacy to create a more reliable and trustworthy information ecosystem.

 

References

Amazeen, M. (2025). Americans expect social media content moderation. The Brink: Pioneering Research of Boston University. https://www.bu.edu/articles/2025/americans-expect-social-media-content-moderation/

Balkin, J. M. (2021). How to regulate (and not regulate) social media. Journal of Free Speech Law, 1(71), 73-96. https://www.journaloffreespeechlaw.org/balkin.pdf

Chow, A. R. (2025, January 7). Whey Meta’s fact-checking change could lead to more misinformation on Facebook and Instagram. Time. https://time.com/7205332/meta-fact-checking-community-notes/

Minow, & Minow (2023). Social media companies should pursue serious self-supervision — soon: Response to Professors Douek and Kadri. Harvard Law Review, 136(8). https://harvardlawreview.org/forum/vol-136/social-media-companies-should-pursue-serious-self-supervision-soon-response-to-professors-douek-and-kadri/

Myers, S. L., and Grant, N. (2023, February 14). Combating disinformation wanes at social media giants. New York Times. https://www.nytimes.com/2023/02/14/technology/disinformation-moderation-social-media.html

Sucui, P. (January 2, 2024). How media literacy can help stop misinformation from spreading. Forbes. https://www.forbes.com/sites/petersuciu/2024/01/02/how-media-literacy-can-help-stop-misinformation-from-spreading/


Friday, March 7, 2025

How to Report Misleading and Inaccurate Content on Social Media

 



By Lilian H. Hill

 

Misinformation and disinformation, often called "fake news," spread rapidly on social media, especially during conflicts, wars, and emergencies. “Fake news” and disinformation campaigns injure the health of democratic systems because they can influence public opinion and electoral decision-making (National Center for State Courts, n.d.). With the overwhelming content shared on these platforms, distinguishing truth from falsehood has become challenging. This issue has worsened as some social media companies have downsized their Trust and Safety teams, neglecting proper content moderation (Center for Countering Digital Hate, 2023).

 

Users can play a role in curbing the spread of false information. The first step is to verify before sharing that we are mindful of what we amplify and engage with. Equally important is reporting misinformation when we come across it. Social media platforms allow users to flag posts that promote falsehoods, conspiracies, or misleading claims, each enforcing its own Community Standards to regulate content (Center for Countering Digital Hate, 2023).

 

Reporting misleading content on social media platforms is essential in reducing the spread of misinformation. Unfortunately, some platforms fail to act on reported content (Center for Countering Digital Hate, 2023). Nonetheless, users should still report when misinformation and disinformation flood their timelines.

 

Here’s how to report misleading content on some of the most widely used platforms:

1. Facebook

  • Click on the three dots (•••) in the top-right corner of the post.
  • Select "Find support or report post."
  • Choose "False Information" or another relevant category.
  • Follow the on-screen instructions to complete the report.

 

2. Instagram

  • Tap the three dots (•••) in the top-right corner of the post.
  • Select "Report."
  • Choose "False Information" and follow the steps to submit your report.

 

3. X (formerly known as Twitter)

  • Click on the three dots (•••) on the tweet you want to report.
  • Select "Report Tweet."
  • Choose "It’s misleading" and specify whether it relates to politics, health, or other misinformation.
  • Follow the prompts to complete the report.

 

4. TikTok

  • Tap and hold the video or click on the share arrow.
  • Select "Report."
  • Choose "Misleading Information" and provide details if necessary.

 

5. YouTube

  • Click on the three dots (•••) below the video.
  • Select "Report."
  • Choose "Misinformation" and provide any additional details required.

 

6. Reddit

  • Click on the three dots (•••) or the "Report" button below the post or comment.
  • Select "Misinformation" if available or choose a related category.
  • Follow the instructions to submit your report.

 

7. LinkedIn

  • Click on the three dots (•••) in the top-right corner of the post.
  • Select "Report this post."
  • Choose "False or misleading information."

 

8. Threads

  • Click more next to a post.
  • Click Report and follow the on-screen instructions.

 

After reporting, the platform will review the content and take action if it violates their misinformation policies. Users can also enhance efforts by sharing fact-checked sources in the comments or encouraging others to report the same misleading content.

 

References

Center for Countering Digital Hate (2023, October 24). How to report misinformation on social media. https://counterhate.com/blog/how-to-report-misinformation-on-social-media/

National Center for State Courts (n.d.) Disinformation and the Public. https://www.ncsc.org/consulting-and-research/areas-of-expertise/communications,-civics-and-disinformation/disinformation/for-the-public


Friday, February 28, 2025

Who Consumes News on Social Media and Why?

 


By Lilian H. Hill

 

 

Social media has become a key source of news for Americans, with half of U.S. adults reporting that they sometimes rely on it for news, according to a 2023 Pew Research Center survey (Pew Research Center, 2024). A significant majority of U.S. adults (86%) report getting news from a smartphone, computer, or tablet at least occasionally, with 57% saying they do so frequently.

 

People who consume news on social media cite several benefits, including its convenience, rapid updates, and ability to engage with others through discussions and shared content (Pew Research Center, 2024). However, many also express concerns about news accuracy, quality, and political bias on these platforms. Notably, the percentage of users considering misinformation the most significant drawback has risen from 31% to 40% over the past five years.

 

Benefits and Constraints of Social Media News

Getting news through social media offers both advantages and drawbacks. One of its most significant benefits is convenience and accessibility, as it provides instant access to breaking news from anywhere, keeping users informed in real time. Additionally, social media exposes individuals to diverse perspectives, allowing them to access news from independent journalists, global outlets, and citizen reporters. The ability to receive real-time updates ensures that users stay informed as events unfold. Social media also fosters engagement and interactivity, enabling people to comment, share, and discuss news with others, thereby promoting public discourse. Personalization is another advantage, as algorithms curate news based on user preferences, making content more relevant to individual interests. Moreover, social media platforms offer cost-free access to news, bypassing paywalls common on many traditional news websites.

 

However, there are significant downsides to relying on social media for news. One primary concern is the prevalence of misinformation and fake news, as these platforms often host misleading information, deepfakes, and propaganda. Bias and echo chambers also pose a risk, as algorithms reinforce users' beliefs by prioritizing content that aligns with their views, limiting exposure to diverse perspectives. Unlike traditional journalism, many social media sources lack rigorous fact-checking, increasing the risk of spreading inaccurate information. Sensationalism and clickbait are also typical, as platforms prioritize engagement, often amplifying emotionally charged or exaggerated content over factual reporting. Privacy and data concerns are another issue, with social media companies collecting vast amounts of personal data that can be used for targeted advertising or political manipulation. Additionally, the short-form nature of social media news consumption can lead to shallow understanding, as users are less likely to analyze complex issues deeply.

 

In a study, Thorson and Battocchio (2023) explored how young adults in the U.S. shape and manage their personal media environments across digital platforms and the impact of these practices on their news consumption. Based on 50 in-depth interviews with individuals aged 18-34, along with an analysis of their most-used social media platforms, the study highlights the various efforts young users invest in constructing and curating their online presence across both “public” and “private” spaces, with particular focus on the architectural strategies that minimize their exposure to news content.

 

Generational Use of Social Media for News

Different generations consume news from various sources, reflecting technological shifts, media consumption habits, and trust in traditional versus digital platforms. Recent studies by the American Press Institute indicate that while Gen Z and Millennials still engage with local and national news from traditional sources, they are more likely to frequently access news and information through social media (Media Insight Group, 2022). Gen Z consumes news daily on social platforms at a higher rate than older Millennials, with 74% doing so compared to 68% of older Millennials. According to the Pew Research Center (2024), the percentage of Americans who regularly get news from television has remained steady at 33%, while reliance on radio and print publications continues to decline. In 2024, only 26% of U.S. adults reported often or sometimes getting their news in print.

 

However, this does not mean these groups rely exclusively on social media for complete or accurate news coverage (Castle Group, 2025; Pew Research Center, 2024). Many consumers follow news outlets and journalists on social platforms, clicking through to full articles when they appear in their feeds. Some people use a free monthly article allowance or continue researching a story beyond the app where they first encountered it. To maintain audience engagement, news organizations have adapted their approach to social media, moving beyond simple headline previews or article snippets to offer more dynamic and interactive content.

 

Here’s a breakdown of where different age groups typically obtain their news (Pew Research Center, 2024):

 

Baby Boomers, born between 1946 and 1964, primarily rely on television for news, favoring broadcast and cable networks such as CNN, Fox News, and NBC. While they still engage with print newspapers, this habit is declining. They also turn to radio sources like NPR and talk radio for updates and are gradually accessing digital news websites, though at lower rates than younger generations.

 

Generation X, born between 1965 and 1980, splits its news consumption between television and online sources, including news websites and apps. While they engage with social media for news, they tend to be more skeptical than younger generations. Many continue to listen to radio news, especially during commutes, and some still read print newspapers, though digital consumption is on the rise.

 

Millennials, born between 1981 and 1996, prefer online news sources, including digital newspapers, news apps, and streaming news content. They are heavy users of social media platforms such as Facebook, Twitter (X), Instagram, and Reddit for news updates. Increasingly, they rely on podcasts and YouTube for in-depth analysis and alternative viewpoints. Compared to older generations, they are less likely to watch traditional television news or read print newspapers.

 

Generation Z, born between 1997 and 2012, primarily consumes news via social media platforms such as TikTok, Instagram, X (formerly known as Twitter), and Snapchat. They favor short-form video content from influencers, independent journalists, and content creators. Many engage with news aggregators like Apple News and Google News, while traditional television news and print newspapers play a minimal role in their media consumption. Instead, they prefer digital and interactive content that aligns with their fast-paced and visually engaging media habits.

 

Each generation's news consumption habits reflect broader shifts in media technology and trust in different sources. While traditional news outlets still hold influence, digital and social media platforms continue to attract younger audiences. It is too soon to predict social media behavior of Generation Alpha, born between 2010 and 2024, and Generation Beta, born after 2025.

 

Mitigating Problems of Social Media News Consumption

Yaraghi (2019) commented that it is naive to view social media as purely neutral content-sharing platforms without any responsibility, but thinks it is unreasonable to hold them to the same editorial standards as traditional news media. Mitigating the problems associated with social media news content requires a multi-pronged approach involving media literacy, platform accountability, and user responsibility. Improving media literacy is essential, as people need to develop critical thinking skills to evaluate sources, detect bias, and distinguish between credible journalism and misinformation. Encouraging a fact-checking culture by verifying information through reliable sources like Snopes, PolitiFact, or Reuters Fact Check can help reduce the spread of false narratives. Additionally, users should be aware of manipulative tactics such as deepfakes, clickbait headlines, and out-of-context images that contribute to misinformation.

 

Social media platforms must also take responsibility by ensuring greater algorithm transparency, disclosing how they prioritize news content, and implementing measures to reduce the spread of misinformation. Stronger content moderation, powered by both AI and human reviewers, is necessary to flag and remove misleading content while still protecting free speech. Yaraghi (2019) stated that while social media companies can moderate or restrict content on their platforms, they cannot fully control how ideas are shared online or disseminated offline. Clear labeling and warnings for unverified or misleading content, like how X and Facebook sometimes provide context to viral posts, can further help users make informed decisions.

 

Encouraging responsible journalism is another crucial step. Supporting trusted news outlets and prioritizing fact-based reporting over sensationalized headlines can help counteract misinformation. Journalists should also uphold ethical reporting standards by rigorously verifying sources and avoiding the spread of misleading information.

 

Users themselves play a vital role in combating misinformation. Taking a moment to verify news before sharing, especially if it provokes a strong emotional reaction, can prevent the spread of false content. Diversifying news sources rather than relying on a single perspective helps reduce the risk of being trapped in an echo chamber. Additionally, users should actively report misleading content to social media platforms to ensure that misinformation does not gain traction.

 

By combining education, regulation, and individual responsibility, we can foster a more informed and resilient digital society that mitigates the negative impact of social media news content.

 

 

References

 

Castle Group (2025, January 31). How social media, Gen Z, and millennials are changing the news media landscape. https://www.thecastlegrp.com/how-social-media-gen-z-and-millennials-are-changing-the-news-media-landscape/

Media Insight Project (2022, August 22). The news consumption habits of 16- to 40-year-olds. American Press Institute. https://americanpressinstitute.org/the-news-consumption-habits-of-16-to-40-year-olds/

Pew Research Center (2024, September 17). News Platform Fact Sheet. https://www.pewresearch.org/journalism/fact-sheet/news-platform-fact-sheet/

Thorson, K., & Battocchio, A. F. (2023). “I use social media as an escape from all that” Personal platform architecture and the labor of avoiding news. Digital Journalism12(5), 613–636. https://doi.org/10.1080/21670811.2023.2244993

Yaraghi, N. (2019, April 9). How should social media platforms combat misinformation and hate speech? Brookings Institute. https://www.brookings.edu/articles/how-should-social-media-platforms-combat-misinformation-and-hate-speech/


Thursday, February 13, 2025

Digital Architecture of Disinformation

 

By Lilian H. Hill

 

Fake news and disinformation are not new, but their rapid spread is unprecedented. Many individuals struggle to distinguish between real and fake news online, leading to widespread confusion (Hetler, 2025). Disinformation architecture refers to the systematic and strategic methods used to create, spread, and amplify false or misleading information. It involves a combination of technology, human effort, and coordinated tactics to manipulate public opinion, sow discord, or achieve specific political or social goals. This architecture leverages technology, social networks, and psychological manipulation to shape public perception, influence behavior, or achieve specific objectives, such as political, financial, or ideological gains.

 

In the last few decades, Gal (2024) stated that social media platforms have transformed from basic networking sites into influential entities that shape public opinion, sway elections, impact public health, and influence social cohesion. For example, during the recent U.S. presidential election, platforms like X played a key role in disseminating accurate information and misinformation, mobilizing voters, and affecting turnout. Likewise, during the COVID-19 pandemic, social media was instrumental in sharing public health guidelines but also became a hotspot for the spread of misinformation regarding vaccines and treatments.

 

Bossetta (2024) stated that a platform's digital architecture influences political communication on social media, meaning the technical frameworks that facilitate, restrict, and shape user behavior online. This generally refers to what platforms enable, prevent, and structure online communication, such as through likes, comments, retweets, and sharing. Ong and CabaƱes (2018) commented that the basic blueprint of political disinformation campaigns strongly resembles corporate branding strategy. However, political disinformation requires its purveyors to make moral compromises, including distributing revisionist history, silencing political opponents, and hijacking news media attention.

 

The primary goals of disinformation campaigns are political manipulation, social division, economic gains, and the erosion of trust in institutions such as the media, science, and democracy. Their impacts are far-reaching, leading to increased polarization, manipulation of democratic processes, reputational damage, and harm to individuals' mental well-being (Bossetta, 2018).

 

Influence of Disinformation Architecture

Disinformation has far-reaching consequences, including the erosion of trust in key institutions such as journalism, science, and governance. By spreading misleading narratives, it undermines public confidence in credible sources of information. Additionally, disinformation fuels polarization by deepening societal divisions and promoting extreme or one-sided perspectives, making constructive dialogue more difficult. It also plays a significant role in manipulating democracies, influencing elections and policy debates through deceptive tactics that mislead voters and policymakers. Beyond its societal impacts, disinformation can cause direct harm to individuals by targeting their reputations, personal safety, and mental well-being, often leading to harassment, misinformation-driven fear, and public distrust.

 

Components of Disinformation Architecture

Disinformation architecture consists of several key components that manipulate public perception. It begins with reconnaissance, where the target audience and environment are analyzed to tailor the disinformation campaign effectively. Once this understanding is established, the necessary infrastructure is built, including creating believable personas, social media accounts, and groups to disseminate false information. Content creation follows, ensuring a continuous flow of misleading materials such as posts, memes, videos, and articles that support the disinformation narrative.

 

The core aspects of disinformation architecture include content creation, amplification channels, psychological tactics, targeting and segmentation, infrastructure support, and feedback loops. Content creation involves fabricating fake news, manipulating media, and employing deepfake technology to mislead audiences. Amplification is achieved through social media platforms, bot networks, and echo chambers that reinforce biased narratives. Psychological tactics exploit emotions, cognitive biases, and perceived authority to gain trust and engagement. Targeting and segmentation enable microtargeting strategies, exploiting demographic vulnerabilities to maximize influence. Infrastructure support includes data harvesting, dark web resources, and monetization channels that sustain disinformation campaigns. Feedback loops ensure that engagement algorithms prioritize viral and sensationalist content, keeping misinformation in circulation.

 

Amplification is crucial in spreading this content widely, utilizing bots, algorithms, and social-engineering techniques to maximize reach. Engagement is then sustained through interactions that deepen the impact of disinformation, often through trolling or disruptive tactics. Eventually, mobilization occurs, where unwitting users are encouraged to take action, leading to real-world consequences.

 

Mitigation of Disinformation Architecture

To mitigate disinformation, several strategies must be implemented. Regulation and policy measures should enforce platform transparency rules and penalize the deliberate spread of harmful content. According to Gal (2024), because social media platforms play an increasingly central role in information dissemination, ensuring the integrity of that information has become more urgent than ever, making discussions about regulation essential. Given their profound influence on nearly every aspect of society, these platforms should be treated as critical infrastructure—like energy grids and water supply systems—and subject to the same level of scrutiny and regulation to safeguard information integrity. Just as a power grid failure can cause widespread disruption, large-scale social media manipulation can erode democratic processes, hinder public health initiatives, and weaken social trust.

 

Technological solutions like AI-driven detection systems and verification tools can help identify and flag false information. Public awareness efforts should promote media literacy, encouraging individuals to critically evaluate information and question sensationalist narratives (Hetler, 2025). Finally, platform responsibility must be strengthened by modifying algorithms to prioritize credible sources and enhancing content moderation to limit the spread of disinformation. Understanding these mechanisms is essential to developing effective countermeasures against the growing threat of disinformation in the digital age.

 

References

Bossetta, M. (2018). The digital architectures of social media: Comparing political campaigning on Facebook, Twitter, Instagram, and Snapchat in the 2016 U.S. election, Journalism and Mass Communication Quarterly, 95(2), 471–496. https://doi.org/10.1177/1077699018763307

Bossetta, M. (2024, October 16). Digital architecture, social engineering, and networked disinformation on social media. EU Disinfo Lab. Retrieved https://www.disinfo.eu/outreach/our-webinars/webinar-digital-architectures-social-engineering-and-networked-disinformation-with-michael-bossetta/

Gal, U. (2024, November 17). Want to combat online misinformation? Regulate the architecture of social media platforms, not their content. ABC. Retrieved https://www.abc.net.au/religion/uri-gal-online-misinformation-democracy-social-media-algorithms/104591278

Hetler, A. (2025, January 7). 11 ways to spot disinformation on social media. TechTarget. Retrieved https://www.techtarget.com/whatis/feature/10-ways-to-spot-disinformation-on-social-media

Ong, J. C., & CabaƱes, J. V. A. (2018). The architecture of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines. The Newton Tech4Dev Network. Retrieved https://newtontechfordev.com/wp-content/uploads/2018/02/ARCHITECTS-OF-NETWORKED-DISINFORMATION-FULL-REPORT.pdf


How Healthy is Civic Literacy in the U.S.?

    By Lilian H. Hill   Do you remember: ·       How many senators serve in the Senate? ·       How many members ...