Showing posts with label Disinformation. Show all posts
Showing posts with label Disinformation. Show all posts

Friday, March 7, 2025

How to Report Misleading and Inaccurate Content on Social Media

 



By Lilian H. Hill

 

Misinformation and disinformation, often called "fake news," spread rapidly on social media, especially during conflicts, wars, and emergencies. “Fake news” and disinformation campaigns injure the health of democratic systems because they can influence public opinion and electoral decision-making (National Center for State Courts, n.d.). With the overwhelming content shared on these platforms, distinguishing truth from falsehood has become challenging. This issue has worsened as some social media companies have downsized their Trust and Safety teams, neglecting proper content moderation (Center for Countering Digital Hate, 2023).

 

Users can play a role in curbing the spread of false information. The first step is to verify before sharing that we are mindful of what we amplify and engage with. Equally important is reporting misinformation when we come across it. Social media platforms allow users to flag posts that promote falsehoods, conspiracies, or misleading claims, each enforcing its own Community Standards to regulate content (Center for Countering Digital Hate, 2023).

 

Reporting misleading content on social media platforms is essential in reducing the spread of misinformation. Unfortunately, some platforms fail to act on reported content (Center for Countering Digital Hate, 2023). Nonetheless, users should still report when misinformation and disinformation flood their timelines.

 

Here’s how to report misleading content on some of the most widely used platforms:

1. Facebook

  • Click on the three dots (•••) in the top-right corner of the post.
  • Select "Find support or report post."
  • Choose "False Information" or another relevant category.
  • Follow the on-screen instructions to complete the report.

 

2. Instagram

  • Tap the three dots (•••) in the top-right corner of the post.
  • Select "Report."
  • Choose "False Information" and follow the steps to submit your report.

 

3. X (formerly known as Twitter)

  • Click on the three dots (•••) on the tweet you want to report.
  • Select "Report Tweet."
  • Choose "It’s misleading" and specify whether it relates to politics, health, or other misinformation.
  • Follow the prompts to complete the report.

 

4. TikTok

  • Tap and hold the video or click on the share arrow.
  • Select "Report."
  • Choose "Misleading Information" and provide details if necessary.

 

5. YouTube

  • Click on the three dots (•••) below the video.
  • Select "Report."
  • Choose "Misinformation" and provide any additional details required.

 

6. Reddit

  • Click on the three dots (•••) or the "Report" button below the post or comment.
  • Select "Misinformation" if available or choose a related category.
  • Follow the instructions to submit your report.

 

7. LinkedIn

  • Click on the three dots (•••) in the top-right corner of the post.
  • Select "Report this post."
  • Choose "False or misleading information."

 

8. Threads

  • Click more next to a post.
  • Click Report and follow the on-screen instructions.

 

After reporting, the platform will review the content and take action if it violates their misinformation policies. Users can also enhance efforts by sharing fact-checked sources in the comments or encouraging others to report the same misleading content.

 

References

Center for Countering Digital Hate (2023, October 24). How to report misinformation on social media. https://counterhate.com/blog/how-to-report-misinformation-on-social-media/

National Center for State Courts (n.d.) Disinformation and the Public. https://www.ncsc.org/consulting-and-research/areas-of-expertise/communications,-civics-and-disinformation/disinformation/for-the-public


Thursday, February 13, 2025

Digital Architecture of Disinformation

 

By Lilian H. Hill

 

Fake news and disinformation are not new, but their rapid spread is unprecedented. Many individuals struggle to distinguish between real and fake news online, leading to widespread confusion (Hetler, 2025). Disinformation architecture refers to the systematic and strategic methods used to create, spread, and amplify false or misleading information. It involves a combination of technology, human effort, and coordinated tactics to manipulate public opinion, sow discord, or achieve specific political or social goals. This architecture leverages technology, social networks, and psychological manipulation to shape public perception, influence behavior, or achieve specific objectives, such as political, financial, or ideological gains.

 

In the last few decades, Gal (2024) stated that social media platforms have transformed from basic networking sites into influential entities that shape public opinion, sway elections, impact public health, and influence social cohesion. For example, during the recent U.S. presidential election, platforms like X played a key role in disseminating accurate information and misinformation, mobilizing voters, and affecting turnout. Likewise, during the COVID-19 pandemic, social media was instrumental in sharing public health guidelines but also became a hotspot for the spread of misinformation regarding vaccines and treatments.

 

Bossetta (2024) stated that a platform's digital architecture influences political communication on social media, meaning the technical frameworks that facilitate, restrict, and shape user behavior online. This generally refers to what platforms enable, prevent, and structure online communication, such as through likes, comments, retweets, and sharing. Ong and Cabañes (2018) commented that the basic blueprint of political disinformation campaigns strongly resembles corporate branding strategy. However, political disinformation requires its purveyors to make moral compromises, including distributing revisionist history, silencing political opponents, and hijacking news media attention.

 

The primary goals of disinformation campaigns are political manipulation, social division, economic gains, and the erosion of trust in institutions such as the media, science, and democracy. Their impacts are far-reaching, leading to increased polarization, manipulation of democratic processes, reputational damage, and harm to individuals' mental well-being (Bossetta, 2018).

 

Influence of Disinformation Architecture

Disinformation has far-reaching consequences, including the erosion of trust in key institutions such as journalism, science, and governance. By spreading misleading narratives, it undermines public confidence in credible sources of information. Additionally, disinformation fuels polarization by deepening societal divisions and promoting extreme or one-sided perspectives, making constructive dialogue more difficult. It also plays a significant role in manipulating democracies, influencing elections and policy debates through deceptive tactics that mislead voters and policymakers. Beyond its societal impacts, disinformation can cause direct harm to individuals by targeting their reputations, personal safety, and mental well-being, often leading to harassment, misinformation-driven fear, and public distrust.

 

Components of Disinformation Architecture

Disinformation architecture consists of several key components that manipulate public perception. It begins with reconnaissance, where the target audience and environment are analyzed to tailor the disinformation campaign effectively. Once this understanding is established, the necessary infrastructure is built, including creating believable personas, social media accounts, and groups to disseminate false information. Content creation follows, ensuring a continuous flow of misleading materials such as posts, memes, videos, and articles that support the disinformation narrative.

 

The core aspects of disinformation architecture include content creation, amplification channels, psychological tactics, targeting and segmentation, infrastructure support, and feedback loops. Content creation involves fabricating fake news, manipulating media, and employing deepfake technology to mislead audiences. Amplification is achieved through social media platforms, bot networks, and echo chambers that reinforce biased narratives. Psychological tactics exploit emotions, cognitive biases, and perceived authority to gain trust and engagement. Targeting and segmentation enable microtargeting strategies, exploiting demographic vulnerabilities to maximize influence. Infrastructure support includes data harvesting, dark web resources, and monetization channels that sustain disinformation campaigns. Feedback loops ensure that engagement algorithms prioritize viral and sensationalist content, keeping misinformation in circulation.

 

Amplification is crucial in spreading this content widely, utilizing bots, algorithms, and social-engineering techniques to maximize reach. Engagement is then sustained through interactions that deepen the impact of disinformation, often through trolling or disruptive tactics. Eventually, mobilization occurs, where unwitting users are encouraged to take action, leading to real-world consequences.

 

Mitigation of Disinformation Architecture

To mitigate disinformation, several strategies must be implemented. Regulation and policy measures should enforce platform transparency rules and penalize the deliberate spread of harmful content. According to Gal (2024), because social media platforms play an increasingly central role in information dissemination, ensuring the integrity of that information has become more urgent than ever, making discussions about regulation essential. Given their profound influence on nearly every aspect of society, these platforms should be treated as critical infrastructure—like energy grids and water supply systems—and subject to the same level of scrutiny and regulation to safeguard information integrity. Just as a power grid failure can cause widespread disruption, large-scale social media manipulation can erode democratic processes, hinder public health initiatives, and weaken social trust.

 

Technological solutions like AI-driven detection systems and verification tools can help identify and flag false information. Public awareness efforts should promote media literacy, encouraging individuals to critically evaluate information and question sensationalist narratives (Hetler, 2025). Finally, platform responsibility must be strengthened by modifying algorithms to prioritize credible sources and enhancing content moderation to limit the spread of disinformation. Understanding these mechanisms is essential to developing effective countermeasures against the growing threat of disinformation in the digital age.

 

References

Bossetta, M. (2018). The digital architectures of social media: Comparing political campaigning on Facebook, Twitter, Instagram, and Snapchat in the 2016 U.S. election, Journalism and Mass Communication Quarterly, 95(2), 471–496. https://doi.org/10.1177/1077699018763307

Bossetta, M. (2024, October 16). Digital architecture, social engineering, and networked disinformation on social media. EU Disinfo Lab. Retrieved https://www.disinfo.eu/outreach/our-webinars/webinar-digital-architectures-social-engineering-and-networked-disinformation-with-michael-bossetta/

Gal, U. (2024, November 17). Want to combat online misinformation? Regulate the architecture of social media platforms, not their content. ABC. Retrieved https://www.abc.net.au/religion/uri-gal-online-misinformation-democracy-social-media-algorithms/104591278

Hetler, A. (2025, January 7). 11 ways to spot disinformation on social media. TechTarget. Retrieved https://www.techtarget.com/whatis/feature/10-ways-to-spot-disinformation-on-social-media

Ong, J. C., & Cabañes, J. V. A. (2018). The architecture of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines. The Newton Tech4Dev Network. Retrieved https://newtontechfordev.com/wp-content/uploads/2018/02/ARCHITECTS-OF-NETWORKED-DISINFORMATION-FULL-REPORT.pdf


Friday, January 24, 2025

Information Pollution: Determining When Information is Accurate and Meaningful


 

By Lilian H. Hill


Information pollution is the spread of misleading, irrelevant, or excessive information that disrupts people's ability to find accurate and meaningful knowledge. The United Nations defines information pollution as the “spread of false, misleading, manipulated and otherwise harmful information” and further states that it is “threatening our ability to make informed decisions, participate in democratic processes, and contribute to the building of inclusive, peaceful and just societies” (para. 1).

In an earlier blog, we described the information ecosystem, the complex network of processes, technologies, individuals, and institutions involved in creating, distributing, consuming, and regulating information. Like environmental pollution contaminates the physical world, information pollution clutters digital and cognitive spaces, making it difficult to distinguish between useful content and noise. When so much information is false and deceptive, people begin to distrust almost everything in the news.

 

Evolution of the News

The shift of news to social media accelerated changes that are already reshaping journalism. In the 1950s and 1960s, TV news was treated as a public service, and news anchors were considered authoritative. However, by the 1980s, entertainment conglomerates purchasing news stations prioritized profits, leading to the 24-hour news cycle and a focus on attention-grabbing stories. Pundits, offering opinions rather than facts, became prominent, altering the industry and public expectations of news (U.S. PIRG Education Fund, 2023). The PIRG Education Fund states that “misinformation that seems real - but isn’t - rapidly circulates through social media” (para. 1). When anyone with a camera and computer can produce content, the supply of news information becomes virtually limitless, fueling social media feeds with countless 24-hour cycles. Unlike traditional opinion sections or dedicated pundit programs, social feeds blend opinions and facts indiscriminately, where the most sensational stories tend to thrive (U.S. PIRG Education Fund, 2023).

 

Types of Information Pollution

  • Misinformation: Inaccurate or false information shared unintentionally.

Example: Sharing outdated or incorrect medical advice without malicious intent.

  • Disinformation: False information deliberately spread to deceive.

Example: Fake news campaigns or propaganda.

  • Malinformation: Information that is based on reality but is deliberately shared with the intent to cause harm, manipulate, or deceive.

Example: Leaking private messages or emails that are factually accurate but shared publicly to harm someone's reputation or cause embarrassment intentionally.

  • Irrelevant Information: Content that distracts from meaningful or necessary knowledge.

Example: Clickbait articles that prioritize attention over substance.

  • Noise: Poorly organized, redundant, or low-quality data that hampers clarity.

Example: Forums with repetitive threads or unmoderated social media discussions.

 

Consequences of Information Pollution

Misinformation, disinformation, and malinformation, along with the rise of hate speech and propaganda, are fueling social divisions and eroding trust in public institutions. Consequences include cognitive overload, which strains mental resources, leading to stress and poor decision-making. Information pollution breeds mistrust as people struggle to verify the accuracy of available information. They may waste time and energy by trying to sift through low-quality content. Information pollution also increases susceptibility to emotional or ideological manipulation.

 

More consequences include:

  • Erosion of Trust in Institutions. The spread of false or manipulated information undermines public confidence in governments, media outlets, and other institutions. Misinformation can mislead voters, distort public debates, and interfere with fair elections.
  • Polarization and Social Divisions. Polarizing narratives deepen ideological divides, fueling hostility and hindering collaboration between groups. Hate speech and propaganda can push individuals toward extremist ideologies or actions.
  • Public Health Crises. False claims about medical treatments or vaccines can result in public health risks, such as reduced vaccination rates or harmful self-medication practices. Inaccurate information can lead to slow or ineffective responses during pandemics or natural disasters.
  • Economic Impacts. Companies may face reputational harm from false accusations or smear campaigns. Misinformation about investments or markets can lead to significant financial losses.
  • Undermining Knowledge and Education. The prevalence of false information blurs the lines between credible and unreliable sources, making it harder for people to discern the truth. Exposure to misinformation, particularly among younger audiences, can disrupt educational efforts and critical thinking.
  • Psychological and Emotional Toll. Exposure to alarming or false information can heighten public fear and anxiety. Persistent negativity and misinformation can make individuals feel alienated or distrustful of their communities.
  • Threats to National Security. States or organizations can exploit information pollution to destabilize societies or manipulate populations for political or strategic gains. Targeted campaigns can sow confusion during emergencies, hindering coordinated responses.

Mitigating Information Pollution

Addressing these consequences requires robust efforts, including promoting media literacy, enhancing regulation of online platforms, and fostering critical thinking skills to create a more informed and resilient society. Reducing information pollution in specific contexts like education and social media requires targeted strategies that promote clarity, trust, and meaningful engagement.

Strategies for combating information pollution include:

  1. Teach Media Literacy: Integrate critical thinking and fact-checking skills into educational curricula. Encourage students to evaluate sources based on credibility, bias, and evidence.
  2. Simplify and Organize Content: Present information in structured, digestible formats (e.g., summaries, infographics). Avoid overloading students with redundant materials.
  3. Use Curated Resources: Recommend vetted textbooks, articles, and tools. Leverage reputable platforms like Google Scholar or PubMed for research.
  4. Promote Inquiry-Based Learning: Encourage students to ask questions and seek evidence-based answers. Use the Socratic method to stimulate deeper understanding and engagement.
  5. Digital Hygiene Education: Teach students to manage their digital consumption (e.g., limiting screen time, avoiding multitasking). Encourage mindful engagement with technology.

 

References

United Nations Development Programme (2024, February 5). Combating the crisis of information pollution: Recognizing and preventing the spread of harmful information. Retrieved https://www.undp.org/egypt/blog/combating-crisis-information-pollution-recognizing-and-preventing-spread-harmful-information

 U.S. PIRG (Public Information Research Group) Education Fund (2023, August 14). How misinformation on social media has changed news. Retrieved https://pirg.org/edfund/articles/misinformation-on-social-media/


Friday, June 21, 2024

Infodemics: How Misinformation and Disinformation Spread Disease


 

 

By Lilian H. Hill

 

An infodemic refers to an overabundance of information, both accurate and false, that spreads rapidly during an epidemic or crisis, making it difficult for people to find trustworthy sources and reliable guidance. The term is a blend of "information" and "epidemic". It highlights how the proliferation of information can parallel the spread of disease, creating additional challenges in managing the primary crisis. The term rose to prominence in 2020 during the COVID-19 pandemic. During epidemics, accurate information is even more critical than in normal times because people need it to adjust their behavior to protect themselves, their families, and their communities from infection (World Health Organization, 2020).

 

Contradictory messages and conflicting advice can create confusion and mistrust among the public (Borges et al., 2022). An infodemic can intensify or lengthen outbreaks when people are unsure about what they need to do to protect their health and the health of people around them. The situation is so dire that the World Health Organization (2020) published guidance to help individuals, community leaders, governments, and the private sector understand some key actions they can take to manage the COVID-19 infodemic.

 

Characteristics of Infodemics

Infodemics result in more information than most people can process effectively, especially those with low health literacy. With growing digitization, information spreads more rapidly. Alongside accurate information, a significant amount of misinformation (false or misleading information shared without harmful intent) and disinformation (false information deliberately spread to deceive) is disseminated. Information spreads quickly, particularly through interconnected social media and digital platforms, reaching global audiences instantaneously. Infodemics often feature highly emotional, sensational, or alarming content that captures attention but may not be accurate or helpful.

 

Examples of Infodemics

Three global epidemics have occurred in recent memory, each accompanied by infodemics:

 

  1. COVID-19 Pandemic: During the COVID-19 pandemic, an infodemic emerged with vast amounts of information about the virus, treatments, vaccines, and public health measures. This included a significant spread of misinformation and conspiracy theories.

 

  1. Ebola Outbreaks: Past Ebola outbreaks have seen infodemics where misinformation about the disease’s transmission and treatments spread rapidly, complicating response efforts.

 

  1. Zika Virus: The Zika virus outbreak was accompanied by an infodemic, with rumors and false information about the virus’s effects and prevention measures.

 

Understanding and addressing infodemics is crucial for effective crisis management and public health response, ensuring that accurate information prevails and supports informed decision-making by individuals and communities. With human encroachment on natural areas, the likelihood of future epidemics is high (Shafaati et al., 2023).

 

Consequences of Infodemics

The flood of conflicting information can cause confusion, anxiety, and stress, making it hard for individuals to know how to respond appropriately to the crisis. Trust in authorities, experts, and media can be eroded when people encounter inconsistent messages or feel they are being misled. Misinformation can lead to harmful behaviors, such as using unproven treatments, ignoring public health advice, or spreading conspiracy theories. The spread of false information can hamper public health responses and crisis management efforts, as resources may be diverted to combat misinformation instead of focusing solely on the crisis. The plethora of unreliable health information delays care provision and increases the occurrence of hateful and divisive rhetoric (Borges et al., 2022). Infodemics can exacerbate social divisions, as different groups may cling to varying sets of information and beliefs, leading to polarized views and conflicts.

 

Managing Infodemics

Another new term is “infodemiology,” a combination of information and epidemiology. Epidemiology, the study of the distribution of health and disease patterns within populations to use this information to address health issues, is a fundamental aspect of public health. It aims to minimize the risk of adverse health outcomes through community education, research, and health policy development (World Health Organization 2024). Infodemiology is the study of the flood of information and how to manage it for public health. Infodemic management involves systematically applying risk- and evidence-based analyses and strategies to control the spread of misinformation and mitigate its effects on health behaviors during health crises.

 

For example, in their systematic review of publications about health infodemics and misinformation, Borges et al. (2022) commented that “social media has been increasingly propagating poor-quality, health-related information during pandemics, humanitarian crises and health emergencies. Such spreading of unreliable evidence on health topics amplifies vaccine hesitancy and promotes unproven treatments” (p. 556). However, they noted that social media has also been successfully employed for crisis communication and management during emerging infectious disease pandemics and significantly improved knowledge awareness and compliance with health recommendations. For governments, health authorities, researchers, and clinicians, promoting and disseminating reliable health information is essential to counteract false or misleading health information spread on social media.

Image Credit: Anna Shvets, Pexels

 

Strategies for Combating Infodemics

For government officials, public health professionals, and educators, preparation is essential to prevent the next pandemic disaster (Shafaati et al., 2023). Strengthening public health services and investing in research and development for new medications and vaccines are crucial steps. Expanding access to education and resources in vulnerable communities is also necessary to enhance understanding and encourage preventive actions. Additionally, investing in international cooperation is vital to support countries at risk of outbreaks and provide economic assistance to those affected by pandemics.

 

  1. Promoting Accurate Information: Authorities and experts must provide clear, accurate, and timely information. This includes regular updates from trusted sources like public health organizations.

 

  1. Media Literacy: Enhancing public media literacy can help individuals critically evaluate the information they encounter, recognize reliable sources, and avoid sharing unverified claims.

 

  1. Fact-Checking and Verification: Fact-checking organizations and platforms are crucial in verifying information and debunking false claims. Prominent placement of fact-checked information can help correct misconceptions.

 

  1. Algorithmic Adjustments: Social media platforms and search engines can adjust their algorithms to prioritize credible sources and reduce the visibility of misleading content.

 

  1. Collaboration and Coordination: Effective communication and coordination among governments, health organizations, media, and tech companies are essential to manage the flow of information and combat misinformation.

 

  1. Public Engagement: Engaging with communities and addressing their concerns directly can build trust and ensure accurate information reaches diverse audiences. This may include town hall meetings, Q&A sessions, and community-specific communications.

 

Referencesre

Borges do Nascimento, I. J., Pizarro, A. B., Almeida, J. M., Azzopardi-Muscat, N., Gonçalves, M. A., Björklund, M., & Novillo-Ortiz, D. (2022). Infodemics and health misinformation: A systematic review of reviews. Bulletin of the World Health Organization, 100(9):544-561. https://doi.org:10.2471/BLT.21.287654.

Shafaati, M., Chopra, H., Priyanka, Khandia, R., Choudhary, O. P., & Rodriguez-Morales, A. J. (2023). The next pandemic catastrophe: can we avert the inevitable? New Microbes and New Infections, 52, 101110. https://doi.org: 10.1016/j.nmni.2023.101110. 

World Health Organization (2020). Managing the COVID-19 Infodemic: A call for action. Author. https://iris.who.int/bitstream/handle/10665/334287/9789240010314-eng.pdf?sequence=1on

World Health Organization (2024). Let’s flatten the infodemic curve, https://www.who.int/news-room/spotlight/let-s-flatten-the-infodemic-curve

 



Wednesday, January 24, 2024

News Literacy and Its Components

 

Image Credit: Evangeline Shaw, Unsplash

As part of our continuing discussion of different types of literacy, this blog post addresses News Literacy in honor of the 5th Annual National News Literacy Week, January 22 – 26. 

 

News literacy is critically analyzing and evaluating news sources, stories, and information. It involves developing the skills and knowledge necessary to be an informed news consumer. News literacy goes beyond simply being able to access information; it emphasizes the capacity to assess the credibility, reliability, and relevance of news content. Ashley (2022) provides the following definition:

 

News literacy is the critical evaluation of information content as well as the contexts where it is produced and consumed. We can think of news literacy as the set of knowledge, skills, and attitudes that a person brings to their personal consumption of information and to their understanding of the structure of the news media landscape.

 

 

Ashley’s book News Literacy and Democracy (2020) also links news literacy with democracy. He writes, “Democracy is ultimately about citizen participation in the organization of society. We are governed by elected representatives, and because representative government requires an informed citizenry, we need news that gives us an accurate picture of our environment. But the morass of information out in the world today poses a real threat to our ability to govern our societies” (p. 4). Ashley explains that we each have the power to be selective about the information we expose ourselves to, and this ability can shape our perceptions of reality, which in turn influences our behaviors and attitudes. Some people choose to tune out altogether. Indeed, the Digital News Report by Reuter indicates that social networks have become a primary news source for 18-24-year-olds (Eddy, 2022). The report further claims that only 26% of Americans trust news generally.

 

We have traveled far from the days of trusted news anchors such as Walter Cronkite, a CBS news anchor from 1962 to 1981, who was known as the most trusted man in America in the 1960s and 1970s. Instead of a few trusted sources of information, digital media have saturated daily life, making it difficult to distinguish legitimate information from biased, fake, and falsified news. Hornick (2024), writing for the News Literacy Center at Stony Brook University, indicates that: “New technologies to create and share information make it easy to create content that only appears authoritative and then to spread it virally. The conflict between speed and accuracy has been exacerbated by Digital Age demands for delivering information as fast as possible, but accelerating that process increases the chance it will be wrong” (para. 4). While nearly everyone can create and publish media with a laptop or smartphone, the responsibility to be accurate, truthful, and unbiased is not shared. The News Literacy Center provides news literacy lessons for college/university students, community groups, and K-12 students. The website references 18 other organizations concerned about news sites' quality and trustworthiness, including the Media Literacy Clearinghouse and the American Press Institute. 

 

PBS Learning Media for Teachers houses several collections of lessons on news and media literacy. The lessons include videos, blog articles, student handouts, lesson plans, and tip sheets to help students identify, analyze, and investigate the news and information they get from online sources. These lessons are aimed at K-12 students. PBS Learning Media for Teachers and the News Literacy Center link news literacy and democracy.

 

Components of News Literacy

 

1.     Critical Thinking

News literacy encourages individuals to approach information critically. This includes questioning the source, understanding the context, and evaluating the evidence from news stories.

 

2.     Source Evaluation

Understanding where news comes from is crucial. News literacy involves assessing the credibility and reliability of news sources. Differentiating between reputable journalistic sources and unreliable sources is a fundamental skill.

 

3.     Fact-Checking

Fact-checking is an integral part of news literacy. Individuals are encouraged to verify the accuracy of claims and information presented in news stories before accepting them as accurate.

 

4.     Media Bias Awareness

Recognizing and understanding media bias is essential. News literacy helps individuals identify potential biases in news reporting and how they might influence the presentation of information.

 

5.     Contextual Awareness

News stories often need to be understood within their broader context. News literacy involves considering the historical, cultural, and social context in which events are reported.

 

6.     Digital Literacy

With the rise of digital media and online information, news literacy includes digital literacy skills. This involves understanding how information spreads on social media, recognizing online misinformation, and being aware of the potential for manipulation.

 

7.     Diversity of Sources

News literacy emphasizes the importance of seeking information from diverse sources. Exposure to various perspectives helps individuals develop a more comprehensive understanding of issues.

 

8.     Ethical Considerations

News literacy includes an awareness of journalistic ethics. This involves understanding the responsibilities of journalists, respecting the rights of individuals featured in news stories, and recognizing the importance of unbiased reporting.

 

9.     Engagement and Participation

News literacy encourages active engagement with news and current events. This can include participating in discussions, sharing responsibly sourced information, and being an informed citizen.

 

Promoting news literacy is essential in a world where misinformation and disinformation can spread rapidly. By fostering these skills, individuals are better equipped to navigate the complex media landscape and make informed decisions about the information they encounter.

 

References

Ashley, S. (2020). New Literacy and Democracy. Available online https://library.oapen.org/bitstream/id/2ed9be72-d915-4e0f-bc8e-7977d9ae4d56/9780429863073.pdf

Eddy, K. (2022, June 15). The changing news habits and attitudes of younger audiences. Reuter Digital News Report. Retrieved https://reutersinstitute.politics.ox.ac.uk/digital-news-report/2022/young-audiences-news-media

Hornick, R. (2024). Why News Literacy Matters: A NEW LITERACY FOR CIVIL SOCIETY IN THE 21st CENTURY. Retrieved https://digitalresource.center/why-news-literacy-matters

 

How to Report Misleading and Inaccurate Content on Social Media

  By Lilian H. Hill   Misinformation and disinformation, often called "fake news," spread rapidly on socia...