Friday, May 30, 2025

Data Rights and Digital Hegemony

 


By Lilian H. Hill

The internet was once imagined as a democratic digital common; however, that notion has been rendered idealistic (Shurety, 2021). Today, mainstream internet use is predominantly governed by a handful of powerful corporations, signaling that cyberspace has also undergone significant privatization. Digital hegemony refers to dominance exercised by a small group of powerful technology companies and states over the digital infrastructure, norms, and data flows that shape global information ecosystems. This form of control extends beyond simple market power; it encompasses the ability to set standards, influence public discourse, and dictate the rules of engagement in cyberspace. Much like cultural or economic hegemony shapes societal values and resource distribution, digital hegemony influences how knowledge is produced, circulated, and monetized, often with limited transparency and accountability.

Digital hegemony intersects directly with data rights, meaning individuals’ and communities’ control over how their personal and collective data are collected, stored, used, and shared. Although AI tools may appear to generate content out of thin air, generative AI systems are built and trained on vast datasets, drawing from extensive collections of images, text, and audio. These systems rely on billions of parameters shaped by complex algorithms that analyze and learn from massive archives of digital information.

In a digitally hegemonic landscape, data are frequently extracted without meaningful consent and commodified by dominant actors, reinforcing asymmetries of power. Citizens often have little recourse or understanding of how their data shape algorithmic decisions, advertising profiles, or political targeting. As scholars like Shoshana Zuboff (2019) argue in The Age of Surveillance Capitalism, this unchecked exploitation of data amounts to a new form of dispossession. Advocating for data rights—including the right to access, delete, and control one's data—is therefore essential to challenging digital hegemony and restoring individuals’ democratic agency.

Monetizing Data

In the digital age, data have become a central asset. Personal information is collected, analyzed, and monetized by both corporations and governments. The commodification of personal data has given rise to growing concerns about privacy, surveillance, and individual autonomy. The concept of data rights has emerged as a response to these concerns, advocating for individuals’ control over their personal information. Verhulst (2022) emphasizes the need for digital self-determination, where individuals have the agency to decide how their data are used and shared. Likewise, Huang and Siddarth (2023) discuss the importance of protecting the digital commons, suggesting that generative AI models trained on public data should contribute back to the communities from which they draw.

The digital realm is also susceptible to more insidious forms of power consolidation. The term digital coup has been used to describe situations where digital platforms or technologies are leveraged to undermine democratic processes. A notable example is Meta's (formerly Facebook) response to Canada's Bill C-18, which aimed to ensure fair compensation for news content shared on digital platforms. In retaliation, Meta restricted access to news content for Canadian users, effectively using its platform's dominance to challenge governmental authority (MacArthur, 2023). Such actions highlight the immense power wielded by tech giants and the potential threats they pose to democratic institutions.

In more extreme cases, digital tools have been employed to facilitate governmental overthrows or suppress dissent. The 2021 military coup in Myanmar saw the junta implementing internet shutdowns, surveillance, and censorship to control the narrative and stifle opposition (Coppel & Chang, 2024). These tactics exemplify how digital technologies can be weaponized to consolidate power and suppress democratic movements. The international community must recognize and address these challenges to safeguard democratic values in the digital era.

Preserving Data Rights

Preserving data rights involves ensuring individuals have meaningful control over how their personal information is collected, used, and shared in digital environments. Legal frameworks play a foundational role in this effort. For example, the European Union’s General Data Protection Regulation (GDPR) provides comprehensive protection, including the rights to access, correct, delete, and restrict the processing of personal data (European Commission, 2016). Similarly, the California Consumer Privacy Act (CCPA) empowers consumers to know what personal information is being collected and to opt out of its sale (California Civil Code § 1798.100, 2018).

Beyond legislation, preserving data rights requires implementing technical and organizational strategies such as privacy by design, where data protection measures are integrated into the development of systems and technologies from the outset (Cavoukian, 2009; Solove, 2025). Another critical principle is data minimization, which means collecting only the data necessary for a specific purpose, thereby reducing the risks of misuse or unauthorized access. Additionally, increasing public awareness and digital literacy helps individuals make informed choices and assert their rights more effectively. Together, legal, technical, and educational approaches form a multi-layered strategy for upholding data rights in the digital age.

Counteracting Digital Hegemony

Counteracting digital hegemony involves resisting the concentrated power that dominant technology corporations and states hold over digital infrastructures, platforms, and user data. Digital hegemony allows a few powerful actors—often multinational tech companies like Google, Meta, and Amazon—to control the flow of information, shape public discourse, and exploit user data for economic and political gain (Couldry & Mejias, 2019). This monopolization raises concerns about surveillance, censorship, and the erosion of democratic processes. To counteract these trends, various strategies have emerged. These include promoting open-source technologies and decentralized networks that reduce dependency on corporate-owned platforms (Zuboff, 2019), enforcing antitrust regulations and data protection laws (Birhane, 2021), and enhancing digital literacy to empower users to navigate and critically engage with online systems (Hintz et al., 2018). Furthermore, advocating for digital sovereignty—where communities and nations assert control over their digital infrastructure and data—is a critical step toward reducing reliance on foreign or corporate technologies (Tomasello, 2023). Ultimately, counteracting digital hegemony involves redistributing digital power, protecting civil liberties, and promoting a more inclusive and equitable digital ecosystem.

 

References

Birhane, A. (2021). Algorithmic injustice: A relational ethics approach. Patterns, 2(2), 100205. https://doi.org/10.1016/j.patter.2021.100205

Brush, H. (2003). Electronic civil disobedience. In Encyclopedia of new media (pp. 167-168). SAGE Publications, Inc., https://doi.org/10.4135/9781412950657.n86

California Civil Code § 1798.100. (2018). California Consumer Privacy Act of 2018. https://codes.findlaw.com/ca/civil-code/civ-sect-1798-100/

Coppel, N., & Chang, L. Y. C. (2024). Coup #4: February 2021 and after. In Myanmar’s digital coup (pp. 23–45). Palgrave Macmillan. https://doi.org/10.1007/978-3-031-58645-3_2SpringerLink

Cavoukian, A. (2009). Privacy by Design: The 7 foundational principles. Information and Privacy Commissioner of Ontario. https://www.ipc.on.ca/sites/default/files/legacy/2018/01/pbd-1.pdf

Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.

European Commission. (2016). General Data Protection Regulation (GDPR). https://eur-lex.europa.eu/eli/reg/2016/679/oj

Hintz, A., Dencik, L., & Wahl-Jorgensen, K. (2018). Digital citizenship in a datafied society. Polity Press.

Huang, S., & Siddarth, D. (2023). Generative AI and the Digital Commons. arXiv. https://arxiv.org/abs/2303.11074arXiv

MacArthur, J. R. (2023, October 1). A Digital Coup. Harper’s Magazine. https://harpers.org/2023/10/a-digital-coup/Harper's Magazine

Shurety, E. (2021, June 17). What happened to electronic civil disobedience? Hyperallergic. https://hyperallergic.com/654595/what-happened-to-electronic-civil-disobedience/

Solove, D. J. (2021). Understanding privacy. Harvard University Press.

Tomasello, F. Digital civics and algorithmic citizenship in a global scenario. Philos. Technol. 36, 39 (2023). https://doi.org/10.1007/s13347-023-00638-3

Verhulst, S. G. (2022). Operationalizing digital self determination. arXiv. https://arxiv.org/abs/2211.08539arXiv

Wikipedia contributors. (2025). Electronic civil disobedience. Wikipedia. https://en.wikipedia.org/wiki/Electronic_civil_disobedienceWikipedia

Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.


Friday, May 23, 2025

Polarization of News Consumption and Narrative Warfare

Image Credit: Jhefferson Santos at Pexels

By Lilian H. Hill

 

The concepts of polarized news consumption, narrative warfare, and information literacy are interconnected in today’s complex media and geopolitical environment. News consumption in modern democratic societies is increasingly polarized, with individuals gravitating toward sources that affirm their existing political beliefs. This behavior fosters ideological echo chambers where alternative viewpoints are seldom encountered, reinforcing confirmation bias and intensifying societal divisions. The proliferation of digital media has amplified this trend. Algorithms on platforms such as Facebook, YouTube, and Google promote engagement by recommending content similar to users' prior behavior, creating what Pariser (2011) describes as “filter bubbles.” Within these insulated media environments, divergent perspectives are not only underrepresented but often distorted or dismissed.

 

This fragmentation undermines a shared public reality—a critical foundation for democratic discourse. Lewandowsky et al. (2017) argue that such environments contribute to a "post-truth" era, in which emotional resonance and identity-aligned narratives prevail over factual accuracy. Misinformation thrives under these conditions, particularly when it reinforces group identities or vilifies out-groups. In these polarized spaces, falsehoods are often not only believed but actively defended, while fact-checking is dismissed as partisan, hindering consensus on crucial issues such as climate change, public health, and election integrity.

 

Partisan media often rely on emotionally and morally charged framing that casts societal issues in stark, binary terms. This framing appeals to deeply held values and group affiliations, prompting individuals to process information through the lens of loyalty rather than reasoned evaluation. As Fricker (2007) explains, this dynamic can lead to epistemic injustice, where individuals are denied fair access to knowledge or their viewpoints are discredited due to identity-based bias. What this emotionally charged representation does not accomplish is to represent the complexity of human life. As a result, public trust in media institutions erodes, and journalism is increasingly perceived not as a truth-seeking endeavor but as a tool for ideological influence.

 

Narrative warfare involves the strategic deployment of stories, symbols, and messages by both state and non-state actors to sway public opinion, legitimize authority, and manipulate public perception. These narratives are disseminated across various platforms, including news outlets, social media, entertainment, and even memes, and often capitalize on existing cultural tensions and ideological rifts (Woolley & Howard, 2019; Miskimmon et al., 2013). In this context, information literacy is a vital civic defense, enabling individuals to assess sources critically, recognize propaganda, and understand the motivations behind messaging campaigns. It also fosters resilience against disinformation, ideologically loaded narratives, and emotionally manipulative content (Mihailidis & Viotty, 2017).

 

Without adequate information literacy, individuals are more vulnerable to misleading stories that provoke fear, anger, or resentment. For instance, during political campaigns or armed conflicts, strategic narratives may be used to legitimize aggression, suppress opposition, or delegitimize dissenting voices (Rid, 2020). These efforts are further magnified by digital algorithms that prioritize sensational content to drive engagement. Consequently, information literacy must extend to digital environments, encompassing an understanding of how platforms function, how algorithms shape content visibility, and how personal data is used for targeted messaging (Gorwa, 2019).

 

Relationship Between Polarization and Narrative Warfare

Polarization and narrative warfare have a reciprocal relationship. In polarized contexts, where institutional trust is low, audiences are more likely to accept narratives that reinforce their worldview and portray others in a negative light. Simultaneously, polarization fosters information disorder, as individuals actively seek out confirmatory content and dismiss contradictory information. Narrative warfare exploits societal divides, using emotionally charged, ideologically targeted messaging to deepen mistrust and entrench ideological silos (Miskimmon et al., 2013; Wardle & Derakhshan, 2017). Social media amplifies this cycle, enabling rapid dissemination of emotionally engaging narratives that further fracture public discourse (Tucker et al., 2018; Rid, 2020).

 

Counteracting Polarization and Narrative Warfare

To address the rising threat of disinformation and digital manipulation, interventions must span education, platform accountability, and public policy. Media and information literacy (MIL) is a frontline defense; studies confirm that MIL helps users identify misinformation, understand algorithmic bias, and develop civic agency (Siegel-Stechler, 2025). Educators and policymakers are increasingly advocating for the integration of MIL in curricula to build long-term societal resilience. In parallel, platforms must take responsibility for transparency in content moderation and algorithmic recommendation systems, a point echoed in regulatory efforts like the EU Digital Services Act (DSA), which mandates transparency reporting for very large platforms (European Commission, 2024).

 

Narrative warfare—the strategic deployment of emotionally resonant stories to distort public understanding—requires not only fact-checking but counter-narratives that engage audiences meaningfully. Bateman and Jackson (2024) argue that effective counter-disinformation strategies combine fact-based messaging with emotionally grounded storytelling tailored to community values. Researchers also warn that algorithmic amplification contributes to political polarization and narrative fragmentation, and advocate for friction-based design (e.g., content warnings, speed bumps) to slow the viral spread of falsehoods. At the policy level, multi-stakeholder approaches, combining regulation, civil society initiatives, and platform cooperation, are essential for defending democratic discourse and reducing the systemic incentives that sustain disinformation ecosystems.

 

Conclusion
In an era marked by deep political divides and the weaponization of information, building robust information literacy is not just an individual skill but a democratic imperative. The interplay between polarization and narrative warfare highlights the urgent need to cultivate critical thinking, media literacy, and intercultural dialogue. Critical thinking enables meaningful participation in public discourse, equips people to resist manipulative narratives, and supports a healthier, more informed democratic culture. Empowering individuals to navigate complex information ecosystems, recognize manipulative storytelling, and engage constructively with diverse perspectives is crucial for preserving democratic values, fostering social cohesion, and maintaining an informed public sphere.

 

References

Bateman, J., & Jackson, D. (2024, January 31). "Countering Disinformation Effectively: An Evidence-Based Policy Guide." Carnegie Endowment for International Peace. https://carnegieendowment.org/research/2024/01/countering-disinformation-effectively-an-evidence-based-policy-guide?lang=en

Fricker, M. (2007). Epistemic injustice: Power and the ethics of knowing. Oxford University Press.

Euopean Commission (2024, July 25. The Digital Services Act package. https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package

Gorwa, R. (2019). The platform governance triangle: Conceptualizing the informal regulation of online content. Internet Policy Review, 8(2).

Hobbs, R. (2021). Mind over media: Propaganda education for a digital age. W. W. Norton.

Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the "post-truth" era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369.

Mihailidis, P., & Viotty, S. (2017). Spreadable spectacle in digital culture: Civic expression, fake news, and the role of media literacies in "post-fact" society. American Behavioral Scientist, 61(4), 441–454.

Miskimmon, A., O’Loughlin, B., & Roselle, L. (2013). Strategic narratives: Communication power and the new world order. Routledge.

Pariser, E. (2011). The filter bubble: What the internet is hiding from you. Penguin Press.

Rid, T. (2020). Active measures: The secret history of disinformation and political warfare. Farrar, Straus and Giroux.

Siegel-Stechler, K., Hilton, K., & Medina, A. (2025, May 12). Youth Rely on Digital Platforms and Need Media Literacy. Center for Media Information for Information and Research on Civic Learning and Engagement. Tufts University. https://circle.tufts.edu/latest-research/youth-rely-digital-platforms-need-media-literacy-access-political-information

Tucker, J. A., Guess, A., Barbera, P., Vaccari, C., Siegel, A., Sanovich, S., ... & Nyhan, B. (2018). Social media, political polarization, and political disinformation: A review of the scientific literature. Hewlett Foundation.

 Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe.

 

Friday, May 16, 2025

Epistemic Rights and Injustice: Critical Skills for Navigating Media Messages

Image Credit: Abenomics on Pexels

By Lilian H. Hill

 

Epistemology is concerned with the theory of knowledge, including its nature, sources, scope, and validity. It addresses fundamental questions, such as what knowledge is, how it is acquired, and how we can distinguish between justified beliefs and unfounded opinions (Audi, 2011). Epistemic rights are based on the idea that society should guarantee that all its citizens can access truthful and accurate information and have the competence to use knowledge for their own benefit and that of society. These rights are part of a broader discourse on epistemic justice (Fricker, 2007). Epistemic inequality refers to the growing divide in access to information, knowledge, and understanding between societal elites and the broader population. This divide has given rise to two contrasting regimes of truth and knowledge: one controlled and interpreted by elites, and the other, often dismissed as disinformation, fake news, or alternative truths, consumed by marginalized and disillusioned segments of society (Nieminen, 2024).

 

Epistemic rights are related to information and media literacy, which equip individuals with critical skills to evaluate sources, identify bias, and navigate complex media, thereby enabling them to exercise their epistemic and democratic rights more effectively. When people lack these literacies, they are more vulnerable to disinformation, manipulation, and epistemic injustice, such as exclusion from meaningful public discourse. Fostering strong media literacy helps promote epistemic justice by empowering individuals to participate in knowledge-sharing processes, challenge dominant narratives, and ensure diverse voices are respected in democratic societies.

 

Epistemic Injustice and Democracy

In the digital age, political media consumption is influenced not only by individual preferences but also by Artificial Intelligence (AI) algorithms that tailor content based on user behavior, often reinforcing ideological echo chambers (Fricker, 2007). When audiences are often exposed to uniform perspectives, marginalized voices are excluded or misrepresented (Noble, 2018). Consistent exposure to media rooted in a single ideology can lead to distorted beliefs and perceptions of others. In the absence of strong information and media literacy skills, individuals are vulnerable to misinformation and epistemic harm (Hobbs, 2021). These patterns compromise individual understanding and pose a threat to a democratic society by allowing the polarization of knowledge and weakening the informed diversity essential to its functioning.

 

Nieminen (2024) comments that democracy is experiencing a downturn across the globe. Even in nations with a longstanding tradition of democratic governance, neo-authoritarian trends are on the rise. Similar patterns have emerged in countries where governments have tightened their grip on the media, limited citizens' freedom of movement, and undermined the independence of the judiciary. While these actions are typical of autocratic regimes found on every continent, political movements with comparable agendas are also gaining traction in countries that are not officially under authoritarian rule.

 

Key Aspects of Epistemic Rights

Epistemic rights refer to the entitlements individuals have regarding their rights to access, seek, use, and share knowledge, and be respected as knowers. These rights include:

 

·      Right to be heard and believed. Individuals have the right to be respected as credible sources of knowledge within their communities and social contexts.

 

·      Right to access knowledge. This includes access to information, education, and other resources that enable individuals to become effective knowers.

 

·      Right to contribute knowledge. People should have the opportunity to participate in knowledge production and dissemination, particularly in cultural or institutional contexts that have historically excluded them.

 

·      Right to epistemic agency. Epistemic agency refers to the capacity to ask questions, offer interpretations, and assert knowledge claims. Denial of this right can occur through silencing or dismissing individuals' perspectives.

 

Examples of Epistemic Injustice

Fricker (2007) identified two primary forms of epistemic injustice. Testimonial injustice occurs when a person’s word is discredited due to their race, gender, accent, or social status. Hermeneutical injustice occurs when individuals lack the language, concepts, or interpretive frameworks necessary to make sense of their own experiences. Both forms of epistemic injustice contribute to the unfair exclusion of individuals or groups from full participation in shared knowledge and understanding. The table below provides examples from the healthcare, education, and media domains that demonstrate the real-world consequences of epistemic injustice.


Confronting Epistemic Injustice

Confronting epistemic injustice requires actively challenging the systems, practices, and assumptions that silence, discredit, or marginalize individuals as knowers. This work can occur at both individual and structural levels.

 

On the individual level, it begins with acknowledging and reflecting on personal biases about who is credible or knowledgeable (Fricker, 2007). Individuals are encouraged to examine who is typically respected in their environments, including workplaces, classrooms, or social settings, and who is overlooked or doubted. Practicing epistemic humility is crucial, which involves recognizing that others may possess valuable knowledge rooted in personal or cultural experiences that differ from one's own (Dotson, 2011).

 

Actively listening to and validating the voices of marginalized individuals is another crucial step. This involves creating space for underrepresented or historically marginalized groups to share their voices, taking their accounts seriously, and seeking to understand rather than challenge their perspectives (Medina, 2013). Clarifying questions deepen understanding rather than interrogating. At the institutional and cultural levels, diversifying knowledge sources is essential. This involves integrating a range of perspectives into curricula, media, and policy, and valuing non-traditional knowledge systems, such as Indigenous, oral, or embodied knowledge, alongside academic or institutional expertise (Kovach, 2009).

 

Another important strategy is to expand language and conceptual frameworks to promote hermeneutical justice. This includes supporting the development of terminology that helps individuals articulate their experiences, such as trauma-informed or inclusive gender language, and broadening public understanding through education, advocacy, and media (Carel & Kidd, 2014). Creating inclusive decision-making spaces is also vital. Marginalized communities must be included in decisions that affect them, and these processes should be transparent so that power dynamics and the value placed on different types of knowledge can be critically assessed and adjusted.

 

Ultimately, holding institutions accountable is crucial for achieving systemic change. This involves challenging policies and practices that perpetuate testimonial injustice, such as dismissing patient complaints or discriminatory hiring practices (Crenshaw, 1991). Structural changes redistribute epistemic authority, such as implementing community advisory boards, reforming peer review processes, or adopting more inclusive research funding criteria (Harding, 1991). Together, these actions contribute to a more just and equitable distribution of knowledge and voice in society.

 

References

Audi, R. (2011). Epistemology: A contemporary introduction to the theory of knowledge (3rd ed.). Routledge.

Carel, H., & Kidd, I. J. (2014). Epistemic injustice in healthcare: A philosophical analysis. Medicine, Health Care and Philosophy, 17(4), 529–540. https://doi.org/10.1007/s11019-014-9560-2

Crenshaw, K. (1991). Mapping the margins: Intersectionality, identity politics, and violence against women of color. Stanford Law Review, 43(6), 1241–1299.

Dotson, K. (2011). Tracking epistemic violence, tracking practices of silencing. Hypatia, 26(2), 236–257. https://doi.org/10.1111/j.1527-2001.2011.01177.x

Fricker, M. (2007). Epistemic injustice: Power and the ethics of knowing. Oxford University Press.

Harding, S. (1991). Whose science? Whose knowledge? Thinking from women's lives. Cornell University Press.

Kovach, M. (2009). Indigenous methodologies: Characteristics, conversations, and contexts. University of Toronto Press.

Nieminen, H. (2024). Why we need epistemic rights. In M. Aslama Horowitz, H. Nieminen, K. Lehtisaari, & A. D'Arma. (Eds.), Epistemic rights in the era of digital disruption. Global transformations in media and communication research (pp. 11-28). Palgrave Macmillan. https://doi.org/10.1007/978-3-031-45976-4_2

Friday, May 2, 2025

Euphemism Use vs. Saying What You Really Mean

 


By Lilian H. Hill

 

Euphemisms are mild or indirect expressions that soften the harshness or bluntness of reality and often reflect cultural sensitivities and societal norms. While they can serve a compassionate role by helping to protect feelings or maintain social decorum, they are also used to conceal uncomfortable truths, obscure responsibility, or sanitize morally questionable actions. We often use this kind of language to be tactful, polite, and to reduce confrontation or negativity—cooperative strategies that are generally positive for communication. Avoiding a direct linguistic approach makes the language seem more neutral and objective, creating a sense of distance from personal involvement (Luu, 2020).

 

Unfortunately, euphemisms can be obfuscation tools in political, corporate, and everyday conversation. Governments might refer to civilian casualties as “collateral damage,” corporations may describe mass layoffs as “rightsizing,” and healthcare providers might call death a “negative health outcome.” These substitutions can distance the speaker and the audience from the emotional or ethical weight of what is being described, reducing the potential for public outrage, guilt, or resistance (Davis, 2025). One method of creating euphemisms is to use passive voice so that the actor is concealed. For example, passive constructions (e.g., “she was found dead from the gunfire,” “mistakes were made”) can obscure who is responsible, making actions seem as if they occur without human agency. Other linguistic strategies, like existential constructions (“there was a shooting”) or transforming active verbs into impersonal nominalizations (such as “incarceration”), similarly deflect attention from the actors involved. Euphemisms may be nebulous, long-winded, or employ non-specific comparisons. These techniques are common in technical jargon, often minimizing the perceived impact of the actions themselves (Luu, 2020).

 

Euphemisms can obscure racist intentions or actions, making them appear less overtly racist and, therefore, more palatable to a wider audience. Code words and euphemisms allow individuals to express racist ideas without explicitly using racist language, providing them with a degree of “plausible deniability.” The repeated use of euphemisms can normalize racist concepts, subtly reinforcing prejudiced attitudes and beliefs over time (Wexler, 2020). 

 

Euphemisms exist on a continuum: at one end, they are acts of empathy; at the other, acts of deception. Euphemisms can desensitize people or help authorities evade accountability by masking the real nature of events. The ethical tension lies in whether they are used to protect the vulnerable or to shield the powerful. Language can reshape our emotional, ethical, or political responses to serious issues. When euphemisms conceal, they don't simply reframe reality; they can fundamentally distort it. They create a linguistic buffer between action and consequence, potentially delaying necessary confrontation with injustice, failure, or harm. They can shape mentalities, societal values, and worldviews (Csathó, 2024). Over time, habitual euphemistic language can erode trust, making communication seem insincere or manipulative.

 

Influence of Euphemisms and Plain Language Compared

The following table compares the influence of euphemisms with plain language based on several aspects:


Words are powerful, and their impact has only grown as technological advancements make communication faster and more widespread.

 

Euphemism Use vs. Plain Language

Euphemisms are often used when speakers want to protect themselves from legal liability, political fallout, public anger, or to minimize emotional disturbance for listeners. The cost is that they can erode public trust, obscure facts, and delay justice or informed decision-making. The Plain Language Program, formally approved by the U.S. government in the Plain Writing Act of 2010, aims to strip away bureaucratic jargon and misleading terms to make communication accessible and truthful. The mandate emphasizes writing that is clear, concise, and well-organized; avoiding jargon and overly complex sentences; and bureaucratic terminology so that the intended audience can comprehend and act confidently based on the information. Underlying this initiative is a commitment to government transparency, accountability, and accessibility because democracy depends on informed participation. Government agencies are now required to train staff, maintain compliance, and regularly review communications to meet plain language standards.

 

Confronting Euphemistic Speech

Confronting euphemistic language involves recognizing when words are being used to obscure meaning and actively working to uncover and name realities more directly. Euphemisms often arise when the truth is uncomfortable (e.g., war, injustice, racial discrimination, corporate failures, or public health crises). While they can soften emotional blows, they frequently serve to minimize accountability, urgency, or harm (Luu, 2020). The Associated Press updated its guidance to promote a stronger approach by encouraging reporters to directly identify racism and provide context, helping readers understand why a statement or system is considered racist. Relying on euphemisms weakens the message and can be especially damaging when used to soften the portrayal of racist remarks made by those in power.

 

Confronting euphemistic speech effectively requires the following actions:

  • Educate ourselves and others about the history and impact of euphemisms and coded language. 
  • Identify the euphemism. Listen for vague or softened expressions that seem to sidestep who is responsible or what really happened.
  • Ask clarifying questions. Push for concrete details: Who? What exactly? How much? When?
  • Restate plainly. Translate euphemistic phrases into clear, direct language to expose the core reality.
  • Name the stakes. Highlight why clarity matters for justice, public safety, informed decision-making, or ethical action.
  • Directly naming and calling out racism when it occurs, without resorting to euphemisms. 
  • Challenging and disrupting the normalization of racist ideas and language. 

 

Challenging euphemistic language is not just about semantics; it’s about reclaiming honest communication, promoting accountability, and ensuring serious issues are neither diluted nor ignored.

 

References

Csathó, Z. L. (2024, September 26). Euphemisms in everyday language: A linguistic perspective on their role in shaping thought, society, and therapeutic reframing. Medium. https://zitalucacsatho.medium.com/euphemisms-in-everyday-language-a-linguistic-perspective-on-their-role-in-shaping-thought-9f0d1b28653e

Davis, B. (2025, February 13). How to blur the lines: Euphemism and erosion. Democratic Erosion Consortium. https://democratic-erosion.org/2025/02/13/euphemism-and-erosion/

Luu, C. (2020, September 30). The ethical life of euphemisms. JSTOR Daily. https://daily.jstor.org/the-ethical-life-of-euphemisms/

Wexler, C. (2020, September 23). (Wexler, 2020). Mainstream media need to stop using euphemisms to describe Trump’s racism. Media Matters. https://www.mediamatters.org/new-york-times/mainstream-media-need-stop-using-euphemisms-describe-trumps-racism


 

 

 


Information Warfare, Virtual Politics, and Narrative Dominance

  By Lilian H. Hill As the Internet becomes more advanced, it is giving rise to new challenges for democracy. Social me...