Showing posts with label digital architecture. Show all posts
Showing posts with label digital architecture. Show all posts

Thursday, February 13, 2025

Digital Architecture of Disinformation

 

By Lilian H. Hill

 

Fake news and disinformation are not new, but their rapid spread is unprecedented. Many individuals struggle to distinguish between real and fake news online, leading to widespread confusion (Hetler, 2025). Disinformation architecture refers to the systematic and strategic methods used to create, spread, and amplify false or misleading information. It involves a combination of technology, human effort, and coordinated tactics to manipulate public opinion, sow discord, or achieve specific political or social goals. This architecture leverages technology, social networks, and psychological manipulation to shape public perception, influence behavior, or achieve specific objectives, such as political, financial, or ideological gains.

 

In the last few decades, Gal (2024) stated that social media platforms have transformed from basic networking sites into influential entities that shape public opinion, sway elections, impact public health, and influence social cohesion. For example, during the recent U.S. presidential election, platforms like X played a key role in disseminating accurate information and misinformation, mobilizing voters, and affecting turnout. Likewise, during the COVID-19 pandemic, social media was instrumental in sharing public health guidelines but also became a hotspot for the spread of misinformation regarding vaccines and treatments.

 

Bossetta (2024) stated that a platform's digital architecture influences political communication on social media, meaning the technical frameworks that facilitate, restrict, and shape user behavior online. This generally refers to what platforms enable, prevent, and structure online communication, such as through likes, comments, retweets, and sharing. Ong and Cabañes (2018) commented that the basic blueprint of political disinformation campaigns strongly resembles corporate branding strategy. However, political disinformation requires its purveyors to make moral compromises, including distributing revisionist history, silencing political opponents, and hijacking news media attention.

 

The primary goals of disinformation campaigns are political manipulation, social division, economic gains, and the erosion of trust in institutions such as the media, science, and democracy. Their impacts are far-reaching, leading to increased polarization, manipulation of democratic processes, reputational damage, and harm to individuals' mental well-being (Bossetta, 2018).

 

Influence of Disinformation Architecture

Disinformation has far-reaching consequences, including the erosion of trust in key institutions such as journalism, science, and governance. By spreading misleading narratives, it undermines public confidence in credible sources of information. Additionally, disinformation fuels polarization by deepening societal divisions and promoting extreme or one-sided perspectives, making constructive dialogue more difficult. It also plays a significant role in manipulating democracies, influencing elections and policy debates through deceptive tactics that mislead voters and policymakers. Beyond its societal impacts, disinformation can cause direct harm to individuals by targeting their reputations, personal safety, and mental well-being, often leading to harassment, misinformation-driven fear, and public distrust.

 

Components of Disinformation Architecture

Disinformation architecture consists of several key components that manipulate public perception. It begins with reconnaissance, where the target audience and environment are analyzed to tailor the disinformation campaign effectively. Once this understanding is established, the necessary infrastructure is built, including creating believable personas, social media accounts, and groups to disseminate false information. Content creation follows, ensuring a continuous flow of misleading materials such as posts, memes, videos, and articles that support the disinformation narrative.

 

The core aspects of disinformation architecture include content creation, amplification channels, psychological tactics, targeting and segmentation, infrastructure support, and feedback loops. Content creation involves fabricating fake news, manipulating media, and employing deepfake technology to mislead audiences. Amplification is achieved through social media platforms, bot networks, and echo chambers that reinforce biased narratives. Psychological tactics exploit emotions, cognitive biases, and perceived authority to gain trust and engagement. Targeting and segmentation enable microtargeting strategies, exploiting demographic vulnerabilities to maximize influence. Infrastructure support includes data harvesting, dark web resources, and monetization channels that sustain disinformation campaigns. Feedback loops ensure that engagement algorithms prioritize viral and sensationalist content, keeping misinformation in circulation.

 

Amplification is crucial in spreading this content widely, utilizing bots, algorithms, and social-engineering techniques to maximize reach. Engagement is then sustained through interactions that deepen the impact of disinformation, often through trolling or disruptive tactics. Eventually, mobilization occurs, where unwitting users are encouraged to take action, leading to real-world consequences.

 

Mitigation of Disinformation Architecture

To mitigate disinformation, several strategies must be implemented. Regulation and policy measures should enforce platform transparency rules and penalize the deliberate spread of harmful content. According to Gal (2024), because social media platforms play an increasingly central role in information dissemination, ensuring the integrity of that information has become more urgent than ever, making discussions about regulation essential. Given their profound influence on nearly every aspect of society, these platforms should be treated as critical infrastructure—like energy grids and water supply systems—and subject to the same level of scrutiny and regulation to safeguard information integrity. Just as a power grid failure can cause widespread disruption, large-scale social media manipulation can erode democratic processes, hinder public health initiatives, and weaken social trust.

 

Technological solutions like AI-driven detection systems and verification tools can help identify and flag false information. Public awareness efforts should promote media literacy, encouraging individuals to critically evaluate information and question sensationalist narratives (Hetler, 2025). Finally, platform responsibility must be strengthened by modifying algorithms to prioritize credible sources and enhancing content moderation to limit the spread of disinformation. Understanding these mechanisms is essential to developing effective countermeasures against the growing threat of disinformation in the digital age.

 

References

Bossetta, M. (2018). The digital architectures of social media: Comparing political campaigning on Facebook, Twitter, Instagram, and Snapchat in the 2016 U.S. election, Journalism and Mass Communication Quarterly, 95(2), 471–496. https://doi.org/10.1177/1077699018763307

Bossetta, M. (2024, October 16). Digital architecture, social engineering, and networked disinformation on social media. EU Disinfo Lab. Retrieved https://www.disinfo.eu/outreach/our-webinars/webinar-digital-architectures-social-engineering-and-networked-disinformation-with-michael-bossetta/

Gal, U. (2024, November 17). Want to combat online misinformation? Regulate the architecture of social media platforms, not their content. ABC. Retrieved https://www.abc.net.au/religion/uri-gal-online-misinformation-democracy-social-media-algorithms/104591278

Hetler, A. (2025, January 7). 11 ways to spot disinformation on social media. TechTarget. Retrieved https://www.techtarget.com/whatis/feature/10-ways-to-spot-disinformation-on-social-media

Ong, J. C., & Cabañes, J. V. A. (2018). The architecture of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines. The Newton Tech4Dev Network. Retrieved https://newtontechfordev.com/wp-content/uploads/2018/02/ARCHITECTS-OF-NETWORKED-DISINFORMATION-FULL-REPORT.pdf


Thursday, February 6, 2025

Digital Architecture of Social Media Platforms


 

By: Lilian H. Hill

 

The architecture of an environment is known to influence human behavior. The relationship between structure and agency extends beyond physical spaces and encompasses how individuals engage with and navigate online environments (Bossetta, 2018). How social media platforms are designed and mediated varies, and these differences influence people’s online activities. For example, some social media platforms favor visual communication, while others favor textual communication.

Bosetta (2018) divided the digital architecture of social media platforms into four key categories:

 

1. Network Structure can be defined as the way connections between accounts are established and sustained. It determines how connections between accounts are established and maintained. Social media enables users to connect with peers (“Friends” on Facebook, “Followers” on X [formerly known as Twitter]), as well as with public figures, brands, or organizations, which often operate specialized accounts with advanced tools (e.g., Facebook Pages, Instagram Business Profiles).

 

This structure influences three key aspects:

  1. Searchability – How users discover and follow new accounts.
  2. Connectivity – The process of forming connections. For example, Facebook’s mutual Friend model mirrors offline networks, while X’s one-way following system fosters networks with weaker real-life ties.
  3. Privacy – Users' control over search visibility and connection interactions. Snapchat prioritizes private ties, while platforms like Instagram and X default to open networks but allow customizable privacy settings.

 

These elements shape the platform’s network dynamics, user relationships, and the content generated (Bosetta, 2018).

 

2. Functionality defines how content is mediated, accessed, and distributed on social media platforms. It encompasses five key components:

  1. Hardware Access – Platforms are accessed via devices like mobiles, tablets, desktops, and wearables, influencing user behavior. For instance, tweets from desktops tend to show more civility than those from mobile devices.
  2. Graphical User Interface (GUI) – The visual interface shapes navigation, homepage design, and interaction tools like social buttons (e.g., X Retweets, Facebook Shares), simplifying content sharing.
  3. Broadcast Feed – Aggregates and displays content, varying in centralization (e.g., Facebook's News Feed) and interaction methods (e.g., scrolling vs. click-to-open).
  4. Supported Media – Includes supported formats (text, images, videos, GIFs), size limits (character counts, video length), and hyperlinking rules.
  5. Cross-Platform Integration – Enables sharing of the same content across multiple platforms.

 

These elements shape content creation, network behavior, and platform norms, influencing user expectations and interactions. Political actors, for example, must align with platform-specific norms to avoid appearing out-of-touch or inauthentic, which could harm their credibility and electability.

 

3. Algorithmic Filtering determines how developers prioritize posts’ selection, sequence, and visibility. This involves three key concepts:

  1. Reach – How far a post spreads across feeds or networks, which algorithms can enhance or restrict.
  2. Override – Pay-to-promote services, like Facebook's "boosting," allow users to bypass algorithms and extend a post's reach.
  3. Policy – policies on fact-checking processes are subject to change, which permits the spread of fake news.

 

These factors are most relevant on platforms with one-to-many broadcast feeds (e.g., Facebook, X, Instagram). Platforms focused on one-to-one messaging (e.g., Snapchat, WhatsApp) are less affected by algorithmic filtering. However, when algorithms dictate content visibility, they influence users' perceptions of culture, news, and politics.

 

4. Datafication is how user interactions are transformed into data points for modeling. Every social media interaction leaves digital traces that can be used for advertising, market research, or improving platform algorithms. Maintaining a social media presence in political campaigns is less about direct interaction with voters and more about leveraging user data. Campaigns can analyze digital traces to inform persuasion and mobilization strategies.

 

Kent and Taylor (2021) commented that the design of many social media platforms limits meaningful discussions on complex issues. Deep, deliberative debates on complex problems like climate change or economic inequality are difficult on platforms optimized for advertising and data monetization.


References

Bossetta, M. (2018). The digital architectures of social media: Comparing political campaigning on Facebook, Twitter, Instagram, and Snapchat in the 201 6U.S. election, Journalism and Mass Communication Quarterly, 95(2), 471–496. https://doi.org/10.1177/1077699018763307

 Kent, M. L., & Taylor, M. (2021). Fostering dialogic engagement: Toward an architecture of social media for social change. Social Media + Society, 71(1). https://orcid.org/0000-0001-5370-1896


How to Report Misleading and Inaccurate Content on Social Media

  By Lilian H. Hill   Misinformation and disinformation, often called "fake news," spread rapidly on socia...