Skip to main content

Political Influence & Misinformation

Beyond Personal Addiction

Social media isn’t just affecting individual mental health - it’s reshaping how society thinks, debates, and makes decisions. Understanding these mechanisms is essential for raising informed digital citizens.


How Algorithms Shape Reality

The Filter Bubble

When you engage with content (like, share, watch longer), algorithms show you more similar content. Over time:

  1. You see content that confirms your existing views
  2. You rarely see opposing perspectives
  3. Your worldview becomes reinforced without challenge
  4. Alternative viewpoints seem increasingly foreign

The result: People with different political views literally see different “facts” and “realities.”

The Engagement Trap

Algorithms don’t optimize for truth or balance - they optimize for engagement. What gets engagement?

  • Emotional content (especially anger and fear)
  • Controversial opinions
  • “Us vs. them” framing
  • Simple explanations for complex problems
  • Outrage

Truth is often nuanced and boring. Lies can be exciting.


Misinformation Mechanics

Why Lies Spread Faster

MIT research found:

  • False news reaches 1,500 people 6x faster than true news
  • False news is 70% more likely to be retweeted
  • This isn’t because of bots - humans share misinformation

Why?

  • Novel information surprises us (even if false)
  • Emotional content compels sharing
  • Confirmation bias: we share what we already believe
  • Critical evaluation takes time we don’t spend

Types of Misinformation

Misinformation: False but shared without intent to deceive

  • “I heard that…” rumors
  • Misunderstood statistics
  • Outdated information

Disinformation: Deliberately false, created to deceive

  • Propaganda
  • Fake news sites
  • Manipulated images/videos

Malinformation: True but shared to cause harm

  • Private information leaked
  • Out-of-context quotes
  • Old events presented as current

Impact on Children and Teens

Identity Formation Period

Adolescence is when political and social views form. Exposure to algorithmic content during this period can:

  • Shape worldviews before critical thinking develops
  • Create strong partisan identities early
  • Make openness to other views feel like betrayal
  • Associate identity with online “team”

Emotional Vulnerability

Teen brains are:

  • More responsive to emotional content
  • Less able to critically evaluate sources
  • More influenced by peer (online) approval
  • Quicker to share without verifying

The Radicalization Pipeline

Studies have documented paths from mainstream content to extremism:

  1. Algorithm serves mainstream political content
  2. More engaging (extreme) content gets recommended
  3. User engagement increases with more extreme content
  4. Algorithm serves increasingly extreme content
  5. User ends up in radical echo chambers

This can happen in weeks or months.


Critical Thinking Skills

Question 1: Who Created This?

Teach children to ask:

  • Who made this content?
  • What’s their background?
  • Do they have expertise?
  • What might be their motivation?

Question 2: What’s the Evidence?

  • Are sources cited?
  • Can claims be verified elsewhere?
  • Is this one person’s opinion or documented fact?
  • What evidence would change this view?

Question 3: What’s Missing?

  • What’s the other side of this argument?
  • Who disagrees and why?
  • What context might be relevant?
  • Are there alternative explanations?

Question 4: Why Am I Seeing This?

  • Did I search for this or was it served to me?
  • What might the algorithm think about me?
  • Is this designed to make me emotional?
  • Would sharing this help spread truth or just outrage?

Practical Family Strategies

Regular “Algorithm Audits”

Periodically sit with your child and:

  • Look at their recommended content
  • Discuss why the algorithm might suggest it
  • Notice patterns in what gets recommended
  • Intentionally diversify what they engage with

”Consider the Source” Habit

Before sharing anything, ask:

  • Is this from a reliable source?
  • Have I seen this elsewhere?
  • Am I sharing because it’s true or because I want it to be true?

Expose to Multiple Perspectives

  • Discuss news from different sources together
  • Point out how same event is covered differently
  • Model uncertainty: “I think X, but I could be wrong”
  • Praise changing one’s mind based on evidence

Create Thinking Time

Instant sharing prevents critical thinking:

  • Rule: Wait 24 hours before sharing anything political
  • Ask: “What would someone who disagrees say?”
  • Practice: “I need to think about this more”

Warning Signs

Mild Concern

  • Strong opinions with weak reasoning
  • Uses “everyone knows” or “they say” without sources
  • Gets news primarily from social media

Moderate Concern

  • Dismisses mainstream sources as biased without specifics
  • Only follows/engages with one political viewpoint
  • Becomes upset when views are questioned

Serious Concern

  • Believes in conspiracy theories
  • Sees “enemies” in groups of people
  • Unwilling to consider any opposing evidence
  • Social media is primary source of worldview

Age-Appropriate Conversations

Ages 8-10

  • “Not everything online is true”
  • Practice checking if fun facts are real
  • Simple discussion of “tricks” in advertising

Ages 11-13

  • How social media makes money (attention)
  • Why exciting lies spread faster than boring truths
  • Basic fact-checking habits

Ages 14-16

  • Algorithm mechanics and filter bubbles
  • How political opinions form
  • Recognizing manipulation techniques
  • The value of understanding opposing views

Ages 17+

  • Responsibility as information citizens
  • Influence of social media on democracy
  • Complex discussions about truth and media

Family Media Literacy Practices

Weekly “True or False” Game

  • Find viral stories together
  • Research if they’re true
  • Discuss how to tell
  • Make it fun, not preachy

”Newspaper Front Page” Test

Before sharing anything online, ask: “Would I be embarrassed if this was on a newspaper front page with my name?”

Model Good Behavior

  • Say “I was wrong about X” out loud
  • Share how you fact-checked something
  • Discuss your own filter bubble
  • Admit when something tricked you

Summary

MechanismEffect
Filter bubblesCreates separate realities
Engagement optimizationPromotes emotional/extreme content
Speed of sharingLies spread faster than truth
Adolescent vulnerabilityForms views during critical period
Radicalization pathwaysCan lead to extremism

Key insight: The goal isn’t to tell children what to think - it’s to teach them how to think critically about what they see online. This skill protects against manipulation from any direction.

Tip: Watch the video first, review the slides, then take the quiz to test your knowledge.