đ„Polarized by Design
How technologyâs global influence is dividing us politically and why the fight over data privacy could decide democracyâs future.
I woke up yesterday to the news of a U.S. military operation in Venezuela and the reported exfiltration of NicolĂĄs Maduro and his wife to face trial in New York. Almost instantly, the political script played out online. Right-wing influencers flooded timelines with memes and celebrations. Left-wing voices erupted in outrage over the violation of national sovereignty and rushed to defend Maduro. Centrist commentators took a more ambivalent stance, rejecting the dictatorship while also refusing to endorse one sovereign nation intervening militarily in another.
At first glance, this has nothing to do with privacy or data protection. But watching this reaction unfold in real time made one thing impossible to ignore: polarization has become the default operating system of our digital public sphere. And that polarization doesnât stay confined to geopolitics or ideology. It shapes how technology is built, how platforms moderate speech, how data is collected and weaponized, and ultimately how privacy itself is interpreted, defended, or dismissed depending on which side of the divide you stand on.
Well, itâs easy to check. Open your social media feed and youâll step into a political echo chamber of your own making. The news you see and the ads you get are algorithmically tailored to your clicks and likes, reinforcing what you already believe. From the United States to Europe and beyond, technology has turbocharged political polarization.
Partisan cable news outlets, Fox News on the right, MSNBC on the left, were early drivers of this divide, but social networks have taken it to a new level. By design, platforms like Facebook, Tiktok and YouTube maximize engagement, serving up content that entertains, shocks, and outrages each user. The result is a vicious cycle of confirmation bias: weâre each living in our personalized digital tribe, rarely encountering the other side except as caricatures.
I think that âXâ itâs an exception of it, maybe because it empowers the conflict?
People are losing trust in the government and the media because everyone thinks the other side is lying. Online arguments on apps like X and WhatsApp have replaced calm discussions. Because of this, politicians find it hard to work together, and everyday government work is failing. Sometimes, this online anger leads to real violence. We saw people using social media to organize riots in Washington, Brazil, and India. Social media platforms help spread extreme ideas and fake stories because these things get many clicks. As one expert said:
âthese companies make money from our anger, and our democracy is the one paying for itâ.
I said âone expertâ because I canât remember who and I could not find it, so maybe I never heard about but I think I did. :D
A big part of this problem is our personal data.
Political groups study our online lives to send us very specific messages. In the famous Cambridge Analytica scandal, a company took data from millions of Facebook users without their permission to influence elections. This showed how easily our information can be used to manipulate what we think. We also know that Russian groups bought Facebook ads to create conflict during the 2016 U.S. election. By splitting voters into tiny groups, this technology breaks our shared reality. Even if the news isnât fake, it makes politicians focus only on their fans instead of trying to agree with everyone.
Well, is it obvious? The system actually rewards conflict because it is easier to make people angry than to find common ground.
How can we stop technology from dividing us?
One way is through better privacy laws. (Thaaaatâs what I was trying to reach)
Europe is leading the way. In 2018, the European Union started a law called GDPR, which says privacy is a human right. This law tries to stop companies from collecting too much data for targeted ads and political profiles.
(Yeah I know you know about it, but intro needed)
European leaders hope that if companies canât secretly collect our data, it will be harder for them to manipulate us. However, the United States has been slow to make these kinds of rules. Many Americans care more about free speech and worry more about the government than about tech companies.
This creates a âprivacy gapâ where U.S. tech companies can do whatever they want with our data for money and politics. Meanwhile, countries like China use technology to control people and stop anyone from disagreeing with the government.
Iâm from Brazil and we do have a federal privacy law called LGPD. Is that perfect? Absolutely not yet, but itâs a step closer to whatâs needed.
In the end, technology is both good and bad for democracy. It connects the world, but it also makes the loudest and angriest people the most powerful. I feel I should put this sentence in my wall.
Our society is now controlled by secret computer programs from Silicon Valley. As a programmer and privacy professional I used to be happy about how the internet brought people together, but now I worry about how it pulls us apart.
I believe that this division isnât a mistake; it is exactly how the system was built to work.
And please, I'm not one of those lunatics who think everything is a conspiracy theory, but this algorithmic division seems so obvious to me that it bothers me that the mainstream media doesn't discuss it. Is there a reason for that?
I can think of just four main reasons
1 - Technical Complexity: Explaining exactly how machine learning algorithms work is difficult for quick news segments
2 - Conflict of Interest: Many major media companies depend on social platforms like Facebook, X, and Google
3 - The Pace of Legislation: The debate often gets stuck on legal issues like 'free speech' versus 'censorship.'
4 - The Journalist 'Bubble': Journalists are stuck in these bubbles too. Often, what looks like 'public opinion' to them is just the algorithm showing them exactly what they expect to see.
Until we change the system or create strong privacy laws, we will stay stuck in this cycle of anger. The future of our democracy depends on taking back our data and making sure technology helps society instead of hurting it.
Letâs finish this post with Linkin Park (RIP Chester)
I tried so hard and got so far
But in the end, it doesn't even matter.




The Cambridge Analytica example really underscores how data weaponization works at scale. That observation about the system rewarding conflict because anger is easier than common ground is uncomfortable but accurate. I worked in tech for awhile and the "engagement metrics uber alles" mindset was everywhere, nobody wanted to acknowledge the downstream effects. The point about journalists being stuck in their own algorithmic bubbles explaining why mainstream media doesnt cover this more is really insightful.