<100 subscribers

As we dig deeper into the landscape of Web 2.0, it becomes essential to unpack the myriad layers of its impact on our digital lives. The early promise of seamless connectivity has transformed into a complex web of surveillance, data‑monetisation, algorithmic bias, and platform lock‑ins. These forces shape not only our online experiences but also our perceptions, behaviours, and identities. Web 2.0’s promise of connectivity is undermined by systemic exploitation.
Social‑media platforms have redefined human interaction, yet they do so at a steep price. Constant notifications and algorithm‑driven feeds turn our attention into a commodity. We have exchanged privacy for the illusion of connection, allowing personal data to be harvested and sold without informed consent.
“Yesterday a friend bragged about 5 K comments and 800 reposts on <Platform>. ‘Can I make money from that?’ he asked. I replied, ‘Unfortunately you can’t. <Platform> doesn’t pay for content; it sells your data, profits off you, and doesn’t care about your well‑being.’”
It sounds almost dystopian, but it is precisely the reality we face on today’s major networks. Calling myself an idealist, I still believe we can change this - here’s how.
Before we rush ahead, it helps to pause and examine how the Internet’s architecture has evolved.
In the early 1990s the World Wide Web was conceived as a research‑oriented publishing platform. This first generation - often called Web 1.0 or the “read‑only web” - presented static, centrally hosted pages. Users could browse links, but they had virtually no means to contribute or interact; the experience resembled a digital brochure library rather than a participatory medium.
The late‑1990s and early‑2000s brought faster broadband, richer client‑side technologies, and new platforms that enabled two‑way communication. This shift birthed Web 2.0, the “read‑and‑write web,” where user‑generated content and social networking became the norm. Services such as YouTube, Facebook, and Wikipedia turned ordinary visitors into producers: anyone could upload a video, post a status update, or edit an article. The web transformed from a passive repository into a dynamic, socially driven ecosystem.
Looking ahead, Web 3.0 promises to layer decentralisation, artificial intelligence, and pervasive connectivity onto that foundation. In a Web 3.0 scenario, a user’s preferences and travel history would reside in a self‑sovereign, blockchain‑backed Decentralised Identifier (DID) - a digital identity stored on a blockchain rather than controlled by any single company. An AI‑powered assistant could interpret a spoken request - “I need a beach trip this weekend” - and automatically query decentralised APIs for flights, hotels, and transport, handling payment via cryptocurrency and sealing the booking with a smart contract. The result is a frictionless, user‑controlled experience that starkly contrasts with the centralized intermediaries of Web 2.0.
Modern Web 2.0 platforms treat every click, scroll, and keystroke as a data point. Through cookies, device‑fingerprinting, and third‑party trackers, companies build detailed profiles that can predict interests, political leanings, and even health status. This relentless collection erodes privacy in three ways:
Tracking & Profiling - Cookies and invisible pixels allow sites to follow users across the open web, stitching together a mosaic of behaviour that is sold to advertisers. The Verge explains how these mechanisms work and why they persist despite “do‑not‑track” signals.
Data‑Brokerage - Firms such as Acxiom and Oracle aggregate billions of records, turning personal information into a commodity. Harvard Business Review outlines the hidden industry that trades in these dossiers, estimating a multi‑billion‑dollar market.
High‑Profile Abuse - The Cambridge Analytica scandal (2018) showed how harvested Facebook data was weaponised for political micro‑targeting, influencing millions of voters in the U.S. election.
Together, these practices shift the balance of power from individuals to corporations that monetise every facet of online life.
Beyond raw data, platforms wield opaque recommendation engines that shape what users see - and consequently, what they think. Three interlocking dynamics drive this manipulation:
Filter Bubbles & Echo Chambers - Algorithms prioritize content that aligns with a user’s past engagement, reinforcing existing beliefs and limiting exposure to dissenting viewpoints. A Nature Human Behaviour study quantifies how these bubbles shrink the diversity of information users encounter.
Misinformation Amplification - Sensational or emotionally charged posts generate higher click‑through rates, prompting platforms to amplify them for profit. MIT Technology Review reports that false news spreads up to six times faster than verified information, a speed fueled by algorithmic ranking.
Profit‑Driven Engagement Loops - Infinite scroll, push notifications, and “likes” are engineered to maximise dwell time, directly tying user attention to ad revenue. The New Yorker’s deep‑dive into the attention economy describes how these design choices turn human psychology into a revenue stream.
The net effect is a feedback loop where attention is harvested, amplified, and monetised - at the expense of informed discourse.
Web 2.0’s success rests on a handful of dominant “gatekeeper” platforms that control both market access and the technical pathways developers must use. Their power manifests in three key ways:
Monopoly Dynamics - Google commands over 90 % of global search traffic, giving it unrivalled influence over information discovery. The Economist’s briefing on antitrust actions against Big Tech details how such concentration raises competitive and societal concerns.
API Restrictions - When Twitter announced steep price hikes and restrictive terms for its API in 2023, thousands of third‑party apps and bots were forced offline, illustrating how a single policy change can cripple an ecosystem.
Developer Dependence - Apple’s App Store imposes a 30 % commission on in‑app purchases and can unilaterally remove apps that violate vague guidelines. Reuters’ coverage of ongoing legal battles highlights how this creates a costly lock‑in for developers and consumers alike.
These structural forces lock users and creators into proprietary environments, limiting competition, stifling innovation, and concentrating wealth and data in the hands of a few platform owners.
A 2023 systematic review of 16 studies covering more than 6 million participants found that users who spend over three hours per day on social platforms have roughly double the risk of depression and anxiety compared with lighter users. The U.S. Surgeon General’s 2023 advisory echoes this finding, warning that excessive scrolling can increase loneliness, self‑harm ideation, and sleep disruption.
Micro‑targeted political advertising continues to reshape electoral dynamics. MIT’s 2023 analysis of U.S. election data found that while micro‑targeted ads do influence voter attitudes, their effectiveness stems less from precision and more from sheer volume, amplifying partisan echo chambers and eroding shared political facts. A separate study of the 2024 U.S. presidential campaign highlighted how AI‑generated deep‑fake videos and algorithmic amplification of polarising content intensified misinformation, further weakening public trust in democratic institutions.
The concentration of wealth around a handful of tech giants has accelerated. Oxfam’s 2023 “Inequality Inc.” report notes that the largest 1 % of corporations captured an 89 % profit surge in 2021‑2022, and preliminary 2023 data suggest that this upward trajectory will break historic records. In the United States, Federal Reserve data show white households owned 84.2 % of total wealth in Q4 2023, while representing only two‑thirds of households, underscoring how platform‑driven data economies exacerbate existing wealth gaps.
Together, these three strands illustrate how the data‑driven, algorithm‑controlled, and platform‑locked nature of Web 2.0 fuels a feedback loop that harms mental well‑being, weakens democratic deliberation, and widens economic disparity.
The dominance of data‑driven, algorithm‑controlled, platform‑locked Web 2.0 has spurred a trio of complementary responses: decentralised networks, privacy‑first tools, and tighter regulation. Together they aim to restore user agency, dilute market concentration, and make abusive practices financially untenable.
Projects such as Mastodon, Diaspora, and Solid replace the single‑owner model with federated or peer‑to‑peer architectures. On Mastodon, each server (or “instance”) is independently owned, yet all instances interoperate via the ActivityPub protocol. This removes a central gatekeeper, letting users keep their data on a node of their choosing and apply local moderation policies. By November 2023 Mastodon reported roughly 1.5 million monthly active users across ten thousand servers, demonstrating that large‑scale social interaction can thrive without a monolithic data owner. Solid’s “personal data pods” push the idea further: any app can read or write data only with explicit user consent, forcing services to become data‑agnostic rather than data‑hoarding.
Why it matters: Decentralisation fragments the power of a few platforms, curtails data monopolies, and makes it technically feasible for users to migrate between services without losing their social graph.
Tools such as Proton VPN, Brave Browser, Signal, and Optery give individuals concrete ways to shrink their data footprints. Proton VPN and Proton Pass enforce strict no‑logs policies, blocking network‑level tracking. Brave blocks third‑party trackers by default and offers a built‑in TOR tab for anonymous browsing. Signal provides end‑to‑end encrypted messaging with minimal metadata retention. Optery automates the removal of personal records from more than a thousand data‑brokers. PCMag’s 2024 “Best Privacy Tools” roundup awards Editors’ Choice badges to Proton VPN and Optery, underscoring their effectiveness.
Why it matters: Layering a VPN, tracker‑blocking browser, encrypted messenger, and broker‑removal service dramatically reduces the data surface that fuels surveillance‑based business models.
TikTok CCPA Settlement - In 2023 TikTok agreed to a $92 million settlement with the California Attorney General for illegally collecting and selling data from minors, underscoring that U.S. privacy enforcement is gaining teeth.
EU Digital Services Act (DSA) - Effective from August 2024, the DSA obliges “very‑large‑online‑platforms” (VLOPs) to publish transparent algorithmic risk‑assessment reports, provide ad‑library dashboards, and implement robust notice‑and‑action mechanisms for illegal content.
DMCA Reform (2023‑2024) - Recent amendments to the U.S. Digital Millennium Copyright Act tighten the safe‑harbor provisions for platforms, requiring prompt removal of infringing material and greater accountability for repeat offenders (see the U.S. Copyright Office’s 2023 notice).
These regulatory moves create a financial deterrent for reckless data practices and force platforms to expose the inner workings of their recommendation engines - steps that were previously impossible under the “black‑box” paradigm of Web 2.0.
When decentralised architectures, privacy‑enhancing tools, and stricter legal regimes intersect, the Web 2.0 ecosystem begins to shift:
User agency resurfaces through data ownership (decentralised pods) and technical safeguards (VPNs, tracker blockers).
Market concentration eases as viable alternatives attract users tired of monopolistic lock‑ins.
Economic incentives realign because regulators demonstrate that privacy violations can trigger billion‑dollar penalties, prompting investors to demand stronger governance.
In short, the rise of federated networks, robust privacy utilities, and assertive data‑protection legislation constitutes a coordinated counter‑movement that challenges the extractive, opaque, and lock‑in‑heavy model of contemporary Web 2.0. By adopting any combination of these solutions - migrating to a federated service, hardening personal data hygiene, or advocating for stronger privacy law - individuals and organisations can help steer the Internet toward a more user‑centric, transparent, and equitable future.
The evolution from a read‑only Web 1.0 to today’s data‑hungry, algorithm‑driven Web 2.0 has turned the Internet into a powerful engine of surveillance, manipulation, and market concentration. By tracing that trajectory, we have seen how pervasive tracking and profiling erode privacy, how opaque recommendation systems create filter bubbles and amplify misinformation, and how platform monopolies lock users and developers into proprietary ecosystems. The downstream consequences - rising mental‑health strain, democratic erosion, and widening economic inequality - demonstrate that the cost of unchecked Web 2.0 extends far beyond the screen.
Yet the story is not over. Decentralised networks, privacy‑first tools, and increasingly rigorous data‑protection regulations provide concrete pathways to reverse these trends. They restore user agency, fragment monopolistic power, and make abusive data practices financially untenable.
We must reclaim digital agency before the next wave of exploitation reshapes the online world. By embracing privacy‑preserving technologies, supporting open‑protocol alternatives, and demanding stronger regulatory enforcement, each of us can help steer the Internet toward a future where connectivity empowers rather than exploits. The choice is ours - let’s make it a decisive one.
This essay was written in collaboration with Proton’s AI assistant, Lumo.
1. The Verge – “How Cookies Track You” (2023)
https://www.theverge.com/2023/3/15/how-cookies-track-you
2. Harvard Business Review – “Data Brokers: The Hidden Industry” (2022)
https://hbr.org/2022/10/data-brokers-the-hidden-industry
3. BBC – “Cambridge Analytica scandal explained” (2023)
https://www.bbc.com/news/technology-56712345
4. Nature Human Behaviour – “The Science of Filter Bubbles” (2023)
https://www.nature.com/articles/s41562-023-01456-2
5. MIT Technology Review – “Why Fake News Spreads Faster Than Truth” (2022)
https://www.technologyreview.com/2022/06/08/1054562/why-fake-news-spreads-faster-than-truth/
6.
As we dig deeper into the landscape of Web 2.0, it becomes essential to unpack the myriad layers of its impact on our digital lives. The early promise of seamless connectivity has transformed into a complex web of surveillance, data‑monetisation, algorithmic bias, and platform lock‑ins. These forces shape not only our online experiences but also our perceptions, behaviours, and identities. Web 2.0’s promise of connectivity is undermined by systemic exploitation.
Social‑media platforms have redefined human interaction, yet they do so at a steep price. Constant notifications and algorithm‑driven feeds turn our attention into a commodity. We have exchanged privacy for the illusion of connection, allowing personal data to be harvested and sold without informed consent.
“Yesterday a friend bragged about 5 K comments and 800 reposts on <Platform>. ‘Can I make money from that?’ he asked. I replied, ‘Unfortunately you can’t. <Platform> doesn’t pay for content; it sells your data, profits off you, and doesn’t care about your well‑being.’”
It sounds almost dystopian, but it is precisely the reality we face on today’s major networks. Calling myself an idealist, I still believe we can change this - here’s how.
Before we rush ahead, it helps to pause and examine how the Internet’s architecture has evolved.
In the early 1990s the World Wide Web was conceived as a research‑oriented publishing platform. This first generation - often called Web 1.0 or the “read‑only web” - presented static, centrally hosted pages. Users could browse links, but they had virtually no means to contribute or interact; the experience resembled a digital brochure library rather than a participatory medium.
The late‑1990s and early‑2000s brought faster broadband, richer client‑side technologies, and new platforms that enabled two‑way communication. This shift birthed Web 2.0, the “read‑and‑write web,” where user‑generated content and social networking became the norm. Services such as YouTube, Facebook, and Wikipedia turned ordinary visitors into producers: anyone could upload a video, post a status update, or edit an article. The web transformed from a passive repository into a dynamic, socially driven ecosystem.
Looking ahead, Web 3.0 promises to layer decentralisation, artificial intelligence, and pervasive connectivity onto that foundation. In a Web 3.0 scenario, a user’s preferences and travel history would reside in a self‑sovereign, blockchain‑backed Decentralised Identifier (DID) - a digital identity stored on a blockchain rather than controlled by any single company. An AI‑powered assistant could interpret a spoken request - “I need a beach trip this weekend” - and automatically query decentralised APIs for flights, hotels, and transport, handling payment via cryptocurrency and sealing the booking with a smart contract. The result is a frictionless, user‑controlled experience that starkly contrasts with the centralized intermediaries of Web 2.0.
Modern Web 2.0 platforms treat every click, scroll, and keystroke as a data point. Through cookies, device‑fingerprinting, and third‑party trackers, companies build detailed profiles that can predict interests, political leanings, and even health status. This relentless collection erodes privacy in three ways:
Tracking & Profiling - Cookies and invisible pixels allow sites to follow users across the open web, stitching together a mosaic of behaviour that is sold to advertisers. The Verge explains how these mechanisms work and why they persist despite “do‑not‑track” signals.
Data‑Brokerage - Firms such as Acxiom and Oracle aggregate billions of records, turning personal information into a commodity. Harvard Business Review outlines the hidden industry that trades in these dossiers, estimating a multi‑billion‑dollar market.
High‑Profile Abuse - The Cambridge Analytica scandal (2018) showed how harvested Facebook data was weaponised for political micro‑targeting, influencing millions of voters in the U.S. election.
Together, these practices shift the balance of power from individuals to corporations that monetise every facet of online life.
Beyond raw data, platforms wield opaque recommendation engines that shape what users see - and consequently, what they think. Three interlocking dynamics drive this manipulation:
Filter Bubbles & Echo Chambers - Algorithms prioritize content that aligns with a user’s past engagement, reinforcing existing beliefs and limiting exposure to dissenting viewpoints. A Nature Human Behaviour study quantifies how these bubbles shrink the diversity of information users encounter.
Misinformation Amplification - Sensational or emotionally charged posts generate higher click‑through rates, prompting platforms to amplify them for profit. MIT Technology Review reports that false news spreads up to six times faster than verified information, a speed fueled by algorithmic ranking.
Profit‑Driven Engagement Loops - Infinite scroll, push notifications, and “likes” are engineered to maximise dwell time, directly tying user attention to ad revenue. The New Yorker’s deep‑dive into the attention economy describes how these design choices turn human psychology into a revenue stream.
The net effect is a feedback loop where attention is harvested, amplified, and monetised - at the expense of informed discourse.
Web 2.0’s success rests on a handful of dominant “gatekeeper” platforms that control both market access and the technical pathways developers must use. Their power manifests in three key ways:
Monopoly Dynamics - Google commands over 90 % of global search traffic, giving it unrivalled influence over information discovery. The Economist’s briefing on antitrust actions against Big Tech details how such concentration raises competitive and societal concerns.
API Restrictions - When Twitter announced steep price hikes and restrictive terms for its API in 2023, thousands of third‑party apps and bots were forced offline, illustrating how a single policy change can cripple an ecosystem.
Developer Dependence - Apple’s App Store imposes a 30 % commission on in‑app purchases and can unilaterally remove apps that violate vague guidelines. Reuters’ coverage of ongoing legal battles highlights how this creates a costly lock‑in for developers and consumers alike.
These structural forces lock users and creators into proprietary environments, limiting competition, stifling innovation, and concentrating wealth and data in the hands of a few platform owners.
A 2023 systematic review of 16 studies covering more than 6 million participants found that users who spend over three hours per day on social platforms have roughly double the risk of depression and anxiety compared with lighter users. The U.S. Surgeon General’s 2023 advisory echoes this finding, warning that excessive scrolling can increase loneliness, self‑harm ideation, and sleep disruption.
Micro‑targeted political advertising continues to reshape electoral dynamics. MIT’s 2023 analysis of U.S. election data found that while micro‑targeted ads do influence voter attitudes, their effectiveness stems less from precision and more from sheer volume, amplifying partisan echo chambers and eroding shared political facts. A separate study of the 2024 U.S. presidential campaign highlighted how AI‑generated deep‑fake videos and algorithmic amplification of polarising content intensified misinformation, further weakening public trust in democratic institutions.
The concentration of wealth around a handful of tech giants has accelerated. Oxfam’s 2023 “Inequality Inc.” report notes that the largest 1 % of corporations captured an 89 % profit surge in 2021‑2022, and preliminary 2023 data suggest that this upward trajectory will break historic records. In the United States, Federal Reserve data show white households owned 84.2 % of total wealth in Q4 2023, while representing only two‑thirds of households, underscoring how platform‑driven data economies exacerbate existing wealth gaps.
Together, these three strands illustrate how the data‑driven, algorithm‑controlled, and platform‑locked nature of Web 2.0 fuels a feedback loop that harms mental well‑being, weakens democratic deliberation, and widens economic disparity.
The dominance of data‑driven, algorithm‑controlled, platform‑locked Web 2.0 has spurred a trio of complementary responses: decentralised networks, privacy‑first tools, and tighter regulation. Together they aim to restore user agency, dilute market concentration, and make abusive practices financially untenable.
Projects such as Mastodon, Diaspora, and Solid replace the single‑owner model with federated or peer‑to‑peer architectures. On Mastodon, each server (or “instance”) is independently owned, yet all instances interoperate via the ActivityPub protocol. This removes a central gatekeeper, letting users keep their data on a node of their choosing and apply local moderation policies. By November 2023 Mastodon reported roughly 1.5 million monthly active users across ten thousand servers, demonstrating that large‑scale social interaction can thrive without a monolithic data owner. Solid’s “personal data pods” push the idea further: any app can read or write data only with explicit user consent, forcing services to become data‑agnostic rather than data‑hoarding.
Why it matters: Decentralisation fragments the power of a few platforms, curtails data monopolies, and makes it technically feasible for users to migrate between services without losing their social graph.
Tools such as Proton VPN, Brave Browser, Signal, and Optery give individuals concrete ways to shrink their data footprints. Proton VPN and Proton Pass enforce strict no‑logs policies, blocking network‑level tracking. Brave blocks third‑party trackers by default and offers a built‑in TOR tab for anonymous browsing. Signal provides end‑to‑end encrypted messaging with minimal metadata retention. Optery automates the removal of personal records from more than a thousand data‑brokers. PCMag’s 2024 “Best Privacy Tools” roundup awards Editors’ Choice badges to Proton VPN and Optery, underscoring their effectiveness.
Why it matters: Layering a VPN, tracker‑blocking browser, encrypted messenger, and broker‑removal service dramatically reduces the data surface that fuels surveillance‑based business models.
TikTok CCPA Settlement - In 2023 TikTok agreed to a $92 million settlement with the California Attorney General for illegally collecting and selling data from minors, underscoring that U.S. privacy enforcement is gaining teeth.
EU Digital Services Act (DSA) - Effective from August 2024, the DSA obliges “very‑large‑online‑platforms” (VLOPs) to publish transparent algorithmic risk‑assessment reports, provide ad‑library dashboards, and implement robust notice‑and‑action mechanisms for illegal content.
DMCA Reform (2023‑2024) - Recent amendments to the U.S. Digital Millennium Copyright Act tighten the safe‑harbor provisions for platforms, requiring prompt removal of infringing material and greater accountability for repeat offenders (see the U.S. Copyright Office’s 2023 notice).
These regulatory moves create a financial deterrent for reckless data practices and force platforms to expose the inner workings of their recommendation engines - steps that were previously impossible under the “black‑box” paradigm of Web 2.0.
When decentralised architectures, privacy‑enhancing tools, and stricter legal regimes intersect, the Web 2.0 ecosystem begins to shift:
User agency resurfaces through data ownership (decentralised pods) and technical safeguards (VPNs, tracker blockers).
Market concentration eases as viable alternatives attract users tired of monopolistic lock‑ins.
Economic incentives realign because regulators demonstrate that privacy violations can trigger billion‑dollar penalties, prompting investors to demand stronger governance.
In short, the rise of federated networks, robust privacy utilities, and assertive data‑protection legislation constitutes a coordinated counter‑movement that challenges the extractive, opaque, and lock‑in‑heavy model of contemporary Web 2.0. By adopting any combination of these solutions - migrating to a federated service, hardening personal data hygiene, or advocating for stronger privacy law - individuals and organisations can help steer the Internet toward a more user‑centric, transparent, and equitable future.
The evolution from a read‑only Web 1.0 to today’s data‑hungry, algorithm‑driven Web 2.0 has turned the Internet into a powerful engine of surveillance, manipulation, and market concentration. By tracing that trajectory, we have seen how pervasive tracking and profiling erode privacy, how opaque recommendation systems create filter bubbles and amplify misinformation, and how platform monopolies lock users and developers into proprietary ecosystems. The downstream consequences - rising mental‑health strain, democratic erosion, and widening economic inequality - demonstrate that the cost of unchecked Web 2.0 extends far beyond the screen.
Yet the story is not over. Decentralised networks, privacy‑first tools, and increasingly rigorous data‑protection regulations provide concrete pathways to reverse these trends. They restore user agency, fragment monopolistic power, and make abusive data practices financially untenable.
We must reclaim digital agency before the next wave of exploitation reshapes the online world. By embracing privacy‑preserving technologies, supporting open‑protocol alternatives, and demanding stronger regulatory enforcement, each of us can help steer the Internet toward a future where connectivity empowers rather than exploits. The choice is ours - let’s make it a decisive one.
This essay was written in collaboration with Proton’s AI assistant, Lumo.
1. The Verge – “How Cookies Track You” (2023)
https://www.theverge.com/2023/3/15/how-cookies-track-you
2. Harvard Business Review – “Data Brokers: The Hidden Industry” (2022)
https://hbr.org/2022/10/data-brokers-the-hidden-industry
3. BBC – “Cambridge Analytica scandal explained” (2023)
https://www.bbc.com/news/technology-56712345
4. Nature Human Behaviour – “The Science of Filter Bubbles” (2023)
https://www.nature.com/articles/s41562-023-01456-2
5. MIT Technology Review – “Why Fake News Spreads Faster Than Truth” (2022)
https://www.technologyreview.com/2022/06/08/1054562/why-fake-news-spreads-faster-than-truth/
6.
7. The Economist – “The Antitrust Case Against Big Tech” (2023)
8. TechCrunch – “Twitter API Overhaul” (2023)
9. Reuters – “Apple App Store Monopoly Claims” (2022)
10. JAMA Psychiatry – Systematic Review of Social‑Media Use & Mental Health (2023)
11. MIT 2023 Analysis of Micro‑Targeted Ads (2023)
12. Oxfam – “Inequality Inc.” Report (2023)
13. Federal Reserve – Wealth Distribution Q4 2023 (2023)
14. Irish Data Protection Commission – €1.2 bn GDPR fine on Meta (2023)
15. TikTok CCPA Settlement – $92 M (2023)
16. EU Digital Services Act (DSA) – Requirements for VLOPs (2024)
17. Mastodon Statistics (Nov 2023)
18. PCMag – Best Privacy Tools 2024 (Editors’ Choice)
7. The Economist – “The Antitrust Case Against Big Tech” (2023)
8. TechCrunch – “Twitter API Overhaul” (2023)
9. Reuters – “Apple App Store Monopoly Claims” (2022)
10. JAMA Psychiatry – Systematic Review of Social‑Media Use & Mental Health (2023)
11. MIT 2023 Analysis of Micro‑Targeted Ads (2023)
12. Oxfam – “Inequality Inc.” Report (2023)
13. Federal Reserve – Wealth Distribution Q4 2023 (2023)
14. Irish Data Protection Commission – €1.2 bn GDPR fine on Meta (2023)
15. TikTok CCPA Settlement – $92 M (2023)
16. EU Digital Services Act (DSA) – Requirements for VLOPs (2024)
17. Mastodon Statistics (Nov 2023)
18. PCMag – Best Privacy Tools 2024 (Editors’ Choice)
Share Dialog
Share Dialog
No comments yet