We are looking back at the past web iterations, reflecting on the present decentralized network, and exploring new possibilities of Web 3.0.


Share Dialog
Share Dialog
We are looking back at the past web iterations, reflecting on the present decentralized network, and exploring new possibilities of Web 3.0.

Subscribe to DeSoc

Subscribe to DeSoc
Today, we are often amazed at how much Web 2.0 has changed the way we interact with others and the world. Previously, in the Web 1.0 stage, we consumed content in a relatively passive manner, limited to one-way information dissemination. But in the current Web 2.0 era, we have acquired a new identity - "prosumers", which means we are empowered to produce and publish user-generated content and contribute to the global information base with our ideas. The high level of participation and interactivity that Web 2.0 has delivered to us seems to give us the illusion that we are unlimited and free in such a virtual world. This illusion, however, will be eventually dashed as we understand that the power structure is only being masked rather than disappearing. Our thinking patterns and behaviors are controlled, even manipulated, in ways that are difficult to perceive.
(VIDEO) Beware online “filter bubbles” | Eli Pariser
One notable phenomenon is “filter bubbles”. As defined by Eli Pariser, your filter bubble is your “personal, unique universe of information”, which is generated by the “personalization” function of algorithms. For example, imagine that when you are using Instagram, you constantly find yourself being recommended “users you might want to follow” or “posts and photos you might like”. You notice that these recommendations always satisfy your needs and grab your interest, so you keep clicking on them. Soon after, you will unconsciously create a filter bubble on Instagram, where you can see only the desired users and information you are more likely to agree with.
Of course, this is not magic— perfect matching and personalized recommendations are made possible by intelligent algorithms. These algorithms sort, filter, and select specific content based on your previous clicking habits and behaviors, and then recommend it to you, predicting what you would like to see and agree on. We all know that social media platforms and search engines are utilizing such algorithms, and we as users seem to be familiar enough with this process as well. However, the problem is that we do not know what gets into the bubble and what gets edited out. And it is also hard to perceive that our thoughts are being controlled by these algorithms, or more specifically, by the small group of companies that utilize them.

The invisible control can be reflected in the following two ways— isolation and bias.
(VIDEO) How Filter Bubbles Isolate You
Isolation occurs when we are more likely to see desirable information and less likely to see undesirable ones in filter bubbles. This mechanism leads us to one extreme of subjectivity since we have no choice but to hear only one voice for quite a long time. As a result, we become increasingly unwilling to accept information and ideas that disagree with our points of view. Then we will inevitably be isolated in our own cultural and ideological coteries. However, because everyone can become the content creator and information disseminator in the Web 2.0 era, can we make sure that the only-one voice we hear is more good than bad? What if this voice was deliberately created by those with greater power in order to shape our thoughts and make us subordinate to their purposes?
This is where bias comes from.

Algorithms do not naturally exist. They are written and modified by the companies behind them. Under this circumstance, we need to acknowledge that the process of selecting and filtering information made by algorithms is not always transparent. Conversely, the “personalization” function of algorithms may become a tool for those with greater power to affect and even manipulate our viewpoints, particularly in the field of politics. The presidential election is a notable example. Nowadays, voters spend more time on the Internet searching for information about candidates. This gives search engines a chance to bias these people’s judgments. For instance, once a voter clicks on the entry supporting Candidate A, he will constantly be recommended similar content in favor of Candidate A, and meanwhile, be less likely to see other opposite information such as the criticism of Candidate A or the praise for Candidate B. Shortly after, this undecided voter becomes a supporter of Candidate A. By adjusting search rankings and recommending selected content, search engines can invisibly favor particular candidates and manipulate the public perception of the election. What’s worse, according to Epstein and Robertson, “such manipulations are difficult to detect, and most people are relatively powerless when trying to resist sources of influence they cannot see.”
As stated in the previous post, Web 3.0 is considered a Web iteration without intermediaries. Therefore, we are very curious whether, without an intermediary that has the power to manipulate and distort people’s thoughts, Web 3.0 will be able to reduce the negative effects of filter bubbles and empower individuals to view the world in a more comprehensive and unbiased way.

Another way of manipulation in this digital age is surveillance, which can be mainly presented in censorship. The advancement of Web 2.0 has actually facilitated this political influence. Before the emergence of new media, censorship was basically enforced by the government directly. For instance, Nazi Party prohibited Jazz music as degenerate music, and they also produced literature and films to conduct propaganda of their ideas further.
(VIDEO) Censorship in Nazi Germany
But now, Big Brother is not only watching us but also ready to shut our mouths anytime. An evident example would be Weibo, the Chinese social media platform. Weibo is characterized by user-generated content, convenient usage, and high interactivity and fits the definition of Web 2.0. Though these features of Weibo have created a participatory space, users’ expression is still limited and manipulated. For instance, each blog post is examined by the Weibo censorship apparatus. Any sensitive content such as politics, LGBTQ+, and depressive ideas will usually be barred in various ways. Some posts might be held for approval, some posts might be posted in the beginning yet deleted or published in private view later. For those influential users, once they initiate sensitive discussions, they might be prevented from posting anything at all, or their accounts might be deactivated. Moreover, users’ current location and comments on political news are automatically presented on their account webpage, which extremely invades their privacy and indirectly warns them to speak cautiously online.
Unfortunately, this power structure is often internalized by people. As a result, Weibo users tend to employ homophones of censored keywords to avoid detection by keyword matching algorithms, such as using pinyin, English, or a reverse image to replace Chinese characters in the text. Despite looking like an act of rebellion, this phenomenon in fact reflects how users often discipline themselves due to the external factors brought by data surveillance. Self-censorship is exactly one of the adverse consequences of centralized Web 2.0.

Therefore, it is reasonable to conclude that the seemingly open Web 2.0 has an invisible power structure within it, as it renders users as passive receivers and only gives them limited freedom to utter and disseminate. Essentially, Web 2.0 is still a machine that instills a singular ideology in people to better control them.
Though people cannot totally break away from the political system or new media technologies, a more innovative network that brings subjectivity to people is needed. It is crucial for us to think about how Web 3.0 could serve as a tool to decrease the centralization impacts exerted by the government. In that way, rather than being "Cyber Refugees," netizens will be able to build their own realms and wander around with their packages.
Works Cited
Epstein, Robert, and Ronald E. Robertson. "The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections." Proceedings of the National Academy of Sciences, 4 Aug. 2015, https://www.pnas.org/doi/epdf/10.1073/pnas.1419828112
MacKinnon, Rebecca.). “View of China’s Censorship 2.0: How companies censor bloggers.” First Monday. Firstmonday.org. January 25, 2009. https://firstmonday.org/article/view/2378/2089
Pariser, Eli. “Beware online ‘filter bubbles’.” YouTube, uploaded by TED, 2 May 2011, https://youtu.be/B8ofWFx525s
“Censorship in Nazi Germany.” Youtube. Retrieved August 7, 2022, from https://www.youtube.com/watch?v=7Lm6U4vxEhI
“How Filter Bubbles Isolate You.” YouTube, uploaded by GCFLearnFree.org, 29 Nov. 2018, https://youtu.be/pT-k1kDIRnw
Today, we are often amazed at how much Web 2.0 has changed the way we interact with others and the world. Previously, in the Web 1.0 stage, we consumed content in a relatively passive manner, limited to one-way information dissemination. But in the current Web 2.0 era, we have acquired a new identity - "prosumers", which means we are empowered to produce and publish user-generated content and contribute to the global information base with our ideas. The high level of participation and interactivity that Web 2.0 has delivered to us seems to give us the illusion that we are unlimited and free in such a virtual world. This illusion, however, will be eventually dashed as we understand that the power structure is only being masked rather than disappearing. Our thinking patterns and behaviors are controlled, even manipulated, in ways that are difficult to perceive.
(VIDEO) Beware online “filter bubbles” | Eli Pariser
One notable phenomenon is “filter bubbles”. As defined by Eli Pariser, your filter bubble is your “personal, unique universe of information”, which is generated by the “personalization” function of algorithms. For example, imagine that when you are using Instagram, you constantly find yourself being recommended “users you might want to follow” or “posts and photos you might like”. You notice that these recommendations always satisfy your needs and grab your interest, so you keep clicking on them. Soon after, you will unconsciously create a filter bubble on Instagram, where you can see only the desired users and information you are more likely to agree with.
Of course, this is not magic— perfect matching and personalized recommendations are made possible by intelligent algorithms. These algorithms sort, filter, and select specific content based on your previous clicking habits and behaviors, and then recommend it to you, predicting what you would like to see and agree on. We all know that social media platforms and search engines are utilizing such algorithms, and we as users seem to be familiar enough with this process as well. However, the problem is that we do not know what gets into the bubble and what gets edited out. And it is also hard to perceive that our thoughts are being controlled by these algorithms, or more specifically, by the small group of companies that utilize them.

The invisible control can be reflected in the following two ways— isolation and bias.
(VIDEO) How Filter Bubbles Isolate You
Isolation occurs when we are more likely to see desirable information and less likely to see undesirable ones in filter bubbles. This mechanism leads us to one extreme of subjectivity since we have no choice but to hear only one voice for quite a long time. As a result, we become increasingly unwilling to accept information and ideas that disagree with our points of view. Then we will inevitably be isolated in our own cultural and ideological coteries. However, because everyone can become the content creator and information disseminator in the Web 2.0 era, can we make sure that the only-one voice we hear is more good than bad? What if this voice was deliberately created by those with greater power in order to shape our thoughts and make us subordinate to their purposes?
This is where bias comes from.

Algorithms do not naturally exist. They are written and modified by the companies behind them. Under this circumstance, we need to acknowledge that the process of selecting and filtering information made by algorithms is not always transparent. Conversely, the “personalization” function of algorithms may become a tool for those with greater power to affect and even manipulate our viewpoints, particularly in the field of politics. The presidential election is a notable example. Nowadays, voters spend more time on the Internet searching for information about candidates. This gives search engines a chance to bias these people’s judgments. For instance, once a voter clicks on the entry supporting Candidate A, he will constantly be recommended similar content in favor of Candidate A, and meanwhile, be less likely to see other opposite information such as the criticism of Candidate A or the praise for Candidate B. Shortly after, this undecided voter becomes a supporter of Candidate A. By adjusting search rankings and recommending selected content, search engines can invisibly favor particular candidates and manipulate the public perception of the election. What’s worse, according to Epstein and Robertson, “such manipulations are difficult to detect, and most people are relatively powerless when trying to resist sources of influence they cannot see.”
As stated in the previous post, Web 3.0 is considered a Web iteration without intermediaries. Therefore, we are very curious whether, without an intermediary that has the power to manipulate and distort people’s thoughts, Web 3.0 will be able to reduce the negative effects of filter bubbles and empower individuals to view the world in a more comprehensive and unbiased way.

Another way of manipulation in this digital age is surveillance, which can be mainly presented in censorship. The advancement of Web 2.0 has actually facilitated this political influence. Before the emergence of new media, censorship was basically enforced by the government directly. For instance, Nazi Party prohibited Jazz music as degenerate music, and they also produced literature and films to conduct propaganda of their ideas further.
(VIDEO) Censorship in Nazi Germany
But now, Big Brother is not only watching us but also ready to shut our mouths anytime. An evident example would be Weibo, the Chinese social media platform. Weibo is characterized by user-generated content, convenient usage, and high interactivity and fits the definition of Web 2.0. Though these features of Weibo have created a participatory space, users’ expression is still limited and manipulated. For instance, each blog post is examined by the Weibo censorship apparatus. Any sensitive content such as politics, LGBTQ+, and depressive ideas will usually be barred in various ways. Some posts might be held for approval, some posts might be posted in the beginning yet deleted or published in private view later. For those influential users, once they initiate sensitive discussions, they might be prevented from posting anything at all, or their accounts might be deactivated. Moreover, users’ current location and comments on political news are automatically presented on their account webpage, which extremely invades their privacy and indirectly warns them to speak cautiously online.
Unfortunately, this power structure is often internalized by people. As a result, Weibo users tend to employ homophones of censored keywords to avoid detection by keyword matching algorithms, such as using pinyin, English, or a reverse image to replace Chinese characters in the text. Despite looking like an act of rebellion, this phenomenon in fact reflects how users often discipline themselves due to the external factors brought by data surveillance. Self-censorship is exactly one of the adverse consequences of centralized Web 2.0.

Therefore, it is reasonable to conclude that the seemingly open Web 2.0 has an invisible power structure within it, as it renders users as passive receivers and only gives them limited freedom to utter and disseminate. Essentially, Web 2.0 is still a machine that instills a singular ideology in people to better control them.
Though people cannot totally break away from the political system or new media technologies, a more innovative network that brings subjectivity to people is needed. It is crucial for us to think about how Web 3.0 could serve as a tool to decrease the centralization impacts exerted by the government. In that way, rather than being "Cyber Refugees," netizens will be able to build their own realms and wander around with their packages.
Works Cited
Epstein, Robert, and Ronald E. Robertson. "The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections." Proceedings of the National Academy of Sciences, 4 Aug. 2015, https://www.pnas.org/doi/epdf/10.1073/pnas.1419828112
MacKinnon, Rebecca.). “View of China’s Censorship 2.0: How companies censor bloggers.” First Monday. Firstmonday.org. January 25, 2009. https://firstmonday.org/article/view/2378/2089
Pariser, Eli. “Beware online ‘filter bubbles’.” YouTube, uploaded by TED, 2 May 2011, https://youtu.be/B8ofWFx525s
“Censorship in Nazi Germany.” Youtube. Retrieved August 7, 2022, from https://www.youtube.com/watch?v=7Lm6U4vxEhI
“How Filter Bubbles Isolate You.” YouTube, uploaded by GCFLearnFree.org, 29 Nov. 2018, https://youtu.be/pT-k1kDIRnw
<100 subscribers
<100 subscribers
No activity yet