Subscribe to everin.eth
Subscribe to everin.eth
Share Dialog
Share Dialog
<100 subscribers
<100 subscribers
In the past decade, we have seen the rise of a new type of economy, one based on data and algorithms. This "data economy" is largely dominated by a handful of giant tech companies, such as Google, Amazon, Facebook and Apple. These companies have amassed vast troves of data on billions of users, and use this data to fuel their algorithms.
These algorithms have a profound impact on our lives, dictating what we see on social media, what products we buy on Amazon, and even which jobs we are offered. Yet, we have very little say in how they work. They are opaque, black-boxed systems, designed by a small group of people at the companies who own them.
This centralization of power is problematic for several reasons. First, it gives these tech companies an immense amount of control over our lives. Second, it concentrates power in the hands of a small number of people, who may not have our best interests at heart. And third, it leads to a lot of harmful decision-making, as these companies often prioritise their own profits over our well-being.
We need a way to take back control over these algorithms, and decentralise power away from these giant tech companies.
Algorithms play a fundamental role in modern life. They are the driving force behind search engines and social media platforms, shaping what we see and how we interact with the world. In a world of abundant information, algorithms have become essential for sifting through vast amounts of data to find the needle in the haystack.
The two main types of algorithms that we interact with today can be classified as:
Information Mediating: Algorithms that help us find the information we are looking for, such as search and curation. Examples are Google and Facebook.
Consumption Mediating: Algorithms that help us make decisions about physical and virtual consumption. Examples are Amazon and Netflix.
These algorithms are employed by the likes of Google, YouTube and Amazon.
Without algorithms, none of their businesses would work. Google's value proposition is to provide the best search results. Youtube's value proposition is to suggest content you'll find entertaining. Amazon's value proposition is to be a storefront that sells anything.
These are all things that the majority of users wouldn't want to give up.
Why would you go back to libraries if you can just type what you're looking for into a search engine and get results in 0.32 seconds?
Why would you watch TV if you can just browse through a never-ending list of content tailored to your specific taste?
Why would anyone buy from a brick-and-mortar store when Amazon can sell you anything you want with just a few clicks?
In short, algorithms have become an indispensable part of our lives. They make it possible for us to find the information and content that we need in an increasingly complex and noisy world.
Sadly, there are some severe downsides to this dependence on algorithms.
Algorithms, to the end-user, are a total black box. We don't know how they work. We don't know what criteria they use to rank results or show us content. All we see are the results that we are served.
We have no way of directly influencing the parameters that algorithms are optimizing for or the dataset they are trained on.
This lack of transparency & control can be problematic for a number of reasons.
First, it can lead to a filter bubble effect, where we only see content that confirms our existing beliefs and worldview. This can make it difficult for us to empathize with others or understand different points of view.
Second, algorithms can amplify the reach of fake news and conspiracy theories. The stories that propagate best are those that are viral or better at hooking our engagement instead of being true.
Third, algorithms can be biased. They can reflect the biases of their creators or the data that they are trained on.
Fourth, algorithms can be weaponized. They can be used to manipulate public opinion or interfere in elections. This is a major concern for democracy in the age of social media and fake news.
Finally, algorithms can have unintended consequences. They are often designed to achieve a specific goal, but they can end up having other effects that are not intended or anticipated. For example, Facebook's News Feed algorithm is designed to show users the content that maximizes time on site. But as a sideffect, it also leads to increased feelings of envy and loneliness.
The fundamental problem with modern day algorithms is very simple: They are optimizing for the goals of their owners, not their users.
They are built by businesses to fulfill their goals & objectives, not those of their customers.
Quite a lot of global problems of the 21st century can be linked to the influence machines have on human behaviour.
Reduced Attention spans are increasing because algorithms hijack our nervous system to keep us hooked with ultra addicting content. A study by microsoft shows that just over a few years our attention span shrunk by 8 seconds which is a drop of 25%!
Polarisation is driving us dangerously far apart into territory where cooperation becomes almost impossible and we are even on the verge of civil wars in some countries.

Algorithms are also great at manipulating us into buying piles of bullshit that we don't need by showing us A/B optimised advertisements that are highly effective at hijacking our brains & recommendation engines learn which they need to show us in just the right moment so that we need the maximum amount of willpower to say “no”.
Algorithms align our actions with the objectives of the businesses that employ them, which is more often than not quite the opposite of what we actually want.
This is the fundamental problem with modern day algorithms: they are not designed to help us, they are designed to control us.
To take back control of our algorithms, we need the ability to influence the goal that algorithms are optimising for.
Only then can we hope to use algorithms for our own benefit instead of being used by them.
In social media, I can choose to escape my filter bubble and see what people that are different from me are seeing. I can tell the algorithm to reduce stress instead of increasing anxiety or configure it to optimise for global collective intelligence instead of maximum personal addiction.
When it comes to anything that influences my consumption behaviour like advertisements or product search engines, it can help me align my consumption decisions with my goal of saving money, increasing fitness or reducing my footprint on the environment.
On a high level it is rather simple. When it comes to building this new paradigm of user-controlled algorithms, the devil is in the details. Building the algorithm itself is not trivial to start with, especially if the goal is to understand different goals, objectives and values from diverse users. Also, the infrastructure in terms of data and user interfaces to interact with the algorithm will be challenging to build. And it is also questionable if most users would even want to engage with algorithms on such a close level as most users today have only a very limited awareness & understanding of them.
One plausible workaround could be a marketplace or exchange for algorithms and configurations of those. This would allow a few power-users to optimise for different outcomes and share them with a broader audience.
When we have the intention of buying anything, we might have quite the large array of known and unknown requirements that we'd like to consider. Lets take food for example . We want to be healthy, which means we want to avoid certain industrial ingredients, but we also want the food we buy to reflect our values like sustainability and fair trade. We may also value not being overcharged and getting the best price for a product. And at the end, the food also needs to taste good. These are a hole lot of variables to keep track of. Making decisions in accordance with our true intentions is therefore extremely difficult.
But what if there was a way to make this process easier? What if we could create an algorithm that would help us make better food choices that reflect our values and meet our requirements?
Algorithms are perfect to aid consumers in making better decisions with their intentions if well designed. We can use algorithms to not only make our life easier but also help us make better choices for the environment, society and ourselves.
In a world with user-controlled algorithms, we would be able to design them that help us find the best food according to our requirements. For example, we could create an algorithm that takes into account our budget, location, values and health goals to find the best food options for us.
As users control the algorithms and decide what goals to optimize for, their consumption will effortlessly be directed to only the most aligned products based on the most trustable meta-data. As product-meta-data that aligns with user-goals becomes more and more important for consumption decisions, products & supply chains that are best aligned with the specific needs, goals and values of the users will outcompete those that are not respecting the users needs. Investments in branding, advertising and superficial greenwashing campaigns will dry up and capital flows will be directed to those activities that produce the meta-data that is needed to get the product better ranking in the ocean of user-controlled algorithms.
This is how bottom-up regulation emerges. Companies will need to make better products that are, not only on a surface level but deeply and verifiably, more in line with their end-users goals and values or lose market share and go out of business. The expression of our goals & values will have a direct impact on the global economy if a sufficient amount of users decide to make their consumption decisions via those new interfaces.
Algorithms are powerful. Like any other major technology, they can be used for good or for evil.
Business-controlled algorithms are destroying our brain, polarising our society and arguably even accelerate the destruction of our ecosystems & planet.
By switching to a paradigm of user-controlled algorithms, we can take back control and start using those powerful engines for our own benefits instead.
We can optimise our feeds for truth rather than outrage & our consumption for personal success instead of corporate greed.
Bottom-Up Regulation is a logical consequence of user-controlled algorithms. Users with the ability to decide what objectives to optimise their consumption decisions for, would result in the creation of goods and supply chains that meet customers' requirements, goals, and values.
Rather than corporations investing in developing sophisticated manipulation strategies like advertising and branding to manufacture demand, they'll need to invest in improving their goods and services so that they are more aligned with their client's objectives and values, or risk losing market share to firms who do.
In the past decade, we have seen the rise of a new type of economy, one based on data and algorithms. This "data economy" is largely dominated by a handful of giant tech companies, such as Google, Amazon, Facebook and Apple. These companies have amassed vast troves of data on billions of users, and use this data to fuel their algorithms.
These algorithms have a profound impact on our lives, dictating what we see on social media, what products we buy on Amazon, and even which jobs we are offered. Yet, we have very little say in how they work. They are opaque, black-boxed systems, designed by a small group of people at the companies who own them.
This centralization of power is problematic for several reasons. First, it gives these tech companies an immense amount of control over our lives. Second, it concentrates power in the hands of a small number of people, who may not have our best interests at heart. And third, it leads to a lot of harmful decision-making, as these companies often prioritise their own profits over our well-being.
We need a way to take back control over these algorithms, and decentralise power away from these giant tech companies.
Algorithms play a fundamental role in modern life. They are the driving force behind search engines and social media platforms, shaping what we see and how we interact with the world. In a world of abundant information, algorithms have become essential for sifting through vast amounts of data to find the needle in the haystack.
The two main types of algorithms that we interact with today can be classified as:
Information Mediating: Algorithms that help us find the information we are looking for, such as search and curation. Examples are Google and Facebook.
Consumption Mediating: Algorithms that help us make decisions about physical and virtual consumption. Examples are Amazon and Netflix.
These algorithms are employed by the likes of Google, YouTube and Amazon.
Without algorithms, none of their businesses would work. Google's value proposition is to provide the best search results. Youtube's value proposition is to suggest content you'll find entertaining. Amazon's value proposition is to be a storefront that sells anything.
These are all things that the majority of users wouldn't want to give up.
Why would you go back to libraries if you can just type what you're looking for into a search engine and get results in 0.32 seconds?
Why would you watch TV if you can just browse through a never-ending list of content tailored to your specific taste?
Why would anyone buy from a brick-and-mortar store when Amazon can sell you anything you want with just a few clicks?
In short, algorithms have become an indispensable part of our lives. They make it possible for us to find the information and content that we need in an increasingly complex and noisy world.
Sadly, there are some severe downsides to this dependence on algorithms.
Algorithms, to the end-user, are a total black box. We don't know how they work. We don't know what criteria they use to rank results or show us content. All we see are the results that we are served.
We have no way of directly influencing the parameters that algorithms are optimizing for or the dataset they are trained on.
This lack of transparency & control can be problematic for a number of reasons.
First, it can lead to a filter bubble effect, where we only see content that confirms our existing beliefs and worldview. This can make it difficult for us to empathize with others or understand different points of view.
Second, algorithms can amplify the reach of fake news and conspiracy theories. The stories that propagate best are those that are viral or better at hooking our engagement instead of being true.
Third, algorithms can be biased. They can reflect the biases of their creators or the data that they are trained on.
Fourth, algorithms can be weaponized. They can be used to manipulate public opinion or interfere in elections. This is a major concern for democracy in the age of social media and fake news.
Finally, algorithms can have unintended consequences. They are often designed to achieve a specific goal, but they can end up having other effects that are not intended or anticipated. For example, Facebook's News Feed algorithm is designed to show users the content that maximizes time on site. But as a sideffect, it also leads to increased feelings of envy and loneliness.
The fundamental problem with modern day algorithms is very simple: They are optimizing for the goals of their owners, not their users.
They are built by businesses to fulfill their goals & objectives, not those of their customers.
Quite a lot of global problems of the 21st century can be linked to the influence machines have on human behaviour.
Reduced Attention spans are increasing because algorithms hijack our nervous system to keep us hooked with ultra addicting content. A study by microsoft shows that just over a few years our attention span shrunk by 8 seconds which is a drop of 25%!
Polarisation is driving us dangerously far apart into territory where cooperation becomes almost impossible and we are even on the verge of civil wars in some countries.

Algorithms are also great at manipulating us into buying piles of bullshit that we don't need by showing us A/B optimised advertisements that are highly effective at hijacking our brains & recommendation engines learn which they need to show us in just the right moment so that we need the maximum amount of willpower to say “no”.
Algorithms align our actions with the objectives of the businesses that employ them, which is more often than not quite the opposite of what we actually want.
This is the fundamental problem with modern day algorithms: they are not designed to help us, they are designed to control us.
To take back control of our algorithms, we need the ability to influence the goal that algorithms are optimising for.
Only then can we hope to use algorithms for our own benefit instead of being used by them.
In social media, I can choose to escape my filter bubble and see what people that are different from me are seeing. I can tell the algorithm to reduce stress instead of increasing anxiety or configure it to optimise for global collective intelligence instead of maximum personal addiction.
When it comes to anything that influences my consumption behaviour like advertisements or product search engines, it can help me align my consumption decisions with my goal of saving money, increasing fitness or reducing my footprint on the environment.
On a high level it is rather simple. When it comes to building this new paradigm of user-controlled algorithms, the devil is in the details. Building the algorithm itself is not trivial to start with, especially if the goal is to understand different goals, objectives and values from diverse users. Also, the infrastructure in terms of data and user interfaces to interact with the algorithm will be challenging to build. And it is also questionable if most users would even want to engage with algorithms on such a close level as most users today have only a very limited awareness & understanding of them.
One plausible workaround could be a marketplace or exchange for algorithms and configurations of those. This would allow a few power-users to optimise for different outcomes and share them with a broader audience.
When we have the intention of buying anything, we might have quite the large array of known and unknown requirements that we'd like to consider. Lets take food for example . We want to be healthy, which means we want to avoid certain industrial ingredients, but we also want the food we buy to reflect our values like sustainability and fair trade. We may also value not being overcharged and getting the best price for a product. And at the end, the food also needs to taste good. These are a hole lot of variables to keep track of. Making decisions in accordance with our true intentions is therefore extremely difficult.
But what if there was a way to make this process easier? What if we could create an algorithm that would help us make better food choices that reflect our values and meet our requirements?
Algorithms are perfect to aid consumers in making better decisions with their intentions if well designed. We can use algorithms to not only make our life easier but also help us make better choices for the environment, society and ourselves.
In a world with user-controlled algorithms, we would be able to design them that help us find the best food according to our requirements. For example, we could create an algorithm that takes into account our budget, location, values and health goals to find the best food options for us.
As users control the algorithms and decide what goals to optimize for, their consumption will effortlessly be directed to only the most aligned products based on the most trustable meta-data. As product-meta-data that aligns with user-goals becomes more and more important for consumption decisions, products & supply chains that are best aligned with the specific needs, goals and values of the users will outcompete those that are not respecting the users needs. Investments in branding, advertising and superficial greenwashing campaigns will dry up and capital flows will be directed to those activities that produce the meta-data that is needed to get the product better ranking in the ocean of user-controlled algorithms.
This is how bottom-up regulation emerges. Companies will need to make better products that are, not only on a surface level but deeply and verifiably, more in line with their end-users goals and values or lose market share and go out of business. The expression of our goals & values will have a direct impact on the global economy if a sufficient amount of users decide to make their consumption decisions via those new interfaces.
Algorithms are powerful. Like any other major technology, they can be used for good or for evil.
Business-controlled algorithms are destroying our brain, polarising our society and arguably even accelerate the destruction of our ecosystems & planet.
By switching to a paradigm of user-controlled algorithms, we can take back control and start using those powerful engines for our own benefits instead.
We can optimise our feeds for truth rather than outrage & our consumption for personal success instead of corporate greed.
Bottom-Up Regulation is a logical consequence of user-controlled algorithms. Users with the ability to decide what objectives to optimise their consumption decisions for, would result in the creation of goods and supply chains that meet customers' requirements, goals, and values.
Rather than corporations investing in developing sophisticated manipulation strategies like advertising and branding to manufacture demand, they'll need to invest in improving their goods and services so that they are more aligned with their client's objectives and values, or risk losing market share to firms who do.
No activity yet