
A deep dive into CosmWasm : a cross-chain smart contract engine for Cosmos SDK and IBC
OverviewCosmWasm is a smart contract engine that improves based on the Cosmos SDK and IBC. It’s the only cross-chain interpretable Virtual Machine besides the EVM. If you think of EVM as a Mainframe computer, the CosmWasm VM is a cluster of computers which can communicate through IBC。 CosmWasm, along with Tendermint, Cosmos SDK, IBC are the the core technology stack within the Cosmos, any chain based on the Cosmos SDK can implement CosmWasm without changing the existing logic. CosmWasm curren...

Stable Diffusion的基准测试-哪种GPU运行AI模型的速度最快(已更新)
哪种显卡能提供最快的人工智能?人工智能和深度学习最近不断成为头条新闻,无论是ChatGPT产生的不良建议,自动驾驶汽车,艺术家被指控使用人工智能,人工智能的医疗建议,等等。这些工具大多依靠复杂的服务器和大量的硬件进行训练,但通过推理使用训练过的网络可以在你的PC上,使用其显卡来完成。但是,消费类GPU做人工智能推理的速度如何? 我们在最新的Nvidia、AMD、甚至英特尔的GPU上对Stable Diffusion进行了基准测试,看看它们的表现如何。如果你碰巧尝试过在自己的电脑上启动和运行Stable Diffusion,你可能会对这一过程的复杂性--或简单性--有一些印象。- 这可能是多么复杂或简单。简而言之,Nvidia的GPU占主导地位,大多数软件都使用CUDA和其他Nvidia工具集设计。但这并不意味着你不能在其他GPU上运行Stable Diffusion。 我们最终使用了三个不同的Stable Diffusion项目进行测试,主要是因为没有一个软件包可以在每个GPU上运行。对于Nvidia,我们选择了Automatic 1111的webui版本;它表现最好,有更多的选...

为什么第四次工业革命不应该属于现有垄断企业?
以下文章部分内容来自于Jim O’Shaughnessy 以及Emad CEO of Stability AI在infinite loops访谈中关于AI未来发展公开对话。同时也感谢Hugging Face 铁震童鞋提供的相关行业信息。 伟大的人类从400万年前的南方古猿物种一步一步演变而来,经历了史前革命,古文明时期,中世纪,文艺复兴时代,工业革命时代等,启蒙时代的物理学家丹尼斯构想出了蒸汽机的雏形,为之后的水泵、驱动轮船和火车奠定了基础。工业革命时代,人们意识到科技进步能够提升质量,带来全人类的幸福。这段时间内,潜艇,火车,摄影术,电灯,电话,电影等技术相继问世,这也是严格意义上的第二次工业革命。第三次工业革命则是继蒸汽机以及电力革命后的,以原子能、电子计算机等的主要应用为标志的时代。而过去两年人工智能惊人的发展速度,以社区主导的大规模信息聚合有潜力推动下一个世纪的主导。当然人工智能在之前并不缺乏故事和市场运作,以自动驾驶、安防、监控等应用为特征的企业服务/政府服务题材公司,比如商汤科技,旷世科技等,都是领域的龙头。前几年AI主要以inference推理的统计模型为主导,而非...
<100 subscribers



A deep dive into CosmWasm : a cross-chain smart contract engine for Cosmos SDK and IBC
OverviewCosmWasm is a smart contract engine that improves based on the Cosmos SDK and IBC. It’s the only cross-chain interpretable Virtual Machine besides the EVM. If you think of EVM as a Mainframe computer, the CosmWasm VM is a cluster of computers which can communicate through IBC。 CosmWasm, along with Tendermint, Cosmos SDK, IBC are the the core technology stack within the Cosmos, any chain based on the Cosmos SDK can implement CosmWasm without changing the existing logic. CosmWasm curren...

Stable Diffusion的基准测试-哪种GPU运行AI模型的速度最快(已更新)
哪种显卡能提供最快的人工智能?人工智能和深度学习最近不断成为头条新闻,无论是ChatGPT产生的不良建议,自动驾驶汽车,艺术家被指控使用人工智能,人工智能的医疗建议,等等。这些工具大多依靠复杂的服务器和大量的硬件进行训练,但通过推理使用训练过的网络可以在你的PC上,使用其显卡来完成。但是,消费类GPU做人工智能推理的速度如何? 我们在最新的Nvidia、AMD、甚至英特尔的GPU上对Stable Diffusion进行了基准测试,看看它们的表现如何。如果你碰巧尝试过在自己的电脑上启动和运行Stable Diffusion,你可能会对这一过程的复杂性--或简单性--有一些印象。- 这可能是多么复杂或简单。简而言之,Nvidia的GPU占主导地位,大多数软件都使用CUDA和其他Nvidia工具集设计。但这并不意味着你不能在其他GPU上运行Stable Diffusion。 我们最终使用了三个不同的Stable Diffusion项目进行测试,主要是因为没有一个软件包可以在每个GPU上运行。对于Nvidia,我们选择了Automatic 1111的webui版本;它表现最好,有更多的选...

为什么第四次工业革命不应该属于现有垄断企业?
以下文章部分内容来自于Jim O’Shaughnessy 以及Emad CEO of Stability AI在infinite loops访谈中关于AI未来发展公开对话。同时也感谢Hugging Face 铁震童鞋提供的相关行业信息。 伟大的人类从400万年前的南方古猿物种一步一步演变而来,经历了史前革命,古文明时期,中世纪,文艺复兴时代,工业革命时代等,启蒙时代的物理学家丹尼斯构想出了蒸汽机的雏形,为之后的水泵、驱动轮船和火车奠定了基础。工业革命时代,人们意识到科技进步能够提升质量,带来全人类的幸福。这段时间内,潜艇,火车,摄影术,电灯,电话,电影等技术相继问世,这也是严格意义上的第二次工业革命。第三次工业革命则是继蒸汽机以及电力革命后的,以原子能、电子计算机等的主要应用为标志的时代。而过去两年人工智能惊人的发展速度,以社区主导的大规模信息聚合有潜力推动下一个世纪的主导。当然人工智能在之前并不缺乏故事和市场运作,以自动驾驶、安防、监控等应用为特征的企业服务/政府服务题材公司,比如商汤科技,旷世科技等,都是领域的龙头。前几年AI主要以inference推理的统计模型为主导,而非...
Share Dialog
Share Dialog
MEV (Miner Extractable Value) can be understood as the total value the miners can extract at the expense of users by arbitrarily reordering, including, or excluding transactions within a block. Simply put, miners can determine the order of when transactions are processed on the blockchain and exploit the power to their advantage. The MEV ("miner" extractable value) is firstly defined by Phil Daian a Smart Contracts researcher. The definition is actually inaccurate and is now normally referred as Maximal Extractable Value (MEV). Also, Osmosis founder of Osmosis, Sunny Aggarwal, redefined it as Proposer Extractable Value (PEV).
In this game of economy, we have seen different participants who are miners, searchers, solvers, users, application and protocol developers etc. It is impossible to inherently eliminate MEV while it’s possible for the participants to negotiate the terms of MEV in order to reduce the extractable value of miners based on different assumptions within different protocols and ecosystems. The miners mentioned above are normally considered as block producers. In other ecosystems, they are also known as validators (such as in ETH 2.0)where their main responsibilities are producing blocks, adding or removing transactions, ordering transactions and so on. Users are individuals or organizations who can generate economic values on the networks, such as having a transaction on the decentralized exchange, adding liquidity to a liquidity pool, providing collateral to the position on the lending protocol, minting a NFT on the marketplace etc. The above economic activities create sustainable amount of extractable value.There are searchers who are parties that express transaction ordering preferences or encryption and decryption process in order to capture some value. Dapps such as wallets, create some of these MEV games by their design, the way that they work and the incentives that they try to create for their users and for searchers. through direct transaction to the miners. Protocol developers are an interesting one, because they create sort of the base rules of these games, they create the structure and the abilities for parties like block producers to take certain action, which ultimately give birth to the MEV.
Before providing a deep dive on the MEV, we know the mines can produce blocks, include/exclude transactions, and order transactions. What else can miners do? Miners can change their block timestamps, manipulate timestamps through "randomness” manipulation, time jacking and other ways. Taking the Bitcoin network as an example, Bitcoin's difficulty adjustment was initially based on timestamps. Attackers would reduce the difficulties by forking the chain and forge the timestamps in order to produce blocks more quickly. The blocks are produced based on the Longest Chain Rule so all rewards would go to the attacker. Therefore, the community modified the rules to define the chain with the most computing power and the most cumulative difficulty as the main chain, thus avoiding the problem. The block proposers also have the right to censor the consensus vote and manipulate the voting power through ways such as Selfish Mining or BFT vote censorship. In the POW mechanism, when the miners themselves get a longer chain than others, they will not immediately release their own chain, while continue mining on their own chain. We call it BFT vote censorship in the BFT mechanism. In addition to this, relayer providers like Flashbot allows block proposers to read transactions from the mempool.Of course, they can also control over the addition/removal of transactions and the sequence of transactions as we mentioned above.

With regard to Tx-Based transactions, , we can categorize them into censorship manipulation and ordering-based manipulation. There are two categories of transaction orders: absolute and relative ordering. Absolute Ordering means that your transaction is not based on another transaction in the block and it has an absolute position in the block. For example, if a liquidation process is taking place, someone must have to trigger the liquidation mechanism and profit by doing so. Block proposers can guarantee that they are always the first one to do so and obtain the extractable value. Osmosis has proposed several solutions to restrict this. However, it is not the focus for Osmosis at the moment because such transactions are positive and provide the services to the network. They make profits by not extracting value from everyone. Osmosis prefers to address the latter which is called theRelative Ordering. When you try to base your transaction on someone else's, we call it Normal Frontrunning (and also includes Back running). For example, the sandwich attack (a type of backrunning), an attacker finds a transaction on the network who is buying an asset, and the attacker purchases that asset before the trader places an order. Although the trader submits the transaction first, the attacker can prioritize its owntransaction first, broadcast it to the network,which push up the price of the asset, and eventually sell it by higher price to the trader through the commodity trade. Such transactions essentially sacrifice the benefits of other participants for attacker’s own benefit, and can cause serious problems such as loss of users’ funds, instability of the consensus mechanism, soaring gas fees, centralization of block production and other issues.
In terms of the right to read transactions, we can encrypt them when they are added to the mempool . Mempool is a caching function for transactions. The mempool performs an initial legitimacy check on accepted transactions and filters out some illegal ones. We can have multiple solutions on how do encrypt the mempool layer, one is SGX, one is Timelock crypto, I am gonna make crypto transactions anyone can trustlessly decrypted it, after running some sequential process in a few minutes, the third one is threshold decryption, where you have some communities, some weights, they collectively can decrypt some transactions, where ⅔ the participants can decrypt it. The assumption is quite similar to the Tendermint POS assumption, if ⅔ nodes we trust, them collectively to act honestly, we don’t attacked, or slash them, if we make trust assumption. Similarly, we can make the decryption, as we trust that. So for the POS and Tendermint mechanism, the “community” is very important. On the other hand, Timelook doesn't emphasize the concept of community, but you need to make sure that everybody participates and uses it. Also, you need to make sure that it's locked for a certain amount of time and that it's not decrypted in the mempool. If its latency is too long, it will affect user experience. The threshold cryptography, which is also the MEV technique used by Osmosis that we will talk about later, allows you to choose whether to participate or not, providing validators with high flexibility.
For the Timelock, I think that we may try to apply for the non- transaction based activities such as adding or removing liquidity. For example, many bots have started to sandwich by getting in front of trades and adding a ton of liquidity, then the trade will be executed and then they’ll immediately remove their liquidity from a pair. So it’s like providing liquidity just in time, and then immediately removing it as well. So the user actually gets better execution than they would otherwise. But it is this bot that is able to capture the fees that that user is creating. And it’s at the expense of liquidity providers that this bot is making MEV as opposed to the users themselves, which is like a super interesting twist on the sandwich game.
Osmosis has instead adopted the threshold cryptography. Different public chains have different architecture in terms of mempools encryption technique, for example Solana does not have a public mempool, and its private mempools are only visible to validators. Moreover, relayerslike Flashbot, it’s not entirely private, like, it is direct to the miner in that it doesn’t go to the public mempool, but the miner, Flashbots, Flashbot API users, nodes can see the bundle and the transactions in it. So it’s not truly private, but it is direct to the miner and it’s private from sandwich bots and like other bots in the public mempool. Also, It is still a relatively centralized solution although it’s considering using alternative solutions to help decentralize the network. Therefore, many long-tail strategies will not be passed through the Flashbots, rather they will be packaged into bundles through some protocols such as ArcherDAO, BloxRoute, or directly to miners.
Flashbot is a new relay system that facilitates communication between searchers and miners in the ecosystem. By creating private pools, searchers place transactions into bundles and send them to miners via Flashbot. So a bundle is a group of transactions that a searcher submits to a miner. And that group of transactions can be my own as a searcher. But it could also be someone else’s that I found in the pending mempool or some other source, and a bundle has to be executed in the order that it’s provided. They also solve the problem by providing private auctions between counterparties. The searcher can either submit the transaction by bidding up the fee or directly bribing the miners/validators with no upfront transaction fees paid.
As a rather centralized organization, with only few nodes, you will need to trust the nodes in order to perform the transaction via Flashbots. Before Flashbot sends all bundles to miners, it prioritizes senders (searchers) based on a reputation system that splits the relay into a high priority and low priority queue. Then if Flashbots has a on-chan record, and in their relay of you submitting high quality bundles through regular basis, then you will be put in the high priority queue. Rather if you are a new searcher and has no record, you will end up with the low priority queue. So this will dramatically decrease the chance for theDDOS attacks (as only low priority queue transactions would be DDOS and hence, they can send higher-quality transaction bundles to miners.
According to Flashbot's latest path, they will use SGX to achieve higher degree of privacy and decentralization. They will continue to iterate on existing private mempools. Miners can have their own SGX mempool and every users can use it. We can understand SGX as a way to create private mempools. Furthermore, SGX requires a relatively high security budget, for example if you break it, you need to hire 10 top engineers with a $10 million budget. However, this is nothing for the huge MEV market. You only need to break one SGX, take out a key then break the mempool.
Osmosis seems to be realizing our MEV endgame. Will the competition for MEV in the future become a absolute block production or the game the miners play between attestation and sequencing of transactions in a block becomes definitely outsourced ? Hence, there will be division of labor between those two, and the miners will focus entirely on the attestation part. And then you have a marketplace where they can choose the best block contents. And it always felt like to me that the way that it’s done right now with the bundles is basically like a stopgap to the real solution, which is an actual block template market.
Osmosis tries to achive the above mission through use threshold encryption. Transactions are encrypted once they get included in the mempool (before being included in the block), which hides the transaction information and thus prevents the validators from ordering/censoring the transaction. After the transaction is encrypted, the private key used to decrypt the transaction is shared among all validators. Then 2/3 of the validators are able to decrypt it. Eventually, the block will be almost simultaneously decrypted when it’s confirmed.
This MEV protection is unique to Osmosis because it is a sovereign application chain based on the Cosmos SDK and implemented through the Tendermint Core (BFT consensus) mechanismIn Tendermint, Tendermint consensus does not have concurrent block proposals.A new block cannot be proposed before the current block proposal is finalized by ⅔ of the validators’ voting. Instead, in POW, two miners might win the probabilistic lottery at the same time, and due to the network propagation delays, one miner might not know another one has won it. All block contents can be decrypted when they are broadcasted among the whole network. All block producers take part in the decryption process at the same block, the protocol ensures that block producers don’t get an unfair advantage over others a in terms of MEV extraction. This is more difficult to achieve in a network like PoW, where blocks are proposed simultaneously, and the finality emerges over time. Therefore, after decryption it still exposes the network to MEV vulnerabilities through possible Reorg issues (Reorg attacks, Time Bandit attacks, or double spend attacks).
Moreover, threshold encryption solves the latency problem, where blocks are not executed until they are finalized.We have seen this high latency problem exist in the Osmosis. Block Proposer normally produces blocks, adds and sequences transactions. Validators can be also understood as voters in Tendermint. Osmosis is building something called vote extensions, which extends the functionalities the Tendermint has. Each validator can contribute additional data along with their consensus vote, so encryption should be shared. In the concept of Joint Proposals, each validator can include some types of transactions in their vote and the next proposer will be required to include the transactions in the proposal. Of course there will be some overlapping transactions.But they can add if they want and at least include the one the validators add.. This prevents the monopoly of one proposer, the only one who has access to inclusion, and allows all validators to contribute to the inclusion a little bit. Here we separate the block producer from the validator, and seperate the inclusion and ordering, do you separately vote on the sets of transactions included and ordered? We hope to achieve a higher degree of duty segregation int POS. So we do have this division of labor now, just to a smaller degree where miners are already sort of separated into mining pools (POW) , and the pool operator(ETH2.0) and the workers basically who add different transactions into bundles via Flashbot. We hope to see more roles available in the future.
Osmosis creates bundle atomicity and prevents miners from seeing the contents of the transaction. Block producers are required to focus entirely on the attestation part, which significantly improves the privacy and ensures the bundles won’t be unbundled. This brings a more efficient marketplace of transactions in the long run, and we can truly have bundle atomicity at the protocol layer, and should be able to sequence transactions at the protocol layer.
Through Joint Proposals, we can select the best individual transactions and place them in a mempool, which creates the prototype for a future block market where we can select the best block contents. Flashbot is a market for bundles, transaction bundles for miners to include. Will Osmosis be able to create the market for block template bundles? Block template bundles is a new name given and are considered as the combination of several transactions and a block template that is like one block full of transactions. So now someone can propose several block templates in a row to replace existing blocks that exist in the Ethereum chain right now. We strongly believe that while the current solution of bundles (full of transactions) is just a stepping stone , while there may be a larger marketplace for block template bundles in the future. And as Sunny Aggarwal said: Flashobot tries to solve and reduce the side effects of MEV, such as high fees, the instability of consensus, and other problems, while Osmosis tries to solve the problems on MEV itself.
We have seen other solutions, such as Uniswap V3, which tries to allow liquidity providers to concentrate their liquidity within certain price ranges, and this makes it much more capital efficient and gives traders much deeper liquidity within the price ranges. So users can set tighter slippage and have less price impact on trades, which gives less room for sandwich bots to front run them and push them to their slippage limits and extract value from them. Also, the high amount of capital required creates the high barriers to entry for bots as they dont want to lose it, being unbundled or being stuck in this large capital when they want to liquidate them.Thirdly, it’s more difficult to reason about and deal with the data.Another example is Cowswap. It provides users the best execution price through the off chain logic using Solver. However, in extreme cases, if there is a $1 billion commodity trade, it is difficult to prevent Solver from screwing with trade and gaining some incentives as in the median trades. So MEV always exists, it's just a matter of how to reduce the occurrence of MEV, how to reduce the frequency of relative ordering such as sandwich attacks that bring significant negative impacts, to the users and network participants. For permissionless MEV arbitrage and liquidations transactions, we think these are stable business models. We have seen that many MEV are top miners and the degree of centralization is very high. Ethermine occupies 30% of the network's hashing power. F2POOL and Sparkpool are also the top players. In addition to the application-specific layer, we are also looking for solutions in the consensus layer and P2P layer. For example, Osmosis' threshold encryption is a consensus layer solution, and TX's fair ordering can also be achieved in the consensus layer. In terms of P2P layer, we can directly build direct communication among users, searcher sand miners via the dapps(such as wallets) (Mining DAO is built to connect the searchers to the users directly and helps to bring the best execution price for the user. Lastly, The segregation of the mempools is also a good solution in the P2P layer, and we can allocate a variety strategies among different relayes (e.g. Archer DAO, BloxRoute, direct to miners, etc) to have better coverage on average of block that my bundles will be included.

As mentioned above in the POW system, miners are participating in the probabilistic lottery. Two miners can win the probabilistic algorithm at the same time. Due to the propagation delays of the network, the two miners do not know that the other has won the race, and the rule of the network is done by looking at the total amount of computational work that has gone into either work and the one that has more l computational work. Thus, if one of them continuously and quickly mines two blocks the miner will eventually win. But here's the problem, it means you can go back a few blocks before to perform more computational work quickly than the actual canonical chain, and then show it to the network. Since this happens, the network follows the rule that the chain with the most usage or the most computational work becomes the main chain. Then this miner can do a lot of bad things. Technically, this is hard to achieve, but if someone could actually do this, then you could go back in time and liquidate positions that others are ready to liquidate because you are the miner and you could prioritize your own transactions. You may perform the liquidation yourself and capture all the MEV value from it, or you could capture MEV by sequencing other people's transactions or in any other possible method. This is called a Time Bandit Attack (or Double Spend Attack, Reorg Attack).
What are the potential negative effects? Take oracle as an example, nowadays, most oracles including g Chainlink and Maker, their oracle updates are not tied to a specific block height actually. This is a very serious problem as most projects in Defi are relying on Chainlink. For instance, in the current block, there might be a user who is closing his liquidation - closing the loan in block zero. Then after five blocks, Compound (using Uniswap v2+ Chainlink, not dependent solely on Chainlink), Maker or Chainlink posts a price update and the user will get liquidated in a few minutes. The miner can then take the oracle update and go back five blocks in time,and liquidate the user with a future oracle update before the user can actually liquidate and close the position when it’s not actually liquidatable. It will be very disruptive to the system. Also, the encryption technique such as threshold cryptography we mentioned above cannot be implemented in the POW network. Blocks are proposed simultaneously, and the finality emergense over timeSo after decryption it would still expose the network to MEV vulnerabilities through potential reorgs (Reorg attacks, also known as Timebandit Attacks or Double Spend Attacks). Although the problem is severe if it happens, it’s good to see some of he top miners such as Ethermine, clearly state that they will not behave in such disruptive ways. The architecture of ETH2.0 and the introduction of the Attestation game also helps to solve the problem, which we will introduce more in further research.
Reference
MEV (Miner Extractable Value) can be understood as the total value the miners can extract at the expense of users by arbitrarily reordering, including, or excluding transactions within a block. Simply put, miners can determine the order of when transactions are processed on the blockchain and exploit the power to their advantage. The MEV ("miner" extractable value) is firstly defined by Phil Daian a Smart Contracts researcher. The definition is actually inaccurate and is now normally referred as Maximal Extractable Value (MEV). Also, Osmosis founder of Osmosis, Sunny Aggarwal, redefined it as Proposer Extractable Value (PEV).
In this game of economy, we have seen different participants who are miners, searchers, solvers, users, application and protocol developers etc. It is impossible to inherently eliminate MEV while it’s possible for the participants to negotiate the terms of MEV in order to reduce the extractable value of miners based on different assumptions within different protocols and ecosystems. The miners mentioned above are normally considered as block producers. In other ecosystems, they are also known as validators (such as in ETH 2.0)where their main responsibilities are producing blocks, adding or removing transactions, ordering transactions and so on. Users are individuals or organizations who can generate economic values on the networks, such as having a transaction on the decentralized exchange, adding liquidity to a liquidity pool, providing collateral to the position on the lending protocol, minting a NFT on the marketplace etc. The above economic activities create sustainable amount of extractable value.There are searchers who are parties that express transaction ordering preferences or encryption and decryption process in order to capture some value. Dapps such as wallets, create some of these MEV games by their design, the way that they work and the incentives that they try to create for their users and for searchers. through direct transaction to the miners. Protocol developers are an interesting one, because they create sort of the base rules of these games, they create the structure and the abilities for parties like block producers to take certain action, which ultimately give birth to the MEV.
Before providing a deep dive on the MEV, we know the mines can produce blocks, include/exclude transactions, and order transactions. What else can miners do? Miners can change their block timestamps, manipulate timestamps through "randomness” manipulation, time jacking and other ways. Taking the Bitcoin network as an example, Bitcoin's difficulty adjustment was initially based on timestamps. Attackers would reduce the difficulties by forking the chain and forge the timestamps in order to produce blocks more quickly. The blocks are produced based on the Longest Chain Rule so all rewards would go to the attacker. Therefore, the community modified the rules to define the chain with the most computing power and the most cumulative difficulty as the main chain, thus avoiding the problem. The block proposers also have the right to censor the consensus vote and manipulate the voting power through ways such as Selfish Mining or BFT vote censorship. In the POW mechanism, when the miners themselves get a longer chain than others, they will not immediately release their own chain, while continue mining on their own chain. We call it BFT vote censorship in the BFT mechanism. In addition to this, relayer providers like Flashbot allows block proposers to read transactions from the mempool.Of course, they can also control over the addition/removal of transactions and the sequence of transactions as we mentioned above.

With regard to Tx-Based transactions, , we can categorize them into censorship manipulation and ordering-based manipulation. There are two categories of transaction orders: absolute and relative ordering. Absolute Ordering means that your transaction is not based on another transaction in the block and it has an absolute position in the block. For example, if a liquidation process is taking place, someone must have to trigger the liquidation mechanism and profit by doing so. Block proposers can guarantee that they are always the first one to do so and obtain the extractable value. Osmosis has proposed several solutions to restrict this. However, it is not the focus for Osmosis at the moment because such transactions are positive and provide the services to the network. They make profits by not extracting value from everyone. Osmosis prefers to address the latter which is called theRelative Ordering. When you try to base your transaction on someone else's, we call it Normal Frontrunning (and also includes Back running). For example, the sandwich attack (a type of backrunning), an attacker finds a transaction on the network who is buying an asset, and the attacker purchases that asset before the trader places an order. Although the trader submits the transaction first, the attacker can prioritize its owntransaction first, broadcast it to the network,which push up the price of the asset, and eventually sell it by higher price to the trader through the commodity trade. Such transactions essentially sacrifice the benefits of other participants for attacker’s own benefit, and can cause serious problems such as loss of users’ funds, instability of the consensus mechanism, soaring gas fees, centralization of block production and other issues.
In terms of the right to read transactions, we can encrypt them when they are added to the mempool . Mempool is a caching function for transactions. The mempool performs an initial legitimacy check on accepted transactions and filters out some illegal ones. We can have multiple solutions on how do encrypt the mempool layer, one is SGX, one is Timelock crypto, I am gonna make crypto transactions anyone can trustlessly decrypted it, after running some sequential process in a few minutes, the third one is threshold decryption, where you have some communities, some weights, they collectively can decrypt some transactions, where ⅔ the participants can decrypt it. The assumption is quite similar to the Tendermint POS assumption, if ⅔ nodes we trust, them collectively to act honestly, we don’t attacked, or slash them, if we make trust assumption. Similarly, we can make the decryption, as we trust that. So for the POS and Tendermint mechanism, the “community” is very important. On the other hand, Timelook doesn't emphasize the concept of community, but you need to make sure that everybody participates and uses it. Also, you need to make sure that it's locked for a certain amount of time and that it's not decrypted in the mempool. If its latency is too long, it will affect user experience. The threshold cryptography, which is also the MEV technique used by Osmosis that we will talk about later, allows you to choose whether to participate or not, providing validators with high flexibility.
For the Timelock, I think that we may try to apply for the non- transaction based activities such as adding or removing liquidity. For example, many bots have started to sandwich by getting in front of trades and adding a ton of liquidity, then the trade will be executed and then they’ll immediately remove their liquidity from a pair. So it’s like providing liquidity just in time, and then immediately removing it as well. So the user actually gets better execution than they would otherwise. But it is this bot that is able to capture the fees that that user is creating. And it’s at the expense of liquidity providers that this bot is making MEV as opposed to the users themselves, which is like a super interesting twist on the sandwich game.
Osmosis has instead adopted the threshold cryptography. Different public chains have different architecture in terms of mempools encryption technique, for example Solana does not have a public mempool, and its private mempools are only visible to validators. Moreover, relayerslike Flashbot, it’s not entirely private, like, it is direct to the miner in that it doesn’t go to the public mempool, but the miner, Flashbots, Flashbot API users, nodes can see the bundle and the transactions in it. So it’s not truly private, but it is direct to the miner and it’s private from sandwich bots and like other bots in the public mempool. Also, It is still a relatively centralized solution although it’s considering using alternative solutions to help decentralize the network. Therefore, many long-tail strategies will not be passed through the Flashbots, rather they will be packaged into bundles through some protocols such as ArcherDAO, BloxRoute, or directly to miners.
Flashbot is a new relay system that facilitates communication between searchers and miners in the ecosystem. By creating private pools, searchers place transactions into bundles and send them to miners via Flashbot. So a bundle is a group of transactions that a searcher submits to a miner. And that group of transactions can be my own as a searcher. But it could also be someone else’s that I found in the pending mempool or some other source, and a bundle has to be executed in the order that it’s provided. They also solve the problem by providing private auctions between counterparties. The searcher can either submit the transaction by bidding up the fee or directly bribing the miners/validators with no upfront transaction fees paid.
As a rather centralized organization, with only few nodes, you will need to trust the nodes in order to perform the transaction via Flashbots. Before Flashbot sends all bundles to miners, it prioritizes senders (searchers) based on a reputation system that splits the relay into a high priority and low priority queue. Then if Flashbots has a on-chan record, and in their relay of you submitting high quality bundles through regular basis, then you will be put in the high priority queue. Rather if you are a new searcher and has no record, you will end up with the low priority queue. So this will dramatically decrease the chance for theDDOS attacks (as only low priority queue transactions would be DDOS and hence, they can send higher-quality transaction bundles to miners.
According to Flashbot's latest path, they will use SGX to achieve higher degree of privacy and decentralization. They will continue to iterate on existing private mempools. Miners can have their own SGX mempool and every users can use it. We can understand SGX as a way to create private mempools. Furthermore, SGX requires a relatively high security budget, for example if you break it, you need to hire 10 top engineers with a $10 million budget. However, this is nothing for the huge MEV market. You only need to break one SGX, take out a key then break the mempool.
Osmosis seems to be realizing our MEV endgame. Will the competition for MEV in the future become a absolute block production or the game the miners play between attestation and sequencing of transactions in a block becomes definitely outsourced ? Hence, there will be division of labor between those two, and the miners will focus entirely on the attestation part. And then you have a marketplace where they can choose the best block contents. And it always felt like to me that the way that it’s done right now with the bundles is basically like a stopgap to the real solution, which is an actual block template market.
Osmosis tries to achive the above mission through use threshold encryption. Transactions are encrypted once they get included in the mempool (before being included in the block), which hides the transaction information and thus prevents the validators from ordering/censoring the transaction. After the transaction is encrypted, the private key used to decrypt the transaction is shared among all validators. Then 2/3 of the validators are able to decrypt it. Eventually, the block will be almost simultaneously decrypted when it’s confirmed.
This MEV protection is unique to Osmosis because it is a sovereign application chain based on the Cosmos SDK and implemented through the Tendermint Core (BFT consensus) mechanismIn Tendermint, Tendermint consensus does not have concurrent block proposals.A new block cannot be proposed before the current block proposal is finalized by ⅔ of the validators’ voting. Instead, in POW, two miners might win the probabilistic lottery at the same time, and due to the network propagation delays, one miner might not know another one has won it. All block contents can be decrypted when they are broadcasted among the whole network. All block producers take part in the decryption process at the same block, the protocol ensures that block producers don’t get an unfair advantage over others a in terms of MEV extraction. This is more difficult to achieve in a network like PoW, where blocks are proposed simultaneously, and the finality emerges over time. Therefore, after decryption it still exposes the network to MEV vulnerabilities through possible Reorg issues (Reorg attacks, Time Bandit attacks, or double spend attacks).
Moreover, threshold encryption solves the latency problem, where blocks are not executed until they are finalized.We have seen this high latency problem exist in the Osmosis. Block Proposer normally produces blocks, adds and sequences transactions. Validators can be also understood as voters in Tendermint. Osmosis is building something called vote extensions, which extends the functionalities the Tendermint has. Each validator can contribute additional data along with their consensus vote, so encryption should be shared. In the concept of Joint Proposals, each validator can include some types of transactions in their vote and the next proposer will be required to include the transactions in the proposal. Of course there will be some overlapping transactions.But they can add if they want and at least include the one the validators add.. This prevents the monopoly of one proposer, the only one who has access to inclusion, and allows all validators to contribute to the inclusion a little bit. Here we separate the block producer from the validator, and seperate the inclusion and ordering, do you separately vote on the sets of transactions included and ordered? We hope to achieve a higher degree of duty segregation int POS. So we do have this division of labor now, just to a smaller degree where miners are already sort of separated into mining pools (POW) , and the pool operator(ETH2.0) and the workers basically who add different transactions into bundles via Flashbot. We hope to see more roles available in the future.
Osmosis creates bundle atomicity and prevents miners from seeing the contents of the transaction. Block producers are required to focus entirely on the attestation part, which significantly improves the privacy and ensures the bundles won’t be unbundled. This brings a more efficient marketplace of transactions in the long run, and we can truly have bundle atomicity at the protocol layer, and should be able to sequence transactions at the protocol layer.
Through Joint Proposals, we can select the best individual transactions and place them in a mempool, which creates the prototype for a future block market where we can select the best block contents. Flashbot is a market for bundles, transaction bundles for miners to include. Will Osmosis be able to create the market for block template bundles? Block template bundles is a new name given and are considered as the combination of several transactions and a block template that is like one block full of transactions. So now someone can propose several block templates in a row to replace existing blocks that exist in the Ethereum chain right now. We strongly believe that while the current solution of bundles (full of transactions) is just a stepping stone , while there may be a larger marketplace for block template bundles in the future. And as Sunny Aggarwal said: Flashobot tries to solve and reduce the side effects of MEV, such as high fees, the instability of consensus, and other problems, while Osmosis tries to solve the problems on MEV itself.
We have seen other solutions, such as Uniswap V3, which tries to allow liquidity providers to concentrate their liquidity within certain price ranges, and this makes it much more capital efficient and gives traders much deeper liquidity within the price ranges. So users can set tighter slippage and have less price impact on trades, which gives less room for sandwich bots to front run them and push them to their slippage limits and extract value from them. Also, the high amount of capital required creates the high barriers to entry for bots as they dont want to lose it, being unbundled or being stuck in this large capital when they want to liquidate them.Thirdly, it’s more difficult to reason about and deal with the data.Another example is Cowswap. It provides users the best execution price through the off chain logic using Solver. However, in extreme cases, if there is a $1 billion commodity trade, it is difficult to prevent Solver from screwing with trade and gaining some incentives as in the median trades. So MEV always exists, it's just a matter of how to reduce the occurrence of MEV, how to reduce the frequency of relative ordering such as sandwich attacks that bring significant negative impacts, to the users and network participants. For permissionless MEV arbitrage and liquidations transactions, we think these are stable business models. We have seen that many MEV are top miners and the degree of centralization is very high. Ethermine occupies 30% of the network's hashing power. F2POOL and Sparkpool are also the top players. In addition to the application-specific layer, we are also looking for solutions in the consensus layer and P2P layer. For example, Osmosis' threshold encryption is a consensus layer solution, and TX's fair ordering can also be achieved in the consensus layer. In terms of P2P layer, we can directly build direct communication among users, searcher sand miners via the dapps(such as wallets) (Mining DAO is built to connect the searchers to the users directly and helps to bring the best execution price for the user. Lastly, The segregation of the mempools is also a good solution in the P2P layer, and we can allocate a variety strategies among different relayes (e.g. Archer DAO, BloxRoute, direct to miners, etc) to have better coverage on average of block that my bundles will be included.

As mentioned above in the POW system, miners are participating in the probabilistic lottery. Two miners can win the probabilistic algorithm at the same time. Due to the propagation delays of the network, the two miners do not know that the other has won the race, and the rule of the network is done by looking at the total amount of computational work that has gone into either work and the one that has more l computational work. Thus, if one of them continuously and quickly mines two blocks the miner will eventually win. But here's the problem, it means you can go back a few blocks before to perform more computational work quickly than the actual canonical chain, and then show it to the network. Since this happens, the network follows the rule that the chain with the most usage or the most computational work becomes the main chain. Then this miner can do a lot of bad things. Technically, this is hard to achieve, but if someone could actually do this, then you could go back in time and liquidate positions that others are ready to liquidate because you are the miner and you could prioritize your own transactions. You may perform the liquidation yourself and capture all the MEV value from it, or you could capture MEV by sequencing other people's transactions or in any other possible method. This is called a Time Bandit Attack (or Double Spend Attack, Reorg Attack).
What are the potential negative effects? Take oracle as an example, nowadays, most oracles including g Chainlink and Maker, their oracle updates are not tied to a specific block height actually. This is a very serious problem as most projects in Defi are relying on Chainlink. For instance, in the current block, there might be a user who is closing his liquidation - closing the loan in block zero. Then after five blocks, Compound (using Uniswap v2+ Chainlink, not dependent solely on Chainlink), Maker or Chainlink posts a price update and the user will get liquidated in a few minutes. The miner can then take the oracle update and go back five blocks in time,and liquidate the user with a future oracle update before the user can actually liquidate and close the position when it’s not actually liquidatable. It will be very disruptive to the system. Also, the encryption technique such as threshold cryptography we mentioned above cannot be implemented in the POW network. Blocks are proposed simultaneously, and the finality emergense over timeSo after decryption it would still expose the network to MEV vulnerabilities through potential reorgs (Reorg attacks, also known as Timebandit Attacks or Double Spend Attacks). Although the problem is severe if it happens, it’s good to see some of he top miners such as Ethermine, clearly state that they will not behave in such disruptive ways. The architecture of ETH2.0 and the introduction of the Attestation game also helps to solve the problem, which we will introduce more in further research.
Reference
No comments yet