<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/">
    <channel>
        <title>YQ</title>
        <link>https://paragraph.com/@ohotties</link>
        <description>restake/acc https://AltLayer.io. </description>
        <lastBuildDate>Tue, 07 Apr 2026 04:28:22 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>en</language>
        
        <copyright>All rights reserved</copyright>
        <item>
            <title><![CDATA[Liquid Staking post Blast]]></title>
            <link>https://paragraph.com/@ohotties/liquid-staking-post-blast</link>
            <guid>1qRgwsmtJdH0Qv7q2wUB</guid>
            <pubDate>Mon, 27 Nov 2023 15:17:54 GMT</pubDate>
            <description><![CDATA[Liquid staking has emerged as an innovative mechanism on Ethereum that gives users a way to participate in securing the network through staking while retaining liquidity of their assets. Traditionally, staking Ethereum requires users to lock up their ETH tokens in specialized validator contracts to serve the purpose of block production and verification. In return, stakers receive block rewards and fees. However, the locked tokens lose all liquidity during this process, meaning they cannot be ...]]></description>
            <content:encoded><![CDATA[<p>Liquid staking has emerged as an innovative mechanism on Ethereum that gives users a way to participate in securing the network through staking while retaining liquidity of their assets. Traditionally, staking Ethereum requires users to lock up their ETH tokens in specialized validator contracts to serve the purpose of block production and verification. In return, stakers receive block rewards and fees. However, the locked tokens lose all liquidity during this process, meaning they cannot be transferred, traded, or utilized elsewhere. Liquid staking protocols solve this by issuing derivative tokens that represent staked ETH and allow stakers to regain liquidity.</p><p>Liquid staking protocols accept deposits of ETH from users and issue tokens that track the value of each user’s share of the staked ETH plus accrued block rewards over time. The key innovation is that these tokens are designed as freely transferable ERC20 tokens that can be traded on exchanges, lent or borrowed from DeFi applications, or provided as liquidity to AMMs. This unlocks the liquidity of staked ETH while still allowing stakers to earn from securing Ethereum through their share of validator rewards represented by the derivative token. The recent Blast protocol offers 4% on Eth deposits provided to the network via bridging assets via liquid staking, and has gained over 569 million-USD Eth(27, Nov, 2023) locked in their contract along with both claps and critics. This makes liquid staking as the hot topic again after the PoS merge since 2022.</p><p>In this post, we take an in-depth look at how liquid staking works technically, analyze the risks and benefits liquid staking introduces for both Ethereum and users, and explore the protocol-level implications it has on factors like network security, decentralization, and systemic risks arising from wider adoption of staking derivatives.</p><h2 id="h-what-is-liquid-staking" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">What is Liquid Staking</h2><p>Liquid staking protocols facilitate the ability for ETH holders to stake their funds and participate in securing Ethereum without losing flexibility or access to their assets. These protocols accept deposits of ETH from regular users who may not have the 32 ETH minimum or the ability to set up and maintain validator infrastructure 24x7, which is required in normal staking.</p><p>In return for users’ deposits, liquid staking protocols issue derivative tokens that represent each depositor’s fractional share of the rewards-earning staked ETH pool held by that protocol. These tokens retain liquidity for the users, allowing them to transfer, trade or utilize them for other DeFi activities while still earning pro rata staking yields on their share of ETH.</p><p>Popular liquid staking protocols on Ethereum include Lido, Rocket Pool and Coinbase amongst others. For example, when users deposit ETH tokens into the Lido protocol, they receive stETH tokens in return. The stETH tokens track the value of the staked ETH deposits along with accrued block rewards over time. Users can hold stETH, trade it on exchanges or utilize these derivative tokens in other DeFi protocols to earn additional yields.</p><h2 id="h-mechanics-of-liquid-staking-protocols" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Mechanics of Liquid Staking Protocols</h2><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/c639296dde99d183d9c61d1ca65cc75bbc54db55d6eb466c3db5ef386ab644e7.png" alt="Liquid Staking of Ethereum" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">Liquid Staking of Ethereum</figcaption></figure><p>Under the hood, liquid staking protocols aggregate ETH deposits from various individual users into pools large enough to meet the 32 ETH threshold required for operating validator nodes on Ethereum.</p><p>The pooled ETH deposits are then leveraged by the protocols to set up and maintain validator nodes that run the infrastructure for tasks critical to staking on Ethereum - like participating in PoS consensus, block production, reward distribution and governance of the staked deposits.</p><p>These validator nodes are typically operated by professional node operators contracted by the protocols rather than the end users themselves. Users who deposit ETH into these protocols do not need to have the technical expertise or ability required for tasks like infrastructure management, key generation or security of validator nodes associated with staking.</p><p>In return for each unit of ETH deposited by a user, the liquid staking protocols mint and distribute ERC20 derivative tokens that represent fractional ownership of the rewards-earning staked ETH pool. For example, 1 stETH token issued by Lido would represent 1 unit of ETH deposited by a user into the Lido staking pool plus accrued block rewards over time.</p><p>When users want to retrieve their funds later, they return (burn) their share of derivative tokens to the smart contract in exchange for the underlying share of the staked ETH deposit plus any rewards earned while they were deposited.</p><h2 id="h-benefits-of-liquid-staking-to-users" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Benefits of Liquid Staking to Users</h2><p>Liquid staking offers several advantages to users compared to regular Ethereum staking around factors like accessibility, liquidity, delegation and capital efficiency:</p><ol><li><p>Reduced barriers with no minimum ETH amount: Liquid staking protocols allow token holders to participate in staking even if they held less than 32 ETH. Smaller token holders can pool together funds while still earning pro rata block rewards and fees.</p></li><li><p>Liquidity of staked assets: Users can seamlessly transfer, trade or utilize liquid staking derivative tokens for other DeFi activities while still earning staking yields rather than having tokens locked. The derivatives provide better capital efficiency.</p></li><li><p>Delegation of staking responsibilities: Protocols appoint professional node operators to handle technical complexities around infrastructure, security, key management rather than regular token holders.</p></li><li><p>Capital efficiency via simultaneous rewards: Users can put their ETH to productive use earning staking yields even while using their liquid derivative tokens elsewhere like in AMMs to further increase yields.</p></li></ol><h2 id="h-risks-associated-with-liquid-staking" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Risks Associated with Liquid Staking</h2><p>While liquid staking opens up new opportunities, it also introduces risks from aspects like smart contract vulnerabilities, excessive centralization among operators, and market volatility:</p><ol><li><p>Smart contracts bugs and vulnerabilities: The smart contracts powering the staking pools and derivative token issuance in liquid staking protocols can have vulnerabilities that may be exploited to drain funds. Such incidents can lead to loss of pooled staked ETH given the value they hold. Preventing incidents requires high auditing standards and vigorous testing. Protocols like Lido have undertaken several audit rounds but risks cannot be fully eliminated.</p></li><li><p>Centralization tendency: As liquid staking via a protocol like Lido gets very popular, it raises centralization risks for Ethereum. Network effects arise with growing usage of a dominant staking derivative concentrating stake under it. For example, Lido has ~30% of total staked ETH supply presently. Very high penetration beyond 33% raises possibilities of transaction censorship, security risks and regulators exploiting central points of control. Maintaining an equilibrium of stakeholders is critical to prevent systemic imbalance.</p></li><li><p>Governance token domination: Many liquid staking protocols have issued governance tokens that allow holders to direct protocol actions. For example, Lido has the LDO token which allows voting on updates. High concentration of tokens with specific entities allows them overriding influence over decisions by staked ETH holders on factors like commission rates and risk policies. Preventing plutocratic control requires governance innovations and decentralization.</p></li><li><p>Opacity and principal-agent risks: In liquid staking systems, users deposit ETH but delegate actual validator operations to node operators contracted by protocols. This separation can pose risks like misalignments, fund misuse, opaque reward sharing affecting users. Protocols need transparent reward distribution, insurance and delegator control to prevent exacerbating risks. In many liquid staking designs, holders of governance tokens have overriding powers to control membership of the node operators running infrastructure. This creates a principal-agent problem where node operators interests may not fully align with those providing the economic stake i.e. LST holders. For example, both may collude to extract MEV or partake in censorship if incentivized sufficiently while bearing little economic risk - creating a centralized point of control. Appropriate checks and balances are needed to address this asymmetry.</p></li><li><p>Technical risks abound: Liquid staking relies on running validator nodes to earn staking rewards which is still based on experimental Ethereum technology that needs to stand the test of time. Any vulnerabilities in key management, consensus participation or blockchain history management can expose node operators to slashing.</p></li><li><p>Adoption risks: The long-term value proposition of liquid staking derivative tokens relies heavily on continuing adoption growth, network security and expected yields from Ethereum staking through validator nodes. Any technical failures or weakness in expected staking reward rates would undermine both adoption and market value of tokens.</p></li><li><p>Slashing risks: Validators in Ethereum PoS face penalties if they fail to follow protocol or suffer security lapses that can result in slashing of staked ETH.Though protocols like Lido distribute this risk through professional node operators, possibility of incidents persist along with need for insurance mechanisms to prevent impacts on users.</p></li><li><p>Derivative token volatility risks: Liquid staking derivative tokens can suffer from price divergence or instability during periods of speculation, poor market sentiment or mass unstaking pressures due to ETH price declines. These tokens are inherently volatile owing to complex factors driving risks and returns. Managing stability requires mature governance, monetary policy tools and capital controls.</p></li><li><p>Higher leverage increases attack risks: The high liquidity and composability of liquid staking derivatives allows them to be used in DeFi protocols for activities like getting loans/leverage or margin trading. For example, lending protocols allow depositing stETH as collateral to borrow ETH which can then be redeposited into Lido for compounding leverage. Though such looping strategies are capital efficient for users, at a systemic level they allow pooling potentially dangerous amounts of economic stake in validators while backed by little actual capital. This expands possibilities of exploit.</p></li></ol><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/1f93d0f4730714c7b03064036e711cc2670304f1639e3abfb2b742f221173025.png" alt="Excessive Leverage in Liquid Staking" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">Excessive Leverage in Liquid Staking</figcaption></figure><h2 id="h-case-study-i-risks-from-excessive-lending-and-leverage" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Case Study I: Risks from Excessive Lending and Leverage</h2><p>To effectively secure a proof-of-stake network like Ethereum over the long run, a majority of the base assets need to be durably locked and staked to validate transactions. However, excessive lending and leverage provided on liquid staking tokens and their derivatives can undermine the collateral guarantees backing the security of validated chains.</p><p>For instance, a user can deposit 1 ETH on Lido to receive 1 stETH, use this stETH as collateral to borrow 0.8 ETH on a lending platform, and reuse this 0.8 ETH again to stake/get stETH as collateral to borrow 0.64 ETH and so on. Eventually, the total staked assets providing security arise from very low collateral, with the margin between borrowed amounts towering over the actual durable capital staked and locked. At a systemic level, this financialization poses risks of sudden deleveraging events severely impacting staked token values. Via different ways, an attacker with much lower capital may gain enough leverage to control stETH or LSTs to control the staking power of Ethereum to censor transactions or even override the protocol’s consensus to hard fork.</p><p>Hence, managing risks requires keeping the debt-based derivatives of liquid staking tokens within prudent collateral limits to maintain durable security capital and prevent excessive hidden leverage. Actions may involve governance measures to curb lending risks, maintaining diversified lending sources, monitoring stable redemption expectations on liquid tokens and preventing ecosystem contagion from deleveraging cascades.</p><h2 id="h-case-study-ii-blast-yield-relies-on-bridge-deposits-raising-liquidity-risks" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Case Study II: Blast Yield Relies on Bridge Deposits, Raising Liquidity Risks</h2><p>Liquid staking and yield generation protocols at Layer 2 are an innovative way to offer high returns on crypto assets like Ether. However, over-dependence on deposit bridging in these sorts of protocols can concentrate liquidity risks.</p><p>For example, the recently launched Blast protocol offers 4% on Ether deposits provided to the network via bridging assets in from Layer 1 (Lido ETH liquid staking derivatives) along with 5% yields on bridged stablecoins from MakerDAO - effectively offering a yield generation engine at L2.</p><p>This bridged ETH liquidity forms a key baseline for enabling yields as the tokens supply collateral value and establish staking positions. , infusing this liquidity by exclusively relying on a bridge deposit pathway means that withdrawal events on L1, sudden losses of market confidence or ambiguity in withdrawal rights can create an acute liquidity crunch on such L2s.</p><p>If doubts emerge on easy redemption into native assets or market crashes trigger deleveraging, the incentive to bridging in more external liquidity rapidly diminishes creating a runway risk. With everyone headed to the withdrawal exit in a short period of time, smooth processing may get compromised and capital losses could result in extreme scenarios.</p><p>Reliable liquidity access and risk mitigation on such protocols requires diversified liquidity channels, multiple bridges, alternate yield sources and prudent leverage caps to prevent a sudden evaporation of deposits or collateral value. Enabling direct fiat on and off-ramps can add more durability. In essence, dependency solely on bridges and staking derivatives requires protocols to dynamically account for inherent stability risks.</p><h2 id="h-conclusion" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Conclusion</h2><p>In conclusion, liquid staking introduces a novel form of derivative instrument that unlocks new opportunities in Ethereum around capital efficiency for users, accessibility of staking yields, and building liquid markets for ETH locked in validators. However, their implications span beyond users, creating a complex web of outcomes impacting factors like network decentralization, security, correlations and risk interlinkages that must be carefully measured and governed to fully unlock advantages while minimizing systemic risks.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/7ebc4a3fb40a1f6c93f713bf859cb01bf9335b4211ec1076d5ad05f571ad0f2a.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Data Availability in Blockchains]]></title>
            <link>https://paragraph.com/@ohotties/data-availability-in-blockchains</link>
            <guid>215Oydeuw13bqGoPrZGv</guid>
            <pubDate>Tue, 07 Nov 2023 06:22:55 GMT</pubDate>
            <description><![CDATA[Data availability refers to the guarantee that the full set of transaction data included in a block is available to all participants in a blockchain network. This concept is critical for maintaining security, especially as blockchain systems scale to higher transaction volumes. New approaches like sharding, rollups, and light clients distribute transaction processing across shards or rollup chains, rather than having every node process everything. This spreads the work out to allow higher thr...]]></description>
            <content:encoded><![CDATA[<p>Data availability refers to the guarantee that the full set of transaction data included in a block is available to all participants in a blockchain network. This concept is critical for maintaining security, especially as blockchain systems scale to higher transaction volumes.</p><p>New approaches like sharding, rollups, and light clients distribute transaction processing across shards or rollup chains, rather than having every node process everything. This spreads the work out to allow higher throughput. But a consequence is that no single node sees all data anymore. This means individual nodes can no longer fully verify every transaction, nor can they generate fraud/validity proofs if some transaction data is missing or withheld. Light clients are especially vulnerable if data availability is not guaranteed.</p><p>Thus, guaranteeing accessibility of necessary data has become a key challenge in blockchain scaling. A variety of techniques are emerging to provide this assurance without excessive redundancy overhead.</p><h2 id="h-data-availability-problem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Data Availability Problem</h2><p>In traditional proof-of-work blockchains like Bitcoin and Ethereum(pre-POS), each block contains a header with metadata and a list of transactions. Full nodes in these networks download and validate every single transaction in each block by independently executing the transactions and checking that they are valid according to the blockchain&apos;s protocol rules. This independent execution of transactions allows full nodes to compute the current state that is required to verify and process the next block. Because they perform this transaction execution and verification, full nodes enforce critical transaction validity rules and prevent miners or block producers from including invalid transactions in blocks.</p><p>Lightweight clients, also known as SPV (Simplified Payment Verification) clients, take a different approach from full nodes in order to conserve bandwidth and storage. SPV clients only download and verify block headers. They do not execute or validate any transactions. Instead, SPV clients rely on an assumption that the chain favored by the blockchain&apos;s consensus algorithm, i.e. the longest chain in Bitcoin, contains only valid blocks that properly follow protocol rules. This allows SPV clients to outsource the actual transaction execution and verification to the blockchain&apos;s consensus mechanism itself.</p><p>The security model for SPV clients fundamentally depends on having an honest majority of consensus participants, for example miners in proof-of-work blockchains, that correctly apply transaction validity rules and reject any invalid blocks proposed by the minority. If a dishonest majority of miners or block producers colludes, they could coordinate to create blocks with illegal state transitions that create tokens out of thin air, violate conservation of assets, or enable other forms of theft or exploitation. SPV nodes would not be able to detect this malicious behavior on their own because they do not actually validate transactions. In contrast, full nodes enforce all protocol rules regardless of the consensus mechanism, so they would immediately reject such invalid blocks created by a dishonest majority.</p><p>To improve the security assumptions for SPV clients, an alerting mechanism called fraud/validity proofs can allow full nodes to generate cryptographic proofs that show light clients that a given block definitively contains an invalid state transition. After receiving a valid fraud/validity proof, light clients can then reject the invalid block even if the consensus mechanism incorrectly accepted it.</p><p>However, fraud/validity proofs fundamentally require full nodes that create them to have access to the full set of transaction data referenced in a block in order to re-execute the transactions and identify any invalid state changes. If block producers selectively release only the block headers and withhold the full transaction dataset for a given block, full nodes will not have the information they need to construct fraud/validity proofs. This situation where transaction data is unavailable to the network is known as the &quot;data availability problem&quot;.</p><p>Without guaranteed data availability, light clients are once again forced to simply trust that block producers are honestly behaving correctly. This complete reliance on trust defeats the purpose of fraud/validity proofs and undermines the security benefits of light client models. For this reason, data availability is absolutely critical for maintaining the expected security and effectiveness of fraud/validity proofs in blockchain networks, especially as they scale to higher transaction volumes.</p><h2 id="h-the-need-for-data-availability-in-scaling-solutions" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">The Need for Data Availability in Scaling Solutions</h2><p>In addition to the need for data availability in existing networks, data availability becomes even more important in the context of new scaling solutions like sharding and rollups that aim to increase transaction throughput. There’re a bunch of initiatives and projects like proto-danksharding, EIP 4484, Celestia, EigenDA and Avail, which have made a lot of progress to provide efficient and affordable DA for rollups.</p><p>In a sharded blockchain architecture, the singular network of validators is split into smaller groups or &quot;shards&quot; that each process and validate only a subset of transactions. Since shards do not process or validate transactions originating from other shards, the individual shard nodes only ever have access to transaction data for their own specific shard.</p><p>In rollups, transaction execution occurs off-chain in an optimized environment that allows for greatly increased transaction throughput. Only compressed and summarized transaction data is periodically posted to the main chain layer 1 by the rollup operator. This approach reduces fees and congestion on layer 1 compared to executing all transactions directly on layer 1.</p><p>In both sharding and rollups, no single node validates or even observes the full set of transactions across the entire system anymore. The previous data availability assumptions that held for traditional monolithic blockchains are broken. If a sequencer operator withholds the full transaction dataset for a rollup block, or a malicious group of colluding validators produces an invalid block in a shard, the full nodes in other shards or on layer 1 will not have access to the missing data. Without this data, they cannot generate fraud/validity proofs to signal invalid state transitions because the data required to identify the issue is unavailable.</p><p>Unless new robust methods are introduced to guarantee data availability, bad actors could exploit these new scaling models to selectively hide invalid transactions while maintaining enough visible block validity to avoid detection. Users are forced to simply trust that shard nodes and rollup operators will act honestly at all times, but trusting a large distributed set of actors to be consistently honest is risky and precisely what blockchains aim to avoid through incentive mechanisms, decentralization and cryptography.</p><p>Maintaining the expected security benefits of light client models and effective fraud/validity proofs in the context of cross-shard transactions and layer 2 solutions requires much stronger assurances that the full set of transaction data remains available somewhere in the network upon request. The data itself does not need to be downloaded by all nodes across all shards, but it must at least be readily accessible if participants wish to verify blocks and generate fraud/validity proofs about potential issues.</p><h2 id="h-data-availability-solutions" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Data Availability Solutions</h2><p>A number of approaches have been proposed and explored that help provide &quot;data availability&quot; without requiring all nodes in a sharded or layer 2 network to redundantly download and store the full transaction dataset:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/a40a67eb53592192df4be8e68d3557f3dabffd9907bc186b1c410bc31cf70e22.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h3 id="h-data-availability-sampling" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Data Availability Sampling</h3><p>Data availability sampling refers to a class of techniques that allow light clients to probabilistically check if transaction data is available by only downloading random fragments of the overall transaction dataset. Initiatives like proto-danksharding, Celestia, EigenDA and Avail, have tried various new techniques like KZG commitments and ZK proofs to achieve better sampling.</p><p>Typically, data availability sampling schemes rely on erasure coding, a method that takes the full transaction dataset and mathematically transforms it into a longer coded dataset by adding calculated redundancy. As long as a sufficient subset of the encoded fragments are available, the original data can be reconstructed from the encoded data by inverting the mathematical transform.</p><p>Light clients fetch and verify random small pieces of the erasure coded data. If any of the sampled fragments are missing or unavailable, this suggests that the full erasure coded dataset is likely unavailable to the network as a whole. The more samples a client can collect from random parts of the dataset, the higher likelihood the client has of detecting any missing data. Erasure coding parameters can be tuned so that only a very small percentage of total fragments, on the order of 1%, need to be randomly sampled by a light client in order to verify availability of the complete dataset with extremely high statistical confidence.</p><p>This general approach allows light clients to very efficiently check the availability of even very large transaction datasets without needing to actually download the entire dataset. The samples are also shared with full nodes on the network to help reconstruct any missing pieces of data and recover unavailable blocks when necessary.</p><h3 id="h-data-availability-committees" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Data Availability Committees</h3><p>Committee-based data availability schemes assign the responsibility for transaction data availability verification to a relatively small group of trusted nodes called a Data Availability Committee (DAC). The committee nodes store full copies of transaction data from blocks and signal that the data is indeed fully available by posting cryptographic signatures on the main chain. Light clients can then cheaply verify these signatures to gain confidence that the data is available to the committee nodes without actually processing or storing the data themselves.</p><p>The fundamental tradeoff with Data Availability Committees is that light clients must ultimately trust the committee nodes to correctly signal data availability. Relying on a centralized and permissioned committee introduces some degree of centralization risks and single points of failure into the network. However, techniques like using a DAC consisting of Proof-of-Stake validators with slashing penalties for misbehavior can reduce, but not completely eliminate, trust requirements for light clients.</p><h3 id="h-data-sharding" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Data Sharding</h3><p>In data sharding schemes, transaction data is split into multiple shards and light clients probabilistically sample data from all shards in order to verify data availability across the entire system as a whole. However, implementing cross-shard sampling typically adds considerable complexity to data availability protocols and may require complex networking topology to prevent single points of failure.</p><h3 id="h-succinct-proofs" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Succinct Proofs</h3><p>Emerging cryptographic proofs like zero-knowledge proofs and zk-SNARKs can potentially be used to prove the validity of state transitions in a block without revealing any of the underlying transaction data. For example, validity proofs can prove that a rollup block transition is fully valid without exposing any of the private transaction data used in the rollup itself.</p><p>However, data still fundamentally needs to be available somewhere for full nodes to properly update their local states. If the underlying transaction data for a block is completely withheld by the block producer, full nodes cannot accurately track latest state balances and integrity. Succinct proofs guarantee validity of state changes, but not the availability of the underlying data driving those changes.</p><h2 id="h-conclusion" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Conclusion</h2><p>Data availability is a critical challenge that must be addressed as blockchains scale transaction volumes and transition to advanced architectures like shards and rollups. Regardless, it is encouraging that multiple viable pathways exist to prevent data availability from becoming a barrier that permanently restricts the scalability and censorship resistance of decentralized blockchain networks as they grow.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/12ef6fabfab5e8d1376367dc8e6f2dd203ed3219cea9d76e3145c03782160474.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Stateless Clients: A Path to Decentralization in Ethereum]]></title>
            <link>https://paragraph.com/@ohotties/stateless-clients-a-path-to-decentralization-in-ethereum</link>
            <guid>J8Xp1rJ7hEVsDR16vMak</guid>
            <pubDate>Fri, 06 Oct 2023 09:32:08 GMT</pubDate>
            <description><![CDATA[As Ethereum usage increases, running a full node becomes more resource intensive and bandwidth intensive. This results in fewer people being able to run full nodes, reducing the decentralization of the network. Additionally, Ethereum struggles to scale as transaction demand increases, leading to network congestion and high gas fees. Stateless clients proposed by Vitalik in 2017 offer a potential solution to both the decentralization challenges facing Ethereum. The key idea behind stateless cl...]]></description>
            <content:encoded><![CDATA[<p>As Ethereum usage increases, running a full node becomes more resource intensive and bandwidth intensive. This results in fewer people being able to run full nodes, reducing the decentralization of the network. Additionally, Ethereum struggles to scale as transaction demand increases, leading to network congestion and high gas fees.</p><p>Stateless clients proposed by Vitalik in 2017 offer a potential solution to both the decentralization challenges facing Ethereum. The key idea behind stateless clients is to reduce the storage and bandwidth requirements for running a full node, making it feasible for more people to participate and decentralize the network. This essay will provide an in-depth look at how stateless clients work and their potential benefits and drawbacks.</p><h2 id="h-what-is-the-ethereum-state" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">What is the Ethereum State?</h2><p>To understand stateless clients, we first need to understand the concept of &quot;state&quot; in Ethereum. The Ethereum state refers to the current status of all accounts, contracts, balances, nonces, and storage in the Ethereum world. It can be thought of as a database that stores all relevant information about the Ethereum network at a given point in time.</p><p>The state is persisted in a Merkle Patricia trie, which is essentially a modified Merkle tree that stores key-value pairs. The root hash of this trie summarizes the entire state. After each new block, the state updates based on the transactions in that block. The new state root hash is included in the block header.</p><p>As more accounts, contracts, and transactions are added over time, the Ethereum state grows larger and larger. Today, the state size is over 1TB and increases by tens of gigabytes per year. This growing state underlies the issues with decentralization.</p><h2 id="h-why-state-growth-causes-problems" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Why State Growth Causes Problems</h2><p>The increasing Ethereum state size causes several key problems:</p><ul><li><p><strong>Longer sync times for new nodes</strong> - It takes an extremely long time for a new node to sync up by processing all historic state changes. This hinders decentralization by making it harder to run new full nodes. Syncing up a new node from genesis currently takes multiple days, up to weeks, on consumer hardware. This represents a major barrier to efficiently spinning up new nodes and allowing more participants to join the network.</p></li><li><p><strong>Increased hardware requirements</strong> - Larger state requires more storage, memory, and processing power to store, access, and update. This blocks less well-resourced users from running nodes. At a minimum, running a fully synced Ethereum node now requires an SSD with 1-2TB of capacity. This is out of reach for many potential node operators.</p></li><li><p><strong>More bandwidth usage</strong> - Broadcasts of new blocks must also include the updated state, requiring more bandwidth. This increases costs for node operators. Currently the state dominates most block broadcasts, so block sizes continue growing. More bandwidth translates to higher costs for node operators.</p></li><li><p><strong>Slower block verification</strong> - Reading and updating a larger state makes block verification slower, limiting transaction throughput. Each transaction requires multiple storage reads and writes to update balances, nonces, contract state, etc. A larger state means more reads/writes per block, reducing how many transactions can be processed per second.</p></li><li><p><strong>Permanent storage costs</strong> - Once data is added to the state, it must be stored forever. This creates unbounded state growth. There is currently no mechanism to actively delete old and unused state data. So the state retention costs increase indefinitely as long as Ethereum continues operating.</p></li></ul><h2 id="h-stateless-clients-explained" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Stateless Clients Explained</h2><p>Stateless clients provide a way to verify new blocks without needing access to the full Ethereum state. They utilize cryptographic proofs called &quot;witnesses&quot; that prove the validity of state changes in a block, without having the underlying state data.</p><p>Here&apos;s how stateless clients work at a high level:</p><ol><li><p>The client stores only block headers and state roots, not full state data. Block headers contain metadata like the root hash of the state trie after that block is processed.</p></li><li><p>When verifying a new block, the client receives a &quot;witness&quot; along with the block. This witness is a set of Merkle proofs that demonstrate specific state updates from transactions are valid.</p></li><li><p>The witness contains Merkle proofs of specific state values needed to process transactions. For example, account balances or contract storage updated.</p></li><li><p>The client uses the witness to ensure the transactions are valid against the last known state root. The proofs authenticate that the state changes match the previous root.</p></li><li><p>If valid, the client updates to the new state root provided in the block header. This new state root will be used to verify the next block.</p></li></ol><p>By using witnesses to verify state instead of storing the full state locally, stateless clients gain several advantages:</p><ul><li><p><strong>Very fast sync time</strong> - no need to replay historic state changes. A stateless client can sync almost instantly with just the block headers.</p></li><li><p><strong>Low storage requirements</strong> - state roots are only 32 bytes. Instead of hundreds of GB of state, only block headers are needed.</p></li><li><p><strong>Less bandwidth</strong> - only block headers and witnesses transferred, not full state. Bandwidth usage is minimized.</p></li><li><p><strong>Quick verification</strong> - witnesses contain only small relevant state subsets. Only the updated accounts/storage touched are proved.</p></li><li><p><strong>Easy light client support</strong> - light clients can easily verify proofs. The light client model is very compatible with stateless verification.</p></li></ul><h2 id="h-challenges-with-stateless-clients" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Challenges with Stateless Clients</h2><p>While stateless clients enable some major benefits, there are also significant technical challenges to overcome:</p><ul><li><p><strong>Witness size</strong> - witnesses could be too large to transmit efficiently. If full Merkle proofs are used, they may exceed block size limits.</p></li><li><p><strong>Witness creation</strong> - generating optimal witnesses is complex for block proposers. Proposers must assemble the right proof fragments to verify each transaction.</p></li><li><p><strong>No witness incentives</strong> - providing witnesses earns no direct rewards. Unlike mining, there is no built-in incentive structure for witness creation.</p></li><li><p><strong>Temporary data</strong> - witnesses prove state at one point in time, requiring regeneration. Witnesses cannot be reused as the state progresses.</p></li><li><p><strong>State storage</strong> - someone still needs to maintain the full state to produce witnesses. Stateless verification relies on stateful witness generation.</p></li><li><p><strong>Complex applications</strong> - some contracts may rely on large state subsets, bloating witnesses. For example, contracts that update many storage slots per transaction.</p></li></ul><h2 id="h-possible-solutions" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Possible Solutions</h2><p>Researchers have proposed various solutions to address these challenges:</p><ul><li><p><strong>Verkle trees</strong> - special data structures to reduce witness sizes. Verkle trees use succinct cryptographic commitments to minimize proof size.</p></li><li><p><strong>Witness caches</strong> - proposers could maintain recent witnesses to reuse. Caching witnesses that are likely to be relevant again amortizes creation costs.</p></li><li><p><strong>Protocol incentives</strong> - reward mechanisms for providing useful witnesses. New incentive structures could compensate witness creation.</p></li><li><p><strong>Intermediate state roots</strong> - track roots over time to avoid regenerating proofs. Maintain partial roots could reuse witness fragments.</p></li><li><p><strong>State rent</strong> - require payments to maintain state long term, pruning unused state. Rent forces cleanup of stale storage to limit proof size.</p></li><li><p><strong>Partitioned witness model</strong> - split state handling between proposers and verifiers. Have some dedicated proposer nodes generate witnesses.</p></li></ul><p>There are tradeoffs between these approaches and further research is needed to discover optimal implementations. Fortunately, the rapid innovation happening in zero knowledge cryptography could open up new possibilities for efficient stateless clients.</p><h2 id="h-potential-impact" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Potential Impact</h2><p>If the technical obstacles can be overcome, stateless clients could significantly advance Ethereum:</p><ul><li><p>Faster syncs and verification to support higher transaction throughput. Stateless validation will drastically speed up block processing.</p></li><li><p>Reduced resource requirements to run nodes, improving decentralization. Laptops and hobbyists could realistically run full nodes.</p></li><li><p>Better support for light clients like mobile wallets. State proofs are highly compatible with the light client model.</p></li><li><p>Smoother introduction of sharding, with stateless verification between shards. Cross-shard transactions can utilize efficient state proofs.</p></li><li><p>Ability to delete and prune old state data that is no longer useful. State growth can be actively managed instead of unbounded.</p></li><li><p>More flexibility for node operators to customize state based on needs. Nodes could tailor state retention policies to use cases.</p></li><li><p>Transition to a model where computation and bandwidth matter more than storage. Architecture shifts towards a more cloud-friendly model.</p></li></ul><p>There are also some potential risks, like increased vulnerability to DDoS attacks and blockchain history only being reliably stored by a few node operators. However, cryptographic proofs could reduce these risks. Overall, stateless clients are one of the most promising approaches to overcoming Ethereum&apos;s current limitations.</p><h2 id="h-conclusion" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Conclusion</h2><p>Ethereum&apos;s growing state size poses challenges for decentralization as adoption increases. Stateless clients present a way out by enabling nodes to verify transactions without the full blockchain state. This could eventually allow mobile phones to run Ethereum nodes, greatly increasing decentralization.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/a0148b0b57593754634fc515e3ec2cddcd9b02704202eddd51ba434b9d85b823.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Autonomous Worlds: A Technical Deep Dive]]></title>
            <link>https://paragraph.com/@ohotties/autonomous-worlds-a-technical-deep-dive</link>
            <guid>pOZ7wG06CpmaLWqVsIh4</guid>
            <pubDate>Mon, 25 Sep 2023 08:26:32 GMT</pubDate>
            <description><![CDATA[In recent years, a new paradigm of blockchain-based gaming has emerged that promises to revolutionize the way we build and experience virtual worlds. This paradigm is known as autonomous worlds – persistent, decentralized virtual environments where community ownership and on-chain logic enable new forms of emergent gameplay, economic incentives, and user-generated content. In this extensive technical article, we will explore what exactly autonomous worlds are, why they represent such a radica...]]></description>
            <content:encoded><![CDATA[<p>In recent years, a new paradigm of blockchain-based gaming has emerged that promises to revolutionize the way we build and experience virtual worlds. This paradigm is known as autonomous worlds – persistent, decentralized virtual environments where community ownership and on-chain logic enable new forms of emergent gameplay, economic incentives, and user-generated content.</p><p>In this extensive technical article, we will explore what exactly autonomous worlds are, why they represent such a radical shift from traditional gaming models, how they are built, the infrastructure that supports them, and the future possibilities they unlock. Strap in for an in-depth look at the frontier of crypto-native gaming.</p><h2 id="h-what-are-autonomous-worldsaw" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>What are Autonomous Worlds(AW)?</strong></h2><p>At the highest level, an autonomous world is a virtual environment powered by a decentralized blockchain network, where the rules, assets, and state of the world exist on-chain. This means that rather than being controlled by a centralized game developer, autonomous worlds are collectively owned and operated by their community of users. Features of autonomous worlds include:</p><p><strong>All game logic and state encoded on-chain via smart contracts.</strong> Having all game logic and state on-chain means that every core part of the game, from character attributes to item properties, is defined and executed by smart contracts on the blockchain. This provides complete transparency into how the game works. Players can inspect the actual code behind game mechanics, and verify that things like the scarcity and properties of NFT items are genuine. Storing state on-chain also enables features like provable randomness and transparency into factors like dynamic pricing or drop rates. Overall, on-chain logic and state means the entire core &quot;backend&quot; of the game runs transparently and autonomously, rather than on private centralized servers.</p><p><strong>On-chain assets represented as non-fungible tokens (NFTs). NFTs are used to represent unique in-game assets like characters, cosmetics, power-up</strong>s, etc. Each token contains metadata that assigns it to a specific asset, gives it attributes, and tracks ownership on-chain. This provides players with verifiable scarce digital ownership of in-game items. NFTs can be freely traded on markets without interference from developers. And unlike virtual items on centralized platforms, NFT-based assets persist forever on the blockchain independently of any game. Players can also use NFTs across different games that recognize the same token standards.</p><p><strong>Ability for anyone to permissionlessly build extensions or modifications to the core game contracts.</strong> This allows for emergent gameplay as the community iteratively expands on the initial game. One of the most powerful aspects of autonomous worlds is that the core smart contracts that govern gameplay are open source. This allows any developer to build addons, game modes, levels, and other extensions that build on the foundations of existing games. For example, someone could create a new tournament format that connects to the core combat logic. If popular with players, such extensions can be adopted into the main game.</p><p><strong>This permissionless innovation leads to faster iteration and community-driven evolution.</strong> Decentralized governance mechanisms enabled through tokens or DAOs, giving the community control over key parameters and future evolution of the world&apos;s rules and incentives. Autonomous worlds often use decentralized governance in the form of vote-weighted tokens or DAOs. This gives the community power to change things like token economic parameters, gameplay rules, or even the roadmap and feature scope. Proposals can be made and voted on transparently on-chain without top-down control by any one party. Power is decentralized between regular players, core contributors, third-party developers, token holders, and other community members. This gives the community a sense of true ownership in the evolution of the world. However, decentralized governance at scale is still an unsolved challenge.</p><p><strong>Permissionless economies where users can freely exchange value, enabled by direct integration with cryptocurrencies and DeFi protocols.</strong> Since autonomous worlds integrate with general purpose cryptocurrencies and DeFi apps, their economies are permissionless. Game assets represented as NFTs can be freely traded in markets or used as collateral in DeFi protocols. Tokens earned in-game can be converted and taken out into the larger crypto-economy. This allows for true play-to-earn functionality and gives players economic upside from the effort they put into the world. Permissionless economies enable greater diversity of business models and incentives aligned with open participation.</p><p>Unlike traditional centralized games, no single entity has top-down control in an autonomous world. The underlying blockchain technology gives users sovereignty over the assets they own and actions they take within these environments. While the initial game designers may set the &quot;physics&quot; of the world by encoding foundational rules into immutable smart contracts, governance power ultimately rests with the community.</p><h2 id="h-why-autonomous-worlds-represent-a-radical-shift" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Why Autonomous Worlds Represent a Radical Shift</strong></h2><p>Autonomous worlds introduce a radically new way of building and experiencing virtual environments enabled by blockchain technology. Here are some of the key differences from traditional gaming:</p><p><strong>True digital ownership.</strong> Rather than virtual items existing on isolated centralized game servers, blockchain-based NFTs let users verifiably own assets and take them anywhere. NFTs have metadata that proves ownership of a unique digital asset, secured by the blockchain. This gives users full control over their in-game items, allowing them to freely trade or move them between different games and marketplaces. Users don&apos;t have to worry about losing access to their items if a particular game goes offline. And game developers don&apos;t have centralized control over the in-game economy.</p><p><strong>Permissionless extendability.</strong> Any developer can build on top of an autonomous world without needing approval from the core team. This enables faster iteration and community-driven evolution. The open and transparent nature of smart contracts means anyone can view and build on top of the core logic that powers an autonomous world. Developers can create new experiences like mini-games, unique levels, quests or even entirely new gameplay modes. If these extensions prove popular, they can be adopted by the broader community. This permissionless innovation leads to greater creativity as developers build on each others&apos; work to push the boundaries of what&apos;s possible.</p><p><strong>Persistence and censorship-resistance.</strong> Autonomous worlds continue running forever on blockchain networks and are resilient against censorship or takedowns. An often overlooked aspect of fully on-chain games is their ability to function largely by themselves once they’re placed on-chain. Given the resilient nature of blockchains (they can stay online for as long as there are validators), on-chain games have a digital permanence: They can exist as code for as long as the blockchain is running. Theoretically, if the underlying blockchain the game is built on were to exist 300 years from now, the on-chain game and game logic would still exist and be stored on the blockchain, and players could still play the game.</p><p><strong>Play-to-earn economies.</strong> On-chain assets and currencies integrate with DeFi and real-world markets, enabling closed gameplay loops to give way to open economies that let users monetize their time and effort. Users can earn real value and fungible cryptocurrency tokens by performing tasks and playing the game. These tokens can be transferred to a user&apos;s wallet and traded on markets outside the game. Conversely, tokens earned outside the game can be brought in and used. This blurs the line between playing just for fun, and playing as a way to earn real income. New play-to-earn strategies and business models emerge from these open economies.</p><p><strong>Decentralized governance.</strong> Rather than developers holding unilateral control, blockchain-based coordination like DAOs enable users to collectively govern autonomous worlds. Any changes to game parameters or logic can be proposed and voted on in a transparent manner. Power is decentralized between regular players, core contributors, third-party developers, token holders, and other community members. This gives the community a sense of true ownership in the evolution of the world.</p><p><strong>Composability between worlds.</strong> Assets and systems built for one world can more easily interoperate with other on-chain environments. As an example, a particular cryptocurrency token earned in one game, can be used in another game that recognizes it. Or users can bring their customizable avatars and cosmetic NFTs between different worlds and blockchain-based metaverse environments. This composability breeds rich interconnected ecosystems of experiences rather than closed-off worlds.</p><p>These represent fundamental expansions of user ownership, control, and economic rights within game environments, aligned with the ethos of decentralization. While not without downsides, the autonomy enabled by crypto-native architecture offers radical new creative possibilities.</p><h2 id="h-how-autonomous-worlds-are-built" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>How Autonomous Worlds Are Built</strong></h2><p>Engineering autonomous worlds requires expertise across game design, blockchain development, mechanism design, and systems thinking. Here are key technical components involved:</p><ul><li><p><strong>Smart contracts that encode the &quot;physics&quot; and core rulesets.</strong> These define things like token/NFT standards, base character stats/abilities, minting logic, etc.</p></li><li><p><strong>Frontend implementation.</strong> This interprets the on-chain state to render graphics, gameplay UI, etc. for users. Can be implemented via traditional engines like Unity or specialized blockchain tools.</p></li><li><p><strong>Cryptoeconomic design.</strong> Effective incentive structures that drive desired user behaviors and prevent exploits. Influences factors ranging from gameplay rewards to governance.</p></li><li><p><strong>Technical architecture.</strong> Low-latency syncing of chain data to clients, modular/upgradeable contracts, composable interfaces between contracts are all important architectural considerations.</p></li><li><p><strong>Infrastructure requirements.</strong> Scalability via layer 2s, oracles, chain interoperability bridges, and other solutions to support an optimal user experience.</p></li></ul><p>While autonomous worlds can leverage some patterns from traditional game development like combat mechanics and 3D environments, the blockchain aspect poses its own unique constraints and opportunities. Teams need expertise across both domains to build out a robust on-chain gaming stack.</p><h2 id="h-supporting-infrastructure-for-autonomous-worlds" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Supporting Infrastructure for Autonomous Worlds</strong></h2><p>In addition to core game design and blockchain engineering expertise, there are also several supporting technologies and infrastructure elements that are crucial to enabling rich user experiences for autonomous worlds:</p><ul><li><p>Layer 2 scaling solutions like rollups are critical for reducing transaction fees for high-frequency actions and enabling higher throughput.</p></li><li><p>Data layers like The Graph allow efficient querying of on-chain data by frontends. Identity and account abstraction solutions improve UX by masking wallet addresses from users.</p></li><li><p>Oracles allow validating off-chain data to influence on-chain state if needed by the game logic. Interoperability bridges let users seamlessly move assets between different blockchain environments.</p></li><li><p>Cloud services like backend DBs, compute/rendering farms, messaging queues can complement on-chain logic to enable complex in-game behaviors.</p></li><li><p>Specialized tools like Lattice framework help developers implement common game patterns like character stats and inventories using reusable libraries.</p></li><li><p>As blockchain technology matures, the availability and sophistication of solutions like the above will vastly expand what&apos;s possible to build as autonomous worlds.</p></li></ul><h2 id="h-the-future-possibilities" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>The Future Possibilities</strong></h2><p>If the limitations can be effectively addressed, autonomous worlds enabled by blockchains and crypto-economics could open the door to entirely new possibilities:</p><ul><li><p>Interconnected metaverses - worlds that span narratives, mechanics, and economics.</p></li><li><p>Truly decentralized virtual economies with their own localized cultures and behaviors.</p></li><li><p>New community coordination tools that replace centralized governance.</p></li><li><p>User-generated subnets - worlds created by compositing tools and systems from other worlds.</p></li><li><p>Reputation, identity, and social capital systems tied to chain activity records.</p></li><li><p>Culturally vital digital spaces that capture the energy and buzz of physical scenes like Renaissance Florence, ancient Athens, or early Internet message boards.</p></li><li><p>By technically encoding community ownership, permissionless composability, and economic alignment, autonomous worlds could facilitate new forms of social organization, creativity, and production.</p></li></ul><h2 id="h-conclusion" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Conclusion</strong></h2><p>Autonomous worlds represent an ambitious vision of decentralized gaming environments—underpinned by blockchains and cryptography—that offer users greater creative freedom and ownership. By encoding rules, state, and incentives on immutable and community-controlled networks, they promise to enable new models of user governance, economics, and generative culture.</p><p>The autonomous worlds concept is still in its infancy, but it outlines a compelling north star for the evolution of virtual environments backed by blockchain technology and cryptographic economics. Their emergence would represent another step towards decentralizing digital economies and improving users&apos; rights in virtual worlds. We are only beginning to glimpse the possibilities of community-owned digital worlds.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/6d6b2340db7e53da2cb0807958bee58850c6477a38f0f0c2db7effd3c9347b27.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Demystifying Intent-Driven Transactions]]></title>
            <link>https://paragraph.com/@ohotties/demystifying-intent-driven-transactions</link>
            <guid>M5fP44qWL2Cgv9UiR4RF</guid>
            <pubDate>Wed, 16 Aug 2023 02:57:05 GMT</pubDate>
            <description><![CDATA[Limitations of Today&apos;s Transaction ModelTransactions are the fundamental means for users to interact with blockchains like Ethereum today. However, the transaction model as it exists surfaces several core limitations:Opacity - When submitting transactions, users have restricted visibility into how they will actually execute. Outcomes are heavily dependent on factors like network congestion, miner/validator behavior, overall blockchain state, and more at the specific time of execution. Th...]]></description>
            <content:encoded><![CDATA[<h2 id="h-limitations-of-todays-transaction-model" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Limitations of Today&apos;s Transaction Model</strong></h2><p>Transactions are the fundamental means for users to interact with blockchains like Ethereum today. However, the transaction model as it exists surfaces several core limitations:</p><ul><li><p><strong>Opacity</strong> - When submitting transactions, users have restricted visibility into how they will actually execute. Outcomes are heavily dependent on factors like network congestion, miner/validator behavior, overall blockchain state, and more at the specific time of execution. This opacity leaves users vulnerable to exploits like front-running, backrunning, and other &quot;maximal extractable value&quot; (MEV) techniques.</p></li><li><p><strong>Lack of guarantees</strong> - Transactions offer no inherent guarantees that user goals will be achieved as intended. Frequently, desired outcomes require coordinating across multiple domains, protocols, and decentralized applications in an atomic fashion. However, executing transactions atomically across decentralized environments with finality remains highly challenging today.</p></li><li><p><strong>Relinquished control</strong> - By signing raw transactions, users cede substantial control and authority to the intricacies of smart contract code and backend infrastructure. Transactions enable arbitrary computation dependent on implementation details. Users relinquish too much authority to decentralized applications and their creators when transacting through today&apos;s paradigm.</p></li><li><p><strong>Legibility</strong> - The transaction model forces users to reason about low-level details like nonces, gas fees, and other blockchain arcana. Transactions provide limited ability for users to express intents in plain terms that map to mental models. Lack of legibility impedes mainstream adoption.</p></li><li><p><strong>Inflexibility</strong> - Transactions offer minimal built-in support for cross-domain composability, privacy, and other progressive capabilities. Applications must implement complex logic and conventions to coordinate across environments and protect users.</p></li><li><p><strong>Centralization risks</strong> - The high degrees of freedom transactions grant to miners, validators, and relayers allow them to readily extract value through reordering, censorship, and other techniques. Lack of visibility into execution exacerbates user vulnerability to MEV exploitation.</p></li></ul><h2 id="h-what-are-intents" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>What Are Intents?</strong></h2><p>Intent-driven transaction frameworks aim to address the limitations above by inverting today&apos;s transaction model. Instead of dictating specific execution steps, users simply declare desired outcomes.</p><p>For example, a user could sign an intent stating &quot;I want to pay Bob 10 ETH&quot; without worrying about the underlying transaction details like nonces and gas fees. Intents encapsulate goals, not means.</p><p>Intents align more closely with how users think about transactions in plain terms – expressing objectives rather than execution paths. Specialized network participants called &quot;solvers&quot; then attempt to fulfill user intents optimally and atomically across applications while minimizing rent extraction.</p><p>In an ideal intent-based system, users could seamlessly perform complex actions across domains by signing a single high-level intent. Meanwhile, solvers coordinate to discover and satisfy user goals in a decentralized manner.</p><h2 id="h-how-do-intents-work" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>How Do Intents Work?</strong></h2><p>There are several key steps involved in the lifecycle of an intent-based transaction:</p><ul><li><p>Intent Creation – The user crafts and signs the intent indicating their desired outcome via a client like a wallet.</p></li><li><p>Intent Dissemination – The intent propagates to an &quot;intent pool&quot; allowing discovery by solvers. This could use public gossip protocols or more permissioned dissemination.</p></li><li><p>Intent Matching – Solvers monitor the pool for compatible intents to aggregate or route, forming fully specified state transitions.</p></li><li><p>Intent Execution – Solvers submit optimized transactions enacting bundled intents to the blockchain for execution and settlement.</p></li><li><p>Validation – Oracles and verification mechanisms ensure user intents were fully satisfied before releasing payments.</p></li></ul><h2 id="h-benefits-of-intent-driven-transactions" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Benefits of Intent-Driven Transactions</strong></h2><p>Intent architectures offer several advantages over today&apos;s transaction models:</p><ul><li><p>Increased user control – Users set constraints rather than relinquishing full authority.</p></li><li><p>Customization – Users decide personalized parameters like privacy, atomicity, counterparties, and fees.</p></li><li><p>Cross-domain composability – Intents seamlessly specify outcomes across applications, protocols, and blockchains.</p></li><li><p>Mitigated MEV – Encryption and programmability hinder rent extraction by solvers.</p></li><li><p>Enhanced outcomes – Specialized solvers compete to optimize intent fulfillment.</p></li><li><p>Legibility – Intents align closer with user mental models of transacting.</p></li></ul><h2 id="h-core-principles-of-intent-driven-interactions" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Core Principles of Intent-Driven Interactions</strong></h2><p>Intent-based transaction frameworks aim to address the limitations above by inverting today&apos;s transaction model. Instead of dictating specific execution methods, users simply declare desired outcomes.</p><p>Several core principles underpin intent-driven interactions:</p><ul><li><p>Declarative model - Users specify desired outcomes rather than low-level execution steps. Intents encapsulate goals, not means.</p></li><li><p>Conditional authority - User funds are only released upon evidence that their intent was fully satisfied. Intents grant limited rather than absolute authority.</p></li><li><p>Competitive solvers - Anyone can attempt to fulfill user intents through optimized transactions. Permissionless competition promotes efficiency and transparency.</p></li><li><p>Enhanced customization - Users decide personalized parameters like privacy, atomicity, counterparties, and fee structures. Intents are highly customizable to user needs.</p></li><li><p>Native composability - Intents seamlessly specify outcomes across applications, protocols, and blockchains. Cross-domain coordination is built-in, not bolted-on.</p></li><li><p>Legibility - Intents align with user mental models of goals and outcomes. Execution complexity is abstracted away.</p></li></ul><p>Under this paradigm, users relinquish far less control to the applications and protocols they interact with compared to signing raw transactions. Specialized network participants called &quot;solvers&quot; listen for user intents and attempt to fulfill them optimally while minimizing rent extraction.</p><h2 id="h-challenges-in-realizing-intent-infrastructures" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Challenges in Realizing Intent Infrastructures</strong></h2><p>However, significant obstacles remain in architecting performant and decentralized intent platforms. Several emerging projects have proposed initial architectures highlighting key complexities:</p><ul><li><p>Scalability – On-chain intent matching struggles with transaction volumes. Off-chain dissemination risks liveness failures and censorship. Hybrid approaches attempt to balance these tradeoffs.</p></li><li><p>Censorship resistance – Preventing malicious actors from selectively ignoring or censoring user intents is critical yet introduces challenges. Solutions range from global gossip protocols to consensus mechanisms for intent ordering and inclusion.</p></li><li><p>Cross-domain coordination – Enabling seamless composability across applications requires optimizing synchronization across solvers working on separate domains. Innovations like shared sequencers may help.</p></li><li><p>Collusion resistance – Mechanisms are needed to hinder malicious actors from manipulating auctions, intent matching, and other solver-based processes. Approaches like negative starting fees show promise.</p></li><li><p>User experience – For mainstream viability, users require simple interfaces abstracting away the complexity of crafting and disseminating intents. Solutions like wallets owning intent abstraction introduce their own risks.</p></li><li><p>Computation – The sophisticated predicates required to evaluate intents likely necessitate off-chain computation enabled by cryptographic proofs like ZK-SNARKS, at least until on-chain scaling substantially matures.</p></li></ul><h2 id="h-early-intent-applications-and-use-cases" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Early Intent Applications and Use Cases</strong></h2><p>Despite existing limitations, intents may already provide value in narrower applications where tradeoffs are clearer. Some emerging use cases include:</p><ul><li><p>Cross-domain trading - Users express abstract intents to go long or short particular assets using capital across multiple protocols. Solvers coordinate borrowing, swaps, transfers, and collateral management across chains and rollups.</p></li><li><p>Algorithmic gaming strategies - Rather than dictating each transaction, players issue broad intents like &quot;avoid combat encounters&quot; or &quot;maximize yield generation&quot;. Solvers translate these into optimized bot strategies.</p></li><li><p>Private order dissemination - Traders propagate encrypted intents representing trading interests to solvers, only revealed when certain predefined on-chain conditions are triggered.</p></li><li><p>Decentralized limit order books - Users submit limit prices for assets, executed by solvers through decentralized batch auction clearing algorithms.</p></li><li><p>Collusion resistant auctions - Bidders declare maximum willingness to pay, fulfilled by solvers using mechanisms like secret blind bids and negative starting prices.</p></li></ul>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/739e37d13c430b477d5d8f0d5963729395dfb17f2d070f1dfcdd89421b703f89.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Decentralize Rollup I: Decentralized Sequencing]]></title>
            <link>https://paragraph.com/@ohotties/decentralize-rollup-i-decentralized-sequencing</link>
            <guid>x1cZkkiy9yU1GYhEg2Ju</guid>
            <pubDate>Fri, 11 Aug 2023 12:07:22 GMT</pubDate>
            <description><![CDATA[Rollups have emerged as a leading technique for scaling Ethereum and blockchain networks more broadly. By offloading transaction data availability and execution off-chain, rollups alleviate the throughput constraints of Layer 1 chains while still inheriting their security guarantees. However, most rollups today rely on centralized sequencer nodes operated by the rollup development teams. Centralized sequencers provide excellent performance in terms of throughput and latency but come with down...]]></description>
            <content:encoded><![CDATA[<p>Rollups have emerged as a leading technique for scaling Ethereum and blockchain networks more broadly. By offloading transaction data availability and execution off-chain, rollups alleviate the throughput constraints of Layer 1 chains while still inheriting their security guarantees. However, most rollups today rely on centralized sequencer nodes operated by the rollup development teams. Centralized sequencers provide excellent performance in terms of throughput and latency but come with downsides:</p><ul><li><p><strong>Liveness</strong> - Centralized sequencers are a single point of failure. If the sole sequencer goes offline, the entire rollup halts. This jeopardizes the guarantee that the system will remain available and process transactions.</p></li><li><p><strong>Censorship Resistance</strong> - Centralized sequencers can arbitrarily censor transactions or users. There is no recourse if the operator chooses to block certain activity.</p></li><li><p><strong>Rent Extraction</strong> - Centralized sequencers can impose monopoly pricing for transactions given their total control over sequencing and inclusion. This manifests in direct fees as well as indirect value extraction via front-running, sandwich attacks, and other forms of MEV.</p></li><li><p><strong>Interoperability</strong> - With each rollup running its own siloed sequencer, cross-rollup composability and bridging requires complex custom integrations. Valuable features like cross-chain atomic transactions are difficult to implement across decentralized domains.</p></li></ul><p>These limitations motivate the need to decentralize sequencers. But this is technically non-trivial, as sequencers are performance-critical components where decentralization can easily bottleneck throughput if not done carefully. Let&apos;s explore prominent decentralization approaches.</p><p><strong>Permissioned Sequencers with Fair Ordering</strong></p><p>An approach more explicitly focused on MEV mitigation is to utilize permissioned sequencer sets with imposed fair ordering. In this model, a small set of authorized sequencer nodes collaborate to determine transaction order and block contents. A typical approach is &quot;first-in, first-out&quot; ordering where transactions are included in the order that sequencers observe them in their local mempools.</p><p>Consensus protocols like Raft or Tendermint enable sequencers to agree on a canonical ordering that matches wall clock timing as closely as possible. Front-running is deterred since the first received transaction gets priority. Other MEV vectors like sandwich attacks are eliminated given the lack of control over order beyond delivery timestamps. Cryptographic techniques like threshold encryption of transactions prior to ordering further hamper manipulation by keeping contents hidden.</p><p>The main downside is the permissioned nature of the sequencer set. An external authority determines who is included as a sequencer and must actively monitor for misbehavior. Admittance criteria and governance processes are crucial for maintaining a responsible sequencer set. MEV is not fully eliminated as timing manipulation and latency advantages still enable some profit. But it meaningfully curtails the most abusive practices.</p><p><strong>Permissionless Proof-of-Stake Sequencing</strong></p><p>A conceptually simple technique is to make the sequencer set permissionless by implementing proof-of-stake based participation. The rollup defines a native staking token. Any entity can join the sequencer set by staking the requisite amount of tokens, subjecting themselves to slashing penalties for malicious actions. An on-chain smart contract manages the stakes and coordinates leader election.</p><p>A typical approach is &quot;round robin&quot; selection where sequencers take turns in a fixed order proposing blocks. Alternatively, each epoch can randomly select a weighted subset of sequencers by stake to be eligible leaders. In both cases, participation rates are proportional to stake ownership. This enforces Sybil resistance - attackers must accumulate a substantial stake to mount an attack or compromise liveness.</p><p>Permissionless PoS sequencers provide open access and avoids centralized control. Liveness is strengthened compared to a single sequencer through redundancies. However, it is not as robust as BFT consensus protocols that require explicit confirmation from multiple validators. Censorship resistance depends on having a sufficiently decentralized stake distribution. MEV concerns remain unaddressed as any elected leader can extract rents during their epoch. Overall, permissionless PoS represents a baseline improvement but lacks mechanisms to handle more subtle risks around manipulation.</p><p><strong>MEV Auctions</strong></p><p>MEV auctions take the permissionless approach further by auctioning off sequencing rights to the highest bidder. During each epoch, participants bid based on the expected profit from transaction fees and MEV they can extract as the sequencer. The winner pays their bid amount to the rollup treasury and earns the right to sequence transactions and capture all profits during the epoch.</p><p>Properly designed, MEV auctions redistribute value from rent extraction to fund public goods. They provide permissionless participation and leverage financial incentives to dynamically allocate sequencing rights. However, auctions favor centralized actors with sufficient capital to consistently win the bids. Less prominent participants struggle to ever win a meaningful share of epochs. This leads to concentration of sequencing power and MEV profits over time. The emergence of a sequencing monopoly recreates many of the problems associated with centralized sequencers.</p><p><strong>Hybrid Approaches</strong></p><p>Tradeoffs exist in both permissionless and permissioned paradigms. This has motivated hybrid solutions that blend elements of both. For instance, a root permissioned set of sequencers could be expanded with a staking mechanic for permissionless participation. Thresholds guarantee minimum representation from authorized nodes while allowing open entry. Alternatively, MEV auctions could be restricted to registered candidates only.</p><p>Cryptographic techniques offer another approach. Multi-party computation can secure ordering and leader election processes without requiring a fully permissioned system. Secure enclaves like Intel SGX provide trusted execution environments that isolate sensitive computation like transaction ordering without centralized intermediaries. Zero-knowledge proofs enable transparent verification of correct sequencing without exposing raw transaction data.</p><p><strong>Shared Sequencing vs. Decentralized Sequencing</strong></p><p>There are two leading paradigms for decentralizing rollup sequencers - shared sequencing and direct decentralized sequencing.</p><p>In shared sequencing, a network of nodes provides sequencing-as-a-service to multiple rollups. Rollups plug into the shared sequencer to handle their transaction ordering and block production needs. The shared sequencer pool is decentralized, with nodes participating in a consensus protocol to agree on transactions and propose blocks. Individual rollups do not run their own sequencer nodes.In contrast, direct decentralized sequencing refers to each rollup deploying its own decentralized network of sequencer nodes. The rollup operates a custom consensus protocol between its sequencer pool to handle sequencing duties. Sequencers are dedicated to a single rollup.</p><p>Shared sequencing provides economies of scale and network effects for decentralization. A single robust sequencer network can service numerous rollups. Individual rollups avoid the overhead of bootstrapping and maintaining their own decentralized sequencer pools. Shared sequencing also enables seamless interoperability and composability between connected rollups.</p><p>However, direct decentralized sequencing allows each rollup to tailor protocols and incentives to suit its specific needs. Hybrid coordination between sequencing and execution nodes is simpler within a single rollup domain. Distributing value generation across many decentralized sequencer pools reduces systemic risk compared to a dominant shared network.Both approaches have merits and can co-exist in a heterogeneous ecosystem. Lightweight “lazy” rollups may opt for shared sequencing while feature-rich rollups may warrant custom decentralized sequencers. Crypto-economic mechanisms around shared security and interchangeable work tokens help align incentives between shared and direct paradigms.</p><p><strong>Decentralized Validation</strong></p><p>As discussed earlier, decentralized sequencing mitigates issues around liveness, censorship resistance, rent extraction, and composability faced by centralized sequencer designs. Alongside sequencing, the validation process which checks state transitions and block integrity must also be decentralized for proper rollup functionality.</p><p>In validity rollups, block producers generate zero-knowledge proofs that attest to the validity of state transitions. A decentralized network of verifier nodes checks these proofs against the previous state root to confirm blocks are valid. In fraud-proof rollups, a decentralized network of watcher nodes actively monitors block data and state changes to identify invalid transitions. Watchers can post fraud proofs to slash block producers that include invalid transactions.</p><p>Decentralized validation enables trustless and transparent verification of rollup state. Without it, users would need to simply trust claims from the rollup provider that blocks are valid. Proof generation and verification distributionmakeData availability efficient while eliminating centralized points of control.</p><p><strong>The Path Ahead</strong></p><p>While decentralized sequencing is crucial for realizing rollups&apos; full potential, they are just one piece of a complex puzzle. Issues around scalable data availability, cross-rollup interoperability, developer experience, and more must all co-evolve. Crypto-economic mechanisms need to provide sustainable security as bridges dissolve the boundary between Layer 1 and Layer 2. Meanwhile, sequencers sit at the core of rollup architectures, as more projects are acknowledging the need to decentralize sequencers.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/c306bd594d00c655bda96941dc1bb013b0ed081a5ae266935b47ac6cd51281e3.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Exploring Protocol-Enforced Proposer Commitments(PEPC)]]></title>
            <link>https://paragraph.com/@ohotties/exploring-protocol-enforced-proposer-commitments-pepc</link>
            <guid>Aox9w4f94ed3c2lfTB0i</guid>
            <pubDate>Sat, 05 Aug 2023 17:01:28 GMT</pubDate>
            <description><![CDATA[In Ethereum&apos;s evolution towards scalability via rollups and layer 2 solutions, block creation duties are increasingly outsourced from proposers (validators) to external block "builders." Most designs for formalizing this proposer-builder separation (PBS) focus on enshrining a specific mechanism, such as a full block auction, to facilitate proposer-builder exchanges. However, restricting the protocol to enabling only one type of exchange may not serve proposers optimally across the divers...]]></description>
            <content:encoded><![CDATA[<p>In Ethereum&apos;s evolution towards scalability via rollups and layer 2 solutions, block creation duties are increasingly outsourced from proposers (validators) to external block &quot;builders.&quot; Most designs for formalizing this proposer-builder separation (PBS) focus on enshrining a specific mechanism, such as a full block auction, to facilitate proposer-builder exchanges. However, restricting the protocol to enabling only one type of exchange may not serve proposers optimally across the diverse spectrum of potential duties they could offload.</p><p>Protocol-enforced proposer commitments (PEPC) proposed by <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/barnabemonnot">Barnabé Monnot</a> in 2022 offers a more flexible approach by providing generalized infrastructure for proposers to make credible commitments to any outsourced block building task. We will explore the potential benefits, open questions, roadblocks, and additional context around PEPC as an alternative path to unfetter proposers and allow permissionless innovation in outsourcing mechanisms.</p><h2 id="h-the-case-for-flexible-proposer-commitments" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>The Case for Flexible Proposer Commitments</strong></h2><p>Let us first consider the rationale behind PEPC&apos;s more open-ended design.</p><p>In the vision for Ethereum&apos;s &quot;endgame&quot; with widespread data sharding and rollup adoption, proposers will likely need to outsource an expanding array of duties essential for block validity. These may encompass outsourcing tasks like:</p><ul><li><p>Full block construction to builders</p></li><li><p>Transaction ordering to mitigate MEV concerns</p></li><li><p>Specific valid transaction inclusion through inclusion lists</p></li><li><p>Validity proofs for rollup blocks or ZK execution environments</p></li><li><p>Retrieval of data availability fragments</p></li><li><p>Any number of other critical functions in future protocol upgrades</p></li></ul><p>Enshrining specific exchange mechanisms for each proposer duty is daunting. The &quot;optimal&quot; market structure is challenging to pin down, let alone encode immutably in the protocol. Meanwhile, proposers require flexible infrastructure to make credible commitments to the outsourcing of their duties.</p><p>PEPC provides this flexibility by allowing proposers to register arbitrary commitments, expressed via EVM execution, that external parties can rely upon. The key principles are:</p><ul><li><p>Proposers can signal commitments to use any external service essential for block validity.</p></li><li><p>The protocol enforces satisfaction of commitments before considering blocks valid.</p></li><li><p>Anyone can provide services to proposers as long as they satisfy registered commitments.</p></li><li><p>There is no single imposed mechanism all proposers must adopt.</p></li></ul><p>In other words, PEPC focuses on providing <em>infrastructure</em> for commitment credibility, rather than mandating a <em>strategy</em> for outsourcing. Proposers choose how to leverage this infrastructure based on their needs and preferences.</p><h2 id="h-augmenting-validator-balance-tracking" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Augmenting Validator Balance Tracking</strong></h2><p>As background, there already exist some out-of-protocol mechanisms like<a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://eigenlayer.xyz/"> Eigenlayer</a> that allow validators to make commitments via external contracts. However, the Ethereum protocol is unaware of these commitments.</p><p>For instance, validators may &quot;restake&quot; their ETH deposits to Eigenlayer as collateral for providing services. Failure to deliver results in balances being slashed on Eigenlayer but not in the core protocol. This leads to discrepancies between actual validator incentives and the protocol&apos;s perception.</p><p>One simple augmentation, which we&apos;ll call &quot;in-protocol Eigenlayer&quot; (IP-Eigenlayer), is allowing messages to reduce validator balances in the core protocol based on their performance of out-of-band duties. This synchronizes their real balance with the protocol&apos;s view.</p><p>However, IP-Eigenlayer provides only limited guarantees. Malicious proposers can still deviate from commitments if the reward exceeds the slashed amount. We need the protocol to enforce commitments, not just record outcomes.</p><h2 id="h-exploring-optimistic-and-pessimistic-enforcement" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Exploring Optimistic and Pessimistic Enforcement</strong></h2><p>To move from recording outcomes to enforcing commitments, the protocol must differentiate between:</p><ol><li><p>Proposer made a commitment and external party delivered → Process payout</p></li><li><p>Proposer made a commitment and external party did <em>not</em> deliver → Process payout</p></li><li><p>Proposer made <em>no</em> commitment → No payout expected</p></li><li><p>Proposer made a commitment but deviated → <em>Invalid state transition</em></p></li></ol><p>Outcomes #1 and #2 are valid, #3 represents no commitment made, but #4 is an invalid state transition that should not be possible.</p><p>One approach is optimistic enforcement: let attesters slash proposers if they make invalid blocks violating commitments. But this retains misaligned incentives.</p><p>A stronger guarantee is pessimistic enforcement where invalid transitions simply cannot occur because blocks violating commitments are inherently invalid.</p><p>We can achieve this by allowing proposers to specify custom validity conditions via EVM execution. Commitments become prerequisites for block validity. Attesters merely check that blocks satisfy registered commitments. No invalid transitions can be finalized.</p><h2 id="h-accounting-for-data-availability" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Accounting for Data Availability</strong></h2><p>Enforcing commitments requires appropriate handling of data availability for builder contributions. We must prevent griefing between proposers and builders.</p><p>There are two broad categories of outsourcing contracts:</p><p>Paying Contracts: Proposer pays an external party for delivery of a service. For example, proposer requests a zk proof of block validity from a builder.</p><p>Procurement Contracts: External party pays the proposer to take some action. For instance, a builder pays for rights to construct a block.</p><p>Paying contracts have natural data availability guarantees. Builders only get paid by satisfying registered commitments.</p><p>However, procurement contracts allow griefing. Attackers could repeatedly bid without delivering, preventing honest builders from winning. Unfortunately, there are challenges with fully eliminating this vector:</p><ul><li><p>Bidding deposits slash malicious actors but disadvantage less capitalized builders.</p></li><li><p>Complex zero knowledge proofs place a burden on honest builders.</p></li><li><p>Encrypted bids must be decrypted by a committee upon reveal.</p></li></ul><p>There are no perfect solutions currently. But PEPC infrastructure enables continued innovation to tackle these concerns.</p><h2 id="h-revisiting-proposer-sophistication" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Revisiting Proposer Sophistication</strong></h2><p>Many PBS designs aim to promote &quot;proposer dumbness&quot; by having passive proposers accept the highest bid for their full block space. But this over-constrains proposers.</p><p>With PEPC infrastructure, a proposer need not rely on any specific outsourcing mechanism. Sophisticated proposers can leverage commitments creatively. Less sophisticated ones aren&apos;t forced into a single model but can adopt preset commitments. The key is flexibility.</p><h2 id="h-pepc-use-cases" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>PEPC Use Cases</strong></h2><p>To make this more concrete, here are some examples of commitments proposers could make:</p><ul><li><p>Full block auction to a single builder</p></li><li><p>Partial block auctions to multiple builders</p></li><li><p>Transaction inclusion lists</p></li><li><p>Future block production rights</p></li><li><p>Validity proofs for blocks or execution environments</p></li><li><p>Any number of customizable outsourcing mechanisms</p></li></ul><p>Proposers choose which commitments suit their needs rather than having the protocol dictate a specific model.</p><h2 id="h-augmenting-eigenlayer-suave" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Augmenting Eigenlayer, SUAVE</strong></h2><p>As described earlier, PEPC builds upon the Eigenlayer approach of enabling validators to make external commitments. However, it aims to create stronger guarantees by making the protocol enforce satisfaction of commitments instead of just recording outcomes.</p><p>Specifically, Eigenlayer supports three primary use cases according to founder Sreeram Kannan:</p><ol><li><p>Economic use cases - Users care about amount staked as collateral</p></li><li><p>Decentralization use cases - Users care about independent participant diversity</p></li><li><p>Block production use cases - Validators make commitments about block contents</p></li></ol><p>PEPC is most relevant for the third use case of block production commitments. By incorporating commitments into block validity conditions, PEPC moves this from an optimistic model (validators can violate but get slashed) to a pessimistic model (violating commitments inherently invalidates blocks).</p><p>However, for economic and decentralization use cases, it remains unclear whether PEPC can improve upon Eigenlayer&apos;s approach. This is an open area for further exploration.</p><p>Overall, PEPC generalizes some of the block production commitments enabled by Eigenlayer, and offers validators stronger credibility for certain outsourced duties essential for block creation. But it may not enhance other Eigenlayer use cases related to collateralization or distributed participation.</p><p>SUAVE allows validators to make credible commitments about their blocks by developing blocks based on declared user preferences. However, the SUAVE process operates on faster timescales, enabling &quot;fast games&quot; compared to the slower pace of Ethereum block production.</p><p>PEPC complements SUAVE by offering validators a way to make credible commitments even earlier in the process before SUAVE&apos;s block construction begins. PEPC commitments occur at the consensus layer, whereas SUAVE focuses on the execution layer.</p><p>Using PEPC, validators could commit to satisfying certain constraints or outsourcing specific duties while still leveraging SUAVE for flexible block building based on user preferences. For instance, validators might use PEPC for commitments about transaction ordering, and integrate SUAVE for maximizing user Extractable Value subject to those constraints.</p><p>Essentially, PEPC and SUAVE provide tools for validators to make credible commitments at different stages of the block production process. Used together, they offer a powerful framework for validators to align incentives and signal intentions to user communities.</p><h2 id="h-remaining-challenges" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Remaining Challenges</strong></h2><p>Major open questions surround PEPC&apos;s feasibility and desirability:</p><ul><li><p>Implementation complexity: PEPC may require similar research as PBS for consensus safety. Can optimistic aggregation handle multiple parallel builder commitments? What metering applies for proposer conditions?</p></li><li><p>Flexibility vs predictability: There is tension between supporting open-ended commitments and having predictable, well-understood exchange mechanisms. Would preset commitments be used most often?</p></li><li><p>Proposer security: Reliable access to outsourced services is critical for proposers. Can procurement grieving be mitigated? Is pessimistic enforcement viable?</p></li><li><p>Philosophical alignment: Does PEPC align with the protocol&apos;s role? Or does it overextend by enabling unconstrained commitments?</p></li></ul>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/670be89dea8fa1c276baf28318bb98fc5aecf3d33fb6c83402fcdbda8bb0555f.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Account Abstraction(AA): Link Web3 and Web2]]></title>
            <link>https://paragraph.com/@ohotties/account-abstraction-aa-link-web3-and-web2</link>
            <guid>hZt1mWy2sPu50Y4G7md4</guid>
            <pubDate>Wed, 02 Aug 2023 19:02:09 GMT</pubDate>
            <description><![CDATA[In the dynamic and evolving world of cryptocurrency, enhancing the user experience in interacting with blockchain applications and making web3 more intuitive and competitive with conventional web2 applications is key. The paradigm shift from "will crypto survive?" to "how can we bring the next billion users into the web3 ecosystem?" is taking center stage. One proposition that has sparked considerable interest is "account abstraction". This post delves into the intricacies of account abstract...]]></description>
            <content:encoded><![CDATA[<p>In the dynamic and evolving world of cryptocurrency, enhancing the user experience in interacting with blockchain applications and making web3 more intuitive and competitive with conventional web2 applications is key. The paradigm shift from &quot;will crypto survive?&quot; to &quot;how can we bring the next billion users into the web3 ecosystem?&quot; is taking center stage. One proposition that has sparked considerable interest is &quot;account abstraction&quot;. This post delves into the intricacies of account abstraction, tracing its evolution, present state, and future trajectory, while addressing key questions revolving around this topic.</p><h2 id="h-decoding-account-abstractionaa" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Decoding Account Abstraction(AA)</strong></h2><p>Account Abstraction(AA), a concept that might initially seem complex in the realm of web3, is a proposal aiming to augment the flexibility in the management and behavior of Ethereum accounts. It achieves this by introducing account contracts—special-purpose smart contracts that define and manage a user&apos;s Ethereum account, now termed as a smart account.</p><p>In the current setup, users interact with Ethereum using Externally Owned Accounts (EOAs), which are the only way to start a transaction or execute a smart contract. This method limits how users can interact with Ethereum. For instance, it makes it challenging to perform batches of transactions and requires users always to keep an ETH balance to cover gas. Account abstraction, as a solution, allows users to flexibly program more security and better user experiences into their accounts, thereby solving these issues.</p><h2 id="h-functionality-and-advantages-of-aa" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Functionality and Advantages of AA</strong></h2><p>From a network-level perspective, &quot;account abstraction&quot; implies that the specifics of account types are concealed from the Ethereum protocol. Every account, including self-custodial accounts, is simply a smart contract, with users having the freedom to determine how individual accounts are managed and operated.</p><p>From a user-level perspective, &quot;account abstraction&quot; suggests that certain technical specifics about interacting with Ethereum accounts are veiled behind higher-level interfaces. This enhancement can significantly reduce the complexity of using web3 applications and improve wallet designs. Account abstraction does not necessarily remove accounts from the users&apos; purview, even if they are abstracted from the protocol. Users still maintain a wallet address to receive funds and a signing key to ensure that only they can spend those funds. From the user&apos;s perspective, account abstraction is akin to utilizing a smart account that abstracts some details about interacting with the blockchain. For instance, account abstraction can eliminate the need for storing seed phrases/private keys, paying gas for transactions, or even setting up an on-chain account independently.</p><p>Account abstraction mitigates most of the friction associated with using web3 wallets and interacting with dapps, thereby moving web3 closer to the user-friendly ideal of web2 where all users—both novice and experienced—can benefit from the same degree of flexibility, security, and ease of use. Notably, account abstraction carries significant implications for the future of self-custody. With the features provided by account contracts, using a web3 wallet will emulate the experience of using a bank account or application without the need to trust the bank.</p><h2 id="h-methodologies-for-implementing-aa" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Methodologies for Implementing AA</strong></h2><p>There are two general methodologies for achieving account abstraction: enabling EOAs to execute EVM code and allowing smart contracts to initiate transactions. Many account abstraction proposals either want EOAs to behave as smart contracts or contract accounts to act as EOAs. The former approach supercharges EOAs and transforms them into smart accounts, setting the stage for native account abstraction. The latter approach provides another route to achieving account abstraction by introducing &quot;supercharged contracts&quot; that can act as EOAs. This resolves a pressing issue in Ethereum: the lack of support for contract wallets at the protocol level.</p><h2 id="h-potential-benefits-of-aa" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Potential Benefits of AA</strong></h2><p>The potential benefits of account abstraction, particularly with the advent of smart contract wallets, are significant. Users can define their own flexible security rules, recover their account if they lose the keys, share their account security across trusted devices or individuals, pay someone else&apos;s gas, or have someone else pay theirs. Moreover, it enables batch transactions and opens up new avenues for dapp and wallet developers to innovate on user experiences.</p><p>The current paradigm is such that only externally-owned accounts (EOAs) can start transactions. EOAs are simply public-private key pairs that grant absolute control to the holder of the private key within the rules of the Ethereum Virtual Machine (EVM). If the private key is lost, it can&apos;t be recovered, and stolen keys give thieves instant access to all the funds in an account.</p><p>Smart contract wallets provide a solution to these problems, but today they are challenging to program because any logic they implement must be translated into a set of EOA transactions before they can be processed by Ethereum. Account abstraction enables smart contracts to initiate transactions themselves, allowing any user-desired logic to be coded into the smart contract wallet itself and executed on Ethereum. Ultimately, account abstraction improves support for smart contract wallets, making them easier to build and safer to use. Users can enjoy all the benefits of Ethereum without needing to fully understand the underlying technology.</p><h2 id="h-security-enhancements-with-aa" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Security Enhancements with AA</strong></h2><p>Today&apos;s accounts are secured using private keys calculated from seed phrases. Anyone with access to a seed phrase can discover the private key protecting an account and gain access to all the assets it protects. If a private key and seed phrase are lost, they can never be recovered, and the assets they control are frozen forever. Securing these seed phrases is awkward, even for expert users, and seed phrase phishing is one of the most common ways users get scammed.</p><p>Account abstraction solves this problem by using a smart contract to hold assets and authorize transactions. These smart contracts can then be decorated with custom logic to make them as secure and tailored to the user as possible. Backup keys can be added to a wallet so that if the main key is lost or accidentally exposed, it can be replaced with a new, secure one with permission from the backup keys. This makes it much harder for a thief to gain full control over your funds. Similarly, you can add rules to the wallet to reduce the impact if your main key gets compromised. For example, low-value transactions could be verified by a single signature, whereas higher-value transactions require approval from multiple authenticated signers.</p><p>Account abstraction also allows for the creation of whitelists, which block every transaction unless it is to a trusted address or verified by several pre-approved keys. It offers other security enhancements, such as multisig authorization, account freezing, account recovery, transaction limits, and more. The possibilities are almost endless, and the freedom to design these custom security measures is one of the most significant benefits of account abstraction.</p><h2 id="h-enhanced-user-experience-with-aa" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Enhanced User Experience with AA</strong></h2><p>Account abstraction not only offers improved security but also a superior overall user experience, as it adds support for smart contract wallets at the protocol level. The enhanced freedom for developers of smart contracts, wallets, and applications allows them to innovate the user experience in ways we may not yet be able to anticipate.</p><p>Account abstraction enables transaction bundling for speed and efficiency, allows users to maintain an ETH balance for funding transactions, and offers the potential for trusted sessions, which could be transformative for applications like gaming where large numbers of small transactions might need approval in a short time. It also paves the way for a more familiar online shopping experience where a user could fill a &quot;basket&quot; with items and click once to purchase all at once, with all the necessary logic handled by the contract, not the user.</p><h2 id="h-the-path-forward-implementing-aa" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>The Path Forward: Implementing AA</strong></h2><p>While smart contract wallets exist today, they rely on wrapping relatively complex code around standard Ethereum transactions because the EVM does not natively support them. Ethereum can change this by allowing smart contracts to initiate transactions, handling the necessary logic in Ethereum smart contracts instead of off-chain. Putting logic into smart contracts also increases Ethereum&apos;s decentralization since it removes the need for &quot;relayers&quot; run by wallet developers to translate messages signed by the user to regular Ethereum transactions.</p><p>The way forward in implementing account abstraction is currently under intense discussion. Several proposals aim to change the Ethereum protocol to accommodate account abstraction or to upgrade EOAs so they can be controlled by smart contracts. However, many of these proposals are not active due to the community&apos;s current preference for proposals such as EIP-4337, which implement account abstraction without requiring large-scale changes to the Ethereum protocol.</p><p>The future of account abstraction is promising. With the features provided by account contracts, using a web3 wallet will emulate the experience of using a bank account or application without the need to trust the bank. The way forward in implementing account abstraction is currently under intense discussion. Several proposals aim to change the Ethereum protocol to accommodate account abstraction or to upgrade EOAs so they can be controlled by smart contracts. The advent of account abstraction will undoubtedly play a pivotal role in shaping the future of the Ethereum ecosystem.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/e3f791fd717b8ee7ec52de2dd1366f33a36edf38ddfb9ebb119c1528904a1311.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Retaking and EigenLayer: Aggregate Security]]></title>
            <link>https://paragraph.com/@ohotties/retaking-and-eigenlayer-aggregate-security</link>
            <guid>7OPMG717f8BrXva2U2D5</guid>
            <pubDate>Fri, 30 Jun 2023 18:46:10 GMT</pubDate>
            <description><![CDATA[Restaking is a process that allows users to stake the same Ethereum (ETH) on both Ethereum and other protocols, securing all these networks simultaneously. EigenLayer, proposed by Sreeram Kannan, on the other hand, is a set of smart contracts on Ethereum that allows consensus layer Ether (ETH) stakers to opt in to validating new software modules built on top of the Ethereum ecosystem. Stakers opt in by granting the EigenLayer smart contracts the ability to impose additional slashing condition...]]></description>
            <content:encoded><![CDATA[<p>Restaking is a process that allows users to stake the same Ethereum (ETH) on both Ethereum and other protocols, securing all these networks simultaneously. EigenLayer, proposed by <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/sreeramkannan">Sreeram Kannan</a>, on the other hand, is a set of smart contracts on Ethereum that allows consensus layer Ether (ETH) stakers to opt in to validating new software modules built on top of the Ethereum ecosystem. Stakers opt in by granting the EigenLayer smart contracts the ability to impose additional slashing conditions on their staked ETH, allowing an extension of cryptoeconomic security. </p><p><strong>Problem: Fractured Trust Networks</strong></p><p>The current blockchain ecosystem faces several challenges. Protocols built on Ethereum are required to bootstrap their own set of validators, which is both time-consuming and costly. Additionally, these protocols are bound by the fundamental set of rules incorporated into Ethereum, limiting the extent of innovation they can achieve. Lastly, a protocol&apos;s security is only as good as the security of the weakest component it depends on, posing a significant security risk.</p><p><strong>The Solution: Restaking</strong></p><p>Restaking offers promising solutions to these challenges. By allowing ETH stakers to restake their ETH to secure these protocols, EigenLayer enables protocols to tap into Ethereum&apos;s security layer, reducing the need for protocols to establish their own validator set. This not only increases protocol security but also provides a high degree of flexibility for protocols to customize their architecture. Furthermore, it increases capital efficiency by allowing stakers to earn rewards from multiple protocols with the same capital.</p><p>EigenLayer provides an avenue for restakers to delegate their ETH or LSTs to other entities who are running EigenLayer operator nodes. EigenLayer operators who have stake delegated to them can deposit the delegated stake to spin up new Ethereum validator nodes, and subject the delegated stake to slashing from the modules the operator is participating in. These operators receive fees from both the Ethereum beacon chain and the modules they are participating in via EigenLayer. They keep a fraction of those fees and send through the remainder to the delegators.</p><p><strong>Usage and Benefits</strong></p><p>EigenLayer introduces two novel ideas, pooled security via restaking and free-market governance, which serve to extend the security of Ethereum to any system and to eliminate the inefficiencies of existing rigid governance structures. By combining these ideas, EigenLayer serves as an open marketplace where actively validated services (AVSs) can rent pooled security provided by Ethereum validators. </p><p>The benefits of restaking and EigenLayer include:</p><ul><li><p>Enhanced Security: EigenLayer allows protocols to leverage Ethereum&apos;s robust security infrastructure by enabling ETH stakers to restake their ETH for securing these protocols. This eliminates the need for protocols to establish their own set of validators, which is a resource-intensive process.</p></li><li><p>Customizability: EigenLayer provides protocols with the flexibility to tailor their architecture according to their specific needs. This allows protocols to innovate beyond the foundational rules of Ethereum, fostering a more diverse and innovative ecosystem.</p></li><li><p>Capital Optimization: EigenLayer enhances capital efficiency by enabling stakers to earn rewards from multiple protocols using the same staked capital. This allows stakers to optimize their returns without additional capital investment.</p></li><li><p>Security Aggregation: EigenLayer introduces the concept of pooled security through restaking, extending Ethereum&apos;s security to any system built on it. This allows even smaller protocols to benefit from the security provided by the larger Ethereum network.</p></li><li><p>Market-driven Governance: EigenLayer introduces a free-market governance model, eliminating the inefficiencies of traditional rigid governance structures. This allows for a more dynamic and adaptable governance system.</p></li></ul><p><strong>Potential Risks</strong></p><ul><li><p>Centralization Risk: If a significant number of stakers secure a single application and get slashed, it could lead to a concentration of power, which could have negative implications for Ethereum&apos;s decentralization ethos.</p></li><li><p>Unsustainable Growth: There&apos;s a risk of a race to the top among protocols offering increasingly higher yields to attract capital. This could lead to unsustainable growth and potential market crashes if the yields are not backed by real value.</p></li><li><p>Security Compromise: Protocols might lower their slashing conditions to attract more capital, compromising their own security. This could make these protocols more vulnerable to attacks and potentially lead to loss of funds.</p></li></ul><p><strong>Use Cases</strong></p><ul><li><p>Oracles Creation: EigenLayer can be used to construct price feeds if majority trust on ETH restaked with EigenLayer is all that&apos;s required. This could offer a more secure and efficient way to provide price feeds, which are crucial for many DeFi applications.</p></li><li><p>Hyperscale Data Availability Layer Construction: EigenLayer can be utilized to build a hyperscale data availability layer by capitalizing on restaking and advanced data availability concepts from the Ethereum community. This layer can provide high data availability rates at a lower cost.</p></li><li><p>Rollup Decentralized Sequencers: EigenLayer can be leveraged to develop decentralized sequencers for rollups, which are crucial for handling Miner Extractable Value (MEV) and ensuring resistance to censorship. A quorum of ETH stakers on EigenLayer can significantly improve the security and efficiency of rollups.</p></li><li><p>Management of MEV through Opt-In: EigenLayer provides a platform for deploying a range of opt-in MEV management techniques, such as Proposal-Builder Separation, MEV smoothing, and threshold encryption for transaction inclusion. For instance, a group of restakers can implement MEV smoothing by agreeing to evenly distribute MEV among themselves. Any restaker who deviates from this agreed-upon MEV smoothing behavior can be penalized. This approach is inherently scalable as only block proposers need to perform specific actions when they are triggered.</p></li><li><p>Light-Node Bridges Development: EigenLayer can facilitate the creation of light-node bridges to Ethereum. Restakers can verify off-chain whether bridge inputs are correct. If a strong cryptoeconomic quorum approves a bridge input, it is considered accepted, enhancing the efficiency and usability of bridges between different blockchains.</p></li><li><p>Fast-Mode Bridges for Rollups Acceleration: EigenLayer can speed up the process of ZK rollups. A quorum of operators on EigenLayer with a large amount of restaked ETH can participate in off-chain ZK proof verification and certify the correctness of proofs on-chain. This can significantly enhance the user experience for those interacting with ZK rollups.</p></li></ul><p><strong>Conclusion</strong></p><p>EigenLayer and restaking represent significant advancements in the field of cryptoeconomic security. By allowing users to restake their ETH, they extend the security of the Ethereum network to other applications, creating a more robust and efficient system. However, it is important to note that while restaking offers many benefits, it also comes with risks and challenges that must be carefully managed. As the field of blockchain technology continues to evolve, it will be interesting to see how the concept of restaking is further developed and applied.</p><hr>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/2d1b7f0ba29404c745f4eebcf1b29fefc29304a40b58546e9828c2647873e1bc.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Proposer-Builder Separation (PBS) and Enshrined/in-protocol PBS in Ethereum]]></title>
            <link>https://paragraph.com/@ohotties/proposer-builder-separation-pbs-and-enshrined-in-protocol-pbs-in-ethereum</link>
            <guid>fkYttQsEnpnN5fWCXayb</guid>
            <pubDate>Sat, 24 Jun 2023 11:12:23 GMT</pubDate>
            <description><![CDATA[*Acknowledgement: This post is inspired by all the insightful work and thoughts from Vitalik Buterin, Barnabé Monnot, Julian Ma, Mike Neuder, Justin Drake, Davide Crapis and Dankrad Feist. *Introduction The Ethereum blockchain, since its inception, has been a hotbed for innovation, fostering the development of decentralized applications and smart contracts. In September 2022, Ethereum underwent a significant transition from a Proof-of-Work (PoW) consensus mechanism to a Proof-of-Stake (PoS) m...]]></description>
            <content:encoded><![CDATA[<blockquote><p>*Acknowledgement: This post is inspired by all the insightful work and thoughts from <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/VitalikButerin">Vitalik Buterin</a>, <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/barnabemonnot">Barnabé Monnot</a>, <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/_julianma">Julian Ma</a>, <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/mikeneuder">Mike Neuder</a>, <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/drakefjustin">Justin Drake</a>, <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/DavideCrapis">Davide Crapis</a> and <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/dankrad">Dankrad Feist</a>. *</p></blockquote><p><strong>Introduction</strong></p><p>The Ethereum blockchain, since its inception, has been a hotbed for innovation, fostering the development of decentralized applications and smart contracts. In September 2022, Ethereum underwent a significant transition from a Proof-of-Work (PoW) consensus mechanism to a Proof-of-Stake (PoS) model. This transition was not only a shift in the way Ethereum secures its network but also introduced a new paradigm known as the Proposer-Builder Separation (PBS) scheme.</p><p><strong>Understanding Proposer-Builder Separation (PBS)</strong></p><p>The Proposer-Builder Separation (PBS) scheme was introduced to decouple the roles of selecting and ordering transactions in a block (the builder) from those validating its contents and proposing the block to the network (the proposer). In this system, proposers validate and secure the network, relying on specialized block builders to create blocks with the most value. Relays also play a crucial role, acting as mediators between builders and proposers, transmitting the most lucrative blocks from the builders to the proposers.</p><p>The PBS ecosystem is made up of several key players, including searchers, block builders, relays, and validators. Searchers are Ethereum users who prioritize privacy and prefer to use a private transaction pool instead of the public mempool. Block builders receive bundles from searchers in the PBS system and attempt to create the most profitable block possible. Relays are responsible for holding blocks from builders in escrow for validators, while validators are still responsible for proposing blocks to the Ethereum network.</p><p>The primary benefits of PBS lie in its potential to decentralize block validation and maximize block profitability. By separating the roles of block building and block proposing, PBS provides all validators, regardless of size, access to competitive blocks. This prevents hobbyist validators from being outcompeted by institutional players who can optimize block profitability better.</p><p><strong>Enshrined Proposer-Builder Separation (ePBS) or in-protocol PBS</strong></p><p>Enshrined Proposer-Builder Separation (ePBS) or in-protocol PBS is a proposition to incorporate PBS directly into Ethereum&apos;s consensus layer. This concept emerged in response to potential relay failures and the need to eliminate single points of failure in the system. This approach, introduced by <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/VitalikButerin">Vitalik Buterin</a>, allows the Ethereum protocol itself to provide assurances to both builders and proposers. At the time of the Ethereum merge, there was no in-protocol solution available for PBS. As a result, Flashbots developed an out-of-protocol solution known as mev-boost. This solution has been widely adopted, accounting for approximately 90% of the blocks produced on the Ethereum network.</p><p>For builders, the protocol ensures that a commitment made by the proposer can only be reverted due to a consensus failure or a safety fault with single-slot finality. For proposers, the protocol guarantees that the builder&apos;s promise of payment will be honored, regardless of the builder&apos;s subsequent actions.</p><p>The concept of PBS can be viewed in a modular fashion, considering it as a market structure/legal system and an allocation mechanism/business logic. As a market structure, the protocol defines the conditions under which proposers may engage with third parties during block construction. As an allocation mechanism, the protocol defines the space of contracts that proposers and third parties may enter into.</p><p>Under the mev-boost model, the protocol does not intervene in the principal-agent problem. It does not recognize the role of builders as an entity interfacing with proposers. This lack of intervention implies that there are no trust-minimized, in-protocol ways for proposers to be compensated when the other side of the market fails to honor their agreements.</p><p>The debate on whether this should be the responsibility of the protocol is ongoing. On one hand, proposers seeking maximum profit may engage in risky behavior, which the protocol may not want to backstop. On the other hand, the asymmetry between block construction and block verification opens up design possibilities that cannot be realized without appealing to external entities.</p><p>If the protocol intends to fully lean into the PBS design philosophy, it may require powerful builders to produce complex blocks, imposing this requirement on proposers. This could be a rough bargain for some proposers, particularly those in resource-constrained environments. However, the risk to the protocol is mitigated by the fact that builders are profit-making entities and care for their blocks to make it into the protocol.</p><p><strong>Reasons to Enshrine PBS(ePBS)</strong></p><ol><li><p>Upholding Ethereum&apos;s Core Values: The current reliance on relays, which are centralized entities, contradicts Ethereum&apos;s foundational principles of decentralization. A small group of relay operators currently handle 99% of mev-boost blocks, which gives them a disproportionate influence in the ecosystem. Moreover, the centralized nature of relays makes them susceptible to regulatory pressures and potential censorship of transactions. Validators and builders also have to trust relays to provide valid block headers and not to misappropriate MEV. This trust-based system is fundamentally at odds with Ethereum&apos;s trustless philosophy.</p></li><li><p>Addressing Vulnerabilities in Out-of-Protocol Software: The existing out-of-protocol solution, mev-boost, has proven susceptible to attacks, such as the &quot;Low-Carb Crusader&quot; unbundling that exploited a relay vulnerability, leading to a loss of over 20 million USD. This incident underscores the fact that relays are attractive targets outside of the protocol. Additionally, the relay response to the unbundling attack destabilized consensus, resulting in a fivefold increase in reorged blocks.</p></li><li><p>Eliminating Inefficiencies of Side-Car Software: Maintaining compatibility between beacon clients and relays incurs significant coordination costs. Each hard-fork necessitates substantial work from the relay and core developers to ensure mev-boost continues functioning. This involves designing the builder spec, maintaining/improving the relay spec, and making software changes on the beacon clients, mev-boost, and the mev-boost relays. Because mev-boost is out-of-protocol, this coordination is in addition to the standard ACD pipeline and usually occurs later in the development cycle.</p></li></ol><p><strong>Reasons Not to Enshrine PBS</strong></p><ol><li><p>Current Stability of mev-boost: mev-boost has been effective to date and has seen widespread adoption, accounting for approximately 90% of Ethereum blocks produced. As the implementation continues to mature, confidence in its security properties can be built and the specification can be further developed. If a neutral way to fund a set of relays can be found, then it might be feasible to continue relying on them.</p></li><li><p>Potential for Other Tools to Mitigate MEV: There is ongoing work to protect users from MEV at the application/transaction level. Solutions like SUAVE, CoW swap, and MEVBlocker are gaining traction. If a significant portion of MEV can be safeguarded, enshrining PBS might be an unnecessary step on an already ambitious roadmap.</p></li><li><p>Resource Allocation: The Ethereum roadmap has numerous goals beyond ePBS. If we decide to proceed with ePBS, it raises the question of where this can fit on the roadmap and what upgrades will be postponed as a result. For instance, ePBS depends on Single-Slot Finality (SSF) for security and complexity reasons, and a validator set consolidation is a prerequisite for any SSF progress.</p></li></ol><p><strong>Protocol-Enforced Proposer Commitments(PEPC)</strong></p><p>PEPC, or Protocol-Enforced Proposer Commitments, is a concept that aims to enhance the market structure of the Proposer-Builder Separation (PBS) in Ethereum proposed by <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/barnabemonnot">Barnabé Monnot</a>. It provides a flexible framework for proposers to make their own commitments, which are then secured by the protocol to prevent any reneging after the builder has delivered the block. Here&apos;s a summary:</p><ol><li><p>PEPC&apos;s Purpose: PEPC aims to generalize the market structure to include more commitments than just the execution payload. It anticipates future needs where builders might be required to produce blocks for Danksharding, or where state providers might need to support block construction and witness computation for stateless proposers. PEPC could also accommodate the need for validity proofs based on a canonical zkEVM to be attached to an execution payload.</p></li><li><p>Flexibility: PEPC could potentially allow the proposer to control precisely how their block is being made. This could include summoning parallel execution-builders, auctioning off block building rights ahead of time, and more.</p></li><li><p>How PEPC Works: PEPC allows a proposer to include additional validity conditions to their block. These conditions could specify requirements such as the execution payload must be signed by a particular builder, or the block witness must be signed by a specific state provider. The hope is to use the EVM to provide a language to express such commitments.</p></li><li><p>Unanswered Questions: There are several open questions about how PEPC would work in practice. These include how builder promises of payment should be represented, how complex the commitments can be, and how PEPC would mesh with protocol-captured value.</p></li></ol><p>In essence, PEPC is a proposal for a more flexible and adaptable market structure within the PBS framework, allowing for a wide range of commitments and conditions to be specified by proposers and secured by the protocol.</p><p><strong>Optimistic Relaying</strong></p><p>Optimistic Relaying is an iterative approach to Proposer-Builder Separation (PBS) in Ethereum proposed by <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/mikeneuder">Mike Neuder</a> <em>and</em> <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/drakefjustin">Justin Drake</a>. It offers a &quot;bottom-up&quot; approach to PBS, contrasting the traditional &quot;top-down&quot; approach where a design is fleshed out, a specification is written, and then implemented by client teams. Here&apos;s a summary:</p><ol><li><p>Concept: Optimistic Relaying leverages the existence of mev-boost and mev-boost relays to gradually shift towards a full ePBS implementation. By modifying the relay, it&apos;s possible to progress towards ePBS without needing to alter the specification or make changes to the consensus node software. This approach allows for testing and mitigating risks associated with full ePBS features while maintaining agility.</p></li><li><p>Relay Responsibilities: The main theme of the optimistic roadmap is to reduce relay responsibilities, which improves the operational efficiency of running a relay. This is crucial as relay operation is expensive and currently performed as a public good. By lowering the entry barrier for relay operators, a more sustainable future for mev-boost is enabled as ePBS details are fleshed out.</p></li><li><p>Block Submission in mev-boost: In the mev-boost relay, processing builder bids is the main function of the relay, incurring the highest latency and compute costs. The relay must handle all builder submissions, simulate the blocks on the Execution Layer (EL) clients, and serve as a data availability layer for the execution payloads. The validator also relies on the relay to publish the block in a timely manner once they sign the header.</p></li><li><p>Optimistic Relaying v1: The first version of optimistic relaying removes the block validation step from the block submission pipeline. This means that once the builder block is received by the relay, it is immediately eligible to win the auction and be signed by the proposer. The risk here is that an invalid block may unknowingly be signed by the validator, resulting in a missed slot. To mitigate this risk, the relay holds builder collateral and uses it to refund the proposer if a bad builder block results in a missed slot.</p></li><li><p>Optimistic Relaying Endgame: The final iteration of optimistic relaying behaves more like the Two-Block HeadLock (TBHL) design of ePBS. Instead of the attesting committee enforcing the rules, the relay serves as a centralized &quot;oracle&quot; for the timeliness of events in the bidpool. Builders now directly submit bids to the peer-to-peer (p2p) layer, and proposers observe these bids and sign the corresponding header of the winning bid. The relay observes the bidpool and checks for timeliness of the proposer&apos;s signed header and the builder&apos;s block publication. The relay still holds builder collateral to refund a proposer if they sign a header on-time, but the builder doesn’t produce a valid block.</p></li></ol><p>In summary, Optimistic Relaying is an innovative approach to gradually implementing ePBS in Ethereum. It reduces relay responsibilities, improves operational efficiency, and allows for a more sustainable future for mev-boost.</p><p><strong>Conclusion</strong></p><p>The Proposer-Builder Separation (PBS) scheme and its enshrined or in-protocol version (ePBS) represent significant advancements in Ethereum&apos;s evolution. These concepts aim to decentralize block validation, maximize block profitability, and eliminate single points of failure in the system. The introduction of Protocol-Enforced Proposer Commitments (PEPC) and Optimistic Relaying further enhance the flexibility and adaptability of the PBS framework.</p><p>However, these advancements are not without challenges. The debate on whether the protocol should intervene in the principal-agent problem continues, and the resource allocation for implementing ePBS needs careful consideration. Moreover, the reliance on relays in the current PBS system raises concerns about centralization and potential vulnerabilities.</p><p>Despite these challenges, the ongoing innovations in Ethereum&apos;s PBS, such as PEPC and Optimistic Relaying, show promise in addressing these issues. They offer potential solutions for creating a more flexible, efficient, and secure system for block construction and validation.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/552b9cb47f5a54a80222c034a5b02a71b34d3bdf225bd0cc2ca30d4c5981cb4b.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Classic Rollups (Optimistic/ZK Rollups) vs  Enshrined and Sovereign Rollups]]></title>
            <link>https://paragraph.com/@ohotties/classic-rollups-optimistic-zk-rollups-vs-enshrined-and-sovereign-rollups</link>
            <guid>1X4i8pTNF5FahLqaiXHz</guid>
            <pubDate>Mon, 19 Jun 2023 09:39:17 GMT</pubDate>
            <description><![CDATA[Blockchain scalability is a critical issue that has been a subject of extensive research and development. One of the promising solutions to this problem is the concept of rollups, a Layer 2 scaling solution that executes transactions off-chain and posts the data on-chain. This post provides a comprehensive analysis of classic rollups, including Optimistic Rollups and ZK Rollups, Enshrined Rollups, and Sovereign Rollups, focusing on their detailed information, advantages, disadvantages, and a ...]]></description>
            <content:encoded><![CDATA[<p>Blockchain scalability is a critical issue that has been a subject of extensive research and development. One of the promising solutions to this problem is the concept of rollups, a Layer 2 scaling solution that executes transactions off-chain and posts the data on-chain. This post provides a comprehensive analysis of classic rollups, including Optimistic Rollups and ZK Rollups, Enshrined Rollups, and Sovereign Rollups, focusing on their detailed information, advantages, disadvantages, and a comparative study among them.</p><p><strong>Classic Rollups: An Overview</strong></p><p>Classic rollups, including Optimistic Rollups and ZK Rollups, are Layer 2 solutions that aim to increase the transaction throughput of a blockchain by executing transactions off-chain and posting the data on-chain. They rely on the security of the underlying Layer 1 blockchain (like Ethereum) for data availability and order of transactions, while the computation is performed off-chain. This approach allows for a significant increase in the number of transactions processed per second, while maintaining the security and decentralization properties of the underlying blockchain.</p><p><strong>Optimistic Rollups</strong></p><p>Optimistic Rollups are a type of Layer 2 construction on Ethereum that allows for the execution of smart contracts off-chain while still maintaining a high level of security provided by the Ethereum mainnet. They operate under the assumption that all transactions are correct (hence the term &apos;optimistic&apos;) and only run full computation in the event of a dispute.</p><p>In an Optimistic Rollup, transactions are executed off-chain in a replicated virtual machine, which maintains a similar state to the Ethereum mainnet. The state transitions are recorded and submitted to the Ethereum mainnet as &apos;rollup blocks&apos;. These blocks are not immediately accepted as valid; instead, they are subject to a challenge period during which any observer can &apos;challenge&apos; the validity of a block by providing a proof that the block includes an invalid transaction.</p><p>Pros:</p><ol><li><p>Scalability: Optimistic Rollups can handle around 100-200 transactions per second (TPS), a significant improvement over Ethereum&apos;s 15 TPS.</p></li><li><p>Compatibility: They are fully compatible with Ethereum, allowing for the execution of complex smart contracts.</p></li><li><p>Security: They inherit the security of the underlying Layer 1 blockchain.</p></li></ol><p>Cons:</p><ol><li><p>Data latency: In the event of a dispute, it can take up to one week for the challenge period to resolve, leading to potential delays.</p></li><li><p>Complexity: Implementing Optimistic Rollups can be complex due to the need for fraud proofs.</p></li></ol><p><strong>ZK Rollups</strong></p><p>ZK Rollups are another type of Layer 2 solution that bundle multiple inputs and outputs into a single transaction, using zero-knowledge proofs to verify the validity of transactions. Unlike Optimistic Rollups, ZK Rollups do not require a challenge period as they provide instant finality.</p><p>In a ZK Rollup, transactions are grouped together and a single &apos;proof&apos; is generated and verified by the Ethereum mainnet. This proof attests to the validity of all transactions in the group, ensuring that only valid state transitions are included in the Ethereum mainnet. The use of zero-knowledge proofs allows for the compression of multiple transactions into a single proof, significantly reducing the amount of data that needs to be stored on-chain.</p><p>Pros:</p><ol><li><p>Scalability: ZK Rollups can process thousands of transactions per second, offering even greater scalability than Optimistic Rollups.</p></li><li><p>Instant finality: Transactions are instantly finalized, avoiding the delay associated with the challenge period in Optimistic Rollups.</p></li><li><p>Security: Like Optimistic Rollups, they also inherit the security of the underlying Layer 1 blockchain.</p></li></ol><p>Cons: </p><ol><li><p>Limited functionality: ZK Rollups currently support a limited set of operations and are not fully compatible with Ethereum&apos;s smart contracts.</p></li><li><p>Complexity: The technology behind ZK Rollups, particularly zero-knowledge proofs, is complex and requires specialized knowledge to implement.</p></li></ol><p><strong>Comparison between Optimistic Rollups and ZK Rollups</strong></p><p>While both Optimistic Rollups and ZK Rollups aim to increase the scalability of Ethereum, they do so in different ways and have distinct trade-offs.</p><ol><li><p>Scalability: ZK Rollups offer higher scalability than Optimistic Rollups due to the use of zero-knowledge proofs, which allow for the compression of multiple transactions into a single proof.</p></li><li><p>Finality: ZK Rollups provide instant finality, while Optimistic Rollups have a delay due to the challenge period.</p></li><li><p>Compatibility: Optimistic Rollups are fully compatible with Ethereum&apos;s smart contracts, while ZK Rollups currently support a limited set of operations.</p></li><li><p>Complexity: Both solutions are complex, but ZK Rollups are particularly so due to the use of zero-knowledge proofs.</p></li></ol><p><strong>Enshrined Rollups</strong></p><p>Enshrined Rollups are a type of rollup that enjoys some sort of consensus integration at Layer 1 (L1). They contrast with smart contract rollups, which live fully at Layer 2 (L2), outside of consensus. Consensus integration can endow enshrined rollups with superpowers at the cost of significant tradeoffs.</p><p>The term &quot;enshrined&quot; refers to the fact that the rollup&apos;s logic is embedded (or &quot;enshrined&quot;) into the protocol of the Layer 1 blockchain itself. This means that the rollup&apos;s operation is governed by the consensus rules of the Layer 1 blockchain, rather than being determined by a separate set of rules or a smart contract.</p><p>Pros:</p><p>Scalability: By performing computations off-chain and only posting the results to the blockchain, enshrined rollups can significantly increase the number of transactions that a blockchain can process per second.</p><p>Security: Because the rollup&apos;s operation is governed by the Layer 1 blockchain&apos;s consensus rules, enshrined rollups inherit the security properties of the Layer 1 blockchain. This means that as long as the Layer 1 blockchain is secure, the enshrined rollup is also secure.</p><p>Cons:</p><p>Complexity: Enshrined rollups add a layer of complexity to the blockchain system. This can make them more difficult to understand and implement, potentially slowing down their adoption.</p><p><strong>Sovereign Rollups</strong></p><p>Sovereign Rollups are a newer concept that combines elements of Layer 1 blockchains and rollups. Like classic rollups, they execute transactions off-chain and post the data on-chain. However, unlike classic rollups, they handle their own settlement and do not rely on the security of the underlying Layer 1 blockchain for transaction verification.</p><p>In a Sovereign Rollup, transactions are published to another blockchain (typically for ordering and data availability), but the nodes of the Sovereign Rollup are responsible for verifying the transactions. This approach allows Sovereign Rollups to have a degree of independence (or &apos;sovereignty&apos;) from the underlying Layer 1 blockchain.</p><p>Pros:</p><ol><li><p>Independence: Sovereign Rollups have their own consensus and can operate independently of the underlying Layer 1 blockchain.</p></li><li><p>Flexibility: They can define their own rules for transaction verification and settlement.</p></li></ol><p>Cons:</p><ol><li><p>Security: Since Sovereign Rollups handle their own settlement, they do not inherit the security of the underlying Layer 1 blockchain.</p></li><li><p>Complexity: Implementing a Sovereign Rollup can be complex due to the need for a separate consensus mechanism.</p></li></ol><p><strong>Comparison between Classic Rollups, Enshrined Rollups, and Sovereign Rollups</strong></p><p>The main difference between classic rollups, enshrined rollups, and Sovereign Rollups lies in where transactions are verified and settled.</p><p><strong>Verification:</strong> In classic rollups, transactions are verified by the underlying Layer 1 blockchain, while in enshrined rollups, the rollup&apos;s operation is governed by the consensus rules of the Layer 1 blockchain. In Sovereign Rollups, transactions are verified by the nodes of the Sovereign Rollup.</p><p><strong>Settlement:</strong> Classic rollups and enshrined rollups rely on the underlying Layer 1 blockchain for settlement, while Sovereign Rollups handle their own settlement.</p><p><strong>Security:</strong> Classic rollups and enshrined rollups inherit the security of the underlying Layer 1 blockchain, while Sovereign Rollups do not.</p><p><strong>Independence:</strong> Sovereign Rollups can operate independently of the underlying Layer 1 blockchain, while classic rollups and enshrined rollups cannot.</p><p><strong>Complexity:</strong> All solutions are complex, but enshrined rollups add an extra layer of complexity due to their integration with the Layer 1 blockchain&apos;s consensus rules.</p><p><strong>Conclusion</strong></p><p>Rollups, both classic, enshrined, and sovereign, present promising solutions to the scalability issues faced by Layer 1 blockchains. While classic rollups, including Optimistic Rollups and ZK Rollups, offer significant improvements in transaction throughput and security, they are not without their trade-offs, including complexity and, in the case of Optimistic Rollups, potential delays due to the challenge period. Enshrined rollups, while offering similar benefits, add an extra layer of complexity due to their integration with the Layer 1 blockchain&apos;s consensus rules. On the other hand, Sovereign Rollups offer a degree of independence and flexibility not seen in classic or enshrined rollups, but they do not inherit the security of the underlying Layer 1 blockchain. As research and development in this area continue, it will be interesting to see how these technologies evolve and how they will shape the future of blockchain scalability.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/40af888323495ddadd577a82bb1224204767736e68bc593edc6571863274e590.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Enshrined Rollups: A Comprehensive Overview]]></title>
            <link>https://paragraph.com/@ohotties/enshrined-rollups-a-comprehensive-overview</link>
            <guid>Tf00KbJksPC9vX9XnMVU</guid>
            <pubDate>Mon, 12 Jun 2023 14:07:15 GMT</pubDate>
            <description><![CDATA[IntroductionIn the world of blockchain technology, scalability has always been a significant concern. As the number of users and transactions on a blockchain increases, the system&apos;s ability to process transactions quickly and efficiently becomes increasingly important. One of the solutions proposed to address this issue is the concept of rollups, and among the different types of rollups, recently enshrined rollups have gained considerable attention. Enshrined rollups are a type of Layer ...]]></description>
            <content:encoded><![CDATA[<h2 id="h-introduction" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Introduction</h2><p>In the world of blockchain technology, scalability has always been a significant concern. As the number of users and transactions on a blockchain increases, the system&apos;s ability to process transactions quickly and efficiently becomes increasingly important. One of the solutions proposed to address this issue is the concept of rollups, and among the different types of rollups, recently enshrined rollups have gained considerable attention.</p><p>Enshrined rollups are a type of Layer 2 scaling solution that aims to increase the transaction throughput of a blockchain by performing computation off-chain and posting the results to the blockchain. This post will delve into the concept of enshrined rollups, how they work, why they are needed, their pros and cons, and how they compare to other types of rollups such as classic rollups and sovereign rollups.</p><h2 id="h-what-are-enshrined-rollups" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">What are Enshrined Rollups?</h2><p>Enshrined rollups are a type of rollup that enjoys some sort of consensus integration at Layer 1 (L1). They contrast with smart contract rollups, which live fully at Layer 2 (L2), outside of consensus. Consensus integration can endow enshrined rollups with superpowers at the cost of significant tradeoffs.</p><p>The term &quot;enshrined&quot; refers to the fact that the rollup&apos;s logic is embedded (or &quot;enshrined&quot;) into the protocol of the Layer 1 blockchain itself. This means that the rollup&apos;s operation is governed by the consensus rules of the Layer 1 blockchain, rather than being determined by a separate set of rules or a smart contract.</p><h2 id="h-how-do-enshrined-rollups-work" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">How do Enshrined Rollups Work?</h2><p>Enshrined rollups work by performing most of the computational work off-chain and then posting the results of these computations to the Layer 1 blockchain. The off-chain computations are performed by a set of validators who are responsible for processing transactions and producing proofs of the computation&apos;s correctness.</p><p>These proofs are then posted to the Layer 1 blockchain, where they are checked and verified by the blockchain&apos;s consensus mechanism. If the proofs are valid, the results of the computations are accepted and added to the blockchain&apos;s state.</p><p>The key advantage of this approach is that it allows for a significant increase in transaction throughput, as the Layer 1 blockchain does not need to perform the computations itself. Instead, it only needs to check the validity of the proofs, which is a much less resource-intensive task.</p><h2 id="h-why-do-we-need-enshrined-rollups" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Why do we Need Enshrined Rollups?</h2><p>Enshrined rollups offer several advantages that make them a promising solution for scaling blockchain systems:</p><ol><li><p>Increased Transaction Throughput: By performing computations off-chain and only posting the results to the blockchain, enshrined rollups can significantly increase the number of transactions that a blockchain can process per second.</p></li><li><p>Security: Because the rollup&apos;s operation is governed by the Layer 1 blockchain&apos;s consensus rules, enshrined rollups inherit the security properties of the Layer 1 blockchain. This means that as long as the Layer 1 blockchain is secure, the enshrined rollup is also secure.</p></li><li><p>Decentralization: Enshrined rollups can help to maintain the decentralization of a blockchain system. Because the computations are performed off-chain by a set of validators, there is no need for every node in the network to process every transaction. This can help to reduce the resource requirements for running a node, making it easier for more participants to join the network and contribute to its security.</p></li><li><p>Interoperability: Enshrined rollups can be designed to be compatible with existing Layer 1 blockchains, making it easier for them to interact with other systems and applications built on the same blockchain.</p></li></ol><h2 id="h-pros-and-cons-of-enshrined-rollups" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Pros and Cons of Enshrined Rollups</h2><p>Likeany technology, enshrined rollups come with their own set of advantages and disadvantages.</p><h3 id="h-pros" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Pros:</h3><ol><li><p>Scalability: Enshrined rollups significantly increase the transaction throughput of a blockchain by performing most of the computational work off-chain and only posting the results to the blockchain.</p></li><li><p>Security: Enshrined rollups inherit the security properties of the Layer 1 blockchain, making them highly secure.</p></li><li><p>Decentralization: By performing computations off-chain, enshrined rollups help to maintain the decentralization of a blockchain system by reducing the resource requirements for running a node.</p></li><li><p>Interoperability: Enshrined rollups can be designed to be compatible with existing Layer 1 blockchains, facilitating interaction with other systems and applications built on the same blockchain.</p></li></ol><h3 id="h-cons" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Cons:</h3><ol><li><p>Complexity: Enshrined rollups add a layer of complexity to the blockchain system. This can make them more difficult to understand and implement, potentially slowing down their adoption.</p></li><li><p>Dependency on Layer 1: The operation and security of enshrined rollups are dependent on the Layer 1 blockchain. If the Layer 1 blockchain is compromised, the enshrined rollup could also be affected.</p></li><li><p>Upgradeability: Enshrined rollups may be less flexible and harder to upgrade than other types of rollups because their logic is embedded in the Layer 1 protocol.</p></li></ol><h2 id="h-comparison-to-classic-rollups-and-sovereign-rollups" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Comparison to Classic Rollups and Sovereign Rollups</h2><h3 id="h-classic-rollups" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Classic Rollups:</h3><p>Classic rollups, also known as smart contract rollups, are chains where instead of a consensus, security is coordinated through smart contracts on a highly secure and decentralized chain, e.g., Ethereum. They inherit the security of Ethereum but are upgradeable through the rollup’s governance.</p><p>Compared to enshrined rollups, classic rollups offer more flexibility in terms of upgradeability because their logic is not embedded in the Layer 1 protocol. However, they may not offer the same level of scalability as enshrined rollups because they still require smart contracts to handle upgrade and proof verification logic.</p><h3 id="h-sovereign-rollups" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Sovereign Rollups:</h3><p>Sovereign rollups are rollups that settle themselves, instead of a more secure settlement layer. They use the data availability layer purely for data and ordering. These do not have secured bridges to a settlement layer, which some argue misses the feature of rollups.</p><p>Compared to enshrined rollups, sovereign rollups offer more independence as they do not rely on a Layer 1 blockchain for security. However, this also means that they do not inherit the security properties of a Layer 1 blockchain, potentially making them less secure.</p><h2 id="h-conclusion" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Conclusion</h2><p>Enshrined rollups represent a promising approach to scaling blockchain systems, offering increased transaction throughput, high security, and maintaining decentralization. However, they also come with their own set of challenges, including increased complexity and dependency on the Layer 1 blockchain. As with any technology, it will be important to carefully consider these trade-offs when deciding whether to adopt enshrined rollups.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/31bc80a221e623d66c759a9fca4cd216da105d3c91bc2a36ae83ab1fbcde79bd.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Sovereign Rollup]]></title>
            <link>https://paragraph.com/@ohotties/sovereign-rollup</link>
            <guid>ZMrtyxoEYZcAzobSiTO1</guid>
            <pubDate>Fri, 09 Jun 2023 17:50:35 GMT</pubDate>
            <description><![CDATA[Blockchain scalability remains a major topic of interest within the distributed ledger technology (DLT) landscape. A key development in this field has been the advent of rollups, a scaling solution that increases transaction throughput by bundling or "rolling up" multiple transactions into a single on-chain transaction. The most common variants of this technology are Optimistic Rollups and Zero-Knowledge (ZK) Rollups, herein referred to as classic or traditional rollups. These variants differ...]]></description>
            <content:encoded><![CDATA[<p>Blockchain scalability remains a major topic of interest within the distributed ledger technology (DLT) landscape. A key development in this field has been the advent of <strong>rollups</strong>, a scaling solution that increases transaction throughput by bundling or &quot;rolling up&quot; multiple transactions into a single on-chain transaction. The most common variants of this technology are <strong>Optimistic Rollups</strong> and Zero-Knowledge <strong>(ZK) Rollups</strong>, herein referred to as classic or traditional rollups. These variants differ primarily in their approach to transaction validation. Optimistic Rollups use a fraud-proof system, assuming transactions are valid unless proven otherwise, while ZK Rollups employ zero-knowledge proofs to ensure transaction validity without divulging transaction details. These classic rollups have offered promising solutions to the scalability issues plaguing Ethereum and other blockchains.</p><p>Recent discourse has seen the emergence of a new term in the rollup landscape: <strong>Sovereign Rollups</strong>. This nascent concept aims to offer a new way of achieving scalability while promising more autonomy and governance capabilities. This blog explores the concept of Sovereign Rollups, comparing and contrasting it with classic rollups, and elucidating its potential benefits and challenges.</p><p><strong>Classic Rollups: An Overview</strong></p><p>Classic rollups work within a specific environment or settlement layer, such as the Ethereum Virtual Machine (EVM). The rollup process begins with transactions being submitted to a rollup smart contract on the main chain, then they are &quot;rolled up&quot; into a single batch, and the new state root is posted to the chain. This dramatically increases the number of transactions processed per block.</p><p>Optimistic and ZK rollups, categorized as classic rollups, address the issue of blockchain scalability by rolling multiple transactions into a single on-chain transaction. This aggregation significantly boosts transaction throughput without necessitating larger block sizes. However, their methods of transaction validation differentiate them.</p><p><strong>Optimistic Rollups</strong> operate on an assumption of honesty. Each state transition is deemed correct unless challenged with a fraud proof. In an Optimistic Rollup, transaction data is posted on-chain, but computation is executed off-chain. If an incorrect state transition gets posted, anyone monitoring the network can submit a fraud proof, challenging the invalid transaction. If successfully challenged, the dishonest proposer loses their deposit, creating a strong disincentive for fraudulent behavior.</p><p>On the other hand, <strong>ZK Rollups</strong> leverage zero-knowledge proofs for transaction validation. In this model, all computation happens off-chain, and the resulting state is submitted on-chain alongside a zero-knowledge proof. The network only accepts the new state if the proof is valid. This deterministic approach removes the need for fraud proofs, providing an extra layer of security.</p><p>Classic rollups derive their security from the Layer 1 blockchain (commonly Ethereum), which enforces their rules and validates their transactions. This level of oversight, while potentially restrictive, significantly mitigates malicious activity risk. It also ensures compatibility with a large ecosystem of DeFi applications and services.</p><p><strong>The Emergence of Sovereign Rollups</strong></p><p>Sovereign Rollups extend the basic rollup architecture, introducing independence and autonomy. These rollups retain authority over their canonical chain, rather than deferring to a settlement layer. The enhanced control gives rollup-associated developers and communities more sway over governance, operations, and future directions.</p><p>Being sovereign, these rollups can set their rules and processes, adjust their system without requiring consensus from an external entity, and promptly respond to evolving needs or circumstances. In a rapidly changing technology landscape, this agility could accelerate innovation and adaptation. Sovereignty does not come without its challenges and considerations.</p><p><strong>The Trade-offs</strong></p><p>As with any technological advancement, Sovereign Rollups also present their own challenges and trade-offs. One prominent concern for Sovereign Rollups is the associated security risk, stemming from reduced external oversight. Without the stringent security provisions of an underlying Layer 1 blockchain, Sovereign Rollups must integrate robust internal security mechanisms, potentially increasing complexity and risk.</p><p>Additionally, the introduction of Sovereign Rollups could fragment the blockchain ecosystem. Each Sovereign Rollup operating its own governance model could lead to an ecosystem with a multitude of different systems, each with its own rules and mechanisms. This fragmentation might make interoperability more challenging and could increase the learning curve and barrier to entry for new participants.</p><p><strong>Conclusion</strong></p><p>Both classic and Sovereign Rollups present unique technical advantages and challenges. Classic rollups provide a secure and interoperable solution for scaling, thanks to their adherence to the settlement layer&apos;s rules. Meanwhile, Sovereign Rollups introduce a layer of autonomy and flexibility, albeit with a potential rise in complexity and risk.</p><p>Choosing between the two models involves a delicate balancing act between control, security, interoperability, and speed of innovation. As the blockchain landscape continues to evolve, understanding these different rollup models can aid in informed decision-making, leading to more scalable, secure, and efficient blockchain networks.</p>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/40fb1a68a211f8aee5e9fcff8bcad76be1ce6a089e00a29b472acd0799eda8ce.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Ethereum Scaling]]></title>
            <link>https://paragraph.com/@ohotties/ethereum-scaling</link>
            <guid>VlHkfyJdqLblVfjImBjR</guid>
            <pubDate>Sun, 04 Jun 2023 16:45:33 GMT</pubDate>
            <description><![CDATA[Ethereum scaling denotes an array of strategies and techniques aimed at enhancing the transaction processing capacity of the Ethereum blockchain. Ethereum, the leading programmable blockchain, is experiencing a surge in demand from burgeoning sectors such as decentralized applications (dApps) and decentralized finance (DeFi). Effective scaling solutions are, therefore, critical for Ethereum&apos;s sustainability and evolution. As of now, Ethereum supports around 15 transactions per second (tp...]]></description>
            <content:encoded><![CDATA[<p>Ethereum scaling denotes an array of strategies and techniques aimed at enhancing the transaction processing capacity of the Ethereum blockchain. Ethereum, the leading programmable blockchain, is experiencing a surge in demand from burgeoning sectors such as decentralized applications (dApps) and decentralized finance (DeFi). Effective scaling solutions are, therefore, critical for Ethereum&apos;s sustainability and evolution.</p><p>As of now, Ethereum supports around 15 transactions per second (tps) and ETH 2.0 targets 100,000 tps. This limitation often falls short of the growing demand, leading to network congestion and higher transaction fees. To address these issues and improve scalability, Ethereum&apos;s development roadmap encompasses scaling solutions that can be broadly classified into two categories: on-chain (Layer 1) and off-chain (Layer 2) scaling solutions.</p><p><strong>On-chain Scaling (Layer 1)</strong></p><p>On-chain scaling involves modifications to the Ethereum protocol itself to handle a larger volume of transactions. The most notable on-chain upgrade in progress is Ethereum 2.0 (ETH 2.0 or Serenity). ETH 2.0 signifies a fundamental change in Ethereum&apos;s consensus mechanism, shifting from the existing Proof-of-Work (PoW) system to a more energy-efficient and scalable Proof-of-Stake (PoS) system. This transition aims to expedite transaction validations, thereby enhancing throughput.</p><p>A vital component of the Ethereum 2.0 upgrade is the introduction of sharding. Sharding is a scalability technique that partitions the Ethereum network into smaller units called &quot;shards,&quot; each capable of processing transactions and smart contracts independently. By allowing parallel transaction processing across different shards instead of the existing sequential processing, sharding is expected to significantly improve Ethereum&apos;s scalability.</p><p>Proto-Danksharding and Danksharding are novel developments in sharding technology. Proto-Danksharding, also known as EIP-4844, is an intermediate solution that enables rollups to add data to blocks more cost-effectively. Danksharding, the full realization of Proto-Danksharding, will expand data blobs, allowing Ethereum to support hundreds of individual rollups and millions of transactions per second.</p><p><strong>Off-chain Scaling (Layer 2)</strong></p><p>Off-chain scaling solutions aim to reduce the load on the Ethereum network by executing transactions outside the main Ethereum chain (off-chain), while ensuring the security of the blockchain. These solutions include state channels, sidechains, plasma chains, and rollups.</p><p>Rollups have emerged as a promising off-chain solution. They execute transactions off-chain and post transaction data on-chain, bundling many transactions into one using smart contract functionality. This process increases the number of transactions Ethereum can handle. There are two primary types of rollups: zk-Rollups and Optimistic Rollups. Zk-Rollups use zk-SNARKs, a form of cryptographic proof, to validate transactions, whereas Optimistic Rollups use a game theory mechanism called &quot;optimistic verification.&quot;</p><p>The transition towards Ethereum 2.0 and the adoption of Layer 2 solutions are pivotal milestones in Ethereum&apos;s development. These on-chain and off-chain scaling solutions, particularly sharding and rollups, promise substantial improvements in transaction speed, network capacity, and gas fee efficiency. They pave the way for Ethereum&apos;s scalability, laying a solid foundation for the continued expansion of the Ethereum ecosystem.</p><p>However, scalability is not the sole consideration in Ethereum&apos;s development. Balancing scalability, security, and decentralization is a complex task requiring careful deliberation. As Ethereum continues to evolve, maintaining the underlying principles of blockchain technology – security and decentralization – remains paramount.</p><p>Sources:</p><ul><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://ethereum.org/en/eth2/">Ethereum 2.0 Overview</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://ethereum.org/en/eth2/shard-chains/">Understanding Ethereum Sharding</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://ethereum.org/en/developers/docs/layer-2-scaling/#rollups">Layer 2 Rollups</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://eips.ethereum.org/EIPS/eip-4844">EIP-4844: Proto-Danksharding</a></p></li></ul>]]></content:encoded>
            <author>ohotties@newsletter.paragraph.com (YQ)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/d3bcca0d0913452c27369b8eec17355d0c0aa272d6af0adc996c545db36ff385.png" length="0" type="image/png"/>
        </item>
    </channel>
</rss>