<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/">
    <channel>
        <title>nikos</title>
        <link>https://paragraph.com/@0xniko0x</link>
        <description>undefined</description>
        <lastBuildDate>Fri, 17 Apr 2026 17:29:05 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>en</language>
        
        <copyright>All rights reserved</copyright>
        <item>
            <title><![CDATA[Reevaluating Consensus Pt.1]]></title>
            <link>https://paragraph.com/@0xniko0x/reevaluating-consensus-pt1</link>
            <guid>XSvYtlP2QL3B21b0S3AM</guid>
            <pubDate>Thu, 19 Dec 2024 07:59:56 GMT</pubDate>
            <description><![CDATA[Since the emergence of blockchains, their core ideology has focused on being decentralized systems for asset transfers and serving as non-custodial asset ledgers. This post explores a controversial alternative to traditional systems, designed to achieve the same capabilities while scaling existing networks. ]]></description>
            <content:encoded><![CDATA[<p><strong>Status quo</strong><br><br>Since the emergence of blockchains, their core ideology has focused on being decentralized systems for asset transfers and serving as non-custodial asset ledgers. A primary design focus has been addressing the double-spending problem, which is still widely believed to require a strong consensus mechanism among all validators and full nodes of a protocol. This assumption has shaped protocols like Bitcoin, Ethereum, Solana, and others, which rely on Byzantine Fault Tolerant (BFT) consensus or its scaled derivatives, such as Solana's Tower BFT and MonadBFT.<br><br>While crucial for decentralized networks, censorship resistance with fast inclusion guarantees are often missing in many existing chain architectures and widely used consensus protocols. In leader-based protocols like Ethereum, the leader (or block builder) has the monopoly power to choose which transactions are included in a new block. This control empowers timing games, maximal extractable value (MEV), and incentivizes vertical integrations among transaction supply chain operators, introducing points of centralization and censorship.</p><p><strong>Introduction into Consensusless networks</strong></p><p>Since transaction-independent asset transfers do not need to be processed in a specific order, transaction-dependent transfers or operations, like asset swaps, require total transaction ordering and generally rely on consensus. However, in recent years, multiple research efforts have recognized this issue, leading individuals and companies to develop systems that utilize consistent and reliable broadcast methods along with consensusless mechanisms. Notable examples include Sui's Lutris, FastPay, and AT2 (Asynchronous Trustworthy Transfers). These approaches have emerged in the R&amp;D of "consensusless" systems, which are decentralized and censorship-resistant alternatives to traditional networks and do not require global consensus to transfer assets. Those networks bypass the high costs of traditional BFT mechanisms while achieving low-latency performance akin to Web2 systems, without compromising on decentralization or censorship resistance. However, Ethereum and its ecosystem of rollups still rely on slow and costly consensus mechanisms for processing data, a well-known bottleneck for the network.</p><p><strong>Expensive consensus overhead is a well-known bottleneck in blockchain systems</strong>. Various solutions have been proposed, such as Tendermint, Hotshot and scaled derivatives of existing BFT consensus protocols, to mitigate this issue. These solutions compete on speed, cost, scalability, security, and distribution, but still rely on global consensus. As a result, most of them require tradeoffs on properties like decentralization, slow finality or cost. Consensusless architectures address costly settlement and finality issues by bypassing consensus to deliver a highly efficient, scalable architecture that empowers high - performance, consensusless applications.<br><br><strong>Consensus isn't required for all on-chain transactions. </strong>On average, over 50% of transactions on Ethereum (~ 600k daily) are transfer, accounting for approximately 30% of fees generated, yet they don't actually require full consensus to be processed, creating a bottleneck to scale them and accrue more protocol revenue - yes, for the L1. While bypassing traditional consensus can enable low-latency transaction confirmation and finality, all by maintaining network decentralization and censorship resistance, Ethereum and its rollups still process transfers/payment transactions through slow consensus mechanisms. <br><br><strong>Must every transaction need to be processed in a total order?&nbsp;</strong><br><em>Consider two payment transactions</em><br><br>1. Alice pays Bob<br>2. Charlie pays Dave<br><br>Since these payments are independent, they can be executed <strong>in any order</strong>. They are commutative, meaning the order of execution does not affect the resulting state - it remains the same regardless of which transaction is processed first.<br><br></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/05ad74bfa0f38096a91847598eee6d06.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAICAIAAAAX52r4AAAACXBIWXMAAAsTAAALEwEAmpwYAAACUklEQVR4nF2SX0hTURzHDz0EBRGEvVTkQwTRCkrmiGam2UMmuTCJ3qykMDCCyRCC0pTon1Ko0TLIBx+MjbHaWLpGPawpQdFD9eTaHtzG1rLrupdzOueec88v7i6tsR/n4Zwf38Pvz/eDGKeMU8ooJphznXFKqEYotvKsfAwpJMjqjARZyRCKCdVq9ABg3ZH1Puw81NrWWlIVnRNCKeOccaJhtaQqANB35TJCqKQqQuqEYkOKQNAfjoQAJKEaZYRywTnVsKqsrVb0ytqqkLpZgHF6sqO90WH/+u0LAJw9s+f6YDcASCEYYwDgGfQghHL5rCEFoRjAsDvsTc1NUI7Z2clm57pMLmN2XdZ3uk4hhArFvCEF0rAKAF7v4/UbNvnmHu3YjGx7kfua6zcmt++MDnjcADAffV2ZwNLb9tku9p4HgN31qKFh48H9qFDMhSOhngs9qXQyFH5Vt7XOkMJcUbkjeBNbONHe8e6t//ixnbu2o3PdB4SUSx8SgYAPAIIvA6e7XNasVoGWlqMjo7cAoKvTdsS5rX4LSqWTmezKC99cSVVmZp43OuyiqoCMxhb6r/b/KOYBYGSod/qp+VkKwTkHgHAkNOBxF4p5IXUh9eXvyzeHb0xOTaTSSQCIx+f7LrUpJdMtIUx7A0H/8MhQLp81PbBMi8YWpp95U+nkH0pEOSqEGFIkFt/fu3/356+ikLohxeJSYurJxNj4g0+fPzITClJhhlAspB5PxMcfjmWyK/9N5pxbK7OWUIMd53o1psKk0MT0H9ZYw6UafbkkY5z+Bc7rZOWcHcEkAAAAAElFTkSuQmCC" nextheight="143" nextwidth="546" class="image-node embed"><figcaption htmlattributes="[object Object]" class="">Source: <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://pod.network/blog/wait-why-do-we-need-consensus-again">https://pod.network/blog/wait-why-do-we-need-consensus-again</a></figcaption></figure><p><strong>However, to maintain the first principles of blockchain </strong>and prevent potential fraudulent actions like double-spending, validators sign valid transactions independently, only based on the datasets they receive and view, rather than agreeing on a global order. A transaction is finalized once it has received independent signatures (a certificate) from more than two-thirds of the validators - assuming less than one-third are malicious. Validators sign only the first transaction receive.  As a result, when about to finalize a transaction, only the transaction containing a certificate will count as valid. In the potential event of a double spending attempt, only the certificate holding transaction will be accepted.  <br><br><strong>Conclusion</strong></p><p><strong>Addressing scalability challenges could unlock a transformative wave of innovation</strong>, attracting EVM developers and encouraging users to embrace more efficient, scalable blockchain applications. By eliminating the bottlenecks that currently limit mass adoption, scalable infrastructure can drive exponential growth across blockchain ecosystems. Consensusless protocols offer the essential foundation to capture the majority of such transactions, providing unparalleled scalability for developers while preserving the core principles of blockchain technology. These protocols ensure decentralization, security, and censorship resistance, empowering developers to build applications capable of serving millions without compromise. This shift will not only empower developers but also propel blockchain toward achieving mainstream adoption across industries.<br></p>]]></content:encoded>
            <author>0xniko0x@newsletter.paragraph.com (0x4e)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/52ce518beda69c7e120511ecf8f2e114.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[The Multichain Endgame ]]></title>
            <link>https://paragraph.com/@0xniko0x/the-multichain-endgame</link>
            <guid>6VxrnKpTATd7eC9vqItH</guid>
            <pubDate>Sat, 09 Nov 2024 11:48:18 GMT</pubDate>
            <description><![CDATA[What the Multichain Landscape looks like today “When in doubt, zoom out.” ]]></description>
            <content:encoded><![CDATA[<div class="relative header-and-anchor"><h3 id="h-what-the-multichain-landscape-looks-like-today"><strong>What the Multichain Landscape looks like today&nbsp;</strong></h3></div><p>“When in doubt, zoom out.” - Reggie Watts. Web 3.0 is a diverse space that presents many opportunities. As a developer, perhaps you’re cooking up a killer decentralised application (dApp) in the latest, most trending narrative in hopes of attaining product-market-fit. As a trader, perhaps you’re drawing setups in hopes of catching every move, up or down. As an on-chain memelord, perhaps you’re scrolling through your hundredth discord channel in hopes of finding the next 100x low-cap gem.&nbsp;</p><p><strong>These are just a few of the many options available to you in the space. It’s easy to get lost in the </strong>mix and lose sight of the bigger picture. One of the reasons why Web 3.0 was conceived was to revolutionise the traditional financial systems. Although Web 3.0 has already been around for almost a decade, it is still a highly nascent sector. The industry has to continue innovating in order for Web 3.0 to see long-term success. While nobody can foresee the future and what’s to come, in 2022, Vitalik shared his vision for a multichain Web 3.0 ecosystem as the way forward.</p><p>A multichain ecosystem refers to deploying dApps across multiple blockchains, resulting in multiple instances or versions of the same application in various blockchain ecosystems. Together with a multichain ecosystem comes challenges such as inconsistent user experiences across chains due different tech stacks and capabilities. Another challenge is fragmented liquidity which results in discrepancies in liquidity across chains. More of such challenges, as well as their proposed solutions will be covered later on.</p><p>As of now, we have not truly achieved a multichain ecosystem just yet. However, with 311 different chains currently, the stage is set. In this article, we will be diving into the various multichain technologies that are available today, covering several multichain case studies, as well as looking into the future and what it holds.</p><p>Composability, the key to opening up a new dimension&nbsp;</p><div class="relative header-and-anchor"><h4 id="h-a-primer-on-composability">A primer on composability&nbsp;</h4></div><p>On a high level, composability refers to the ability for dApps to build on top of each other, allowing for seamless integration and the formation of complex and innovative solutions. Composability is one of the most distinctive features of Web 3.0 and DeFi, given its open-sourced and permissionless nature. This begs the question, is composability for everyone?&nbsp;</p><p>When thinking of composability, the first thing that comes to mind is the Curve-Convex integration. Curve is a Decentralised Exchange (DEX) and Automated Market Maker (AMM) which incentivises Liquidity Providers (LPs) with $CRV rewards. Curve pioneered the vote escrowed governance model, which gives users the option to lock $CRV to receive $veCRV based on the lock duration and the amount locked. $veCRV then allows LPs to boost their $CRV rewards and collect a portion of the fees from swaps and loans that occur on Curve.&nbsp;</p><p>Convex was developed with the goal of optimising yield for Curve users by aggregating $CRV. Instead of having to lock up $CRV tokens for extra rewards, Curve LPs can deposit their LP tokens into Convex and earn the same boosted yield. Convex also allows $CRV holders to convert their tokens to $cvxCRV which is then staked on Convex to earn token rewards. Users can also opt to lock $CVX to earn a share of platform fees and Curve emissions weight votes. $vlCVX holders can redirect these external incentives to specific liquidity pools via the Votium bribes platform.</p><div class="relative header-and-anchor"><h4 id="h-interoperability-as-the-key-for-the-evolution-of-composability">Interoperability as the key for the evolution of composability</h4></div><p>By leveraging composability, Curve and Convex shot to fame. Convex was dubbed as the “Kingmaker of DeFi”, and rightfully so. Today, Curve is deployed on more than 10 different chains, embracing interoperability and its benefits. However, Convex, its partner in crime, has chosen to reside solely on Ethereum mainnet, presumably due to the concentration of liquidity and demand there. The Curve-Convex integration is a great application of composability. However, due to the nascency of cross-chain liquidity solutions at present, this particular “DeFi Lego” has yet to take off on other chains.&nbsp;</p><p>Interoperability creates a unified environment where dApps can interact across multiple chains. While composability is not fully dependent on interoperability, it is highly enhanced by interoperability. Interoperability allows dApps to scale horizontally across chains, while composability allows vertical expansion via the permission integration of other various protocols. Future multichain solutions could enable one to bring “dApp Legos” smoothly across chains, opening up doors for greater innovation and collaboration.&nbsp;</p><div class="relative header-and-anchor"><h4 id="h-doing-away-with-interoperability-and-composability">Doing away with interoperability and composability</h4></div><p>Current limitations in blockchain technology, such as throughput constraints, network congestion, and high transaction fees, prevent certain applications from being fully supported on-chain. Some applications require highly specialised blockchain solutions with custom technical requirements. Instead of leveraging interoperability and composability across networks, they opt to develop custom chains tailored to their needs, prioritising control and optimization over seamless integration with the broader ecosystem. Effectively, these applications remain within their own ecosystems, which raises questions about the importance of composability in Web 3.0.</p><p>One such example is Hyperliquid’s L1 blockchain. Hyperliquid is a high-performance perp DEX. Unlike other projects that leverage existing Layer 2s or Layer 1s with broad interoperability, Hyperliquid chose to deploy its own L1 to cater to the specific needs of its platform. By developing a custom L1, Hyperliquid can optimise throughput, reduce latency, and avoid the scalability and network congestion issues commonly faced by more generalised blockchains. This way, Hyperliquid maintains full control over their infrastructure, ensuring that the technical specifications align with the demands of high-frequency trading and liquidity provisioning. Ultimately, this approach prioritises performance and security over composability with other networks, enabling Hyperliquid to create a specialised environment tailored for its user base. Since its inception in June 2023, Hyperliquid has seen great success. It currently has a Total Value Locked (TVL) of US$671m, with a cumulative trading volume of more than US$290b.&nbsp;</p><div class="relative header-and-anchor"><h4 id="h-is-composability-oversold">Is composability oversold?</h4></div><p>In view of Hyperliquid’s success, is it fair to say that interoperability and composability are not necessary to build a great product? Although the application-specific approach worked for Hyperliquid, not all types of protocols and products can fit in this same mould.&nbsp;</p><p>Is composability oversold? Is interoperability really necessary? There are no easy answers to both of these questions. However, if application-specific chains and infrastructure continue to be popular, this further solidifies the bull case for interoperability solutions. As these solutions continue to improve, perhaps there will come a time when “dApp Legos” can be brought multichain seamlessly.&nbsp;</p><p>Given the limitations of the current L1 architecture, which lacks a clear path for scaling at the base layer, the concept of moving execution to an offchain environment was introduced. Since then, we've seen the rise of nearly 100 rollups, all designed to massively scale developer and user experiences by reducing latency and fees. These rollups offer modular components that allow for the creation of either generalized execution environments or specialized ones, tailored to the needs of specific applications. The advent of rollups and Layer 2 solutions has scaled Ethereum but negatively impacted the developer and end-user experience. Developers are forced to choose specific L2 platforms for their projects, while users must navigate a complex landscape of interoperability tools just to interact with decentralized applications. This fragmentation is expected to worsen as more rollups and application-specific chains are introduced.<br><br>However, because these networks maintain sovereignty over how they operate, order, and sequence transactions, the largest Rollups have become siloed within their ecosystem and keep gain increasing profitability, particularly through the control of transaction sequencing. The Rollup's control over sequencing introduces a centralized and fragmented liquidity landscape, which does not reflect the original ideology of decentralization, censorship resistance and a unified composability layer as we see on several L1 environments. <br>Additionally, the notion that rollups inherit the censorship resistance and decentralization of the underlying L1 is misleading, as long as the sequencer remains under the control of the rollup entity itself. This is evident in the significant revenues generated by platforms like Base and Arbitrum through their sequencing operations.<br>Hence, the interoperability among L2s is currently fragmented and almost non-existent, only through clunky 3rd party bridges and routing aggregators. Several teams are actively working on solutions to seamlessly unify liquidity and user experience across existing L2s, fostering collaboration and enabling them to opt into third-party solutions that could maximize Ethereum's network effects.&nbsp;</p><p><br>A closer look at the current building blocks aimed at addressing the challenges of interoperability and composability.</p><p>Shared Sequencing<br>Shared sequencing and the decentralization of the sequencer that comes with it, was among the most promising approaches to achieve the notion of why Rollups have been brought to existence.<br><br>Shared sequencers are rollup-agnostic solutions that aim to enable synchronous composability by submitting transactions from multiple rollups to a shared, mempool-like environment. The sequencer then selects and batches these transactions to achieve synchronous atomic transactions across rollups. Current shared sequencer designs vary; some focus solely on aggregating and ordering transactions, while others execute them too. <br><br>Case study: Espresso’s shared sequencer<br>To address the limitations and challenges faced by current Layer 2 (L2) solutions, Espresso introduces a shared sequencing network designed to provide secure, high-throughput, and low-latency transaction ordering and availability for the existing optimistic - and zk Rollup landscape. Espresso focuses on preserving the inherent properties of L2 solutions while leveraging Ethereum's economic security through restaking. By breaking the current sequencing monopolies, their shared sequencing layer aims to enhance user experience and enable atomic cross-rollup transaction execution, while providing feature like preconfirmations.&nbsp;<br><br>At its core, Espresso's shared sequencer is built on HotShot, a modified consensus protocol derived from HotStuff BFT, along with its own data availability layer, Tiramisu. This combination allows Espresso to optimize for both high throughput and fast finality for cross Rollup transactions, with similar security guarantees as on the underlying L1.<br><br></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/9cd5fd0a80fac5319a3971643c6ebfe9.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAXCAIAAADlZ9q2AAAACXBIWXMAAAsTAAALEwEAmpwYAAAEWklEQVR4nK1V3U9cRRS/icQQ0wSzydoXUsJ2XQuLC5tdLnd2Pu7MnWX27l5WsrBsVUxKi4k1hbaAGiuUaIwYEtptqQ+mffI/MLaJaQgvvNBGKLq2YIjVPpHwaE32gZdr5g69WbeJYHRyHibn6zfnzPnNaISwQ4hFWI6wHKY5Qu3DheyLdqAHxpSwHE2XrMwwE+9QXkTYksr/BQAABABC2CKk1xOLEIEwAwBCSP4TAITEzwIA8tP5Z1fYhwewPJEbCEmPURts1RzWqvFUh0ApgJ831QFY1CpaoujVjiCEJqGUqphek1AEIcYmIZZJTF+8jjGMTQSRYRgpkMLY9NyYZxIKTJM71g/NEmIliDgk+3vM3pRjgylmQ8nUAGIDhNrSxxyC7CRmQzIRy1mZYewpdTRgoEErMyznTWHIjaUR0jt/xd3cdtfW3XLZ/fY79+G2e+8H91HFHZ34xRRnV5bdrd/d7++4V6+6P1bcBxV3fd1dWXaZPXb64sO1Jff+ulQ+qLirq+79u+57H/yGIZAYcqB7NUIzAyPfnL/867nprfzwrVPjyxdmt9/9cG185nG2OEfTJ89Mrk5+9njkwkr+7a/HZ7fGL2+OTW+embxnZYazg5+OXPz53CeV87ObY9OP3v+4cmr8J3vwc1k3ps8AWI4gnIy3dyc6MIIQ6Hoi1hk7kYy3m0TeLQSJ7vjrPXoXxtDQ412x9mQ8qic71ZX0JDv0RMzQ44ae0BOx7kTUMAxCPKLIFvVq0o/1MV4AKAuJw3gBsTcQ6UOkL4WEyRyTF9KZEmH9hPWzdJGLoskLiPSZzEkhITe8wHghRXKI5gnrT6GMjGIOoZk6Hlhq3hGiACDOhUmlhtJ0U1MgGDxKLZGCpKHhxdaW4wjLGaU0TWn6GSewGug6ctQTDQCk68BL+rKmaQhRIbKat2zbkQGaFgwe5VxEo12apkUibRjLA9Uy4wAm67pBCAsGX9E0TddBJNLW1tbe3HyspaVVSSh0PByOxONJBfAc/w8CUE7Kr6kpQCmdm/tC07TGxsYGuV44cuQlha3roO5FOiwApWnVFghxNNoZjydeO9HxaqTNq6YjHI5Eo52xmGxRS0sr51kf5rAAEJJotCsUClOaFiLb0NAYCARt21HAoVCYcwEAam4+puuGuud/2yLIueBcEML8alRSTdMCgaB6ZzjP1mb/JwAhskLYvodqrl84pWkFprrnDYJ8DOq+HT/2bwBqln2x7ZxtO7btOE6ec1Fr4lwok7LWBQqRVXrHyQshK1NHkQA+vm07Gxsb1Wp1d3e3Wq3evHlLcYLStK6DS5emn/759A9vPXnyZGTktKKhil1aWt7b29vZ2alWq7dv3/FpuF+B34qZmdly+fr8/JXFxRujo2c9fu7Dl0pvlcvXFxauLSxcm5v70nHytcMzNfXR4uINL/CriYkpCWDJD0PeQW2lug6SyR4lAOBaEyHMNyWTPXWBAKCaQFkZpfLD+QtiAmt9NKsMLwAAAABJRU5ErkJggg==" nextheight="539" nextwidth="758" class="image-node embed"><figcaption htmlattributes="[object Object]" class="hide-figcaption"></figcaption></figure><p style="text-align: center"><em>Source: </em><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://hackmd.io/@EspressoSystems/EspressoSequencer#IV-Building-the-Espresso-Sequencer"><em><u>https://hackmd.io/@EspressoSystems/EspressoSequencer#IV-Building-the-Espresso-Sequencer</u></em></a></p><p><br></p><p><strong>Sequencer Marketplace</strong><br>Espresso enables applications to sell their sequencing rights through an open marketplace governed by an auction, where participants acquire execution tickets for the sequencing rights of chains, and if successful, become the sequencer for a given slot on those chains. This coordination layer enables atomic cross-rollup transactions and fast pre-confirmations, boosting the utility of rollups and enhancing the UX. Chains can set a reserve price for their blockspace or designate their own sequencer if the price isn't met. Additionally, the Ethereum L1 proposer has a right of first refusal on sequencing rights sold in the marketplace, allowing participating chains to function as Ethereum rollups and enhancing composability with Ethereum, a derived idea from based sequencing. <br><br><br><strong>Cross chain - Intent bridges&nbsp;</strong><br>Intents are paving the way for a transformative shift in on-chain trading experience. At their core, intents represent a user’s desired action - such as swapping Asset A for Asset B. These intents allow for flexibility, as they can be fulfilled by any party (referred to fillers and solvers) in various ways, providing a more adaptable and user-centric approach to trading. As the current DeFI landscape is fragmented, liquidity in AMMs is limited causing a bad trading experience for ever participant - from HFT traders to the everyday meme trader.&nbsp;</p><p>Intents introduce an elegant alternative, where a user can broadcast it’s intentional trade to a wide array of fillers/solvers. Those third parties have the flexibility to batch a set of transactions, tap into offchian liquidity or match two transactions in a CoW to fulfil the intent in the most optimal execution for the user, while earning a rewarding fee for facilitating this. A decentralized network of&nbsp;</p><p>Just as money flows almost seamlessly within current monetary frameworks, certain applications and user groups rely on robust and secure cross chain solutions to unlock new levels of sophistication and innovation in product development and user experience. <br><br><strong>Intents x Interoperability&nbsp;</strong><br>Many existing cross-chain solutions use various approaches to enable communication between blockchains, with most relying on message passing—often off-chain—to facilitate end-to-end data exchange. These protocols require robust validation, transport, and security mechanisms to ensure reliable communication. Leading protocols such as Axelar, Across, and LayerZero enforce strict rules to relay only fully finalized transactions, ensuring security but introducing latency, especially on blockchains like Ethereum, where finalization can take several minutes. While messaging-based designs have been widely used, they often suffer from slow transfer finalization, high costs, and security trade-offs, highlighting the need for more efficient and secure cross-chain architectures.</p><p>Intent-based systems seek to optimize real-time, secure trading experiences by allowing users to define the desired end-state of a transaction. A network of fillers then competes to achieve the user’s outcome as quickly and cost-effectively as possible.</p><p><strong>Case Study<br>Axelar Cross-Chain Intent Communication</strong></p><p>Axelar has pioneered the connection of multiple blockchains and Rollups, enabling near-instant cross-chain transactions through an intent-based system. In Axelar's framework, users define their desired actions - such as swapping asset <em>X</em> for asset <em>Y</em> from Chain <em>A</em> to Chain <em>B</em> - by specifying these actions as "intents." Once the intent is submitted, it is broadcast to a decentralized network of "solvers," which are third-party entities or validators that compete to identify the most efficient route and execution path for fulfilling the intent. These solvers then execute the transaction across the appropriate chains, leveraging Axelar's secure and standardized protocol. This approach combines the security of on-chain execution with the flexibility of off-chain processing.<br></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/695928b1cf0ccb72787ab71d6aa549a9.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAVCAIAAACor3u9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAE1klEQVR4nJ2VXXATVRTH7xRxfJBBdHixIjpCkREYZYoFsXwNVBzUVqs+gR3L9MEBBHlQXpTyOZUifhQYoUKVdmYpNFObAGPaJg3phm2TbOi2aTZJk02abGjupt2E7DZNNx/XSbbGQsEB/3Me9p797/3tOXvvXYAQwnH8pl7fqdHc1OuJrMwkqdVqOzWacDgiSYlcpNLpVDodn5QkKYEQkpNET69ej5tJ0kySkiShewUQQrt273k+f8EHpWWFq1YXF6+v3Fm1a/ce+bq314gQGuUjbWq8aP3Wqj0HdlTt/bTiCw1uEcSYPMWRo8c2btpcWvbhxk2bfX7/AwDTJYpi9tUkCCHP8xBClmVx48BlVdeSFUX5i5YDMHvZqmKlhjCYKIfD6XA4WTbAsizP8/dN9S9AulcIIQihxXKbzyoSDpspWtdDfVNd89KrK/MXLatvUmh7+mxD3hDHways1kEcxx8KmCkIOZZlc0NJSjjdfpYbu9yqUnfhngB0++5M9/M8b7HcfgwAz/PTAShbJULoh1Onmpubc8N7AZb/XwFCKJVOI4QYhoEQ5oaPWEF6bGxEFEVJSsTGxSG7xWbrtdn67gPIstvtD8qn1X+pyj/+BCE0EY9xQa+TNrE+e0KKTQFIY7vDdms05HLY8P6+jiGnwUQo3a7+GRMhl9vNsoH7koIQufT7j41/1Ok0f6pvYK0t5661XWyoP3GtrWFcjGZa1H1T9dn290q2FO3f93nFjrINGwq/3LWdeRzAm4VLnnwCzJ4F3lr92tw5oKDgBQDAkoIF8biQAeh1ym3vrtuyec3+fZXlH20pfvuNr/ZWPDogIcV+PXusYHH+whfnAwCenZcHAMjLAw2/1UqTYgZAkVqKVNsH9X0mtd3W7fNaTIRqyE4+ImBcjGo7mq9gp69gZxrqT1zBziiu1l/FzrbfaBQFPgMI85zF1GUibug0Kr9/eHIyHvC7An7XfwAm4nGX20X09NI0zTCeW7d0g/04RWoZ1+1oNLOl4/GYb9g2Loanlqm87KzWQXkVPkyOzMGQWUU8z5eUvDPvufkrXl9ZtGbt+6VlM485WRlAKp1KpJIIIZqmR4JBOZNKp3Km1D/DYd8wQRB0VgzjYdkAw3gYxuPz++XkwMBAVIjKfvmRqQpk/unTZx52pMhKJpKjoTE5spsxMBIMjgSDEHL8KC/cFfr6KIZh0DRlAD7OZ/VYwzH+YO3htg4lJ0CrxzoaHc2ZQmKIjbB37t5hIyyMQjnuTkRyBi7CeXgPJ8BBt5UaomAUOoPOkBjKAFLpVDvV3tTddPjC0Z+af65T1NUp6jAcs3imzhZJklp0iur6Q4fOH6nFTh48V338Us3XvxzoJDvl7kmS1NXfdbHj4vFLNYcvHK3FTn77a3VTd5Pepp8C4A7DedX5pcXLQR5YWrx8645tSlKpJjpMRjNN05U7qzaVl8xZ+AwAYH35RjAfPL1gLngKfHfyoPw/qKiorDn3vZJUri1dB2aBV1YtfrlwcaOm0ejK/KwyLers11w2NLeaWuVoIa426hopL5VMJCVJio3H9DZ9i7FFQShyHoVRYWbMyURyIj6RTCQNtAHDsdzd69R1DMcMtGEKMDI24oVeH+fLhRd6hZiQazHk4UwDL4RzBl4IzzTIX/Fvrx6EchWV5HkAAAAASUVORK5CYII=" nextheight="460" nextwidth="693" class="image-node embed"><figcaption htmlattributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Axelar has also partnered with Uniswap to pioneer ERC-7683, an initiative aim to improve cross-chain interoperability with a new standard<br>While current intent-based systems fulfil user intents through dedicated networks or relayers, often operated by decentralized validators, they face significant challenges in attracting a broad range of solvers. These challenges stem from complex integration requirements, infrastructure inconsistencies, and other implementation hurdles, which often result in the centralization of execution among the most sophisticated and resourceful solvers.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/7035b950b05113d762063dcc8d2f59d0.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAQCAIAAAD4YuoOAAAACXBIWXMAAAsTAAALEwEAmpwYAAAEQUlEQVR4nH2U3U/bVhiHU2kXvViloWq721X/pUntbivRar2oKKKBoAi6jiFNuWAZLEsTUhvsGhMw0ZJgGsgXwW4ch0AamSRzEvLpOHGjAAFiDAmenLCsINRHR0fvhXUeve/vHKva7falfNld7Q6SJF10kCTp9OT0oH5QLBZ5nhduQ/mGaxznjk5Kyt4tTgrKOs4dnR2KKlmWm5XmQeJAFES5gyiKQk34VPvUaDRkWS4UC16fx+vzoigKwzCCIL19wbpQSOR2XtFhDRHWkNFf6egkHdaQ4dEP29pgaGgru5JWibVmYTUdX4wW1zKNyqEsy2NjY3dVd/u+7vv2m+8efP/g3lf3VCrV/b77BoPB6/WmUikMw0AQVBwL73JMJjJGUy8CjH637Cvx/nLSFAtrghEtRQ1sZhaSqpPCMbvE2KewfSxeSfBSS7qjuqO6jbfA20wmEwwGWZbFMMxsNiOLC/nYfmScptVEwZELzHrWfrdz7qK9f9H1dIUa2NxfSqlaYqsa5MqeokDxrbOWLMscx9md9rX3a6v4qmvDNfPnzMNHD/uf9Mfj8d3oLgAA+Xze7XZrtdo5aK6wl42M06GhrdR8oriRzrrYLMbSw1ueZw5vvyO7sq9k0D5vl9Pc+dm5Urfb3SR6dSqVmjXPQhCEomgsFhMEgSAIBEGMRiMwDxYZRUCriW1tMGlmWGAvMh7cHqWIgfeefvuVQJblbDYrSVLv3N4tkmU5Go3qdDqDwQDDsNVqdbvdEATBMGw2m+eguasRDRP0MEmrCVpNKgGM0TcFuWyuJ7jRAcMwer1+YmJCq9WOjIxoNBoAABAEMZlMc9B8d0T0MBEe/eD5yQ7+YEB+NKOPLPgTzP/UmcXSV4J8Pv8FwdTUlNFohGEYBEGowy0CjdJBaMhPqwNhDRl4jns/7+DLAv0fepPJhKIo0gGGYRRFZ82z10b0kki8YVLzCXqY7GSwflMgispDa3/GxcVFVzA5OTk9Pd3tAARBnU43ODj4+ufXEAL1rmniTYxz58q+PAswES0VeI67H9tuD/lGB/F4HACA5eVll8uF43h3x3Hc6XRasaXOQwtFxkl+k9+YWf37N2uVrEZ+IVYfL1EDfuUddE/JZDIcx9Xr9cMOjQ7dAlcOXXM4HCAIWiwWBEFsNhsMw+uudZdnI7PD7rwK02pib+Zj2Vfg/aWkmQlryG1tkH5JZDH2qoNms1mv15PJJEmQVJAiSTIcDgcCgQpf8fv9FosFx3GHw+F0OjEMs2E2DMOwZQxCoGw0HdFSoaEANbDJ6Hfjf32kXih1aGiLfObLoP8J/v/NdahWq4IgcBzXbrd5ng8EAt4Ofr8/Hk+wLJtMJv9JJtOZ9OnRaYXgy55S2VPi3CVuo8T7OYGuCHSlGiyflI6vCW7NoFarsSwbiUQYholGo93ke1y2LqVDqVlpNvmmWBPFmigdXMvyX3cXr+5R8B+zAAAAAElFTkSuQmCC" nextheight="327" nextwidth="655" class="image-node embed"><figcaption htmlattributes="[object Object]" class=""><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://blog.uniswap.org/uniswap-labs-and-across-propose-standard-for-cross-chain-intents">https://blog.uniswap.org/uniswap-labs-and-across-propose-standard-for-cross-chain-intents</a><br><br></figcaption></figure><p>ERC-7683 addresses these issues by proposing a standardized intent-based architecture. This framework establishes a unified network where any filler or solver can participate seamlessly, fostering global DeFi interoperability and supporting a diverse, decentralized network of solvers worldwide. While intents have proven to be an effective solution for cross-chain interoperability, current systems often operate within isolated networks, leading to potential latency issues, value extraction, and centralization risks due to reliance on a small group of solvers.&nbsp;</p><p>At the center of this proposed new standard are the "Cross-chain Order" and "Cross-chain Settler" functions. These functions define the constraints that settlement contracts must adhere to in order to resolve implementation-specific orders and initiate them on-chain. This new standard allows users to submit cross chain intents that remain consistent across platforms, whether directly on Uniswap, Across or any other system. This standardization enables cross-chain applications to share a common set of fillers/solver through a universal network, eliminating siloed systems and reducing the risk of centralized dependencies, improving the UX for dApps, fillers/solvers and end users - all at once.&nbsp;</p><p><strong>Based sequencing - Based Rollups</strong> <br>Still in a very nascent stage, rollups are considered "based" or “L1 sequenced” when its sequencing is directly determined by the base layer - L1. Specifically, based rollups are a class of L2’s that leverage the inherent sequencing capabilities of an L1 proposer. In this setup, L1 proposers, along with searchers and builders, can permissionlessly include the next rollup block (<em>n</em>+1) within the subsequent L1 block (<em>n</em>). This design significantly enhances composability among rollups, fostering seamless interactions and interoperability between them.</p><p>L1-based rollups achieve universal synchronous composability (USC) by leveraging the base layer for sequencing, which allows cross-rollup transactions to be finalized directly on the underlying L1. This approach stands in contrast to other rollup solutions that rely on third-party message-passing systems or shared sequencing marketplaces. By utilizing the decentralization, security, and liveness inherent to the L1, L1-based rollups not only optimize for scalability but also create a more interconnected rollup ecosystem.</p><p>Unlike current rollup solutions that depend on external interoperability protocols, this method offers an in-protocol solution that directly leverages the security and liveness guarantees of the Ethereum validator set. This integration enhances the overall reliability of cross-rollup transactions, while minimizing dependencies on external systems and protocols. As a result, it fosters a more robust, secure, and scalable cross-rollup ecosystem, building on the decentralized foundation of Ethereum itself and further reducing the risks associated with centralized or third-party intermediaries.<br><br>When multiple rollups are sequenced by the L1, they benefit from synchronous composability within any L1 slot where the proposer is granted the right to act as the L2 sequencer across multiple rollups. Here, the L1 proposer functions as a shared sequencer among a set of based rollups, facilitating sophisticated cross-rollup transactions and interactions natively at the base layer.</p><p><strong>Chain abstraction </strong><br>As the multichain ecosystem evolves, efforts to scale EVM infrastructure and embrace Ethereum’s rollup-centric future have resulted in over <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://l2beat.com/scaling/summary"><u>100 rollups</u></a> and numerous standalone appchains. While these advancements address scalability and sovereignty, one critical challenge remains unresolved: user experience (UX). Despite progress in scalability - largely empowered by the modular thesis - user frustration continues to stem from fragmentation. Assets are dispersed across multiple wallets, bridging pathways are complex, chain configurations are diverse, and gas tokens vary widely. This added complexity not only increases the learning curve for both developers and users but also raises user acquisition costs and slows broader blockchain adoption. To overcome these challenges and create a seamless, user-friendly environment, the industry must introduce additional layers of abstraction, simplifying interactions for both users and developers.</p><p>Introducing and implementing account abstraction was the first step towards enhancing UX, but it doesn't address the coordination of assets in a multichain environment. <br>The core idea behind Chain abstraction is to solve the coordination problem based on a pretty straightforward assumption; blockchain technology should be invisible to users. <br>In other words, users should not be aware that they are using a blockchain, nor should they know which blockchain they are using. Users should be able to execute logic from one point of execution, eliminating the need for them to switch networks, sign transactions on different chains, or manage independent balances across networks. For the first time, users can seamlessly interact with a dApp from any supported chain, using any token, all without ever leaving your UI.</p><p><strong>Particle Network is leading the charge in chain abstraction innovation</strong></p><p>Particle Network is an emerging Layer 1 blockchain with a single goal: providing a superior user experience that many others in the space aspire to achieve. Particle Network brings a solution to the industry's challenges through its implementation of an L1 that unifies all chains through Universal Accounts, simplifying the blockchain experience with Chain Abstraction. This approach addresses issues such as fragmented liquidity, cumbersome and expensive cross-chain transactions, and suboptimal user experiences across different blockchains. By leveraging its modular core components, Particle Network creates a seamless interaction experience, making it almost indistinguishable from using a single chain. In essence, Particle Network provides users with one account and one balance usable across the entire ecosystem.</p><p>To achieve the notion of a fully blockchain abstracted future, Particle network is build upon 3 critical components:&nbsp;</p><ol><li><p><strong>Universal Liquidity </strong><br>Particle Network’s Universal Liquidity architecture combines a modular node setup with a Decentralized Messaging Network to enable automatic, cross-chain operations. This system streamlines interactions by unifying balances across Universal Accounts and automating fund movement between chains, drawing liquidity directly from the user’s balances when necessary. By eliminating the need for manual bridging, it allows transactions with any token, all coordinated and settled through Particle Network. Users can interact across chains as though they were on a single network. When a user joins an app powered by Particle Network’s Universal SDK, their EOA (e.g., MetaMask, Keplr) signs into a Universal Account, enabling seamless funding and usage across multiple chains without the need for bridging, all backed by Universal Liquidity.</p></li><li><p><strong>Universal Accounts</strong><br>Universal Accounts are a core feature that enables Particle Network to achieve chain abstraction. These accounts provide users with a single address, balance, and point of interaction across the entire multi-chain ecosystem. With a Universal Account, users maintain one unified balance and address across all chains. Leveraging Universal Liquidity, users can execute cross-chain transactions automatically, delivering a seamless, cohesive experience without the complexity of managing multiple wallets or addresses.</p></li><li><p><strong>Universal Gas</strong><br>This feature allows users to pay gas fees on any blockchain using any token, eliminating the need to hold multiple tokens like SOL or ETH. Regardless of the token used for payment, all gas fees are ultimately settled in $PARTI, the network’s native token, even if the user doesn't hold it.&nbsp;</p></li></ol><p><strong>Protocol Architecture</strong></p><p>To create a universal coordination layer, Particle Network introduces a streamlined infrastructure that outsources data availability (DA) and employs an innovative consensus mechanism built around three core modules:</p><p><strong>Master Keystore Hub (MKH)</strong><br>The MKH serves as the central coordinator for smart contract deployments and state management across all networks. It automatically synchronizes configurations across all Universal Accounts (UAs), ensuring consistent state updates throughout the network. Acting as the authoritative source of truth, it securely stores account settings to maintain uniformity across all connected chains.</p><p><strong>Decentralized Messaging Network</strong><br>Particle’s Decentralized Messaging Network leverages relayer nodes to create a communication hub, enabling seamless cross-chain coordination. These relayer nodes monitor and update the execution status of operations across external chains and efficiently manage state changes within Universal Accounts, ensuring consistent and reliable cross-chain interactions.</p><p><strong>Transaction Bundler</strong><br>Unlike the centralized bundlers commonly seen with ERC-4337 (Account Abstraction), Particle implements a decentralized approach for its bundler. In this system, transactions from the public UserOps mempool are gathered and processed by node operators within the bundler network before being relayed and executed on external chains as needed.</p><p><strong>Aggregated Data Availability</strong><br>Designed with modularity in mind, Particle adopts a unique approach to data availability (DA), integrating multiple DA solutions like Celestia and Avail rather than depending on a single provider. By leveraging a diverse array of DA layers, Particle enhances the robustness, redundancy, and security of its data availability infrastructure.</p><p><strong>Dual Staking Model with Babylon</strong><br>The Particle-Babylon mechanism introduces a dual staking model featuring two distinct groups of node operators:</p><ol><li><p>$PARTI Staking Nodes — Nodes that stake Particle's native token, $PARTI, to secure the network directly.</p></li><li><p>$BTC Staking Nodes — Nodes that stake $BTC to provide cryptoeconomic security through Babylon.</p></li></ol><p>Both groups of validators are incentivized to independently validate blocks, fostering balanced participation in the consensus process. This model broadens the network’s security framework and enhances incentives for node operators, promoting higher participation and greater network stability.<br><em><br>*Disclosure: Particle Network is Signum Capital’s portfolio company and&nbsp; the information provided on this newsletter is for general informational purposes only and does not constitute professional nor investment advice.</em></p><p><br><strong>Building a unified liquidity Layer on Ethereum </strong><br>Ethereum and its rollup ecosystem face considerable challenges in establishing a unified liquidity layer, a problem less pronounced in more integrated networks like Solana. Rollups such as Arbitrum, Optimism, and zkSync boost Ethereum’s scalability, yet each operates as a distinct entity with separate liquidity pools, resulting in fragmented markets and limited asset flow across networks. Unlike the seamless composability Ethereum originally offered, rollups increasingly behave as appchains, prioritizing their own liquidity and user retention. This shift reflects their evolving role as standalone entities with business models that may be impacted by liquidity sharing with competitors. For instance, Optimism's OpStack enables atomic composability across rollups built within its framework, effectively aiming to retain liquidity and users within its own network rather than Ethereum as a whole. This fragmentation presents a unique hurdle for Ethereum as it seeks to create a cohesive financial layer amid a decentralized landscape of independently operating rollups. The incentives between Ethereum’s base layer and its rollups have become misaligned. While rollups initially leveraged Ethereum’s liquidity and user base to scale, they are increasingly seen as parasitic to the Layer 1, contributing only little value in return. Instead of enriching the broader Ethereum ecosystem, rollups are now focused on retaining their own liquidity and user base, often isolating themselves from Ethereum’s native network. This dynamic creates a tension where rollups benefit from Ethereum’s established network effects without providing reciprocal value, undermining the long-term symbiotic relationship that was once envisioned between Ethereum and its rollups. <strong><br><br><em>While interoperability is often touted as the future of blockchain, the reality is that many networks are more focused on building 'walled gardens' than truly interconnected ecosystems</em></strong><br><br>​​Interoperability across domains may, in some cases, be overstated, as many networks and applications are fundamentally incentivized to retain liquidity and users within their own ecosystems. This dynamic is particularly evident in the growing prevalence of rollups and appchains, which, despite promoting the idea of cross-chain functionality, are often more concerned with consolidating their own liquidity and fostering self-contained environments. These ecosystems are designed to ensure that assets and users remain within their "walled gardens," limiting the flow of value and interaction with external networks. As a result, the promise of seamless cross-chain communication may be hindered by the competitive interests of individual networks, which prefer to protect their liquidity pools and user bases, limiting the broader vision of a fully interconnected blockchain ecosystem.</p><p><br><strong>How we see the end game to be</strong><br>While the vision for a seamless multichain ecosystem is not fully realised yet, there has been substantial progress with the development of new technologies like shared sequencers and&nbsp;arbitrary messaging protocols. Composability, although one of the hallmarks of DeFi, faces limitations without the full support of interoperability across chains. The Curve-Convex integration illustrates the potential of composability, though limited by the inability to bring this “DeFi Lego” across chains.</p><p>Ultimately, the future of the multichain endgame is still unfolding, and it remains to be seen whether interoperability and composability will be universally adopted, or is just an ideology that does not align with most businesses. What is clear, however, is that these principles will continue to shape innovation in the space, allowing for more sophisticated, scalable, and user-friendly applications in the years to come. The path forward may be diverse, but with ongoing advancements in multichain solutions, the possibilities for a truly interconnected blockchain world seem closer than ever.</p>]]></content:encoded>
            <author>0xniko0x@newsletter.paragraph.com (0x4e)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/204f28e81476f12e8767b901b321ce85.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[(Based)  Preconfirmations]]></title>
            <link>https://paragraph.com/@0xniko0x/based-preconfirmations</link>
            <guid>IQmlCkbsfC5IKUe5jijA</guid>
            <pubDate>Thu, 31 Oct 2024 10:07:55 GMT</pubDate>
            <description><![CDATA[Status quo Preconfs are a daily routine of our interactions with L2 before they settle on the L1, in the form of soft confirmations. The certainty of inclusion and settlement of the transaction on the L1 is provided by reputational collateral of the L2 & its sequencer. Preconfs TLDR; Preconfs serve as early guarantees for transaction inclusion in a block, giving users rapid feedback and securing their inclusion via staked collateral. The sub millisecond latency in inclusion confirmation, aims...]]></description>
            <content:encoded><![CDATA[<p><strong>Status quo</strong> <br>Preconfs are a daily routine of our interactions with L2 before they settle on the L1, in the form of <em>soft confirmations</em>. The certainty of inclusion and settlement of the transaction on the L1 is provided by <strong><em>reputational collateral</em></strong> of the L2 &amp; its sequencer. <br><br><strong>Preconfs TLDR; </strong><br>Preconfs serve as early guarantees for transaction inclusion in a block, giving users rapid feedback and securing their inclusion via staked collateral. The sub millisecond latency in inclusion confirmation, aims to enhance UX by clearly defining and reducing execution risk. <br><br><strong>Based Sequencing </strong><br>Based Sequencing provides a neutral, reliable shared sequencing layer designed to enable asynchronous composability both among rollups and between rollups and the Ethereum L1. Additionally, Based preconfirmations offer enhanced user experience by delivering preconfirmations that significantly improve transaction responsiveness. Usually provided by the L1 proposer, the preconf. supply - chain setup looks like the following: <br><br><em><u>In a leader - based setup, the L1 proposer handles two endpoints </u></em><br></p><ol><li><p>Request endpoint: For users and searchers to request preconfirmations </p></li><li><p>Promise Endpoint: Enables real-time streaming of preconfirmation results to the public. This allows the requester (user/searcher) to instantly receive their result while simultaneously sharing the latest state updates with other users, even before they submit their own preconfirmation requests.</p></li></ol><ol start="2"><li><p>Similar to priority fees or tips, users can include a 'tip' with their preconfirmation request to incentivize the proposer (preconfirmer) to provide a preconfirmation, operating on a first-come, first-served (FCFS) basis<br></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/99023fa3e2bb12d485af4725054c28a1.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAMCAIAAACMdijuAAAACXBIWXMAABYlAAAWJQFJUiTwAAABTklEQVR4nMWToW7DMBBAA4rzA/6GsLCisKEws7KQ1rAsf2BWGHjQzNAw0DBKieG16KC1gtUwk+I2S9ZJUzVVe8g+n/TuznYyvJjk/wUhBOec956IQgiPCfHoG977hcAYo5T6UeCcS9OUMZamadu2jwn7/b4oirclQoibIBa12WzW63Usx3tvjNF3zLhumgYApqDWWo0QUVVVUkqlFIwopQ6HA+f8ev346sA5h4hExDkHgDiZXxmGARHrus7zPEmS1WqVjHDOq6qKU7oJQghEdD6dOOdKKeectdbdiW2FEBBxCsaaoqAoimRGlmVCiIWgaZo8z621cduOWGu7rgMAxliWZYwxAOi6zt459kdE3O628xFpreOIYos3Qd/3Usrp6udc3i8AIKWs6xoRHxOEEGVZ8hllWQqxW7yiv0BEsdcJay0RPSeYbvVZXv6TPwHiSsz1Wa1YAgAAAABJRU5ErkJggg==" nextheight="418" nextwidth="1146" class="image-node embed"><figcaption htmlattributes="[object Object]" class=""><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://ethresear.ch/t/strawmanning-based-preconfirmations/19695">https://ethresear.ch/t/strawmanning-based-preconfirmations/19695</a></figcaption></figure><p><br><strong>However, this monopolistic design brings several important trade-offs</strong><br><br><strong>Latency games - the centralization trap, all over again </strong><br>The single entity with the most sophisticated infra and lowest latency, gains the majority of the MEV backrunning profit. As we've seen in other prorpoerties within the transaction supply chain, participants were always incentivized to minimize latency to the limit. <br>An example here is, a searcher could vertically integrate with preconfers to have a constant advantage among other searchers and users in the space. <br><br>Now imagine this integration<em> </em>string<em>: searcher - builder - proposer/validator </em><br>Sounds familiar?<em> more on that later on </em><br><br><strong>Congestion</strong><br>As L2 fees approach near zero, searchers aiming to avoid costly latency games may flood the network with probabilistic arbitrage attempts. This strategy involves repeatedly submitting arbitrage contracts that execute attempts and roll back if they fail. Such a scenario could echo the pre-priority gas auction days, where intense competition among searchers congested block space with unsuccessful arbitrage transactions, ultimately increasing gas fees for regular users. <br><br><strong>Tip pricing</strong> <br>In a profit-seeking, profit-maximizing environment, preconfirmers may receive requests anytime before their slot, often without knowing all transactions that will be in their block (partial requests). Additional requests and non-preconfirmed transactions could arrive before their proposal deadline, requiring preconfirmers to decide whether to confirm a request based on an incomplete view of the potential value/MEV opportunity. This means that if a preconfirmer receives a request with a 'tip' of X amount, they may not be able to determine if this tip is appropriately priced relative to the value of the rest of the block<br><br><strong>Fair trade</strong><br>Preconfers have the ability to delay preconfirmation promises, potentially failing to return them to users in a reasonable time. It's important to note that preconfermers are motivated to withhold these promises as much as possible to maximize their chances of reordering and inserting transactions or prioritise transactions from vertically integrated users, thereby increasing their MEV. <br><br><strong>Principle agent risk </strong><br>When a proposer delegates the preconfirmation rights to an external entity or marketplace, the liveness and censorship resistance of the preconfirmations depend entirely on that single external entity for the duration of the preconfer's slot(s), potentially introducing a single point of failure. </p><p><strong>Access is key </strong><br>Any system with L1 preconfirmations will likely cause the preconfirmer and block builder to merge into a single entity. This is because most MEV (like CEX-DEX arbitrage) is captured in preconfirmed transactions, reducing the profitability of building non-preconfirmed blocks. <br>Since preconfirmed transactions must be placed at head of block to avoid disrupting the expected transaction order, builders must constantly update blocks to include the latest preconfirmed transactions, making it nearly impossible to separate the two roles. <br>This integration would shift block-building from the current JIT - auction model to an early selection process, where proposers delegate both preconfirmation and building rights to a single external entity in advance, offering limited control over centralization risk and fair execution in those off-chain environments. <br></p></li></ol><p><strong>Inclusion List </strong><br>Since the Merge, validators have largely outsourced block production to builders, who control transaction inclusion, leaving proposers to accept this or miss out on MEV. Inclusion Lists aim to shift some decision-making power back to proposers, balancing incentives and data availability. While PBS promotes competitive block creation through mev-boost, it gives builders full control over transactions. Inclusion Lists allow proposers to ensure important transactions are included in the next block, maintaining proposer involvement and fair block production while preserving PBS's competitive environment. <br><br><strong>Out of protocol designs</strong><br> Out-of-protocol designs are solutions from third-party companies like Espresso, Chainbound, and Primev, using low-latency consensus, off-chain P2P environments, or proposer sidecars. <br><br><strong>MEV-commit by </strong><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1loqt21" href="https://x.com/primev_xyz">@primev_xyz</a><br>MEV-commit is p2p coordination layer for all transaction supply chain &amp; MEV participants. Bidders submit encrypted bids onto a mempool, where execution providers like builders, rollups and sequencers evaluate those submissions, bid on them and issue commitments to those transactions, submitted on chain. The system rewards these providers based on their execution performance and slashes them if confirmed transactions are not executed.<a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://x.com/primev_xyz/status/1813264680724668905…">https://x.com/primev_xyz/status/1813264680724668905…</a> <br><br><strong>In protocol designs</strong> Protocol-based pre confirmations aim to deliver an enhanced UX with sub-millisecond latencies. These preconfs create an in-protocol standard for immediate inclusion guarantees in the next block. As preconfirmers, L1 proposers generate "signed promises" and earn tips from users by economically guaranteeing block inclusion. To enable this in-protocol, we must assume that proposers are willing to • opt into additional slashing (via third party protocol s.a Eigenlayer) • have the ability to force inclusion of transactions/blobs via inclusion lists This method leverages L1’s decentralization and liveness for efficient sequencing, potentially channeling MEV back into L1. However, giving proposers the right to include and settle transactions allows them to exploit a larger design space for potential value extraction due to their centralized nature (in this case). This incentivizes proposers to vertically integrate with resource-rich entities (like builders), essentially controlling the entire transaction supply chain and potentially creating new monopolies. <br><br><strong>PP &amp; BFT by</strong> <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1loqt21" href="https://x.com/EspressoSys">@EspressoSys</a><br>PP preconfirmations aim to provide fast preconfirmations by having preconfers act as a slashable centralized sequencers during their preconfer slot. Similar to in protocol solutions of based preconfs, they opt into an externally-managed protocol and can be slashed for not executing their commitments. Preconfers are ranked by their next proposer slot and can offer anything from really plain inclusion/execution confirmations to more specified preconfirmation types. • <strong>BFT consenus</strong> Unlike PP preconfirmations, which rely on a single entity's honesty, BFT consensus protocols offer different guarantees for preconfirmations due to their internal designs. This low-latency consensus approach aims to speed up rollup pipelines, as BFT rollups begin settlement immediately after receiving the committed protocol output. In contrast, PP preconfirmations require rollups to wait until the preconfirmer's block is proposed and finalized, before settlement on the L1. Both BFT and PP preconfirmations are fully composable, allowing them to run together and provide a variety of preconfirmations <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://hackmd.io/@EspressoSystems/bft-and-proposer-promised-preconfirmations…">https://hackmd.io/@EspressoSystems/bft-and-proposer-promised-preconfirmations…</a> <br><br><strong>Bolt by Chainbound</strong> <br>Bolt is an out-of-protocol sidecar for Ethereum block proposers, enabling them to credibly commit to a set of which will be included in a block. Using innovations like preconfirmations and inclusion lists, Bolt aimsto increase block-space value and yields for stakers via inclusion and confirmation tips. It integrates seamlessly with the current block production pipeline and uses an external-managed protocol (symbiotic) for economic security. Key features of Bolt include: <br>• Trustlessness, new trusted entities; commitments are backed by economic assurances. <br>• Proposers are fully accountable for their commitments with penalties for breaches. <br>• A permissionless system, where any proposer can opt in, and any user can request commitments without a central authority. <br><br><strong>Conclusion &amp; personal thoughts</strong> <br>I appreciate the innovation behind preconfirmations, as well as the aim to enhance user experience and alter on-chain behavior. However, they currently seem more focused on constructing narratives around technical jargon, building infrastructure around those narratives, and marketing operations that may ultimately increase centralization in transaction execution. At the current point of time, they feel more like a 'nice-to-have' feature rather than an essential development, especially since ongoing discussions around shortening L1 block times could make preconfirmations unnecessary.</p><p>However, if preconfirmations indeed represent a meaningful intellectual advancement, we should carefully weigh the associated trade-offs in their design:</p><ul><li><p>In-protocol implementations risk overloading L1, requiring extensive testing before integration and potentially imposing restrictive design limits.</p></li><li><p>Out-of-protocol designs, while quicker to deploy, could introduce challenges, such as reduced credible neutrality, hardware demands for low-latency consensus (which may centralize control), and dependency on external protocols for liveness guarantee<br><br></p></li></ul><p></p>]]></content:encoded>
            <author>0xniko0x@newsletter.paragraph.com (0x4e)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/5d7a1ef6b6f0f25c8f331c1e2cbd1f40.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Don’t Just Trust - Prove It.]]></title>
            <link>https://paragraph.com/@0xniko0x/dont-just-trust-prove-it</link>
            <guid>LRmeB9NixznJCBcZWdTs</guid>
            <pubDate>Sat, 26 Oct 2024 07:02:02 GMT</pubDate>
            <description><![CDATA[Kakarot + ETH = keth ]]></description>
            <content:encoded><![CDATA[<p><strong>Intro</strong><br>As we envision a future where blockchain underpins a global financial system and operates as a decentralized supercomputer, prioritizing fundamental principles is essential. However, the current landscape has fallen short in delivering core values like transparency, verifiability, and trustlessness. Instead, we grapple with issues such as the centralization of Layer 2 solutions, frequent bridge hacks, and concentrated validator control.</p><p>To create a truly cohesive and borderless internet of value, we must emphasize self-custody, decentralization, scalability and verifiability. These principles are not mere ideals; they are vital for constructing a resilient infrastructure that allows people and communities to coordinate seamlessly across borders. This digital evolution, driven by zero-knowledge proofs, will pave the way for a more secure and decentralized ecosystem.</p><p><strong>Kakarot </strong><br>The broader industry acknowledges Ethereum as the leading settlement layer, prompting a surge of initiatives aimed at scaling its infrastructure. While these efforts vary in their commercial and technological success, they have significantly contributed to research and development, steering focus toward the most effective scaling solutions for the EVM—zero-knowledge proofs. Notable projects currently in production, such as zkSync, Starknet, and Mina, exemplify this fundamental goal of enhancing the existing blockchain ecosystem through zk-proofs.</p><p>The Kakarot team is dedicated to developing the most lightweight and efficient proving engine in the blockchain space, emphasizing maximum performance while minimizing resource consumption. This groundbreaking engine will facilitate the convergence of key technologies, bringing us closer to a fully provable blockchain environment.<br>Kakarot's proving engine will provide EVM provability across various domains, including Layer 2 solutions, EVM-compatible blockchains, sidechains, and EVM-compatible rollups on Bitcoin.</p><p><strong>Let's quickly reiterate why zero-knowledge technology is essential for blockchain scalability</strong><br>At its very core, zero knowledge enables compression of computation. Particularly when implemented through zero-knowledge proofs, it is a game-changer for blockchain scalability.</p><p>In existing implementations and architectures, there are two primary methods to verify that an on-chain action was executed correctly:</p><ol><li><p><strong>Re-execution of computation </strong><br>In decentralized networks, the only way to guarantee accuracy is to re-execute the action independently. For example, a user running a Bitcoin full node downloads each new block and verifies every transaction to ensure the block's validity.</p></li><li><p><strong>ZK proof verification <br></strong>With ZK, users can verify a proof that confirms a particular action was performed correctly, without re-executing the entire process.&nbsp;</p></li></ol><p>The main advantage of this approach is that verifying a proof is exponentially more efficient than re-executing the entire computation. In ZK-powered networks, nodes only need to validate the proofs that confirm transactions were executed correctly, resulting in substantial savings in both cost and resources. This principle forms the foundation of ZK-rollups' scalability. However, a key challenge lies in ensuring that the underlying network - in this case Ethereum - can trust that the rollup transactions are legitimate. The solution is to generate a cryptographic proof that mathematically certifies the validity of all transactions. This is achieved through zero-knowledge proof technology, or zk-proofs, which allows the network to verify the integrity of transactions without needing to re-execute them.</p><p>This approach is also crucial for cross-network interoperability, particularly as the industry shifts towards a multi chain future. With a rollup centric roadmap on Ethereum, Bitcoin Layer 2s, and Solana extensions, multiple networks will inevitably coexist. ZK technology enables trustelss and asynchronous interaction and communication between these diverse ecosystems.</p><p><strong>The zkEVM</strong><br>For any rollup that leverages the Ethereum Virtual Machine (EVM) and benefits from Ethereum's inherent properties, compatibility with EVM smart contracts is crucial. This compatibility ensures that transactions, on-chain interactions, and dApps are not only mathematically secure but also leveraging the most optimal execution environment. This need for seamless integration and enhanced security has given rise to the concept of the "zkEVM."<br><br>Very early on, Kakarot identified the limitations in achieving EVM compatibility with Cairo and decided to build a zkEVM, effectively creating an EVM layer on top of Starkwares Virtual Machine. This approach seeks to address compatibility bottlenecks and improve the integration of EVM capabilities within the Starknet ecosystem. Since Starknet's native smart contract language, Cairo, is not EVM compatible, it poses a significant friction point for developers and applications that wish to build on the most cost-effective and high-performance zkRollup. By implementing a zkEVM, Kakarot eliminates this barrier, making it easier for developers to deploy EVM-based solutions on Starknet.<br><br><strong>Kakarot in a Nutshell <br></strong>At its core, Kakarot is creating a zkEVM implementation in Cairo, enabling transactions to be provably verified. This functionality is driven by the CairoVM, a Turing-complete, provable CPU architecture that forms the backbone of Kakarot. As a smart contract deployed on Starknet, Kakarot can execute arbitrary EVM bytecode, deploy EVM-compatible smart contracts seamlessly, and call functions of Kakarot-developed EVM contracts, effectively serving as a sophisticated EVM bytecode interpreter. By providing an Ethereum-compatible JSON-RPC interface, developers can effortlessly deploy any Solidity or EVM-based code on Kakarot, just as they would on Ethereum itself.</p><p><br><strong>Kakarot's key design advantages</strong></p><ol><li><p><strong>Lightweight and modular Design:</strong> By building the EVM on top of an intermediary ZK - VM like Cairo, rather than using specialized circuits, it allows kakarot to adapt rapidly to changes within Ethereum, like Shanghai and Dencun earlier this year. Additionally, by leveraging CairoVM, Kakarot brings EVM compatibility to the Starknet network, effectively bridging the gap between EVM-native applications and the Starknet ecosystem.&nbsp;</p></li><li><p><strong>Efficiency and performance: </strong>By empowering EVM based applications to natively deploy on Kakarots zkEVM, they are able to take full advantage of the technological improvements and innovations developed by Starkware.&nbsp;</p></li></ol><p>The best example for such is the upcoming STWO prover, which will deliver a performance that is several orders of magnitude (~1000x) more efficient compared to the current STARK prover. Kakarot's design is able to easily implement such advancements without introducing additional cost nor computational complications for application developers building on Kakarot.&nbsp;</p><p><strong>The Grand Vision: Achieve Provable EVM Executions Across the Entire Ecosystem</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/c29e6946b23e83bd0b630997ddfe44fc.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAASCAIAAAC1qksFAAAACXBIWXMAABYlAAAWJQFJUiTwAAAGv0lEQVR4nBXS6VNTBwIA8Pfy8q6Qd+TlPiAJeS8JISGcAQkBkhDu+wiI4YYoFFzBCioUMlxWOaw2WikF3KWWbrZUW1l3XcVhLNZFFrpdbW13d3a7X7r90Bk/tDOd3ZmWTn9/ww/Y+7Xk/0+LD37c+v6f899+Gf5s+3RkRr51w/XRu/n3V/Mn+t0EBCSoieXjzv3XGx5PVG10ZW+O+p+EOx7ONN4LlSw36cdzRX12/KiNaLYIsjUog/N5AFCUoVmcypwbcQCXevEbY8T6BdW9a8z+u9yn66n3wqIfvug9+GnjcSQ7Jx7HISidFX14Jnd7quTjC/7tydoHg2XbU9WbofKHk+WRYwlTHml/OtVmI3qyVH4rI8chFASt0czJVvNw0Ai44ukMI+qxQ7XZ0NZC4n8flC2OCL+8W/Di2cnIqxJHDA8BwRYPu3u5YbknZcavO1ehuVRnuR6If6c94XrAMuFTH0siGy3CfB3S5NAE01Uais/j8Qg+0lqiDo8YgDxndLKRjDdQDIFmWoi8FEJK8VONqM8OtftZjztJGQVPH7F+ciUQGfRMl+m700VNHBRyq7Ymq4cyVckCXp4SLY3BssSQWw512RkvK5KhsJzA5n6l+OA1Gjg3crjEFeOMZ2LVlNdpOVzpstvjigt9Fo5VGFlzWU0GJ1sdrvjHB+Pb4a6Vk+XNTl2/g1k7kf/8zeBghsTABy0M5tbRxbHSUjU8kK7sSI3OUorqssi9VenOkhi4OlV7tIYtdFCcGk00y3IcMRqVKC5WaTEooSgMETE5ZtmtsaoXn6ztv9G0v9C+ftz5m/aUnbmGx5O+9b6c6apDboIX0IouFafc7Mnfne1fDBZVceIbY6ZHS+IbowTw+YPpvdtDf94YvxSqLXbKOQ0G8UArpzfrZIIYnZiNK4xTLLYa/31r9Mlc3Uav5fcvH9qbbdyfa9wZL/qwz70bfmkqP3YoXRnpyI505631lE6UpjQkMh8vmK6cijruh4Hvvr7z/MHoN39b+fb5yjdPl0Iv14AAYDZpQytX1z57fnJh0WeWLPe6//P+mScXm59e63oyXrJ5yvtotHw3VHrBq+pJkk5kx5xNjhpMITrNeLMZcyvhYpYqTYOtLKbTSYCfvt9ZO58aLOCfDmYefLdz53fnQQBQaGSX79/+4uDg8t07XpNowp+yvzTw7O3hv4aDf+px3OxKW623L1ZaOkxEkYLfGyc4m0ScsAqC8UQtK8iUY96YKBKBAB4EYRhw8MOzvz8ae28+u8Kragu4CzxpFC2CYViiFCd1dmf1vuIxq1tSpXfHyvfC7VuhstXO5Lda7EPZ0qMJRLEWSxChRXpBmwnrtuEBk8ClRDgKdqlwOQHzeGAUAgFf7b729advvHgW3tsYWbt2ur7Ko1BpSUoEAABpTva1vVSZzJ0N+N4aqLg+UPhml3O6IbPaKKjiyGoz45TzzQSiwSEbjRToBKlSNCYKjhUiqVJcIUJJIaJgUODhH1/97ZWa5XPeyNXm/331/sXRJuAXIE2LWKu1oCi3IlFb57TVp2rr7UyrI/pwhrnTZR4pcoTcSUc0giI1nsjgeiFiJNFoASxHEb0QS5RhbgaqFMMdMhT41+e3BjqsbcWazgJ2tDP9kF1F0iKSoAiCksrlue6MXItGHwXmGlWNTktWNM0S0Klia6TPO1tj70skAmZhppzP0agSRxg+j4BAGgbaFNhtzrBBq+5RKmAz0lpmhSfTMmczvX4dTcGg2ZaQ6kjTaDUkJTBZ2RSDhCXBOBnhMqptMkxLIGYKrIuj3QZhoVmexxJWCSJDeDIUUZBROY60Ro/zfnv1R63+iwrJiIwEFmYCZQpo2mjoNnBFalxLYw4rlx2vbfc5Rhvy+z3WsQx9f3mmGgdZCk2NlihxKF4jT4qRuczROVZtSXZ6IqsjeIBBLo1n9RW+nKGG2vPdnW4bq8AhAgaB9aWpPo2oXI7FMHC9Gj8jIybF5JyQXpGpbnPco9K8v7S1bq6EMzjGl8D11JUqULClurS1rrIi12lS0Aa5WEZGNflrs9KSKYQfLeDXxSeEAr3JpoRfmqAg8M5sV1BNp0hwTil4RUveZKRrpOQyhk0g6LxK9YeWrvcm58MTZxvzkmwKQRqrUGK8WDFV4Ey3GfRCCKBgHgnDrFajlDJKAerj9PUprunAibeH5081dB7z5QBnGjkTDbEaym6U5WoInxDOIoUNScnDdYHwibHFqZmZof4zx+qDNbmuOKVFiugpWI7BDAaTMEQiEI1AJAwJQFCN8wZKS464XJWJGafL28bqe1YGZ1eHL/4MQcjsKN3vSY4AAAAASUVORK5CYII=" nextheight="546" nextwidth="960" class="image-node embed"><figcaption htmlattributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Kakarot + ETH = keth</strong>&nbsp;<br>With its latest product advancements, Kakarot is setting its vision far beyond simply bridging the gap between the EVM ecosystem and Starkware. The project is now driving a bold vision to fully integrate and unify these ecosystems, unlocking new possibilities for developers and accelerating the adoption of cutting-edge, zero-knowledge technology.&nbsp;</p><p><strong>Keth</strong> is set to emerge as a leading proving engine, bringing “STARKification” to every EVM-equivalent network, including rollups and appchains. Its primary aim is to facilitate the transition from Optimistic to ZK-rollups by employing a sidecar model that cryptographically proves each transaction within an Optimistic rollup, thereby offering a more efficient and secure scaling solution. Additionally, Keth will enhance existing EVM ZK-rollups by leveraging Starkware's technological advancements, providing a more cost-effective and high-performance proving stack. With the advanced capabilities of STWO, the proving engine behind Keth, this solution promises to achieve performance levels significantly superior to current provers. This exponential improvement sets a new standard for efficiency and scalability in the proving landscape. Furthermore, Keth will accelerate the adoption of Stage 2 architecture across rollups by enabling multi-proof systems. <br>This innovative approach allows rollups to validate their integrity using multiple simultaneous provers, such as Keth, Succinct's Reth, or TEEs. As a result, Keth ensures robust security and consistent state validation while sidestepping the high costs typically associated with traditional proof systems.</p><p>By approaching this vision, Kakarot's first major application will enable EVM execution to Starknet, a leading ZK-rollup scaling solution developed by Starkware. By enabling developers to write smart contracts in both Solidity and Cairo, Kakarot bridges the gap between Ethereum's familiar EVM environment and Starknet's high-performance infrastructure. This interoperability makes it easier for EVM-based dApps and bluechip infrastructure to seamlessly launch on Starknet, while allowing Starknet-native protocols to reach EVM users.&nbsp;</p><p>Beyond rollups, the emergence of various appchains has highlighted ongoing challenges with composability and interoperability across networks. Kakarot’s product suite aims to address these bottlenecks by introducing provability across a network of appchains, ensuring seamless integration and interaction. Positioned to compete with solutions like ZKSync's ZK Stack and Polygon CDK in both speed and cost, Kakarot's stack will empower a new breed of appchains that leverage Starknet's underlying infrastructure while delivering the best EVM practices, enabling a more interconnected and efficient blockchain ecosystem.<br><br><strong>Verifiability on Bitcoin</strong><br>With the increasing number of Rollups and general scaling solutions of Bitcoin, the ecosystem of such solutions is exacerbating similar bottlenecks as Rollups and appchains in the EVM environment - the fragmentation of assets and liquidity. However, with the reintroduction of the OP_CAT module, Starkware plans to extend the Starknet network to Bitcoin, making it the first Layer 2 to settle transactions on both Bitcoin and Ethereum. This integration allows ZK-STARK proofs to be verified directly on Bitcoin, paving the way for enhanced scalability with improved security and integrity, similar to the capabilities ZK-rollups provide for Ethereum.&nbsp;</p><p>Kakarot's proving engine will enable proof and transaction verifiability on Bitcoin. Similar to the above solution to transition optimistic Rollups into zkRollups, Kakarot will enable self-custody and minimizing trust requirements for BTC bridged assets, unlocking trust minimized DeFi on Bitcoin. Furthermore, Kakarot will add EVM execution capabilities to Starknet as it settles on both Ethereum and Bitcoin, ultimately transforming Starknet into the first ZK-EVM Layer 2 on Bitcoin.&nbsp;</p><p><strong>Conclusion</strong><br>In summary, Kakarot is pioneering a vision for global EVM provability. At the heart of this initiative is its advanced proving engine, designed to deliver efficient and secure transaction verification across diverse applications, including Layer 2 solutions, EVM-compatible blockchains, and Bitcoin rollups. Kakarot aspires to establish a future where provability is the norm across all blockchain environments, fostering a decentralized and interconnected digital economy. By prioritizing a trust-minimized and cryptographically secure framework, Kakarot is building a resilient infrastructure that facilitates seamless collaboration and coordination on a global scale.</p>]]></content:encoded>
            <author>0xniko0x@newsletter.paragraph.com (0x4e)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/dc6af38fbcd05a45063dd5987577fceb.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Prediction Markets - Wisdom of the Crowds]]></title>
            <link>https://paragraph.com/@0xniko0x/prediction-markets-wisdom-of-the-crowds</link>
            <guid>eORClZ8lWYTjYNaILBmh</guid>
            <pubDate>Sat, 19 Oct 2024 10:43:48 GMT</pubDate>
            <description><![CDATA[Introduction into the wisdom of crowds - Prediction markets ]]></description>
            <content:encoded><![CDATA[<p><strong>Introduction</strong><br>Prediction markets are platforms where participants can speculate on the outcome of future events by buying or selling contracts. These contracts pay out a fixed amount if the predicted event occurs and nothing if it does not.</p><p>Based on the "wisdom of crowds" principle, prediction markets can generate accurate forecasts by aggregating the collective knowledge and opinions of many individuals. Over the past decade, they have been used to predict various outcomes, including elections, sports results, economic indicators, corporate decisions, and even niche markets like scientific developments.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><a href="https://www.archetype.fund/media/crypto-powered-information-games" target="_blank" rel="noopener noreferrer nofollow ugc" style="cursor: pointer;"><img src="https://storage.googleapis.com/papyrus_images/74629f97aa55052dc016344a78f23f18.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAUCAIAAABj86gYAAAACXBIWXMAAAsTAAALEwEAmpwYAAAA23RFWHRSYXcACmdlbmVyaWMgcHJvZmlsZQogICAgICA5NAo0OTQ5MmEwMDA4MDAwMDAwMDIwMDMxMDEwMjAwMDcwMDAwMDAyNjAwMDAwMDY5ODcwNDAwMDEwMDAwMDAyZTAwMDAwMDAwMDAwMDAwNTA2OTYzNjE3MzYxMDAwMDAyMDAwMDkwMDcwMDA0MDAwMDAwMzAzMjMyMzA4NjkyMDcwMDEyMDAwMDAwNGMwMDAwMDAwMDAwMDAwMDQxNTM0MzQ5NDkwMDAwMDA1MzYzNzI2NTY1NmU3MzY4NmY3NApC2R0+AAADGElEQVR4nLWVT0vjWhjGs/AL+AG6du/SDyJuXAsighvdih9ilrMR7sKFC5E7t5QUW2IaMJakNE1Jk3J6axpyGtIc8o/zh5zBHG+dO3cYZZz72yUh75PzvO/zRip/BMa4/B1gjKX1RVFTlqWmaZ7nMcYopYSQ9f2PCuAaSqltj1p3ymg88TwPQshqvtV4v+SrAMY4TVMIYVmWEMIEoTiOLcvqdruyLDuOQykVpbMsI4RgjLMse1PmRaAoCkqp67qqqpZlWT2bQ6oaSimE8OHhYTKZEEKen1bVcrlECHHOxbvvEsAYx3H815cv7c79FMzDMHx8fAQAzGazOI6rqsrznBCyWq1OT093dnYajcbR0VGSJOsmfce67LMAxrj+rmg4HC7jGNYMLcswjG632+l05vO57/uc8729PekfNjY2zs7OOOdFUYiJoJSKhjHGSE1ZlhLGOIqiIAgQQqvVCmPMGKv+TRAESZIEQSBJ0snJydbWliRJ29vbjUYjCIKiKJrNpmmad3d3iqLINQCAMAwJIRJCaDwez2azLMs45xhjCKHjOJ7n2bat63qv15tMJovFAgCwubmpKMru7u76HFEUQQg1TRsMBqZpGoah67phGMPhUFVVzrmU57njONfX161W689W253+PRiYsiwPa1zX9X0/CIL5fM4539/fX5eWJOn4+Jhznuf5zyyilMZxfHNzM7Is1/MShIRL9BuqqmKMcc5VVT04ODg/P7+4uDg8POz3+0LgZ01mjE2nU9M0OedVVRFC8rzI81yEjlJaFIXv+7ZtL5dL0zSbzaZSc3t7a1nWdxn8wZgSQmzbHgwGT09PruumaSpygBACAOi63m637+/vdV33PG8dN1FUmPBGDkTK0jS1LCsMw8ViMZ35D32z0+koiuK6rhgtcbj/+vCuJAsAAJeXl1dXV58+/9Ht6aIJjDGM8a/tO/HKiwAhJEkSWZajKEIIiWX9wT3KGHsREFF0XbfX64kN80EIIRDCfr//uosIIQCA0WiUZdnH/zYY4/F4rGnaq0VC9s2peD9idl520f9FPUVfAduHwfsDjbbrAAAAAElFTkSuQmCC" nextheight="992" nextwidth="1600" class="image-node embed"></a><figcaption htmlattributes="[object Object]" class=""><br></figcaption></figure><p><strong>History of prediction markets</strong><br>In the early 2010s, prediction markets like Intrade and Betfair gained prominence, especially in political forecasting, particularly for U.S. presidential elections. However, their growing popularity attracted increased regulatory scrutiny. This led to legal uncertainties in the U.S., causing prediction markets to face challenges. Intrade, for example, was forced to shut down in 2013, just a year after the 2012 election. Despite this, some academic and internal corporate prediction markets, like those at Google and Microsoft, continued to operate, forecasting project deadlines and other business outcomes.</p><p>In 2015, blockchain technology emerged as a new venue for prediction markets. The decentralized nature of blockchains allowed these markets to operate outside of a single regulatory framework, avoiding legal and political obstacles.</p><p>Augur, the first decentralized prediction market, capitalized on blockchain technology by launching on the Ethereum network. Users could create markets on any question or event. However, amid inconsistent user engagement in the blockchain space, Augur encountered several challenges that resulted in a decline in user retention and hindered sustainable growth. </p><p>Other platforms, like Polymarket, emerged with unique approaches to market creation and settlement. Polymarket gained popularity due to its user-friendly interface and focus on timely, controversial topics, such as COVID-19 outcomes.<br><br><strong>Types of prediction markets</strong><br><strong>a) Event-based markets</strong><br>• Focus on specific events with outcomes tied to real-world occurrences.<br>• Examples: Political markets (predicting election results), sports markets (betting on sporting event outcomes), economic markets (forecasting economic indicators).</p><p><strong>b) Binary vs. multi-outcome markets</strong><br>• <strong>Binary markets:</strong> Have two possible outcomes (e.g., "YES" or "NO"). Contracts pay out if the predicted event occurs and are worthless otherwise.<br>• <strong>Multi-outcome markets:</strong> Offer more than two possible outcomes (e.g., predicting which candidate will win a multi-candidate race or which team will win a tournament).</p><p><strong>c) Continuous vs. categorical Markets</strong><br>• <strong>Continuous Markets:</strong> Predict outcomes that vary across a range of values (e.g., stock prices, weather temperatures, percentage of votes).<br>• <strong>Categorical Markets:</strong> Center around discrete options (e.g., predicting which film will win the Best Picture award).</p><p><strong>d) AMM vs. Order Book-Based Markets</strong><br><strong>i. AMM-based markets (algorithms to adjust prices based on supply and demand)</strong><br>• Examples: Logarithmic liquidity, market rebalancing (Logarithmic Market Scoring Rule)<br>• Ensure liquidity and continuous pricing.<br>• Prices are adjusted based on trade volume.<br>• The market remains liquid as prices are always available for buyers and sellers.</p><p><strong>ii. Order Book-Based Markets (uses an order book system similar to stock markets)</strong><br>• Participants submit buy and sell orders.<br>• Prices are determined by matching these orders.<br>• Bid and ask prices are determined by the highest and lowest prices someone is willing to pay.<br>• Trades are made when both sides agree on a price.</p><p><strong>Landscape</strong><br>The landscape of companies developing new prediction markets or integrating this feature into their existing product suites is continuously evolving. This expansion reflects a growing interest in using market-based forecasting across various sectors, including finance, sports, politics, and technology. As more organizations recognize the potential of prediction markets to provide valuable insights, enhance user engagement, and diversify revenue streams, the ecosystem is becoming increasingly dynamic and competitive.</p><p>As shown in the chart below, Polymarket has consistently maintained the highest trading volume among all prediction market categories, despite a modest daily active user (DAU) count of approximately 40,000 wallets. This surge in trading volume and user engagement can be attributed to several political events, including high-profile debates, Joe Biden's withdrawal from the 2024 presidential race, and other significant occurrences. These events not only drive participation but also demonstrate the unique ability of prediction markets to reflect and capitalize on real-time public sentiment.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/87acf1314f4a20b37b91e784abacf529.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAPCAIAAAAK4lpAAAAACXBIWXMAAAsTAAALEwEAmpwYAAACgElEQVR4nK1UzU8TQRydkzf/CeNVryTGi9GD8SLx5N1IggcTNerFRDRaJFQbAwiRpHyW1QYswQUxpWshKLEt60Fga2mWj26hdNpS7M50lt3uz2wXGkQPtPryMtmZ2fzevDcfaITjWltauIEBx5OnnR0dL12uvp4eRqmuadWTHWKxkEdqfnt5OY4xVgmp0ASoikVdx7kC/IFdStEupZIUxRgfnNCrAYAZk1Ptbj8AGEbpYB3NFoB/RkxO3XnkoST3dwdLknTIwdFhltuvYvyhcwTAPOhgXUlsZzOWQDaXI4TUJmCUK74eFC5fcwFAGmckKRpfjv+IxdydPVFx8f9E1NLBn7nyGAC+Lyx6vcM8P/6W4yb9s+mtbUsgMj+fTCar3VsbAKZKWFd/4NxVhxXLuiIIAs+PB4PBvoH3scU1pFFq51gbVMJSOP9uIlJX32SaVilCiCyvTAWmeN9HnNiyHKiEMMZqS7+rPyAurLq56br6JutcapYtUfz2SQh4RwV5pSywJEmpVOqIEe3uJ1NkGoB54sJ9hE4idBodv2QLAAArQ9N0pqo1brJprcYAAITOoz2crQgcvgcAQCllKqOFIlNZPvczh3cy6Z3NjaySyCgJvLaaXltNK4nM7BfJ2c63dU/2vpl5/mocnbq+Xx2hYxcBTFv1NwFS2NnY3Bwb46cDs+E50esZ+TwT4X1+btAXFMLd3d5hr3/UF3zh6p34MPeste/GTafTNXz7nutBk/vW3a6GxmZH81BDo6Ot3VOOzoKu64wxXStaj108Gg2FQpFwWJKWRHE+mVTsP0qGDlAq03rQTLPStUf2xisfJUMvGUbJsBxgnF6R5WQ0lFVivwAWnhT8TYaHZQAAAABJRU5ErkJggg==" nextheight="764" nextwidth="1600" class="image-node embed"><figcaption htmlattributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>The future - can crypto prediction markets stand the test of time?</strong><br><strong>a) Betting on prediction markets as an uninformed retail participant often leads to a net negative outcome</strong><br>• It's pure speculation with limited long-term upside, especially compared to the broader opportunities in the crypto space, which offers more attractive alternatives.<br>• Returns from on-chain yield, staking, or direct investments in liquid assets provide better risk-adjusted returns and hedging strategies for market volatility.</p><p><strong>b) The primary motivation for participating in prediction markets is often not financial gain but social and emotional factors</strong><br>• People place bets that align with their personal beliefs, values, and interests, often choosing outcomes or candidates that resonate with them deeply. This explains certain patterns in prediction market behavior, such as why some candidates may perform better despite questionable odds. Thus, prediction markets are not just about estimating the probability of events but also about capturing the intensity of sentiment and engagement among participants. <br>• This distinction highlights a broader trend within the crypto space, where speculation is not only about financial returns but also about participating in a community, experiencing excitement, and making a statement.</p><p><strong>c) Furthermore, this element of engagement and bias opens up opportunities for new crypto-native prediction markets</strong><br>• Participants can speculate on a range of niche areas, such as narratives, market sentiment, content popularity, asset listing probabilities, and other specialized topics unique to the crypto landscape.</p><p><strong>Challenges<br>Truth or fiction: which will you bet on?</strong><br>Earlier, we compared prediction markets to traditional financial stock markets. However, the negative effects are more significant here than in traditional markets because the core purpose of prediction markets is to generate accurate information. While profit maximization is the primary motivation in traditional financial markets, participants in prediction markets are often driven by personal beliefs, political biases, or vested interests in particular outcomes. Consequently, they may be more willing to accept financial losses within the market if their bets align with their values or if they anticipate benefits outside the market itself. This creates a significant divide between delivering accurate information and information driven by personal or financial incentives. When participants prioritize their beliefs over objective analysis, the market's ability to serve as a reliable forecasting tool is undermined, leading to a misalignment between the market's outcomes and the true probabilities of events.</p><p><strong>Market Inefficiencies</strong><br>For prediction markets to be efficient, they must continuously integrate new public information and adjust prices accordingly. However, if participants rely on entirely different streams of information—essentially inhabiting separate realities—it becomes impossible for the markets to effectively incorporate public data. When traders cannot agree on basic facts, these disagreements manifest as noticeable and exploitable anomalies in market prices. The efficient market hypothesis assumes that information is uniformly interpreted by rational traders; yet, humans often disagree on how to distinguish fact from fiction and determine relevance. While one might argue that such issues do not apply to traditional financial markets, prediction markets face unique constraints, such as trade size caps that limit participation from large institutions, allowing these inefficiencies and anomalies to persist without correction.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/5454b0c84b82b286baf160e4e2e9c4da.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAALCAIAAACRcxhWAAAACXBIWXMAABYlAAAWJQFJUiTwAAACsElEQVR4nI2ST4TcUBzH3y3HEPa2l7mMmDUMawkRVnQuQwgRoiFLU8My+mrY5tg5DWHZXqY7GpYQfYyOLmFIK4TorpJD9LBGOzqV/iHVNTXVEBtSJtvp7LSHfk6/9/u93/f7ft4P/PiZPHn+8viZ45yHgxevHj21h+55mqZZluV5nq2QL8iyrKgWwTJeLeUrgKur7N3nr+H4/dsPn6L48uO32ZfL70Wt6E8WLLWK0mw2S5JkqZKm6Xw+X5O+Nsjz/EG73ek8BADch/fu7O0176rNZtOyLADAxsZGqVSiaZogCAAAhmEQwqOjo3K5jOM4XOC6LsMwJEkSBKFp2mQyuWGQJIkgCI+Pj2/L8i2WFQSBoqjd3d1er8fzPMMwiqKYpklRVLVaZVmW4zhVVW3blmWZJMlms3l6espxnGEYmqaxLBsEwfoEpVKpWq1KkkTTdLfbpSgKw7CdnR3DMAiCqFS2dF2XJAkAoCiKpmk0TSuKghAiCKJeryuKUqlsbW9vd7vd4ifWDdrtNoRQFMXNzc3iniRJqqq2Wi0IIc/zoih6ntdoNOr1uizLgiBomra/v394eFir1XAcZximUtlqt9vFf9wwuLi4wDBMURSapnEcZ1mWYRhRFDudTq1WAwBACMvlMkmSuq4v+1eXalXu760DcRz3ej1d1zmOkySp1WodHBzwPA8htCxLFMVGo6GqKs/zCKGl4prK39v5Z4Isy3zf9xb4vn92duY4zmomCALXdT3PcxzHMAzLskzTRAidnJzYtt3v9xFCRdI0TcuyRqORYRj9fr9YJ5CmadFs27ZlWcPhcDAYIIQGg4Hrugih4XAYhuHkN1EUTafTKIqK43RBHMfT6XQ8HkdRFMfxmwXz+fza4PWCIAiKV49GI8/zXNcNw9D3fdd14zj+5/j/wy+twddFo9tVvwAAAABJRU5ErkJggg==" nextheight="464" nextwidth="1328" class="image-node embed"><figcaption htmlattributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Manipulation</strong><br>Manipulation in crypto-based prediction markets presents considerable challenges and risks due to the relatively unregulated environment in which they operate. Unlike traditional financial markets, which benefit from established regulations and oversight mechanisms, blockchain based prediction markets are more vulnerable to a range of manipulative practices. These actions can distort market prices and undermine the integrity of the information being conveyed. The efficient market hypothesis above, relies on the assumption that market prices reflect all available information, but manipulation disrupts this equilibrium, as malicious actors can create artificial market conditions that mislead participants. Furthermore, the decentralized nature of these markets complicates accountability, making it difficult to identify and penalize those responsible for manipulation. As discussed in previous sections, this lack of oversight not only invites exploitation but also contributes to the persistence of anomalies, ultimately eroding trust among participants and detracting from the potential value of prediction markets as tools for informed speculation.</p><p><strong>Conclusion</strong><br>In summary, prediction markets are inherently shaped by human passions and interests. While they experience significant spikes in activity during major events, such as U.S. presidential elections, the challenge lies in maintaining long-term engagement and ensuring a qualitative, truthful source of information. To sustain momentum, it is crucial to diversify into smaller, niche markets that enable communities and individuals to participate actively, thereby preserving excitement beyond high-profile events. This diversification not only fosters ongoing interest but also enhances the potential for meaningful engagement within the crypto ecosystem. By adapting to these dynamics, prediction markets can continue to attract attention and remain relevant as valuable tools for speculation and community interaction in an ever-evolving landscape.</p>]]></content:encoded>
            <author>0xniko0x@newsletter.paragraph.com (0x4e)</author>
        </item>
    </channel>
</rss>