<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/">
    <channel>
        <title>CyberGen Lab</title>
        <link>https://paragraph.com/@cybergen-lab</link>
        <description>Come join us in our exploration of Web3 </description>
        <lastBuildDate>Tue, 14 Apr 2026 01:43:05 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>en</language>
        
        <copyright>All rights reserved</copyright>
        <item>
            <title><![CDATA[2022 Derivatives Report  ]]></title>
            <link>https://paragraph.com/@cybergen-lab/2022-derivatives-report</link>
            <guid>eXJwwOs5HI76Sg7DG5xH</guid>
            <pubDate>Thu, 12 Jan 2023 15:48:18 GMT</pubDate>
            <description><![CDATA[As we move into 2023 after a particularly challenging year marked by several black swan events all throughout the DeFi space, it appears interesting to perform a broad assessment of the current state of the derivative market and analyze each of its key components in order to highlight the dynamics that will most likely impact if not led the development of the market in this coming year. Main insightsThe Grayscale premium trade aftermath will most likely continue to have ripple effects onto th...]]></description>
            <content:encoded><![CDATA[<h2 id="h-" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"></h2><p>As we move into 2023 after a particularly challenging year marked by several black swan events all throughout the DeFi space, it appears interesting to perform a broad assessment of the current state of the derivative market and analyze each of its key components in order to highlight the dynamics that will most likely impact if not led the development of the market in this coming year. </p><h3 id="h-main-insights" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Main insights</h3><ul><li><p>The Grayscale premium trade aftermath will most likely continue to have ripple effects onto the industry and potentially cause additional contagion among key industry players (GDC, Gemini, …) if no settlement can be found among the different parties.</p></li><li><p>Futures and Perpetuals volumes should start to rise again from their 2022 year-end bottom levels, as the space start to rebuild with more transparency and better fit to market products attracting new institutional investors prioritizing derivatives to gain exposure to BTC and ETH ecosystems.</p></li><li><p>Investors will have to remain cautious of the regulatory risk given the increasing scrutiny of international regulators on crypto assets and the regulatory by enforcement strategy employed by the SEC in the US.</p></li><li><p>The emergence of new investment vehicles in the derivative market (structured vaults, perpetual options…) could bootstrap the emergence of new interesting opportunities for investors while enabling the inflow of additional liquidity in the market.</p></li></ul><h3 id="h-grayscale-premium-trade" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Grayscale premium trade</h3><p>Despite its already dramatic effect on the entire industry over the past year which led the to the bankruptcy of several key industry players (FTX, 3AC, …), the aftermath of the industry wide Grayscale Premium trade still continues to spread contagion and fear throughout the market following the recent collapse of FTX and its ripple effects on Genesis, DCG and Gemini**.**</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/e72e0e93d31b8a66c4fcffb0dbaf11e8f9ea1bd34de8e9fa572f7a691f83e552.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Furthermore, adding fuel to the fire, the refusal of Grayscale to show public proof of its BTC holdings as well as the growing public tensions between Gemini and Grayscale’s parent company DCG over a $1B liability recently increased the selling pressure on $GBTC shares pushing the Grayscale discount over -40% towards the end of 2022. As such, it appears important for investors to remain cautious of this risk in their investment choice during the first half of 2023 and price in the eventuality of additional headwind across the crypto sector coming from this unresolved situation.</p><p><strong>Grayscale premium TLDR</strong></p><p>Initially profitable the Grayscale trade consisted for eligible institutional investors to exchange with Grayscale a given amount of BTC for $GBTC shares to then resell those shares on the secondary market at a premium to retail investors who prior to 2019 couldn’t get any real exposure to BTC on traditional financial market outside of $GBTC. However, the situation quickly reversed in 2021 with this premium becoming a discount following the added selling pressure on $GBTC coming from an increasing number of institutional investors entering the trade as well as the growing options at hand for retails investors to get exposure to BTC through BTC future ETFs (CME, …).</p><h3 id="h-resilience-of-the-derivative-market" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Resilience of the derivative market</h3><p>All throughout this past year, the derivative market showed growing signs of maturity as it continued to gain additional market shares against the spot market while also being able to ensure sufficient liquidity during the several period of high volatility that we encountered.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/57793c0722ca608c8903b6b4cf533516383515a5a6a6b601d8e50f56f899d2a2.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Indeed, even though the future and perpetuals markets suffered from a steep decline in volume across the year following the collapse of numerous institutional market participants on both the seller and buyer side (3AC, FTX, …), we can see that those markets reveal themselves mature enough to enable investors to hedge their portfolio risks during period of high risks as shows the below peak volume of future trading during the collapse of FTX.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/62c9c3c82ac7d4c2e73328245620f449a11f959b1bf09b2cab9a435c8440ebd6.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h3 id="h-inability-of-the-option-market-to-reach-mainstream-adoption" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Inability of the option market to reach mainstream adoption</h3><p>On the contrary of perpetuals and futures which quickly became soon after their release among the most popular and traded assets on both centralized and decentralized derivative marketplaces, option derivatives still remain a pretty nascent and low volume market focusing mostly on BTC and ETH.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/78810bd880ccccd967cb04de9da84c7d37f8fb9676f2a30953533f7ca7fec89f.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Nonetheless, existing use-cases in risk management such as custom options strategies for Liquidity providers enabling to hedge the convexity of the impermanent loss risk or underdeveloped option products such as hash rate options enabling miners to hedge away the hash rate risk on their mining ROI could create new opportunities for this asset class in the short term future. Similarly, new type of options such as everlasting options still complex to price could also become a really interesting tool if able to find a good market fit in the ever-changing DeFi landscape. As such, investors should remain informed on this asset class as its current underdevelopment could hide an important untapped growth potential.</p><h3 id="h-increasing-regulatory-risk" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Increasing regulatory risk</h3><p>Following the recent demise of numerous high profile crypto companies involved in the Grayscale premium trade the hope for more regulatory transparency through the approval of a spot BTC ETF under the actual SEC chair Gary Gensler disappeared altogether. Worst, due to the massive implication of SBF in the draft of the forthcoming crypto bill scheduled to be soon presented in front of the Congress, chances of congressional vote on a “crypto bill” appear now pretty slim and investors will most likely have to wait for the election of a new administration to see the passing of a crypto bill bringing more clarity onto the sector. Consequently, investors will have to remain in the meantime cautious of the regulatory risk coming from the pretty aggressive legislation by enforcement led by the SEC when investing in tokenized derivatives.</p><p>On the other hand, for pure derivative products falling under the purview of the CFTC, investors will also need to remain aware of the regulatory risk and of the previous decisions of the commissions which despite being more consistent than the SEC still remain nonetheless willing to assert its authority on this new sector.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/67241628c95638c0aeee6a43d47c736d51780058eb5d43a06ebd54f2bf424978.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h3 id="h-dominance-of-centralized-derivative-marketplace" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Dominance of centralized derivative marketplace</h3><p> In line with the situation observed in the spot market, both from a volume and open interest standpoint centralized derivative marketplaces still remain ahead of fully decentralized marketplaces by several orders of magnitude despite the recent scandals which tainted several centralized derivative actors and the increasing trend among crypto native users to rely more and more on self-custody.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/a4ac6c44c82a9e2757521356b9db3ae43dc03f71636473efff8f0f4a1e1abd49.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>However, with the current growing momentum around Ethereum scaling solution enabling low gas fees and the increasing reliance of the DeFi sector on a growing number of AMM pools requiring liquidity hedging, it is to expect that the market share of the decentralized derivative marketplace will likely increase over the course of the upcoming year.</p><h3 id="h-tethers-monopoly-challenged" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Tether’s monopoly challenged</h3><p>Building-up on its growing momentum following the different 2021 USDT scandals, USDC continued all throughout 2022 to consolidate its place of best USDT alternative for investors wary of transparent auditing process and even came close to challenge Tether in terms of market cap despite a continued dominance of the latter in terms of daily volume traded. </p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/6b8af87a5771356e0eb4937d9866299ce108700b815275fcaaa6ea7be78c9100.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/c539d3ad8dafb93627b094f818c6ab8abbb98a4a2f3c292097c77d864f0c7731.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Nonetheless, given its highly transparent auditing process and partnership with major players such as VISA or BlackRock, USDC seems on a good path to be able to challenge further Tether’s monopoly throughout the coming year as it already did successfully inside the Ethereum ecosystem. </p><h3 id="h-brewing-derivative-innovation" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Brewing derivative innovation</h3><p>Despite an interesting pace of innovation in the derivative sector with several new tools designed over the past year and a half (everlasting options, structured vault, floating strike options …) the adoption and infrastructural development around this new market segment was not able to follow-up given the larger focus on the numerous scandals that impacted the market over this past year which induced a liquidity crunch and a flight to safety from investors. Nonetheless, despite this mild set back, those new products seem now really well positioned to be able to finally hit the market and offer interesting new range of investment types all across the DeFi space during the course of 2023.</p><p><strong>Disclaimer: <em>All information/documents contained in this blog rely solely  on my personal beliefs, and do not constitute professional investment advice.</em></strong></p><p><strong><em>Be careful in your investment and do not invest more than you can afford to lose.</em></strong></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/f491aef8db866938ef74e346d7e86d8dd6f83be5d8f4f0113d0c162cdc5e503b.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[DevOps Introduction (part 4)]]></title>
            <link>https://paragraph.com/@cybergen-lab/devops-introduction-part-4</link>
            <guid>gxpnGHKt6J5YdoBJnlcg</guid>
            <pubDate>Sun, 11 Dec 2022 20:34:06 GMT</pubDate>
            <description><![CDATA[Hey there, nice to see you back for this fourth and final part of our DevOps tutorial series! 😃 Today’s tutorial will be mainly divided into two parts. First we’ll finalised our cloud data architecture and make sure that it updates itself automatically on a daily basis. Then, in a second time we’ll hop onto the creation of our Grafana dashboard and see how we can connect our ethdb database to a Grafana instance in order to create our ETH analytics dashboard.Part 1: Cloud data architecture fi...]]></description>
            <content:encoded><![CDATA[<p>Hey there, nice to see you back for this fourth and final part of our DevOps tutorial series! 😃</p><p>Today’s tutorial will be mainly divided into two parts. First we’ll finalised our cloud data architecture and make sure that it updates itself automatically on a daily basis. Then, in a second time we’ll hop onto the creation of our Grafana dashboard and see how we can connect our ethdb database to a Grafana instance in order to create our ETH analytics dashboard.</p><h2 id="h-part-1-cloud-data-architecture-finalization" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Part 1: Cloud data architecture finalization</h2><p>Alright so first thing first, now that we have at our disposal our basic cloud data architecture with each of our tables initialised we need to create python scripts enabling us to update each of those with new daily metrics.</p><p><strong>Observation:</strong> In the case of the daily_metrics and delayed_metrics tables given that we only extracted the latest daily metric value available we don’t need to recreate additional python scripts and can directly used the python functions that we used in our table_init2 and table_init3 python scripts.</p><p>Alright so let’s focus on the case of the historical_metrics and network_growth tables. So here basically the idea, is to create a python script in order to add on a daily basis the new daily metrics value to each tables.</p><p>Starting, with our historical_metrics table, let’s create a new updating python scripts called historical_daily.py in our code folder using the following command:</p><pre data-type="codeBlock" text="$ nano historical_daily.py
"><code>$ nano historical_daily.py
</code></pre><p>From there copy and paste the code that you can find <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro/blob/main/Code/historical_daily.py">here</a> in the corresponding file in Github.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/a0ba070b53ed4b205ab8091e04a939f3f0804681fc49bafea06f0d332b7cb8b0.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Observation</strong>: Here the code is basically exactly the same that we used before in our previous tutorial in our table_init.py script at the difference that here we’re only focusing on extracting the metric value for the current day.</p><p>Alright from there using the same process let’s take care of our network_growth table. So first we’ll create a new python script using the following command:</p><pre data-type="codeBlock" text="$ nano network_growth.py
"><code>$ nano network_growth.py
</code></pre><p>From there copy and paste the content of the <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro/blob/main/Code/network_growth.py">github file</a> of the same name:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/13d61cae9e65ed7d63e5177355d0cb916a4e393e981d238951459add1633fb48.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright, now we have to make those scripts executable and run them. To do so we’ll use the following commands presented in our previous tutorial:</p><pre data-type="codeBlock" text="$ chmod +x historical_daily.py 
$ chmod +x network_growth.py 
$ ./historical_daily.py 
$ ./network_growth.py
"><code>$ chmod <span class="hljs-operator">+</span>x historical_daily.py 
$ chmod <span class="hljs-operator">+</span>x network_growth.py 
$ ./historical_daily.py 
$ ./network_growth.py
</code></pre><p>Here if everything goes well you should be met with our usual prompt message &apos;“Process done&apos;“ after the running of each of our python scripts.</p><p><strong>Note</strong>: if you want to double check, feel free to use the same process explained in our previous tutorial using the Postgres user and your local instance of psql.</p><p>Alright so now that we’re all set and have our full data architecture, let’s automate this daily updating process. To do so, first we’ll need to create a python script that we’ll call main.py which will call all the functions we need to perform a full daily update of all our tables composing our cloud data architecture.</p><p>So first let’s create, that new python script in our code folder using the following command:</p><pre data-type="codeBlock" text="$ nano main.py
"><code>$ nano main.py
</code></pre><p>From there as usual copy the content of the corresponding <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro/blob/main/Code/main.py">github file</a> in your editor:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/939483528aa7ef954e0af92e36e6e9e88ad8fbcfe179b7f123a60fa0497e119a.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>As you can see here, this file is pretty straightforward we basically just import each functions that we need from their respective python scripts and run them one after another.</p><p>Alright so from there, as always we need to make this script executable using the following command:</p><pre data-type="codeBlock" text="$ chmod +x main.py
"><code>$ chmod <span class="hljs-operator">+</span>x main.py
</code></pre><p>Once it’s done the last thing we need to do, is to ask our cloud server ro automatically run on a daily basis this script. To do so, we’ll use a linux tool called crontab enabling us to ask the computer to perform a set of task on a periodical basis. So first let’s open crontab using the following command:</p><pre data-type="codeBlock" text="$ crontab -e
"><code><span class="hljs-variable">$ </span>crontab -e
</code></pre><p>Choose the easiest editor by typing 1:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/1dea89b270f8c20dff9ec3e7f4a15f9f7f7b6cf7207b20169a6d8cf300b79de3.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Normally you should be met with the following editor:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/f2fb3bfd020303a0723805ba0abc4668ff810e524588ea6ef7a3d3f82c651319.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there type the following command below the comments, save and exit the editor.</p><pre data-type="codeBlock" text="$ 0 9 * * * ./main.py
"><code>$ <span class="hljs-number">0</span> <span class="hljs-number">9</span> <span class="hljs-operator">*</span> <span class="hljs-operator">*</span> <span class="hljs-operator">*</span> ./main.py
</code></pre><p>See <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://crontab.guru/#0_9_*_*_*">here</a> for crontab time converter</p><p>And that’s it we finally have a complete full cloud data architecture querying and storing our needed metrics value on a daily basis 👍</p><p>Alright so that’s a pretty big part done, so don’t hesitate to make a quick coffee break here if you need it before heading to the second part ☕</p><h2 id="h-part-2-grafana-dashboard-creation" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Part 2 Grafana dashboard creation</h2><p>Alright so let’s start building our Grafana dashboard. First, if you haven’t already created one go on the <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://grafana.com/">Grafana website</a> and follow the onboarding steps in order to create an account.</p><p>Once you’re done you should be able to see the following screen:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/e1a067c86f387b4b88ff26b4ab00daf36f0c0296672199c721ee74079de377e4.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>So first, let’s create our dashboard. To do so, hoover over the dashboard icon and click on “new dashboard”.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/3d9d547f7162f6e6533eb9a3b459c87d77029df5a4155d6710a4c39a17bbd05a.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there click on &apos;“Add new panel” and click on “save” in order to save your new dashboard as shown below:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/df0c0fcb5b590ec3c06cb14f97c3579a5069b6f74ca83f74b72baa7c5a0d7602.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/b20d04ac991f0db98018900856c4ce49bb9f28d332e0c85e09afadbf7f5dd730.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright, so now let’s connect our ethdb database stored in our cloud server to our Grafana instance in order to get access to our ethereum network metrics.</p><p>So first let’s click on configuration and go into Data sources.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/fe5741df1f3e28f1b5eefaa6bccc23d85b0cfadc9d20fe4c92a7adb4c0c8bcd7.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Once you’re there click on add datasource</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/f3ab7f33842df677c0b07cfaf25f11091b7563ef4a249934bc1701b2ee90924e.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there seach for postgreSQL and select it in order to add a new PostgreSQL data source to your Grafana instance.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/23110346347c07245b77c43634a1b8ba0bc0b2c1fd9e5109a9e8d0205a94768b.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Here we have to specify the following connection parameters:</p><ul><li><p><strong>Name</strong> -- the Grafana name we want to give to our data source. Here we’ll call it ethdb</p></li><li><p><strong>Host</strong> -- which corresponds to the IP address of our DO server and the port that Grafana should use to connect to it here it is “:5432” (as per Digital Ocean default parameters)</p></li><li><p><strong>User</strong> -- which corresponds to the user that we use to connect to our database located in our cloud server. Here we’ll use our previously created user called “myuser” the password being the password you assigned to your psql user.</p></li><li><p><strong>TLS/SSL Mode</strong> -- Disable</p></li></ul><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/b6dcde35497d3b8cc10e36b82bbc7a8094be19ad282b9580ae3ea7f96124a882.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Once you specified all the parameters presented above, click on “Save &amp; Test” and if everything goes smoothly you should see appear the following message</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/f308690c3858590a2b1744328f26635153938c8ea891e0a09b499fa48bc2ba6d.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now that we connected successfully our ethdb database to our grafana instance let’s start building our Grafana dashboard. So first let’s get back to our dashboard and edit our panel</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/1cb3a318587470a43c77c613227a0530284d302bbb1f5685ad766de8598e9daf.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>There click on data source and select ethdb</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/3fcb139f0bd525fe32c8c07987d5bab47106c8c647c6437d6f00b9c7247df6d2.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Once it’s done, in the query section click on the code option in order to be able to start writing SQL queries and use the following command:</p><pre data-type="codeBlock" text="select date, historical_prices 
from historical_metrics
order by date asc;
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, historical_prices 
<span class="hljs-keyword">from</span> historical_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">asc</span>;
</code></pre><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/9d6ea3122fc5effa220c36b9785e5b9e1376014b52904bdb9d5e39047438d2e1.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Upon running the query you should end up with the following result:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/c2b68eeb09cf9aa51cba63cb2a9f5d93e7ef5e84e4d9568a51dfa21faf4e056d.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Don’t forget to select the “last 6 months” time range given that it is not the by default used time range by Grafana.</p><p>From there you can customize your graph as you want using the panel options and once you’re done don’t forget to save all your changes.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/5bf782e234d888fdd312b5860892d2a3195527eaa88c7a097409fffd1069b082.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now that we’re down with our price chart let’s go ahead and create another panel in order to create our current price counter.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/841b4c2d2a3a7502f96e238e0d8eb5e9e9b8f9cdad1e3868254b847b14dffd20.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there as before choose ethdb but this time copy and paste the following sql command in order to extract the current ETH price from our daily_metrics table.</p><pre data-type="codeBlock" text="select date, price 
from daily_metrics 
order by date desc
limit 1
;
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, price 
<span class="hljs-keyword">from</span> daily_metrics 
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">desc</span>
limit <span class="hljs-number">1</span>
;
</code></pre><p>Normally you should end up with the following result:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/10fd27644066d59730890c5040ff77ac70b3a5bc504815a27a588eca7d43f9ae.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Here, switch the visualization from time series to stats and delete the threshold in the panel options before saving it.</p><p>Alright so now moving on to our daily volume counter using the same process and the following SQL command we obtain a counter with our daily volume.</p><pre data-type="codeBlock" text="select date, volume 
from daily_metrics
order by date desc
limit 1; 
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, volume 
<span class="hljs-keyword">from</span> daily_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">desc</span>
limit <span class="hljs-number">1</span>; 
</code></pre><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/337c0fda3f282cf02ab1e0040fc6d77824745d2b270b64caa2f547ab42eac3ff.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Observation</strong>: Our volume here is expressed in billions so if you want to also show that in Grafana you can use a transformation by clicking on transformation and on the “Add field from calculation” button as shown below:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/81f15f29015c02ed0b551041e575ce7e75cc56758597462d247619250625bf4f.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Then in order to only get your value in billion showed in the counter go to the panel options and select your newly added columns in the Fields section.</p><p>Alright so for the two other counters we’ll basically use the same process and use the following SQL queries.</p><p><strong>Social Volume:</strong></p><pre data-type="codeBlock" text="select date, socialvolume
from delayed_metrics
order by date desc
limit 1
;
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, socialvolume
<span class="hljs-keyword">from</span> delayed_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">desc</span>
limit <span class="hljs-number">1</span>
;
</code></pre><p><strong>24h active addresses:</strong></p><pre data-type="codeBlock" text="select date, activeaddress 
from daily_metrics
order by date desc
limit 1;
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, activeaddress 
<span class="hljs-keyword">from</span> daily_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">desc</span>
limit <span class="hljs-number">1</span>;
</code></pre><p>Normally if everything goes right you should end up with a dashboard similar to the one below:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/57ddc736756f4516b5fafd1d380899f46cac21f44e705a79bdbc878f2d87073b.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there, the building steps being mostly the same in order to create the rest of the below dashboard I’ll let you try to do it on your own as an exercise by relying on the below code snippets showing you which sql queries you should use to create each of the neeeded dashboard components.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/d1ef4a169d88fed8d7b561cfa0c967dbe2d97df0303bc36fbf88e75ac5844818.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h3 id="h-sql-queries" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">SQL queries:</h3><p><strong>Velocity</strong></p><pre data-type="codeBlock" text="select date, volatility
from delayed_metrics
order by date desc
limit 1;
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, volatility
<span class="hljs-keyword">from</span> delayed_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">desc</span>
limit <span class="hljs-number">1</span>;
</code></pre><p><strong>Marketcap</strong></p><pre data-type="codeBlock" text="select date, historical_marketcap 
from historical_metrics
order by date asc
; 
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, historical_marketcap 
<span class="hljs-keyword">from</span> historical_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">asc</span>
; 
</code></pre><p><strong>Top holders</strong></p><pre data-type="codeBlock" text="select date, topholders
from delayed_metrics
order by date desc 
limit 1; 
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, topholders
<span class="hljs-keyword">from</span> delayed_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">desc</span> 
limit <span class="hljs-number">1</span>; 
</code></pre><p><strong>Github activity</strong></p><pre data-type="codeBlock" text="select date, githubactivity
from delayed_metrics
order by date desc
limit 1; 
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, githubactivity
<span class="hljs-keyword">from</span> delayed_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">desc</span>
limit <span class="hljs-number">1</span>; 
</code></pre><p><strong>Dev Activity</strong></p><pre data-type="codeBlock" text="select date, devactivity 
from delayed_metrics
order by date desc
limit 1;
"><code><span class="hljs-keyword">select</span> <span class="hljs-type">date</span>, devactivity 
<span class="hljs-keyword">from</span> delayed_metrics
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">desc</span>
limit <span class="hljs-number">1</span>;
</code></pre><p><strong>Network Growth</strong></p><pre data-type="codeBlock" text="select * 
from network_growth
order by date asc;
"><code><span class="hljs-keyword">select</span> <span class="hljs-operator">*</span> 
<span class="hljs-keyword">from</span> network_growth
<span class="hljs-keyword">order</span> <span class="hljs-keyword">by</span> <span class="hljs-type">date</span> <span class="hljs-keyword">asc</span>;
</code></pre><p>And that’s it !! 🥳🥳</p><p>So congrats if you reach the end of this tutorial serie, you’re now well equipped to go further and start tackling further challenges so be proud of yourself and as always don’t hesitate to play around with the code and fine-tune it to your liking 😃</p><p>In the future we’ll go further and introduce other DevOps notions such as kubernetes, docker and others tools so stay tuned if you’re interested to learn more about those and in the meantime have fun with all this new knowledge.</p><p>Take care 🖖</p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/aebb652d570f55509b311523385e1ea92462a148acb77eccacb79c320b3e0c8d.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Weekly News 12/09]]></title>
            <link>https://paragraph.com/@cybergen-lab/weekly-news-12-09</link>
            <guid>p1ShTf7vzHBDgdplf7A2</guid>
            <pubDate>Fri, 09 Dec 2022 11:43:32 GMT</pubDate>
            <description><![CDATA[DeFiOryen NetworkOryen Network a new upcoming staking protocol enabling investors to yield high-passive income through a new auto-compounding tool announced the roll-out of its 7th presale’s phase later on this week.AnkrEarlier last week, Ankr reported a successful $5M hack of its aBNBc token contract through an unlimited mint bug and halt trading for several days before resuming operations earlier this week and reimbursing all affected users.Aave & CompoundDeFi giants Aave and Compound recen...]]></description>
            <content:encoded><![CDATA[<h2 id="h-defi" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">DeFi</h2><h3 id="h-oryen-network" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Oryen Network</h3><p>Oryen Network a new upcoming staking protocol enabling investors to yield high-passive income through a new auto-compounding tool announced the roll-out of its 7th presale’s phase later on this week.</p><h3 id="h-ankr" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Ankr</h3><p>Earlier last week, Ankr reported a successful $5M hack of its aBNBc token contract through an unlimited mint bug and halt trading for several days before resuming operations earlier this week and reimbursing all affected users.</p><h3 id="h-aave-and-compound" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Aave &amp; Compound</h3><p>DeFi giants Aave and Compound recently decided to cap loans and freeze a number of token markets in order to enhance the overall security of their cryptoeconomic designs and prevent liquidity attacks such as the one attempted earlier this year by Avraham Eisenberg</p><h2 id="h-web3" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Web3</h2><h3 id="h-aurox" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Aurox</h3><p>Aurox a leading DeFi-focused software development company, aiming to compete with Metamask, recently unveiled its new web3 wallet extension enabling users to profit from industry-leading security features as well as from an innovative reward program being applied on each transaction.</p><h3 id="h-permaswap" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Permaswap</h3><p>Arweave’s first cross-chain DEX Permaswap, recently announced the successful ending of its second testnet phase and officialy announced the upcoming release of its Beta version.</p><h3 id="h-redstone" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">RedStone</h3><p>Earlier this week, Redstone announced a new partnership with the Fantom network aiming to enable developers to build data-driven decentralised applications on top of the Fantom network in a fast and reliable way.</p><h2 id="h-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Ecosystem</h2><h3 id="h-vitalik" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Vitalik</h3><p>In a recent blogpost, Vitalik Buterin, co-founder and chief scientist of Ethereum, weigh-in on the decentralised voting system debate arguing that token value should not be tied to their democratic utility given the risk of attacks by specialised investors and the negligible leverage of retails users in those type of governance framework relaunching as such the heated debate around the best type of possible decentralised governance.</p><h3 id="h-us-regulators" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">US regulators</h3><p>The Financial Crimes enforcement Network (FinCEN) recently stated that it is looking carefully at DeFi and referred to digital currencies as its key priority area making some market participants fear a strengthening of the future regulatory framework for DeFi companies in the US.</p><h2 id="h-market" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Market</h2><ul><li><p>BTC     $17,237</p></li><li><p>ETH $1285</p></li><li><p>BNB. $290</p></li><li><p>ADA. $0.31</p></li><li><p>SOL $13.89</p></li></ul><h3 id="h-perennial" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Perennial</h3><p>Perennial, a decentralized-finance derivative trading platform recently successfully closed a $12 million seed funding round that led by Polychain Capital and Variant.</p><h3 id="h-auros-global" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Auros Global</h3><p>Crypto trading firm Auros Global appears to be suffering from FTX contagion after missing a principal repayment on a 2,400 Wrapped Ether (wETH) DeFi loan.</p><p><strong>Disclaimer</strong>: <em>All information/documents contained in this blog rely solely  on my personal beliefs, and do not constitute professional investment advice.</em></p><p><em>Be careful in your investment and do not invest more than you can afford to loose.</em></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/0677e02b21d3dc301f77a3449947ea471ade8aca3ecaaff341c201ddd9b2c4b7.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Enzyme Finance Market Intelligence]]></title>
            <link>https://paragraph.com/@cybergen-lab/enzyme-finance-market-intelligence</link>
            <guid>Lel1YwbHpqJ33ktXEVJT</guid>
            <pubDate>Fri, 02 Dec 2022 18:29:20 GMT</pubDate>
            <description><![CDATA[Product & EcosystemValidity of Enzyme Finance main use cases [ Score 4/5 ]Treasury managementFlexible purpose-built asset management toolkit enabling DAOs and companies to implement custom treasury management architecturePossibility to grant limited operational power on specific treasury management process to a small subset of members or to a specialized third party.Built-in integration of Gnosis Safe* allowing the creation of a layered treasury management structure implementing different sig...]]></description>
            <content:encoded><![CDATA[<h2 id="h-product-and-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Product &amp; Ecosystem</h2><h3 id="h-validity-of-enzyme-finance-main-use-cases-score-45" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Validity of Enzyme Finance main use cases [ Score 4/5 ]</h3><p><strong>Treasury management</strong></p><ul><li><p>Flexible purpose-built asset management toolkit enabling DAOs and companies to implement custom treasury management architecture</p></li><li><p>Possibility to grant limited operational power on specific treasury management process to a small subset of members or to a specialized third party.</p></li><li><p>Built-in integration of Gnosis Safe* allowing the creation of a layered treasury management structure implementing different signature thresholds for each treasury management actions.</p></li><li><p>Complete treasury management transparency thanks to Enzyme’s user-friendly UI enabling DAO members and token holders to easily monitor treasury state at all time</p></li><li><p>Publication of transparent and easily readable reports on assets being managed</p></li><li><p>Reduction of admin private key operational risk through the leveraging of Gnosis multi-sig wallet</p></li><li><p>Custom policy features allowing the limitation of asset managers rights to a limited number of actions pre-vetted by users through community vote on Snapshot.</p></li></ul><p><strong>Investment Clubs</strong></p><ul><li><p>Possibility to implement main DeFi services in Fund strategy thanks to Enzyme’s integration with several external DeFi protocol (Aave, Maple Finance, Compound …)</p><ul><li><p>Current available services:</p><ul><li><p>Yield Farming</p></li><li><p>Synthetic asset &amp; Derivative</p></li><li><p>DEX’s swap</p></li><li><p>AMM pools</p></li><li><p>Borrrowing &amp; Lending</p></li></ul></li></ul></li><li><p>Unique possibility to create strategies relying on external positions not resulting in a simple exchange of ERC20 assets (Compound cTokens, Uniswap v3 LP positions, …)</p></li><li><p>Unique possibility to create strategies relying on external positions not resulting in a simple exchange of ERC20 assets (Compound cTokens, Uniswap v3 LP positions, …)</p></li><li><p>Large set of available policy features allowing to create a variety of gated and public investment clubs with varying level of trust towards the asset manager</p></li><li><p>Possibility for investors to withdraw liquidity without restriction 24/7</p></li><li><p>Easy implementation of trading bot through Enzyme policy feature enabling the community to grant revocable trading rights to a given address</p></li></ul><h3 id="h-current-ecosystem-state-score-35" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Current Ecosystem State [ Score 3/5 ]</h3><ul><li><p>Development of interesting new features diversifying the type of holdings available for investment strategist (External position, airdrop claims, …)</p></li><li><p>Mature and successful decentralized governance architecture</p></li><li><p>Limited set of adapters enabling the integration of Enzyme with external DeFi protocol:</p><ul><li><p>16 available adapters on Ethereum</p></li><li><p>Only 8 adapters on polygon</p></li><li><p>2 upcoming adapters (Notional Finance &amp; Pool together)</p></li></ul></li><li><p>Asset pricing mechanism still heavily reliant on Chainlink price feeds</p></li><li><p>Ecosystem still mainly targeting crypto native users given the technological understanding required to use the platform (Web3 wallet private keys management, handling of different networks, …)</p></li><li><p>Investors token ownership preventing most institutional investors to take part in that ecosystem due to regulatory limitations</p></li><li><p>Limited set of supported networks (only Ethereum &amp; Polygon)</p></li></ul><h3 id="h-moat-long-term-stability-of-use-case-and-ecosystem-score-35" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">MOAT (Long-Term stability of use case and ecosystem) [Score: 3/5 ]</h3><ul><li><p>Recent move of crypto native users towards more decentralized investment platforms following the recent FTX meltdown which will most likely attract new users towards Enzyme asset management solution.</p></li><li><p>Continued growth of the DeFi space and of DAOs in a broad array of sectors ranging from gaming to social media (Lil Miquela, …) structurally supporting the demand for decentralized asset management solutions such as Enzyme</p></li><li><p>Huge future opportunities in the investment sector with hedge funds and investment banks most likely to start leveraging platforms like Enzyme once the regulatory framework around token ownership will have been clarified by regulators worldwide.</p></li><li><p>Strong insurance regarding the continuing development of the core Enzyme protocol thanks to the recent renewal of Avantgarde Finance contractor agreement as lead protocol developer up until September 2025.</p></li><li><p>Additionally with the recent growth of investment clubs, Enzyme could also become an ideal decentralized investment solution for retail traders communities such as WallStreetBets willing to invest on crypto</p></li></ul><h3 id="h-project-maturity-age-token-years-since-existence-score-45" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Project maturity / Age token (years since existence) [ Score: 4/5 ]</h3><ul><li><p><strong>July 2016</strong> — Creation of Melonport AG as a privately domiciled company in Zug, Switzerland with the sole mandate of developing an asset management computer built on top of Ethereum.</p></li><li><p><strong>February 2017</strong> — ICO raising $2.9M in one day</p></li><li><p><strong>February 2019</strong> — Launch of Mainnet v1.0</p></li><li><p><strong>August 2019</strong> — Ownership of the protocol transferred to the Melon Council DAO</p></li><li><p><strong>December 2020</strong> — Rebranding from Melon Protocol to Enzyme</p></li><li><p><strong>January 2021</strong> — Launch of Mainnet v2.0 (after failure of v1.0) implementing a brand new smart contract architecture as well as a full suite of new features developed in partnership with the community (lending, synthetic assets, yield farming, …)</p></li><li><p><strong>February 2022</strong> — Launch of Mainnet v4.0 (Sulu)  building on top of the success of mainnet v2.0 and implementing a whole suite of new features and integrations.</p></li><li><p><strong>Unknown</strong> — Release of Mainnet v5.0 (Eve) aiming to iterate on the success of Sulu and expand further the range of applications and potential use cases of Enzyme</p></li></ul><p><strong>Analyst Observation</strong></p><ul><li><p>High level of community governance enabling to pivot quickly and align at best with the need of users as shown by the quick pivot from mainnet v1.0 to mainnet v2.0</p></li><li><p>Continuous development led by a team of seasoned developers with deep understanding of the space and true beliefs in the long-term mission of Enzyme as shows the grant token vesting enforced in the recent <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://snapshot.org/#/enzymefinance.eth/proposal/0x713b98e78cbae5d7f4a0ee7f07b0b1c9bcf05c2e05b2c6c8b71e892127dc7327">contactor agreement renewal</a></p></li><li><p>Currently among the top three decentralized asset management solutions in terms of TVL.</p></li></ul><h2 id="h-revenue-potential" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Revenue Potential</h2><h3 id="h-last-12m-revenue-score-3-5" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Last 12M Revenue [ Score 3 /5 ]</h3><p>AAGR = - 83%</p><p><strong>Analyst comment</strong></p><p>By comparison with the annual growth returns exhibited by other major crypto assets such as Bitcoin (-72%) and Ethereum (-73%) the current AAGR of Enzyme does not seem worrying especially in the current context of severe contraction of the crypto space due to the consecutive occurrence of black swan events all throughout the year (Terra, Three Arrows Capital, FTX, …).</p><h3 id="h-revenue-trend-last-3-months-vs-last-12-months-score-35" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Revenue Trend (Last 3 months vs Last 12 months) [ Score 3/5 ]</h3><p>3 month Growth rate = <strong>- 23%</strong></p><p><strong>Analyst comment:</strong></p><p>The comparison here between those two revenue trends does not really help us getting a clear picture on the intrinsic protocol revenue generating power given the occurrence of consecutive black swan events during both of those time periods.</p><h3 id="h-revenue-flow-to-token-holder-score-25" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Revenue flow to token holder [ Score 2/5 ]</h3><p><strong>Cryptoeconomic design</strong></p><ul><li><p>Current token circulating supply: 2,038,518 MLN</p></li><li><p>Maximum mintable tokens per year: 300,600 MLN</p></li><li><p>Built-in burning mechanism of protocol fee</p></li><li><p>Governance burning of any unused residual amount of MLN minted for development purposes</p></li></ul><p><strong>Value accrual mechanism</strong></p><p>The burning of all protocol fees applies a deflationary pressure on the overall circulation supply of MLN tokens which in return enable token holders to benefit from the token price appreciation.</p><p>→ The goal being that the protocol becomes deflationary in the long run given enough usage.</p><p><strong>Current state</strong></p><ul><li><p>Overall MLN tokens burned since launch = 382,522 MLN</p></li><li><p>MLN Circulating supply = 2,038,518 MLN</p></li><li><p>MLN tokens burned / MLN circulating supply = 18.7%</p></li></ul><p><strong>Observation</strong></p><p>This means that in three years the Enzyme council has effectively burned 18.7% of the circulating supply of MLN tokens.</p><h3 id="h-valuation-score-35" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Valuation [ Score 3/5 ]</h3><p>Given that all protocol fees get burned by the protocol it appears more interesting here to compare the market cap with the TVL instead of with the amount of protocol fees.</p><p>Current market cap/TVL ratio = <strong>1.11</strong></p><p>As demonstrated by the current market cap to TVL ratio, the protocol seems a bit overvalued at present. As such, it would appear better for investors to wait additional token price corrections before investing in the project.</p><p><strong>Comparison with competitors in terms of TVL (source DefiLlama)</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/14fd2f8380662a1437ac395cdaef3a014f1f7cb217587303ac6763f5752bfb91.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Global in/out of the money</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/5ffcf9ff2672febd1024da4b17cedee2ffb9e9c6ee64a9db855ada5a6a35e0ad.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>As shows the above graph from IntoTheBlock, most of MLN token holders are currently out of the money suggesting a high selling pressure on the token and strong levels of resistance ahead, especially in the range $20 - $50.</p><p><strong>Number of Large Transactions</strong></p><p>Daily number of on-chain transactions greater than $100,000</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/50b3a7bfc647023823d63b31126a2de79e7ef45c4692d27af1c738056495e0d0.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Not a lot of activities currently on the protocol outside of small retail investors.</p><p><strong>Average Balance growth ($)</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/6aec245931d2a7ba943bcd80b5a644b702e22aa7a00b2f9ed6bdf13c76147528.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Average Balance growth = - 81%</p><p>AAGR = - 81%</p><p>Interesting to note that users didn’t massively exit the protocol over the past year and globally stayed invested in the protocol in the same relative proportion.</p><p><strong>Twitter sentiment</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/997545e687da3811ddb20c9ca357b379f2809b015f90e47c2f7de7e5f6c73a75.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Mixed user sentiment over the past week but overall good twitter sentiment over the past year.</p><h2 id="h-team" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Team</h2><h3 id="h-governance-architecture-score-45" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Governance architecture [ Score 4/5 ]</h3><p>The overall Enzyme protocol governance is ensured by the Enzyme Council DAO itself divided into two sub DAOs :</p><ul><li><p><strong>The</strong> <strong>Enzyme Technical Council (ETC)</strong> — Providing technical expertise and in charge of making decisions on core technical topics such as protocol upgrades, resource allocation or network parameters.</p></li><li><p><strong>The Enzyme Users Representatives (EUR</strong>) — Collecting, prioritizing and delivering users feedback to the Enzyme Council on behalf of users.</p></li></ul><p><strong>Current Enzyme Council members</strong></p><ul><li><p>Chain Security (ETC)</p></li><li><p>Exa: Co-founder of Exponent (ETC)</p></li><li><p>Janos Berghorn: Investor @KR1 (ETC)</p></li><li><p>Giel Detienne: User representative (EUR)</p></li><li><p>Mona El Isa: Founder &amp; CEO @Avantgarde Finance (ETC)</p></li><li><p>Felix Hartmann: Founder @Hartmann Capital &amp; User (EUR)</p></li><li><p>Will Harborne: Founder &amp; CEO @Deversifi (ETC)</p></li><li><p>Nick Munoz-McDonald: Smart Contract Auditor &amp; Researcher @G0 Group (ETC)</p></li><li><p>Paul Salisbury: Founder @Blockchain Labs (ETC)</p></li><li><p>Zahreddine Touag: Founder @Woorton (ETC)</p></li><li><p>Theophile Villard: Co-founder of Multis wallet, contributor to Enzyme codebase (ETC)</p></li></ul><h3 id="h-tech-leadership-and-ability-to-fill-gaps-score-45" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Tech leadership and ability to fill gaps [ Score 4/5 ]</h3><p>As per the recent approval by the Enzime Council of the <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://snapshot.org/#/enzymefinance.eth/proposal/0x713b98e78cbae5d7f4a0ee7f07b0b1c9bcf05c2e05b2c6c8b71e892127dc7327">new protocol development grant</a> defended by Avantgarde finance, this latter will continue to be up until September 2025 the lead developer of the core Enzyme protocol as well as of its growing ecosystem.</p><p>Additionally, as per this new grant, Avantgarde finance will also support and develop in partnership with Enzyme a new marketing strategy on par with similar effort across the space in order to attract new users to the platform.</p><p>Past achievements of Avantgarde Finance as lead protocol developer:</p><ul><li><p>Three major releases in the space of 28 months.</p></li><li><p>Integrations with some of the biggest and most widely used protocols in DeFi</p></li><li><p>Development of a sophisticated set of subgraphs tracking important metrics</p></li><li><p>Complete UX/UI development</p></li><li><p>Management of numerous bug bounty programs</p></li></ul><p>Overall the Enzyme protocol benefits from a high level and <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://avantgarde.finance/company">dedicated core dev team</a> with an impressive past track record which will most likely have no difficulty to further continue the development of its core features in order to ensure its long term success.</p><h3 id="h-general-team-business-dev-and-marketing-capabilities-score-35" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">General team business dev and marketing capabilities [ Score 3/5 ]</h3><p>Generally speaking, the Enzyme Council is in charge of the marketing efforts alongside Avantgarde finance through the distribution of grants allocated through vote proposals.</p><p><strong>Example</strong></p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://snapshot.org/#/enzymefinance.eth/proposal/0x4dbc6c4cfb714ae64d550aa53e7aa694e77c4f011174e4504cd43224b3a47940">$35,000 grant approval for Messari coverage</a> of Enzyme through a dedicated webinar as well as a serie of dedicated reports published on Messari’s website and distributed through its newsletter.</p><h3 id="h-backers-and-funding-strength-score-25" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Backers and funding strength [ Score 2/5 ]</h3><p>Enzyme finance exhibits two main sources of funding:</p><ul><li><p>ICO held in 2017:</p><ul><li><p>Start Date: Feb 15, 2017</p></li><li><p>End Date: Feb 16, 2017</p></li><li><p>Total raised: $2.9M</p></li><li><p>Country: Switzerland</p></li></ul></li><li><p>Yearly protocol minting allocation of 300,600 MLN for ensuring protocol development</p></li></ul><p>Given that the main current source of funding of the protocol is by design denominated in MLN, it is critical for the project that the MLN price doesn’t fall too low at the risk of hindering any future development and even create a sort of a death spiral if breaking a certain floor price (not enough funds to pursue development  further decrease of the price  even less found to upgrade the protocol …)</p><h2 id="h-security" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Security</h2><h3 id="h-token-governance-score-35" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token governance [ Score 3/5 ]</h3><p>As highlighted before, the token governance is ensured by the Enzyme Technical Council.</p><p>Main token governance topics:</p><ul><li><p><strong>Resource allocation:</strong> The yearly inflation of MLN being the sole resource available to fund the development of the Enzyme protocol, prioritizing and executing its allocation in a mindful manner is of the utmost importance.</p></li><li><p><strong>Network parameters:</strong> Asset management gas price per amgu as well as protocol fees needs to be set according to network usage and market conditions in order to ensure the optimal performance of the Enzyme protocol.</p></li></ul><h3 id="h-token-holder-concentration-score-35" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token holder concentration [ Score 3/5 ]</h3><p><strong>Top 10 holders distribution</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/e57b0237d55c8bf8dbbc8afd3609862475c98a83c12ad556a9327e7e46f2422e.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/843545e9d02f4ef3ca7ace65cdc8de5be518086a57d384dd54180d7d02cab4a3.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Top holders incentives</strong>: Given the perspective of the protocol becoming deflationary in the mid/long term, top holders are incentivized to hold their tokens in order to then be able to profit from a passive source of profit due to the mechanical price increase of the MLN token.</p><h3 id="h-sources" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Sources</h3><p><strong>Protocol documentation and additional information sources</strong></p><ul><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://docs.enzyme.finance/">https://docs.enzyme.finance/</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://specs.enzyme.finance/">https://specs.enzyme.finance/</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://enzyme.finance/">https://enzyme.finance/</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/orgs/enzymefinance/repositories?type=all">https://github.com/orgs/enzymefinance/repositories?type=all</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://medium.com/@avantgardefi">https://medium.com/@avantgardefi</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://medium.com/enzymefinance">https://medium.com/enzymefinance</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://medium.com/@meloncouncil">https://medium.com/@meloncouncil</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://messari.io/asset/enzyme-finance">https://messari.io/asset/enzyme-finance</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://app.enzyme.finance/discover/assets">https://app.enzyme.finance/discover/assets</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.youtube.com/@enzyme9189">https://www.youtube.com/@enzyme9189</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://snapshot.org/#/enzymefinance.eth">https://snapshot.org/#/enzymefinance.eth</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://twitter.com/enzymefinance">https://twitter.com/enzymefinance</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://discord.enzyme.finance/">https://discord.enzyme.finance/</a></p></li></ul><p><strong>Data sources</strong></p><ul><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://etherscan.io/token/0xec67005c4e498ec7f55e092bd1d35cbc47c91892?a=0xd8f8a53945bcfbbc19da162aa405e662ef71c40d">https://etherscan.io/token/0xec67005c4e498ec7f55e092bd1d35cbc47c91892?a=0xd8f8a53945bcfbbc19da162aa405e662ef71c40d</a></p></li><li><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://coinmarketcap.com/fr/currencies/enzyme/">https://coinmarketcap.com/fr/currencies/enzyme/</a></p></li></ul><p>Disclaimer: <em>All information/documents contained in this blog rely solely  on my personal beliefs, and do not constitute professional investment advice.</em></p><p><em>Be careful in your investment and do not invest more than you can afford to loose.</em></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/f1efe655e18990300f8e5f7fd63222e21b13acb514694e2099df60f26ca61746.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[DevOPs Introduction (part3)]]></title>
            <link>https://paragraph.com/@cybergen-lab/devops-introduction-part3</link>
            <guid>kUD3o2BAbvrdkWokt67U</guid>
            <pubDate>Thu, 01 Dec 2022 16:28:11 GMT</pubDate>
            <description><![CDATA[Hey there, nice to see you back for this third part of our DevOps tutorial series! 😃 Alright so today the roadmap will be a bit longer than usual so hang tight and don’t hesitate to make breaks between the different parties if necessary 😉 Tutorial roadmap:Creating a new PostgreSQL user and our databaseCreating our tables using python scriptsPerforming the initial data input of our tables with pythonPart 1: Creation of our Postgresql databaseAlright so first thing first we need to create a n...]]></description>
            <content:encoded><![CDATA[<p>Hey there, nice to see you back for this third part of our DevOps tutorial series! 😃</p><p>Alright so today the roadmap will be a bit longer than usual so hang tight and don’t hesitate to make breaks between the different parties if necessary 😉</p><p><strong>Tutorial roadmap:</strong></p><ul><li><p>Creating a new PostgreSQL user and our database</p></li><li><p>Creating our tables using python scripts</p></li><li><p>Performing the initial data input of our tables with python</p></li></ul><h2 id="h-part-1-creation-of-our-postgresql-database" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Part 1: Creation of our Postgresql database</h2><p>Alright so first thing first we need to create a new PostgreSQL database in which we will develop our data architecture with our different tables. So let’s do just that by ssh into our existing DO droplet.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/7e0d9e1e723e0410d3cf9100d66d8cc0dc3ea4eef3bb8654c2cac4625a482b7d.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there given the default installation of PostgreSQL in Ubuntu we have to switch from the root user to the postgres user, using the following command in order to be able to access our PostgreSQL instance.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/f685a7c9c96b1f29d501d77ce30912a3a02223ae3191117d59547820d0de0420.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now let’s create our dedicated postgres user that we’ll use to access remotely our database. To do so use the following command and fill up the prompt as shown below:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/8ffefe2e9205450f20b348739e0de2c1ede6e0c62dd4dfe8ba75926935bf843f.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/ac072b214caea4bb5ee4c4d7a87bc1a7551518a925bcb01608a9c36246ff87ab.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><em>Note:</em> Ideally it would be best to avoid creating our new user as a superuser given that we’ll use it later to connect our database to Grafana. However given that it is just an introductory tutorial here we’ll consider that it is good enough from a security standpoint given the non-existent security risks.</p><p>To check that your new user has indeed been created you can use the Postgres Shell as follows:</p><p>Type psql in your terminal to access the Postgres shell as shown below</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/445dab92710ee36accf2893e4c822b82b5189fa2d3dd97b739e4a875cce227ea.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there, type the following command:</p><pre data-type="codeBlock" text="$ \du
"><code><span class="hljs-meta prompt_">$ </span><span class="bash">\<span class="hljs-built_in">du</span></span>
</code></pre><p>From there you should be meet with a table of the following format inside of which you’ll see your new user.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/cccfe0fa82c0822db3179f079496604b96a6393eebb5859c0e029e6acfc07409.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so from there exit the postgres shell, using the following command</p><pre data-type="codeBlock" text="$ \q
"><code>$ \<span class="hljs-selector-tag">q</span>
</code></pre><p>Then still using the postgres user type the following command in order to create a new database called “ethdb” that we’ll assign to our newly created user.</p><pre data-type="codeBlock" text="$ createdb -O myuser ethdb
"><code><span class="hljs-variable">$ </span>createdb -O myuser ethdb
</code></pre><p>Alright and that’s it for our first part we now have our database and a dedicated user to access it, so make a quick break if needed before jumping to part 2.</p><h2 id="h-part-2-creation-and-initialisation-of-our-metrics-tables-using-python-scripts" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Part 2: Creation and initialisation of our metrics tables using python scripts.</h2><p>Alright so now let’s create the tables that we’ll use to store the network metrics that we want.</p><p>⚠️ ⚠️ Don’t forget to come back to the root user before continuing by using the following command:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/38766333e961edfbf60c0a4cde3b8b762009edba62bed0bee5c5b78292639596.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so from there use the following command to list all the available folders at the current level where we at:</p><pre data-type="codeBlock" text="$ ls
"><code><span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">ls</span></span>
</code></pre><p>Normally you should see, the following:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/36de84a461a351f83ed9380b8a0b8d76d256e98a5eda94d04e22577f7e709062.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>So from there use the following command in order to get one level higher up in the folder architecture of the server:</p><pre data-type="codeBlock" text="$ cd ..
"><code><span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">cd</span> ..</span>
</code></pre><p>There if you use the previous “ls” command you should get the following result:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/1b2ba6f0098e2b9a6a381a713ff6417a4cd4826f3ed978911567083c97445078.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now that we’re at the right level let’s start by first creating the folder in which we’ll store our python script. To do so, use the following mkdir (make directory) command:</p><pre data-type="codeBlock" text="$ mkdir tutorials
$ mkdir tutorials/introduction 
$ mkdir tutorials/introduction/code 
"><code><span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">mkdir</span> tutorials</span>
<span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">mkdir</span> tutorials/introduction</span> 
<span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">mkdir</span> tutorials/introduction/code</span> 
</code></pre><p>Now let’s enter our introduction folder:</p><pre data-type="codeBlock" text="$ cd tutorials/introduction
"><code><span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">cd</span> tutorials/introduction</span>
</code></pre><p>From there let’s create two additional text documents :</p><ul><li><p>A ReadMe.txt file in which we’ll present an overview of our project</p></li><li><p>A Requirements.txt listing all the packages and version requirements to run this project</p></li></ul><p>To create those two files we’ll use here the “touch” command as follow:</p><pre data-type="codeBlock" text="$ touch ReadMe.txt 
$ touch Requirements.txt
"><code><span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">touch</span> ReadMe.txt</span> 
<span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">touch</span> Requirements.txt</span>
</code></pre><p>Normally after all that you should end up with an “introduction” folder looking like the below snippet</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/3f6d99a8cbd6f5e14a13264de77f33dee67784852bb93843aa9e354915c35214.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now that we’re all set from a project architecture standpoint, let’s start creating our PostgreSQL tables.</p><p>Here given the limitation of the Santiment API provider enumerated in our first tutorial it appears better to create the four following tables:</p><ul><li><p>historical_metrics: Storing all historical prices and marketcap up to today</p></li><li><p>network growth: Storing all historical network growth values up to last month</p></li><li><p>daily_metrics: Storing daily value for the metrics where we only want to show in our dashboard the latest current value</p></li><li><p>delayed_metrics: Same as daily metrics table but for metrics where we only have access to past month values</p></li></ul><p>In order to create this data architecture, let’s get back to our terminal and enter into our code folder.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/45a9bf7e33409ab783a29858d3f2feee1284608e84a8ef5b1b590d699301388a.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><em>Note:</em> Depending on where you are inside your folder architecture the command might changed so keep in mind that to enter inside a folder at a particular level you only need to use the following command:</p><pre data-type="codeBlock" text="$ cd [name of the folder you want to enter]
"><code><span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">cd</span> [name of the folder you want to enter]</span>
</code></pre><p>and if you want to go back of one level just used the following command:</p><pre data-type="codeBlock" text="$ cd ..
"><code><span class="hljs-meta prompt_">$ </span><span class="bash"><span class="hljs-built_in">cd</span> ..</span>
</code></pre><p>Alright so once you’re into the code folder, go ahead and create a new python file by using the following the following command:</p><pre data-type="codeBlock" text="$ nano init.py
"><code>$ nano init.py
</code></pre><p>From there you’ll end up in the nano text editor which is with Vim one of the most common text editor used in Linux.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/6069c76ce07402dcbc0d54bf6c2426ed021d601167b1607eb09134957c46af2f.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there go to the github page of the tutorial and copy and paste the content of the init.py file inside the nano shell.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/a807ca7d8eae1d162ea6377f97e1759eb35cba95a081590feabd876595502d9d.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Let’s unpacked a little bit the code that we used here.</p><p>So first we import the psychopg2 python package that will enable us to connect to our Postgres instance through Python.</p><p>Then we define the create_tables() function composed of two blocks through which we’ll create each of our tables.</p><ul><li><p>Block 1: We create a command variable composed of usual SQL commands in order to create each of our tables with their respective metrics.</p></li><li><p>Block 2: We use a try - except block in order to try to connect to our local PostgreSQL instance and create a cursor that we’ll then use to execute the content of our command variable.</p></li></ul><p>Finally, we run our function in order to create our data architecture in our PostgreSQL database.</p><p>Alright so now that we now what the code is doing let’s run it in our distant server. To do so first save all modifications made in the nano text editor by pressing ctrl + s and then exit by pressing ctrl + x.</p><p>Once back into the code folder install the psychopg2 package with the following command:</p><pre data-type="codeBlock" text="$ pip install psychopg2
"><code><span class="hljs-variable">$ </span>pip install psychopg2
</code></pre><p>Then from there we have to make our script executable. To do so we’ll use the following command:</p><pre data-type="codeBlock" text="$ chmod + x init.py
"><code>$ chmod <span class="hljs-operator">+</span> x init.py
</code></pre><p>Normally if everything went well you should see the name of your init.py file appearing in green. From there, just type the following command to run your python script in your remote server:</p><pre data-type="codeBlock" text="$ ./init.py 
"><code>$ ./init.py 
</code></pre><p>Alright so here normally if everything worked you should not receive any message once the script has finished running. However, it’s always better to double check so reusing the same method used before we can easily check if we indeed created those tables using the following sequence of commands:</p><pre data-type="codeBlock" text="$ su - postgres 
$ psql 
#We enter here into the Postgres shell 
postgres=# \c ethdb 
ethdb=# \dt
"><code>$ su - postgres 
$ psql 
<span class="hljs-comment">#We enter here into the Postgres shell </span>
postgres=<span class="hljs-comment"># \c ethdb </span>
ethdb=<span class="hljs-comment"># \dt</span>
</code></pre><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/d65ea84d01bf591b835f503badb5ab7109e510420005eb8f1d755757e57d6b6d.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><em>Note:</em></p><ul><li><p>The “\c ethdb” command is a psql command allowing to connect to the ethdb database</p></li><li><p>The “\dt” command is a psql shortcut enabling to list all available tables</p></li><li><p>Here the “user1” mentioned comes from the fact that I used another user name tag when creating this tables</p></li></ul><p>Alright so now that we have our table let’s go ahead and populate them.</p><h2 id="h-part-3-initial-data-populating-of-our-psql-tables" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Part 3: Initial data populating of our psql tables</h2><p>As before for the creation of our psql tables, we’re also going to use python scripts. However, this time we’re going to have to make API calls to Santiment API through reusing the functions that we previously created in our first tutorial.</p><p>As such before creating our init.py script let’s create before a new python script called functions.py using the following command inside the code folder:</p><pre data-type="codeBlock" text="$ nano functions.py 
"><code>$ nano functions.py 
</code></pre><p>And there just copy and paste the content of the function file that you can find <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro">here</a> on the tutorial Github repo.</p><p>Also don’t forget to make it executable by using as before the following command:</p><pre data-type="codeBlock" text="$ chmod +x functions.py
"><code>$ chmod <span class="hljs-operator">+</span>x functions.py
</code></pre><p>Alright so now let’s move onto our data populating script per se</p><h3 id="h-script-1-historicalmetric-init" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Script 1 : historical_metric init</h3><p>Let’s start with our historical_metrics table. First we create a dedicate python script that we’ll call table_init.py:</p><pre data-type="codeBlock" text="$ nano table_init.py 
"><code>$ nano table_init.py 
</code></pre><p>As before, copy and paste here in nano the content of the <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro/blob/main/Code/table_init.py">table_init</a> file available on Github.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/7cb1805eb91e82f875f922ed4272c7ad19c0eb9f523f2826620920f3d00c4de0.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Let’s unpack a little bit our code here given that a lot is going on at the same time 😅</p><p>So first as you can see in the code we import the functions that we need from our functions.py file, as well as the psychopg2 package allowing us to connect to our local psql instance and other python package enabling us to get access to date variables.</p><p>Second we create the update_db() function enabling us to populate our historical_metrics table for each date.</p><p>Third we create the populate_table() function which calls our “get_historical_price” and &apos;“get_historical_marketcap” to get all the data point for the past 6 months and use our previous update_db() function on each of those data point to populate our psql table.</p><p><em>Note:</em> As you can see below we also need to a bit of formating on our data given that the data delivered by Santiment API in python is not compatible with our integer psql type.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/3ca3e884fc003b0a1a95a2e7acd5abe5ae514e023a2a1f3b019224bc6b8c2bc3.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now let’s run this python script. So as per usual make it executable and run it with the following sequence of commands:</p><pre data-type="codeBlock" text="$ chmod +x table_init.py
$ ./table_init.py
"><code>$ chmod <span class="hljs-operator">+</span>x table_init.py
$ ./table_init.py
</code></pre><p>If everything goes smoothly at the end of the process we’ll be met with the following prompt message:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/5b758eac91712665baab31fdc62bf634f17d1b7e930c2fb116f998ad85f33225.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>If you want to double check the code execution, run the following sequence of commands:</p><pre data-type="codeBlock" text="$ su - postgres
$ psql 
postgres=# \c ethdb
ethdb=# select * from historical_metrics; 
"><code>$ su <span class="hljs-operator">-</span> postgres
$ psql 
postgres<span class="hljs-operator">=</span># \c ethdb
ethdb<span class="hljs-operator">=</span># select <span class="hljs-operator">*</span> <span class="hljs-keyword">from</span> historical_metrics; 
</code></pre><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/864e0fb60af20652d8a2c87eadd6ab18197bb7fbb1958bc3036475b615f8c02a.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><em>Note:</em> To exit that view press q</p><h3 id="h-script-2-networkgrowth-init" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Script 2 : network_growth init</h3><p>Alright so here using the same process, previously employed in the case of table_init.py we create a new python_script to populate our network_growth table called table_init1.py and copy and paste the <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro/blob/main/Code/table_init1.py">table_init1.py</a> file from Github:</p><pre data-type="codeBlock" text="$ nano table_init1.py 
"><code>$ nano table_init1.py 
</code></pre><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/82816429360c6f624ae1521f2f4e641b97bc729d34d0375b69e466e342cfe6b4.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>The code employed here is pretty much the same as the one used before in the case of table_init slightly modified to get the network growth value and take into account the fact that we can only get thet data as far as last month.</p><p>From there as per usual we make that file executable and run it.</p><h3 id="h-script-3-dailymetrics-init" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Script 3: daily_metrics init</h3><p>Alright so from there we’ll go a bit quicker in our explanation given that we’ll again use the same process.</p><p>So create here a file called table_init2.py, copy and paste the content from the <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro/blob/main/Code/table_init2.py">corresponding Github file</a> into nano and from there make the script executable and run it.</p><h3 id="h-script-4-delayedmetrics-init" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Script 4: delayed_metrics init</h3><p>Create a file called table_init3.py, copy and paste the content from the <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro/blob/main/Code/table_init3.py">corresponding Github file</a> into nano, make the script executable and run it.</p><p>And that’s it!! 🥳🥳</p><p>We finally have a full data architecture into our DO server, with a database and a set of tables storing the value of our different metrics so congrats if you came as far as this, it was definitley not easy so give yourself a well deserved pat on the back 😃</p><p>Next time we’ll build onto our work to see how to create scripts that populate automatically our tables on a daily basis and also how to connect our psql database to grafana in order to create our ethereum dashboard, so stay tuned for the next and final part of this tutorial serie and in the meantime happy coding!</p><p>See you next time 🖖</p><h3 id="h-link-to-tutorial-github-repo" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Link to tutorial Github repo:</h3><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro">https://github.com/Cybergen300/DevOps-Intro</a></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/322da7c568248a8618cec8c997e9b9676d89f26cca4c23475340aa55a092f906.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Weekly news 11/29]]></title>
            <link>https://paragraph.com/@cybergen-lab/weekly-news-11-29</link>
            <guid>j1SfHHHf4FPpKM8u1mif</guid>
            <pubDate>Tue, 29 Nov 2022 17:27:08 GMT</pubDate>
            <description><![CDATA[DeFiBlockFiBlockfi officially filed for bankruptcy on Monday, following the collapse of FTX, its main financial backer, which earlier this year rescue the company with a revolving $250 million line of credit that later on morphed into a $400 million credit facility.KilnKiln a Paris-based startup enabling retail customers to easily partake into Ethereum PoS mechanism through its newly launched staking-as-a-service product recently raised $17.6M in a series A funding round led by Consensys, GSR...]]></description>
            <content:encoded><![CDATA[<h2 id="h-defi" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">DeFi</h2><h3 id="h-blockfi" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">BlockFi</h3><p>Blockfi officially filed for bankruptcy on Monday, following the collapse of FTX, its main financial backer, which earlier this year rescue the company with a revolving $250 million line of credit that later on morphed into a $400 million credit facility.</p><h3 id="h-kiln" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Kiln</h3><p>Kiln a Paris-based startup enabling retail customers to easily partake into Ethereum PoS mechanism through its newly launched staking-as-a-service product recently raised $17.6M in a series A funding round led by Consensys, GSR and Kraken’s venture-capital arm in order to continue the development of its current platform.</p><h3 id="h-wbtc" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">WBTC</h3><p>Following the spreading of a rumour claiming that Alameda Research was among the top wrapped Bitcoin merchants and custodiers, WBTC slightly depegged prompting Bitgo CEO to officially debunk the news on Twitter by exhibiting available on-chain data. Still, at the time of writing the peg still hasn’t been fully restored.</p><h2 id="h-web3" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Web3</h2><h3 id="h-bizi" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Bizi</h3><p>BIZI LABS, the Swiss based mobile platform allowing people to easily access Web3, recently announced that it will integrate the Polygon network into its flagship Web3 partner smartphone brand, ZMBIZI making it the first smartphone combining built-in Web3 features, user rewards, and multi chain functionality.</p><h3 id="h-consensys" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Consensys</h3><p>Consensys is currently facing community backlash following the use of a misleading wording concerning users data collection in its recent update of Metamask’s privacy policy which led users to start doubting the claim of fair data usage defended by Consensys and Infura regarding data collection in Infura supported applications.</p><h3 id="h-carv" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Carv</h3><p>Carv a Los-Angeles based start up developing a brand new decentralised identity solution for gaming applications enabling users to move freely between applications and games recently announced the closing of a $4M seed funding round led by Singaporean-based venture capital firm Veterex.</p><h2 id="h-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Ecosystem</h2><h3 id="h-us-regulators" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">US regulators</h3><p>According to a recent report from Barron, several US state regulators are looking into whether crypto trading firm Genesis Global Capital may have violated securities laws with a specific focus on unregistered selling of crypto securities investment.</p><h3 id="h-japanese-regulators" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Japanese regulators</h3><p>The Financial Services agency (FSA) more commonly known as the top japanese financial regulator recently issued a call for DeFi players to take part in a new “fact-finding survey” on the sector signalling for many a willingness of japanese authority to start regulating the sector some time next year.</p><h2 id="h-market" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Market</h2><ul><li><p>BTC     $16,230.82</p></li><li><p>ETH $1,171.9</p></li><li><p>BNB $292,95</p></li><li><p>ADA $0,30</p></li><li><p>SOL. $13,42</p></li></ul><h3 id="h-onomy" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Onomy</h3><p>Despite the current turbulent market conditions, Onomy a Cosmos blockchain based ecosystem aiming to merge decentralised finance and the foreign exchange market to bring the latter on-chain recently an additional $10M funding from Bitfinex and AVa Labs.</p><h3 id="h-ardana" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Ardana</h3><p>Ardana, a once-promising decentralized finance project built on top of Cardano, has halted development citing funding and project timeline uncertainty as the main reason for the closing of the project.</p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/0677e02b21d3dc301f77a3449947ea471ade8aca3ecaaff341c201ddd9b2c4b7.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[DevOps Introduction (part 2)]]></title>
            <link>https://paragraph.com/@cybergen-lab/devops-introduction-part-2</link>
            <guid>pBefIiAgVcLipTOTUC1i</guid>
            <pubDate>Wed, 16 Nov 2022 17:16:03 GMT</pubDate>
            <description><![CDATA[Hey there, good to see you back on this second part of our DevOps Intoduction series😃 Today’s roadmap we’ll be the following:Creation of our Digital Ocean Cloud serverInstallation of PostgreSQL on our cloud serverPart 1. Creation of a Digital Ocean (DO) cloud server.Alright, so first let’s get set up with Digital Ocean. If you don’t have a pre-existing DO account, go here to set it up. You’ll need to input your credit card info but usually there are available promo deals offering up to $200 ...]]></description>
            <content:encoded><![CDATA[<p>Hey there, good to see you back on this second part of our DevOps Intoduction series😃</p><p>Today’s roadmap we’ll be the following:</p><ul><li><p>Creation of our Digital Ocean Cloud server</p></li><li><p>Installation of PostgreSQL on our cloud server</p></li></ul><h2 id="h-part-1-creation-of-a-digital-ocean-do-cloud-server" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Part 1. Creation of a Digital Ocean (DO) cloud server.</h2><p>Alright, so first let’s get set up with Digital Ocean. If you don’t have a pre-existing DO account, go <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://cloud.digitalocean.com/registrations/new">here</a> to set it up. You’ll need to input your credit card info but usually there are available promo deals offering up to $200 worth of credit for new users for the first few months. Additionally, given the really small workload that we’ll use even without any promo plan, it wil not cost us more than $5 to perform this tutorial.</p><p>Cool, so right after onboarding you should be met with this initial screen, presenting the current state of our project.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/c6ae418c1eee9a6079fecc3cc3e777b74e7d7d7f98af3a13b111b93e8f666d5f.png" alt="Digital Ocean homepage" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">Digital Ocean homepage</figcaption></figure><p>So first, let’s customise it a little bit by clicking on settings and modifying our project detail as follows:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/63e16eb5d18213e3d8d05bbe23d4c08b64c21e4e98694e8bd0a30b26dc5be698.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright, so now that we cleaned up a bit our project let’s move on to create our cloud server also called droplet inside the Digital Ocean ecosystem by simply clicking on the droplet tab inside the manage section.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/523409ce287e0544887ca1ee30bb02b89568897dbf559a1a83a1ed8e7c08dadd.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there click on create droplet and wait to get redirected to the droplet creation page.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/a3fbb29b561a678563d16ddcf39a1fdf43f1f1af75968697ad82ec2941c65c20.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so here as you can see through scrolling the page we have a lot of option. However given the pretty basic project that we’re aiming to realise there is no need here to select ultra fancy parameters. As such we;’ll use the following parameters:</p><ul><li><p>Distribution: Ubuntu 22.10 x64</p></li><li><p>Plan: Basic Plan (Shared CPU)</p></li><li><p>CPU options: Regular with SSD ($6/mo)</p></li><li><p>Datacenter Region: Choose the one closer to your location</p></li><li><p>Authentication: Password</p></li><li><p>No additional options</p></li><li><p>How many Droplets: 1</p></li><li><p>Choose a hostname : DevOps-Intro (or any name of your liking)</p></li></ul><p>Once everything is set click on create droplet and wait for it to be created by the platform.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/06883707b88fcf1e2a8e6cca9b20f7ccd79a32d9c5b8408f5d0c3b853d2dd87b.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h2 id="h-part-2-installation-of-postgresql-in-do-server" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Part 2. Installation of PostgreSQL in DO server</h2><p>Great, so now that we have an up and running cloud server, let’s access it in order to install PostgreSQL on it. To do so, let’s open a terminal window on our computer.</p><p>➡️ <em>If you’re on Mac go in the finder, browse to application and then click on the Utilities folder and there double click on the Terminal application. If you’re on window, just open the start menu and type cmd before pressing enter.</em></p><p>From there we’re going to ssh into our server. To do so just copy and paste the IP of your droplet shown here:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/49ecf3be094f821ea040d716bb17f99c5274b6b0bb33b9feaa0a4bc5a3ade913.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>And type the following command into your terminal:</p><pre data-type="codeBlock" text="ssh root@(YOUR DROPLET IP)
"><code>ssh <span class="hljs-symbol">root@</span>(YOUR DROPLET IP)
</code></pre><p>If you’re prompted with a message relative to the authenticity of the host in your terminal type yes and press enter. Right after it you’ll get ask your droplet password so again just type it into your terminal an press enter.</p><p>From there if everything goes well, you should be greeting by the following welcoming message from your Ubuntu cloud server.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/4ece620182d0825597e30d3bcc9804ec440736ea1e26ceb11a970c09a5ed3d33.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now that we’re inside our server let’s install PostgreSQL. To do so just copy and paste the following set of command in your terminal:</p><p>Command 1: Create the file repository configuration:</p><pre data-type="codeBlock" text="sudo sh -c &apos;echo &quot;deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main&quot; &gt; /etc/apt/sources.list.d/pgdg.list&apos;
"><code>sudo sh <span class="hljs-operator">-</span>c 'echo <span class="hljs-string">"deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main"</span> <span class="hljs-operator">></span> <span class="hljs-regexp">/etc/</span>apt<span class="hljs-regexp">/sources.list.d/</span>pgdg.list'
</code></pre><p>Command 2: Import the repository signing key</p><pre data-type="codeBlock" text="wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
"><code>wget <span class="hljs-operator">-</span><span class="hljs-operator">-</span>quiet <span class="hljs-operator">-</span>O <span class="hljs-operator">-</span> https:<span class="hljs-comment">//www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -</span>
</code></pre><p>Command 3: Update the package lists</p><pre data-type="codeBlock" text="sudo apt-get update
"><code>sudo apt<span class="hljs-operator">-</span><span class="hljs-keyword">get</span> <span class="hljs-keyword">update</span>
</code></pre><p>Command 4: Install the PostgreSQL 12 version in order to not have any issue when integrating later with Grafana</p><pre data-type="codeBlock" text="apt-get install postgresql-12
"><code>apt-get install postgresql<span class="hljs-number">-12</span>
</code></pre><p><em>Note:</em> During the running of this last command you might be asked by the terminal to confirm your command in that case just type Y and press enter.</p><p>Alright and now let’s just make sure that everything is operational by checking our installed psql version.</p><pre data-type="codeBlock" text="psql -V
"><code>psql <span class="hljs-operator">-</span>V
</code></pre><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/78085a38512cc0579571d16fa314c6db255c431757e1fbfccafabba9dc551f1f.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>And that’s it! You succesfully created and set up your first very own cloud server using Digital Ocean 🥳🥳</p><p>So give yourself a pat on the back and don’t hesitate to play around with this new server in order to familiarise yourself with the terminal since next time, we’ll use it intensively in order to create our database architecture.</p><p>⚠️ ⚠️</p><p>Also don’t forget to exit your remote DO server by typing the following command in the terminal</p><pre data-type="codeBlock" text="exit
"><code><span class="hljs-built_in">exit</span>
</code></pre><p>See you next time 🖖</p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/322da7c568248a8618cec8c997e9b9676d89f26cca4c23475340aa55a092f906.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Weekly News 11/15]]></title>
            <link>https://paragraph.com/@cybergen-lab/weekly-news-11-15</link>
            <guid>ZVvve5YO2Ej2Q8B7Q1xo</guid>
            <pubDate>Tue, 15 Nov 2022 20:41:28 GMT</pubDate>
            <description><![CDATA[DeFiUniswapIn the wake of the recent FTX meltdown, Uniswap recently surpassed Coinbase and became the world’s second largest Ethereum trading venue with over $1 billion in ETH daily trading volume.SolanaThe recent downfall of prominent Solana backer Sam Bankman Fried continues to ripple-down on the Solana ecosystem as shows the drop by more than 50% of the SOL price and overall liquidity drop in Solana’s ecosystem over the past two weeks.DeFi LendersWhile Maple Finance narrowly escaped disast...]]></description>
            <content:encoded><![CDATA[<h2 id="h-defi" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">DeFi</h2><h3 id="h-uniswap" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Uniswap</h3><p>In the wake of the recent FTX meltdown, Uniswap recently surpassed Coinbase and became the world’s second largest Ethereum trading venue with over $1 billion in ETH daily trading volume.</p><h3 id="h-solana" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Solana</h3><p>The recent downfall of prominent Solana backer Sam Bankman Fried continues to ripple-down on the Solana ecosystem as shows the drop by more than 50% of the SOL price and overall liquidity drop in Solana’s ecosystem over the past two weeks.</p><h3 id="h-defi-lenders" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">DeFi Lenders</h3><p>While Maple Finance narrowly escaped disaster after closing all loans to Alameda in September, fear of contagion in the sector following the demise of FTX and Alameda Research still remain present in the market. Indeed, despite the already known $12.4M liabilities owed to a variety of corporate lenders (Apollo Capital, Clearpool, …), many market participants fear the potential unveiling of additional hidden liabilities in the coming weeks.</p><h2 id="h-web3" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Web3</h2><h3 id="h-sarcophagus" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Sarcophagus</h3><p>Sarcophagus the famous dead man’s switch dApp developped on top of the Arweave network, recently reiterated its December mainnet release date while unveiling a new live demo of it’s faster and more user friendly V2 version leveraging the Bundlr Network.</p><h3 id="h-kwil" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Kwil</h3><p>Kwil, best known for its decentralised SQL development on Web3 recently announced a new partnership with BlockVision in order to help it decentralised its structured data storage architecture.</p><h3 id="h-trust-wallet" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Trust Wallet</h3><p>Trust Wallet, a major self-custodial and multi-chain wallet provider recently announced the launch on Chrome, Brave and Opera of its brand new browser extension wallet supporting Solana and all EVM chains.</p><h2 id="h-ecosytem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Ecosytem</h2><h3 id="h-defi-regulation" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">DeFi regulation</h3><p>Earlier this week, in an attempt to fill-up the void left by Sam Bankman Fried, Binance’s CEO Changpeng Zhao, took the stage at Blockbali to voice his stance on the nedeed sector regulation and indirectly propose a new industry partner to US regulators.</p><h2 id="h-market" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Market</h2><ul><li><p>BTC     $16,833.56</p></li><li><p>ETH     $1253,67</p></li><li><p>BNB     $275.36</p></li><li><p>ADA     $0.33</p></li><li><p>SOL     $14.30</p></li></ul><h3 id="h-sino-global" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Sino Global</h3><p>Sino Global, a blockchain and digital assets-focused investment firm, closely tied to SBF and directly owned by Alameda Research recently announced that despite having suffered mid-seven figures losses the company will nonetheless be able to continue its operations.</p><h3 id="h-ikigai" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Ikigai</h3><p>As confirmed by its founder, california-based hedge fund Ikigai Asset Management had a large majority of its assets on defunct crypto exchange FTX and will consequently most likely be forced to scale-down or stop all operations.</p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/0677e02b21d3dc301f77a3449947ea471ade8aca3ecaaff341c201ddd9b2c4b7.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Dash 2 Trade Overview ]]></title>
            <link>https://paragraph.com/@cybergen-lab/dash-2-trade-overview</link>
            <guid>cIJBu7HfUxJtYHVWG6A0</guid>
            <pubDate>Sun, 13 Nov 2022 14:32:43 GMT</pubDate>
            <description><![CDATA[Advertised as the latest initiative of the Learn 2 Trade team, Dash 2 Trade is a new crypto analytics and social trading platform aiming to help seasoned and casual investors improve their trading performance through the leveraging of proprietary analytical features built on top of a user friendly trading dashboard. Dash 2 Trade advantagesTrading signals providing users with early buy & sell market opportunitiesSocial sentiment and on-chain analysis features helping users to spot trending coi...]]></description>
            <content:encoded><![CDATA[<p>Advertised as the latest initiative of the Learn 2 Trade team, Dash 2 Trade is a new crypto analytics and social trading platform aiming to help seasoned and casual investors improve their trading performance through the leveraging of proprietary analytical features built on top of a user friendly trading dashboard.</p><p><strong>Dash 2 Trade advantages</strong></p><ul><li><p>Trading signals providing users with early buy &amp; sell market opportunities</p></li><li><p>Social sentiment and on-chain analysis features helping users to spot trending coins</p></li><li><p>Custom building tools helping users to create or adopt new trading strategies</p></li><li><p>Bespoke scoring system for available and upcoming crypto presales</p></li><li><p>Listing alerts on new listing announcements</p></li><li><p>Incentivised intelligence sharing mechanisms among users</p></li></ul><h2 id="h-project-architecture" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Project Architecture</h2><p>From a design standpoint, Dash 2 Trade presents itself as a three-tiered trading subscription platform:</p><p><strong>Free</strong> <strong>Tier</strong> — Full access to data terminal and general community channels but limited access to protocol metrics</p><p><strong>Starter</strong> <strong>Tier</strong> — Full access to basic protocol tools &amp; features as well as to members only Discord channels.</p><p><strong>Premium</strong> — Full access to premium protocol features and possibility to take part into incentivised quarterly trading competition.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/12b1cb5824493416f6aa3f2f84a515ee8d157e2f1d0ec709bcc353c3c93d5336.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Interesting to note here, that instead of creating a novel monetization design leveraging staking and other cryptoeconomic features, Dash 2 trade is simply reusing a token denominated version of the usual pricing scheme used by Web2 tech companies (AWS, Google, …). As such, in the current state of things active users may very well, in the future, suffer from price variations on their monthly subscription due to the D2T price volatility.</p><p><strong>Observation:</strong> A design denominating price membership in FIAT currency and enforcing the payment of said membership fee in the D2T token thanks to the leveraging of an Oracle solution such as Chainlink would have maybe appear better suited from a design standpoint in order to avoid price membership volatility from one month to another.</p><h2 id="h-project-features" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Project Features</h2><h3 id="h-unique-social-indicators" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Unique social indicators</h3><p>Social signals are among the most important valuation factors in the crypto space as most investments in the space tends to be driven first by market perception. Consequently, the Dash 2 Trade platforms offers several social metrics trackings various aspects of market sentiment such as:</p><ul><li><p>Token/Project related activity on social media (Reddit, Twitter, …)</p></li><li><p>Overall sentiment on a network through the leveraging of NLP algorithms</p></li><li><p>Dev activity (Github commit, …)</p></li></ul><h3 id="h-bespoke-scoring-system-for-presales-and-icos" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Bespoke scoring system for presales and ICOs</h3><p>Commonly viewed as a community-powered seed funding phase, presales often provide interesting entry point opportunities on valuable projects. As such, through its dashboard, Dash 2 trade provides dedicated overviews on available tokens presales including report analysis and a custom Dash score assessing the general quality of the project.</p><h3 id="h-strategy-builder-and-backtester-tools" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Strategy Builder and backtester tools</h3><p>Developping trading strategies usually require extensive CAPEX allocation due to the need for live experimentation on top of the basic historical simulation operated on past data points. As such, Dash 2 Trades developped a suite of strategy building tools relying on Dash 2 Trade metrics as well as a complete backtesting platform mirroring live market conditions in order to help users to easily and cheaply developed and assess the efficiency of new trading strategies.</p><h3 id="h-trading-tips" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Trading tips</h3><p>Thanks to the myriad of metrics Dash 2 Trade is able to collect from its users activity, the platform internal algorithm can analyse each member trade and compare it to other similar strategies with a better ROI in order to deliver personalised advice on future trades and strategies.</p><h3 id="h-listing-alerts" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Listing Alerts</h3><p>New exchange listing announcements can result in significant price changes for a token. As such, for D2T subscribers, immediate access is provided to new crypto listings that the Dash 2 Trade platform identifies, offering as such an opportunity for each and every D2T holders to capitalise on any upcoming listing.</p><h3 id="h-on-chain-analytics" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">On-chain analytics</h3><p>Dash 2 Trade is also servicing on-chain statistics for token and wallet activities allowing D2T holders to easily monitor whale movements and market maker dynamics throughout different blockchains.</p><h3 id="h-social-trading" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Social trading</h3><p>In the future, Dash 2 Trade will also be hosting a number of social trading features and trading competitions for its premium tier subscribers in order to incentivised intelligence sharing across the community and consequently improved the overall level of used strategies among premium-tier members.</p><h2 id="h-governance" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Governance</h2><p>N/A (Currently under development as per the project roadmap)</p><h2 id="h-cryptoeconomics" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Cryptoeconomics</h2><h3 id="h-token-utility" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token utility</h3><p>The Dash 2 Trade analytics dashboard operates on top of the Ethereum blockchain and uses as its native digital asset the D2T ERC-20 token. For now the number of utility ensured by this native token appear a bit scarce given that the D2T token is solely used as payment of the Dash 2 Trade subscription fee. However, given that the governance design is still under development it is important to not draw any early conclusion for now as to the core necessity of the token in the overall project given that following the unveil of the governance model we could see appear staking mechanism ensuring governance rights inside the protocol ecosystem.</p><p>Furthermore, it’s also important to note that no transactions tax is enforced by the protocol due to the core team belief that value should ultimately originate from the project and not from people’s transactions.</p><p><strong>Observation:</strong> Important to flag that currently the available documentation does not really explained where the subscription fees paid by users ultimately end up and how those tokens ultimately get reused by the protocol (burning, selling on secondary market, …). This latter point being a bit worrying for potential investors given that depending on the team’s choice, this cryptoeconomic flow could create a built-in upward or downward pressure on the D2T price.</p><h3 id="h-token-allocation" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token allocation</h3><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/7aec2e12d22731092c49f3ffde7d7dbe52c90783d38ea36b32e15d80e6c82fc3.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h2 id="h-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Ecosystem</h2><h3 id="h-dash-2-trade-team" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Dash 2 Trade team</h3><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/duco-van-rossem-85700a25/">Duco Van Rossem</a>, CPO</p><p>Extensive prior experience in quantitative market research at JP Morgan as well as in other various sectors such as machine learning and Big Data.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/andr%C3%A9-ferreira-b89a4981/">André Ferreira</a>, Development Lead</p><p>Prior experience in a broad range of tech related roles (Product owner, UX consultant, …) with as strong background in social media and digital marketing.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/gqorlando/">Orlando Gutierrez</a>, Head Trader</p><p>Head Trader of the Learn 2 Trade platform with a strong trading background on broad range of financial markets.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/jarmo-van-de-seijp/">Jarmo van de Seijp</a>, Web3 Engineer</p><p>Strong background in Web development and Cybersecurity and prior experience in a variety of Web3 ventures.</p><h3 id="h-current-partners" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Current Partners</h3><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/7ef02a7473d789e82a8cbe9b84c14e28729f18306b12a61a77f15f2e4b4ba4b5.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Note:</strong> Important to highlight that Bullbear.cc is another data analytics venture co-founded by Duco Van Rossem the actual CPO of Dash 2 Trade.</p><h3 id="h-current-state-of-the-presale" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Current state of the presale</h3><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/6119310dee94ff9061929bcdf3bc2f857c6dee329d885565da9ded621329b0a3.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h2 id="h-roadmap" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Roadmap</h2><p><strong>Q4 2022, VC Investment</strong></p><ul><li><p>Create founding team and governance (still in progress)</p></li><li><p>Development kick-off</p></li><li><p>Trader partnerships</p></li><li><p>Presale launch of D2T token</p></li><li><p>Full Security Audit</p></li></ul><p><strong>Q1 2023, Website Launch and Presale</strong></p><ul><li><p>Dashboard Launch and Beta Testing</p></li><li><p>Start of subscriptions giving users access to protocol features (Social metrics, technical indicators, watchlist, listing alerts, …)</p></li><li><p>CEX and DEX listing for D2T token</p></li></ul><p><strong>Q4 2023, Backtester</strong></p><ul><li><p>Backtester</p></li><li><p>Risk profiler</p></li><li><p>Trading competition launch</p></li><li><p>Trader AMA’s</p></li><li><p>More CEX partnerships, integration of D2T platform</p></li></ul><p><strong>Q2 2024, Auto-trader</strong></p><ul><li><p>Auto-trader launch</p></li><li><p>Trading competition launch</p></li><li><p>Integration with CEX API’s</p></li><li><p>Competition launch</p></li><li><p>Trader AMA’s</p></li><li><p>Dash 2 Trade Terminal</p></li><li><p>Tradelab</p></li><li><p>Mobile App</p></li><li><p>Widgets</p></li></ul><p><strong>Q4 2024 Social &amp; Copy trading functionality</strong></p><ul><li><p>Charting tools</p></li><li><p>D2T Social trading features</p></li><li><p>Community feedback features</p></li><li><p>Further CEX listings</p></li></ul><h2 id="h-conclusion" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Conclusion</h2><p>Overall, as a crypto analytics trading platform, Dash 2 Trade appears like an interesting new proposition, servicing a solid one stop solution for investors willing to be able to profit easily from on-chain analytics and key network metrics without having to build their own custom data infrastructure. Furthermore, from a price standpoint with a current monthly tier fee premium around $50, the platform appears like a lucrative future alternative for retail investors hosting their own data dashboard and currently using blockchain API provider with monthly subscription ranging north of $150.</p><p>However from a core design stantdpoint, Dash 2 trade still appears to raise a bit too many flags at this stage to be considered as a finalised project. Indeed, from its unclear cryptoeconomic design failing to exhibit a true utility for the D2T token to its lack of membership price stability creating inequality among users depending on the timing of their buying of D2T tokens, Dash 2 Trade appears to need a bit more time to fully adjust its core design.</p><p>Nonetheless, before being too categorical it appears important to wait for the release of the protocol dashboard in Q1 next year and monitor the ongoing technical development related to the upcoming release of the governance design to see if the core team will tackle the several issues highlighted above and grant its D2T token additional utility functions instead of solely using it as technical prop giving a web3 feel to an overall interesting new crypto analytic platform.</p><p>Disclaimer: <em>All information/documents contained in this blog rely solely  on my personal beliefs, and do not constitute professional investment advice.</em></p><p><em>Be careful in your investment and do not invest more than you can afford to loose.</em></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/a53bc5dc9d71adc478a8c0b3680b9f7d1bf587e161c92aec24f8324f75dd3126.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[DevOps Introduction (Part 1)]]></title>
            <link>https://paragraph.com/@cybergen-lab/devops-introduction-part-1</link>
            <guid>36pgdNEZ8vL2iWQODFRh</guid>
            <pubDate>Thu, 10 Nov 2022 20:11:27 GMT</pubDate>
            <description><![CDATA[Hey there, hope everyone is doing fine! 😃 Today we’re going to tackle the first part of our introductory DevOps tutorial that will be splitted in 4 distinct parts and through which we will learn among other things how to:Handle API calls through Python using Santiment APIHow to properly set up a coding projectHow to create and set up a distant cloud server on Digital OceanCreate and manage PostgreSQL database on a distant serverBuild-up data tunnel between an API provider and a distant data ...]]></description>
            <content:encoded><![CDATA[<p>Hey there, hope everyone is doing fine! 😃</p><p>Today we’re going to tackle the first part of our introductory DevOps tutorial that will be splitted in 4 distinct parts and through which we will learn among other things how to:</p><ul><li><p>Handle API calls through Python using Santiment API</p></li><li><p>How to properly set up a coding project</p></li><li><p>How to create and set up a distant cloud server on Digital Ocean</p></li><li><p>Create and manage PostgreSQL database on a distant server</p></li><li><p>Build-up data tunnel between an API provider and a distant data server</p></li><li><p>How to connect a Grafana instance to a cloud hosted database</p></li></ul><p>Phew! 😬</p><p>So as you can see we have quite an awful lot to do here. However, it will all be worth it as by the end of this tutorial we will have a basic understanding on how to create a complete data architecture from scratch, so grab a cup of coffee and once you’re ready join me in the first part of this tutorial.</p><h2 id="h-project-presentation" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Project Presentation</h2><p>So first before diving right-in into the technical aspect of this tuto let’s discuss a little bit about what we’re trying to do here 🤔</p><p>Basically, the idea of this tutorial is to give us a “grand tour” of the main DevOps components and skills that are needed to create a full data infrastructure from scratch. As such, I thought that a good way to do just that would be to create the whole back-end data architecture supporting a Granafana analytical dashboard focusing on the Ethereum network and using the following metrics:</p><ul><li><p>HIstorical price</p></li><li><p>Volume</p></li><li><p>Token circulation</p></li><li><p>24h active addresses</p></li><li><p>Velocity</p></li><li><p>Market capitalisation</p></li><li><p>Top 10 holders holdings</p></li><li><p>Github activity</p></li><li><p>Social Volume</p></li><li><p>Dev activity</p></li><li><p>Network Growth</p></li></ul><h3 id="h-alright-so-now-that-we-have-an-idea-of-what-we-want-to-do-the-next-question-is-how-to-do-it" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Alright, so now that we have an idea of what we want to do, the next question is how to do it?</h3><p>Basically, here we’ll use the most simplistic approach in order to just give you a basic sense of how it can be done. As such, we’ll basically use the Santiment API provider in order to get our needed metrics and then store those value in a set of PostgreSQL databases that we’ll host on a Digital Ocean cloud server. Then, we’ll just connect those databases to a Grafana instance and simply build our Grafana dashboard from there.</p><h3 id="h-data-architecture-flow" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Data architecture flow</h3><ul><li><p><strong>Step 1</strong> — Our Python scripts stored in our Digital Ocean server make API requests to extract the needed metrics for our dashboard.</p></li><li><p><strong>Step 2</strong> — Santiment API return the metrics value that we reformat and store in our custom PostgreSQL databases hosted on Digital Ocean.</p></li><li><p><strong>Step 3</strong> — Our Grafana instance connected to those PostgreSQL tables upload the new metrics values and update accordingly our Ethereum analytical dashboard.</p></li></ul><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/83075e97c3932ada8f38aa92573206fb7a7da3f2c3e0340f7788b8545b5420ec.png" alt="Architecture Flow illustration " blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">Architecture Flow illustration</figcaption></figure><h3 id="h-step-1-requesting-data-from-santiment-api" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Step 1 Requesting data from Santiment API</h3><p>Alright so first thing first before thinking about the rest of our architecture we need to create some code scripts in order to extract our needed metrics from the Santiment API provider.</p><p>Here we basically have two main possibilities to make our API requests either using curl command inside bash scripts or use the python library developed by Santiment. So even though we could dive right into bash scripting, given that the idea of this tutorial is to keep things simple we’ll instead use the handy python library at our disposal here.</p><p>(<em>For those interested, I’ll also do a bonus part of this tutorial where we’ll create the same architecture but using Bash so stay tuned if you’re interested in that</em> 😉)</p><p>So first thing first, you’ll need a python IDE, you’re free to use the one you prefer but on my end I’ll use the Spyder IDE coming with the Anaconda distribution so if you don’t have any IDE or want to have the same screen as me just download anaconda from <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.anaconda.com/products/distribution">here.</a></p><p>Alright so now that we’re set up, before being able to start coding our python API requests, we’ll need to install the python library developed by Santiment API. So to do that open your terminal and if you’re not using any virtual environment copy and paste the following command inside your command line before pressing enter.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/aace98be1d85a1affcf4dff9f352b5393bb4b0442cc1576a60ef35d18b55ccf3.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Cool, we’re now all set up to start coding and creating our API requests. So go ahead and create a dedicated folder where we’ll store the code for our project. Here is a basic example, but feel free to use whichever template you’re most comfortable with.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/77132fd95f1143f60ed6a9945bd546cd821e5ce82a4a6fec80c1a5e9276cbd75.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there, let’s open our python IDE and create our first code script that we’ll call function.py, in which we’ll code as functions all the API requests that we need to make to the Santiment API. The idea here being to keep things organised.</p><p>Alright so first stop, let’s create a function to request from Santiment the ETH historical price using the guidelines provided by Santiment in their <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/santiment/sanpy#getting-the-data">documentation</a>:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/7c6099baf26f9313863ec016dfa66f12b78d4bde59a4102007f3af73c2068b59.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>As you can see it is pretty straightforward and pretty much speaks for itself. So let’s try it out now in order to check that everything is working.</p><p>So first we’ll need to import a few packages among which datetime and sanpy.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/e10360fece380ed856bc6a2916a2af4cfdbb1abccbabe1ab4d806b5ab5e13b00.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright, and from there, let’s us do a basic testing of this first function:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/b64e1a1983afe80fd5886a03d8f9153d04824e041b3be72ef047fa45206c2625.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/92bec2b2f084289de9e4526965cb8d6aba26b60317da6b89c44ad4c594e333cb.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>And it works as expected, as you can see from the above illustration extracted from the Spyder variable explorer.</p><p>Alright, so now that we did the first example, we’re basically going to reiterate the same process for each case and create a function for each of our metrics.</p><p><strong>Marketcap Function</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/638c9269f5ab37d6616916bf6f162490a3b6efd18ef73fb7d9506f92bbaa9f34.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Network Growth</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/8732e7ca5159471f3956a808411e2586bc0c45da9e5bc2c99f4906bda5a7f489.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>⚠️⚠️</p><p>In the following cases given that we only want those metric for the current date, there is no need for a start date and an end date and can instead use here a unique date input that we’ll use both as “from_date” and “to_date”.</p><p>⚠️⚠️</p><p><strong>Current Price</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/1faf5af5063b417c3adfa257dd5d779760dc7b922ddd7abe652a8fb68982a6a8.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Current Volume</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/86213a1517e307d26dc2a6a63e33a10cbb6a50684ba76007e64bb6569290318c.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Daily Active Address</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/6af6a9d54cbcb4ee7ee1efcef8801b4bb1f3d920bb462093342289a295052e1c.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Token Circulation</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/33f496d133addd3efe03bbc8593d7ed4c17d13e181dfffc1f6135f5929e8142d.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Velocity</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/46b7b4d4180551e733d5cf2dfde44a4b48fa70706480611c90be41be8ae084e6.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Social Volume</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/8010159d947b7b399ac14379a9a03f57ed3c7205705e180c0f68ad5344e08e72.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Top 10 holders holdings</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/32c4ca3fa3c7f67aaa739cffb4a66fdec4faa23f4a90cc8fa774aa8e41a58f05.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Dev Activity</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/8df61eb29565f253f75d18acf16ce383d61808d04b5c2568efdf94288df48ece.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Github Activity</strong></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/a6b7bc46417be3ed652ba8ad9f588ca4611970dfa591b2f5f55bda6f79482219.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>⚠️⚠️</p><p><strong>Important note:</strong> Given that we are using the free Santiment API plan here, for some metrics we don’t have access to the latest value and view ourselves limited to only being able to request at most last month values. So it’s not ideal but given that the goal of this tutorial is just to learn how to create an overall data architecture it is not really critical here.</p><p><strong>List of restricted metrics:</strong> Circulation, Velocity, Social Volume, Amount in Top holders, Dev activity, Github activity and Network growth.</p><p>⚠️⚠️</p><p>Alright and that’s it we have now at our disposal all the needed functions to request all our metrics from the Santiment API. So if you want to test it for yourself, here is the full test code:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/f3d9ed0486f9a0975e13cb75089d181799bfb0b38f9bb769426b7de3ee908737.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>So congrats, you made it through this first part of our DevOps tutorial! 🥳🥳</p><p>Next time, we’ll spin up our cloud server on Digital Ocean, create our PostgreSQL database and write our main code script calling those python functions in order to populate our database so stay tuned and happy coding in the meantime.</p><p>Take care 🖖</p><p>N.B As per usual full code available <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/Cybergen300/DevOps-Intro">here</a> on Cybergen Lab Github repo</p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/6ef8ad984012b6b6e022a8c0a7957a6ea048f37dbed2f972c5d0b4c10eb37280.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Weekly News 11/09]]></title>
            <link>https://paragraph.com/@cybergen-lab/weekly-news-11-09</link>
            <guid>gSYFbjEtQc3WesSRlceA</guid>
            <pubDate>Wed, 09 Nov 2022 00:38:04 GMT</pubDate>
            <description><![CDATA[DeFiBebopFollowing its recent soft launch over the summer to an exclusive set of whitelisted address on the Ethereum network, Bebop has now released its public mainnet version on Polygon coming with a really well developed UI and some novel features such as the one-to-many or many-to-one services enabling users to seamlessly consolidate a multi-token portfolio into a single token or inversely spread their holding across multiple tokens. Furthermore, leveraging the expertise of its backing par...]]></description>
            <content:encoded><![CDATA[<h2 id="h-defi" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">DeFi</h2><h3 id="h-bebop" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Bebop</h3><p>Following its recent soft launch over the summer to an exclusive set of whitelisted address on the Ethereum network, Bebop has now released its public mainnet version on Polygon coming with a really well developed UI and some novel features such as the one-to-many or many-to-one services enabling users to seamlessly consolidate a multi-token portfolio into a single token or inversely spread their holding across multiple tokens. Furthermore, leveraging the expertise of its backing partner as a professional market maker, Bebop made the choice of not becoming an automated market maker and ensure its platform liquidity through the more than 60 liquidity platforms offered by Wintermute which comes as interesting choice at a time where the yield appetite of DeFi users is definitely on the rise.</p><h3 id="h-rubic" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Rubic</h3><p>Multi-chain swaps protocol and decentralised exchange Rubic has lost over one million worth of tokens after attackers gained access to the private keys of an administrator’s wallet most likely through a malicious software. Consequently to this hack and of the selling of the coins by the hacker, the RCB lost more than half of its prior value and still hasn’t recovered yet despite the recent core team offer to the hacker promising up to 20% of the RCB stolen as a bug bounty and no further legal action against the return of the remaining 80% RCB stolen.</p><h3 id="h-aave" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Aave</h3><p>The Aave community recently approved the Matter Labs’ proposal to deploy the decentralised lending protocol Aave on its Ethereum scaling product zkSync at a very large majority. However, this recent vote is for now only enabling the deployment of Aave to zkSync’s testnet. Indeed, Matter Labs will need to propose a new vote before being able to operate mainnet deployment.</p><h2 id="h-web3" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Web3</h2><h3 id="h-iota" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">IOTA</h3><p>TanglePay, IOTA’s self-custody wallet recently announced a partnership with the Mises browser in order to enable users to profit from TanglePay wallet features both on desktop and mobile while using Mises.</p><h3 id="h-exodus" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Exodus</h3><p>Moving forward on its effort to support an ever growing number of chains and offer the highest possible flexibility to its users, exodus one of the leading self-custodial crytpocurrency software platform recently announced the addition of the BNB smart Chain to its Exodus’s browser-based Web3 wallet.</p><h3 id="h-flux" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Flux</h3><p>Flux, the frontrunner in building decentralized infrastructure to power Web3 development, today announced the launch of Jetpack 2.0, a system update enabling easier and cheaper deployment of decentralized applications (Dapps) onto the Flux decentralized cloud. Additionally, Jetpack 2.0 also brings a variety of new features, including an improved Dapp registration and management process as well as direct fiat payment and settlement services.</p><h2 id="h-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Ecosystem</h2><h3 id="h-ripple-vs-sec" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Ripple vs SEC</h3><p>In a surprising turn of event, crypto exchange Coinbase which was among the first to halt trading of the XRP token following the filing of the SEC lawsuit against Ripple recently filed an amicus brief in the ongoing lawsuit between the SEC and Ripple Labs supporting Ripple Labs and highlighting the lack of clear SEC regulatory guidelines over the past couple of years.</p><h3 id="h-japan" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Japan</h3><p>NTT Docomo, Japan’s largest mobile operator with over $40 billion in annual revenue, recently partnered with multichain smart contract platform Astar Network to accelerate the Web3 implementation in the country through the build-up of a dedicated consortium giving to individuals and corporations the ability to utilize tokens for governance.</p><h2 id="h-market" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Market</h2><ul><li><p>BTC     $18,576.50</p></li><li><p>ETH     $1332.40</p></li><li><p>BNB     $330,97</p></li><li><p>ADA     $0.37</p></li><li><p>SOL     $23.89</p></li></ul><h3 id="h-ftx-ventures" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">FTX ventures</h3><p>FTX Ventures, the venture arm of FTX recently operated its first Web3 social media related investment through its participation for an undisclosed amount in a recent funding round held by Aave’s social media graph Lens Protocol.</p><h3 id="h-fordefi" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Fordefi</h3><p>Fordefi, a financial technology and software company, today announced an $18 million seed round led by Lighspeed Venture Partners and Alameda Research in order to further support the development of its newly launched institutional MPC wallet.</p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/0677e02b21d3dc301f77a3449947ea471ade8aca3ecaaff341c201ddd9b2c4b7.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Dune Analytics Introduction ]]></title>
            <link>https://paragraph.com/@cybergen-lab/dune-analytics-introduction</link>
            <guid>EkFmjMrbVdj9clMY2qW4</guid>
            <pubDate>Sun, 06 Nov 2022 18:59:35 GMT</pubDate>
            <description><![CDATA[Hey everyone, hope everything is going well on your side! 😃 Today we’ll start introducing a pretty powerful analytical tool, called Dune Analytics largely used all throughout the space and enabling power users to freely and easily access on-chain data through simple SQL queries in order to create dedicated analytics dashboard supporting market research. So to do that, we’ll create the below dashboard analysing a set of key metrics of the Chainlink ETH/USD data feed in order to learn how to q...]]></description>
            <content:encoded><![CDATA[<p>Hey everyone, hope everything is going well on your side! 😃</p><p>Today we’ll start introducing a pretty powerful analytical tool, called Dune Analytics largely used all throughout the space and enabling power users to freely and easily access on-chain data through simple SQL queries in order to create dedicated analytics dashboard supporting market research.</p><p>So to do that, we’ll create the below dashboard analysing a set of key metrics of the Chainlink ETH/USD data feed in order to learn how to query data through Dune analytics and also present those data using the different types of modules offered by the platform (counters, graph, tables …)</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/5b1e8707352433befe1202635c4d672b2863765fc44f3da3d811c63d64748545.png" alt="Chainlink ETH/USD Feed Analytical dashboard  (Ethereum) " blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">Chainlink ETH/USD Feed Analytical dashboard (Ethereum)</figcaption></figure><p>Alright so let’s get right into it, first if you haven’t already created one you’ll need to create a Dune analytics account using the following <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://dune.com/auth/register">link</a>.</p><p>Once you’re set you should be met with the following homepage upon connecting to the platform.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/b8809671fa42a8cefa7aa5b555efb7155101994304c66b41e543df839efef2cb.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there click on “New query” in order to get redirected toward the query builder.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/b244ade9caf91e9aa1f0c17d22cf13dee433f117edf61e98501f092c3691a937.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>There as shown on the previous captcha we are going to be met with 3 main components:</p><p>Component 1 — The query section where we are going to write our SQL queries</p><p>Component 2 — The Query result section where we’ll be able to visualize and fine-tune the visual rendering of our queries (tables, graphs, …)</p><p>Component 3 — The table section listing all the available Dune Analytics tables per integrated network.</p><p>Alright so let’s start building our first component here and create our SQL query for the number of requests made to the Chainlink ETH/USD price feed over the last hour. First we need to get the address of the Chainlink Price feed contract on the Ethereum mainnet that you can find <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://data.chain.link/ethereum/mainnet/crypto-usd/eth-usd">here</a>.</p><p>Chainlink ETH/USD address: 0x5f4ec3df9cbd43714fe2740f5e3616155c5b8419</p><p>Then from there,we can start building our query.</p><p>So first we need to extract all the requests which interacted with the Chainlink contract over the past hour. So to do that, we’ll use the <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://dune.com/docs/reference/tables/v1/raw/ethereum-mainnet/traces/#ethereumtraces">ethereum.traces database</a> allowing us among other things to extract the address of the requesters and use the COUNT() SQL function in order to obtain the overall number of requests per requester.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/5f8d08937dddb532e5afe32932789363833de7e479c690bf179f114401af44ef.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/985805728117916c966ba031786e9b612ada66240f8e2b225192502ac2a3547c.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there we need to calculate the overall sum of total requests over the past hour. So to do that we’ll need to fine tune a litle bit our code and encapsulate our previous code inside a table called here table1 in order to be able to use a second table calling table1 and be able to apply the SUM() function over our request_nb column to obtain our total request number.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/3141cc33f02cbc25ca8bee93889267c633394496536df0a4411426819bbe298a.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/8e6bbb2e9a089f64c97273e01ed971d49183b9f924eaebcf9b5552a6865272de.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now that we have our value, let’s create a new visualization in order to get that counter view that we want. So first go ahead and save your new query. Then once it’s done click on “New visualization” and select Counter type before clicking on Add Visualization.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/4d491a4f372155b8995935cd6e6a7a3452f7799f57a87a0960db45a04b6b1812.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there feel free to use the names and style that you want or if you want to follow strictly what I did enter the parameters showned below.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/b149b6e7d1d755bd9bea0665ce1de1a9b8e430a12ab5e07cb0d2dafcf4996abb.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>When you’re ready, click on add to dashboard and click on the add button linked to the dashboard your willing to use.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/44a18daecae3e268cc794e5e01d7027a241d97df5123aa8418f2a98672c7130a.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>And here it is ! If you go back to your dashboard you’ll see that we created our first live widget, on our dashboard 😃</p><p>Alright so now reiterating the same process we could manually create two new queries and go through all our previous steps to create the daily and weekly request counters.</p><p>However, it would be cumbersome and Dune Analytics allows us to create similar queries in a much more clever way so let’s just use it 😉</p><p>So first, let’s get back to our previous query and click on the fork button on the upper right part of the screen.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/dd1579efb1ab69fdcd822a89ea9c1885ee8cf9a2c352b81a86f8ed4b09a8db2a.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>As you can see, we’ll get automatically redirected to a new copy or fork of our previous query enabling us to use right off the shelf our previous work.</p><p>So from there first, we’ll save this new query as our daily counter and change accordingly our time specification in our SQL queries before adding this counter to our new dashboard.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/d0d08f05f57032820eda8a8faa1e45cfeae78cdc8b2c2bfd76a552fb16378250.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>And then reiterating the same process, we easily create our weekly counter as follows.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/905cc82531f2eb4ffbb1c7fd9b91324378c63e1f22cb819a1b5fec3ff47293fa.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so now let’s move on to create our first table, showcasing the latest ETH/USD price feed requests. So here we want to be able to show 5 key informations:</p><ul><li><p>Time of the request</p></li><li><p>Address requesting the ETH/USD price</p></li><li><p>The gas limit used</p></li><li><p>The actual gas price paid for the request</p></li><li><p>The transaction hash</p></li></ul><p>So first, let’s fork our initial query. From there as you can see by looking at the ethereum.traces dataset details, we already have at our disposal all those informations.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/fb7519012fffcc3d0998aedbb70b4881e73de985d612d53a752a728ce6fa17e2.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>As such, the only thing that we need to do here to obtain our table, is to once again fork our initial query and fine tune it in order to extract all the needed metrics.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/44d12818d559810ef78370161321f11a0e5741ba543566fd41f36919e0c5d995.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there , as done previously we we need to save this new query and create a new visualisation in order to incorporate our table inside our dashboard. The only difference here being that we’ll go with the table visualisation.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/bbae1b23a55edc190e92adac8d37f8f89b24d9fd745d5f55dbd9847e200b68da.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there again, feel free to use any style or title you want and remember to add this new element to the dashboard by clicking on the add dashboard button.</p><p>Alright so now moving on to our other counter, corresponding to the daily gas costs, here it is again pretty straightforward so try it for yourself before reading the rest in order to try it out for yourself and see if you start to get a hang of it.</p><p><strong>Solution</strong></p><p>Here as previously we’ll fork our initial query, however this time we don’t want the sum of all transactions but the average gas cost over a day period as such we’ll use here the AVG() function as follows:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/49894ff35bf1a265215b26f7edd8349c1e9d2de29a765bc6cf861bf5eeb037bd.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Observation: Here given the inner limitation of Dune analytics we cannot only truncate our set of data in order to obtain solely the current date data through using CURDATE() so in order to tackle this issue we’ll use here the following work around consisting to order our extracted dataset by descending time order and then only keeping the first value using LIMIT.</p><p>From there as per usual use the counter visualisation and add it to your dashboard.</p><p>Alright so now let’s get to our graph visualisation, representing the evolution of the average gas cost over the past 14 days. To do so, let’s start by first fork our previous gas cost query. From there just change the interval day from 1 to 14 and here you go, you end up with the following table summarising the average ETH/USD query gas cost for each day over the past two weeks.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/85a6b53ea4c34d043a951bb7b702b1add2565a775d8e1055d2f1755a370937e1.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>From there click on new visualisation and choose Area chart.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/76bdb94d129d9a14c4830004f6105106ba819f8fd540f44c4f91db1b04847789.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>And here it is! You have a brand new live graph chart exhibiting the evolution of the gas cost over the past 14 days.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/2d77ad8ffccaf9686ed64a0917516bc6bc9b7daf49e8324d2aae4c845a1994d0.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright, so now let’s create our last component and create our table showcasing the 10 main biggest ETH/USD price feed requester for the current day. So first, let’s fork our initial query.</p><p>The first thing that we need to do here is to work around the little limitation of Dune Analytics preventing us to only have value from the current day without yesterday’s value. Indeed here, if we only use the COUNT() function and sort our table with by the number of requests we’ll end up with the following result which is not that what we want.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/f5858c83ddfaff536811a2f1af7685c7a3b9b8ee722706b7661b370452798509.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>As such to avoid this situation, a quick work around is to couple our previous request number ordering with another ordering applied on the time column as follows:</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/bef3e6a1fb1107e7f3ef10fd4d5da59f160349f7ae5d695a52fd96f484517468.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/f695b9b4cd913368978a726cd94513616df507088a2e6e95319eb9adb0ab7c24.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so from there, even though we could stop here, it appears interesting to change the requester address with their real name whenever possible. As such, to do so, let’s head over to etherscan. There, using the contract address from our table let’s search the owner of the \xcf7fe2e614f568989869f4aade060f4eb8a105be contract.</p><p><strong><em>Note:</em></strong> It is mportant to be aware that as per convention Dune analytics replaces the initial 0 of each Ethereum address by “\” as such when using Etherscan we have to be cautious to reformat the address into 0xcf7fe2e614f568989869f4aade060f4eb8a105be.</p><p>Furthermore, finding the owner of the contract might be a bit tricky at times and might even necessitate in some instances to either browse a bit around Etherscan or Google to find the contract owner.</p><p>However, in our particular case here, it appears as you can see pretty straightforward as shows the below Etherescan where we can easily spot that this contract corresponds to the ENS domain protocol.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/111c375c0197f4764d3ab40af0267df05d58f4542fc59f2c5997310e5cdd5f31.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Additionally, if you want to double-check on that point, one way to do so is to extract in Dune analytics the latest transaction hash, select those originating from that particular contract address and then go check the internal transaction of this transaction in etherscan to see if this contract is indeed used in the suspected context.</p><p><strong>Example for the 0xcf7fe2e614f568989869f4aade060f4eb8a105be ENS domain contract:</strong></p><p><em>Used transaction hash:</em> 0x12cb364247f1fd1c93bbfb255c238ed4e9ea039482248a3ae5866d51a92fd130</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/95b54a53f204e52d45305a1bccd3e943c92d17e03e827dcb4b583bd41d4b0511.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/e27698ee7c4e1921b2de4311028caf720cedd04982418b6069a510292e0bdc25.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Alright so from there once we have all our corresponding names for each address let’s integrate them in our table. To do so, we’ll have to refine a bit our code and as before incorporate our current existing code in a table that we&apos;ll call table1 in order to then be bale to call it in through another table and apply a set of transformation on our requester_address column using the WHEN argument allowing us to replace our current Ethereum address by its corresponding human readable name.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/bef3e6a1fb1107e7f3ef10fd4d5da59f160349f7ae5d695a52fd96f484517468.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/a40101807b76c6b5e7da5d87f7281d36640fbad85cfcf6f7e1e0ef476989c3b0.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>And from there you should be able to obtain a table with much more understandable names</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/18ca3bfb888417bee5f7f4676a3db11931f69c26523805ec4791f1d98969e4fb.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p><strong>Observation:</strong> In some instance, it is just not possible to find the contract adress owner so don’t sweat it and just put the reformated contract address for everyone to see in your table. Furthermore, given the used SQL function if a new address appears the following day in your table, the request origin will show a blank so I’ll let you fine tune that part as a small exercise if you want to have a better polished dashboard version.</p><p>Alright and after all that by just adding this new component you should have on your dashboard all our needed components. So from there just go ahead and click on the edit button located at the top right of your dashboard in order to be able to move things around and reformat the visual design of your dashboard as you prefer.</p><p><strong>Note:</strong> If you want to create custom widget as the one showing the CyberGen Lab logo just just click on Add text widget and use the set of available markdown.</p><p>And that’s it ! Congratulations you made your first Dune Analytics dashboard 🥳🥳</p><p>So as always don’t hesitate to build upon that, create your own dashboards on other projects and have fun with the plaftorm 😃</p><p>Until next time, see y’all 🖖</p><h3 id="h-resources" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Resources</h3><p>Link to Chainlink ETH/USD price feed dashboard (Ethereum mainnet):</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://dune.com/cybergenlab/chainlink-ethusd-feed">https://dune.com/cybergenlab/chainlink-ethusd-feed</a></p><p>Link to Chainlink ETH/USD price feed dashboard (Polygon mainnet):</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://dune.com/cybergenlab/polygon-chainlink-ethusd">https://dune.com/cybergenlab/polygon-chainlink-ethusd</a></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/4e0dd35f36fae400f9161442bf09ef67bb17941e682ee9b9e1608d3767220f0e.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Spacemesh Deep Dive Analysis]]></title>
            <link>https://paragraph.com/@cybergen-lab/spacemesh-deep-dive-analysis</link>
            <guid>e849VMKRshktnKIyqBg8</guid>
            <pubDate>Thu, 03 Nov 2022 17:23:50 GMT</pubDate>
            <description><![CDATA[Over the past year or so, the blockchain industry and more particularly Proof of Work networks have become the target of increasing critiscim from both mainstream media and regulators starting to perceive this new industry as a new emerging threat to global environmental policies as shows the recent regulatory chatter around a potential winter seasonal ban on crypto mining in the EU or the recent damning White House report on PoW mining. Furthermore, from a pure networking standpoint, most Po...]]></description>
            <content:encoded><![CDATA[<p>Over the past year or so, the blockchain industry and more particularly Proof of Work networks have become the target of increasing critiscim from both mainstream media and regulators starting to perceive this new industry as a new emerging threat to global environmental policies as shows the recent regulatory chatter around a potential winter seasonal ban on crypto mining in the EU or the recent damning <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.whitehouse.gov/wp-content/uploads/2022/09/09-2022-Crypto-Assets-and-Climate-Report.pdf">White House report</a> on PoW mining.</p><p>Furthermore, from a pure networking standpoint, most PoW network are also starting to face a fair number of issues trickling down from their ever-growing mining process cartelization around a small subset of miners investing massive amount of CAPEX in order to spin up highly efficient and specialised mining facilities pricing out small casual miners out of of the ecosystem and creating a de facto a massive barrier of entry for any newcomer.</p><p>As such, in the light of this potential future regulatory breakdown on PoW network it appears interesting to start analysing other viable alternatives such as Spacemesh aiming to offer a new form of decentralised payment relying on a novel form of consensus algorithm called Proof of SpaceTime leveraging miners available disk storage space.</p><h2 id="h-spacemesh-overview" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Spacemesh Overview</h2><p>Spacemesh is usually advertised by its core project team as the natural evolution of PoW decentralisation payment network into a better and more efficient form enabling fairer token distribution and lesser energy consumption. Nonetheless, it appears important to bear in mind here that no first generation PoW blockchain ever truly succeeded to become viable decentralised payment networks (e.g Venezuela) and for the most part never got past the store of value use case.</p><p>Else relatively to its technical attributes, the Spacemesh protocol is most well described as a blockmesh protocol leveraging a custom proof of Spacetime consensus algorithm and a dedicated mesh network architecture avoiding the centralisation and energy consumption pitfalls of PoW protocols while offering at the same time a higher degree of token and network distribution.</p><h2 id="h-spacemesh-architecture" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Spacemesh Architecture</h2><p>The Spacemesh protocol infrastructure is built on top of four core components each maintained separately but working in concert to enable the broader support of the Spacemesh network and ecosystem.</p><ul><li><p><strong>Go Spacemesh</strong> — Open source codebase implementing the bulk of the Spacemesh core protocol and enabling each miner to run a protocol-compatible full node as well as a mining node on a variety of different systems and architectures</p></li><li><p><strong>PoET server</strong> — Web service enabling the implementation of temporality inside the Spacemesh network.</p></li><li><p><strong>Proof of SpaceTime (PoST)</strong> — Collective set of consensus algorithms employed simultaneously to ensure the build-up of a Spacemesh ledger implementing an ordered BFT canonical view of blocks.</p></li><li><p><strong>Spacemesh App</strong> — GUI-based desktop application acting as a wallet and running a Go-spacemesh full node instance in order to enable users to mine.</p></li></ul><h2 id="h-spacemesh-design" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Spacemesh Design</h2><p>PoW network, such as Bitcoin, usually tackle the Sybil attack problematic (where one entity secretly create a large number of miners) through the implementation of a computing intensive process leveraging miner’s electrical consumption cost as a scarce resource in order to impose an implicit limit on the number of nodes that a single actor can realistically spin-up.</p><p>However, as mentioned above despite being extremely robust from a technical standpoint, this solution is also extremely energy intensive. Consequently, Spacemesh is replacing Proof of Work with a novel type of consensus algorithm called Proof of SpaceTime (PoST) allowing miners to use disk space rather than computation as the scarce resource for preventing Sybil attacks.</p><p><strong>How does it work?</strong></p><p>Technically speaking let’s take the example of Miner A willing to pledge 500GB of disk space to the Spacemesh network to start mining. First, Miner A is going to go through a phase known as the init phase during which the network is going to require her to perform a one-time Proof of Work process in order to fill up her 500GB of storage with cryptographic junk.</p><p>Once, this phase is done, our Miner A is then set to start mining on Spacemesh and is going to be able to prove its eligibility for taking part in the mining process by just sending prior to each mining period a proof of its storage allocation, also known as activation transaction (ATX), build on top of the stored cryptographic junk.</p><p><strong>How to make sure that miners effectively allocate space to the network between two activation transaction?</strong></p><p>One of the issue with our previous process is that our Miner A could effectively perform the one-time proof of work process in order to be able to start mining and then supress the process output before reprocessing it right before the next period putting us back effectively in the situation where the scarce resource is not anymore the available disk space owned by Miner A but again its processing power leading us straight back to our previous PoW situation. As such, in order to avoid this problematic and enforce the disk space as the sole usable scarce resource, Spacemesh is incorporating the time primitive inside it’s ecosystem and more particurlarly in the build-up of each activation transaction in order to ensure that Miner A indeed allocated continuously her 500 GB to the network between each activation transaction.</p><p><strong>How is time integrated in the Spacemesh protocol?</strong></p><p>In order to implement time inside its mining process, Spacemesh is leveraging a cryptographic primitive called Proof of Elapsed Time (PoET) providing a proof that a sequential work cycle has been performed and thus that a given period of time has elapsed.</p><p>Furthermore in the case of Spacemesh, given than what is needed is merely a proof that a certain amount of time has elapsed there is no need for every miners to perform this sequential work themselves as it would defeat the initial purpose of Spacemesh to avoid continuous CPU work. Indeed, here it appears sufficient to have several PoET instances shared among multiple miners in order to minimise the overall electrical cost especially since each miner can easily check the integrity of the proof send by the PoET server and used another PoET server in case of malicious behaviour.</p><h2 id="h-consensus-in-spacemesh" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Consensus in Spacemesh</h2><p>On the contrary of PoW networks enforcing a racing mechanism between miners where the first miner to produce and communicate a valid block to the network earns the right to produce the current block of the chain and where each block are produce one after the other. In the case of Spacemesh, multiple blocks get created simultaneously and every miner that expands sufficient space-time resource is deterministically eligible to generate a block.</p><p>As such, given its inherent lack of block ordering and randomness features normally generated by the racing mechanism, Spacemesh needs to implement a way of ordering newly creating blocks in order to form its canonical ledger as well as an external source of randomness in order to make difficult to predict who will be eligible to generate a block</p><h3 id="h-spacemesh-block-ordering" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Spacemesh block ordering</h3><p>Spacemesh employs simultaneously two consensus mechanisms in parallel to determine the canonical set of blocks at each layer:</p><ul><li><p>Hare protocol — Byzantine Fault Tolerant consensus algorithm</p></li><li><p>Tortoise Protocol — Slow vote based mechanisms requiring each node to count the votes “for” and “against” each previous block and tallying these votes to eventually achieve consensus.</p></li></ul><h3 id="h-tortoise-protocol" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Tortoise protocol</h3><p>The Tortoise protocol is the mechanism by which the Spacemesh network achieves final consensus on the set of blocks and transactions that form the canonical ledger.</p><p>Technically speaking, each time a miner produces a new block, they include in that block a “votes” field that lists one or more previous blocks that the miner also considers valid at the time the block is produced.</p><p>→ <em>Each time, a new block links to a given older block, the vote tally for that older block increases by one. On the contrary, any block generated after a given older block that doesn’t vote for that block is counted as a vote against the block.</em></p><p>Furthermore, the Tortoise protocol is using a majority-rule fashion such that any block with a net tally greater than the irreversibility threshold becomes part of the canonical ledger.</p><p><strong><em>Observation</em></strong>: Currently each vote is weighted proportionally to the amount of space-time resources it represents, e.g, votes cast by a miner that has committed 200GB count twice as much as votes cast by a miner that has committed 100GB to the protocol. As such, given the current lack of upper storage limit this design feature could represent a critical risk to the network if not fixed before mainnet launch given that a cartel or single massive miner could seize control of the Tortoise protocol voting and by extension of the canonical ledger.</p><p><strong><em>Why the need for two consensus protocols?</em></strong></p><p>Despite being extremely robust, the Tortoise protocol is also pretty slow to run and cannot run on the most recent layer since there are not yet any newer blocks voting on these blocks. Therefore, Spacemesh is using a quicker consensus mechanism in order to bootstrap the Tortoise protocol called the Hare protocol which purpose is to allow each node to quickly determine the set of blocks in the layer to be voted on by all honest miners.</p><h3 id="h-hare-protocol" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Hare protocol</h3><p>The Hare protocol, run once per layer, is a BFT-compatible algorithm achieving consensus in four round and levergaing the participation of a randomly selected subset of available eligible block producers.</p><p>Technically speaking, at the beginning of each round, each miner draws through the use of a Verifiable Random function a role that tells it whether it’s active (i.e participating) or passive (i.e just observing) in this round.</p><p><em>If the Verifiable Random Function (VRF) output passes some threshold depending on the number of total active miners then the miner is active else it remains passive for that round.</em></p><p><strong>Hare protocol flow</strong></p><p>The protocol takes place over four rounds, which are preceded by a pre-round:</p><ul><li><p><strong>Pre-round</strong> — Each active participants shares their current view of blocks that are valid. At the end of the round, each then factors in the views shared by other participants and updates their view by removing blocks that didn’t receive enough support from other participants.</p></li><li><p><strong>Status round</strong> — Each active participant broadcasts a status message reporting its updated view.</p></li><li><p><strong>Proposal round</strong> — Each active participant broadcasts a proposal to the group, based on the results of the previous round. One of these participants will be randomly chosen to be the leader.</p></li><li><p><strong>Commit round</strong> — Each active participant independently determines who was elected leader, reviews the proposal from this leader, and signal its willingness to commit to it to the group. By the end of this round, each participant that received a valid proposal from the leader and a sufficient number of commit messages from other participants creates a commit certificate including all of this information.</p></li><li><p><strong>Notify round</strong> — Each active participant holding a commit certificate broadcasts it to the group. Then, if a sufficient number of commit certificates are received from other participants, the protocol terminates and each participant knows the proposed canonical set of blocks.</p></li></ul><h3 id="h-finality-in-spacemesh" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Finality in Spacemesh</h3><p>Contrary to PoW network the Spacemesh goes through two distinct phase of finalisation. A first phase corresponding to a temporary canonical consensus produce by the Hare protocol and a second stage corresponding to the final canonical consensus once the Tortoise protocol reaffirm the Hare proposed canonical order.</p><p><em>Note:</em> In the case where the Tortoise reject the proposed canonical order output by the Hare protocol, both rewards and transactions are rolled back and reapplied based on the new set of valid blocks.</p><p>See Appendix for greater details on Spacemesh block creation and transactions structure</p><h2 id="h-spacemesh-features" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Spacemesh features</h2><h3 id="h-beacon" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Beacon</h3><p>Spacemesh runs a beacon protocol to generate a random value at the beginning of every epoch. In order to take part in this process, a miner must satisfy the following conditions:</p><ul><li><p>Be online</p></li><li><p>Be synced and having listened to network gossip for at least 2 layers</p></li><li><p>Be an eligible miner since at least epoch N-1</p></li></ul><p>Technically speaking the beacon protocol is essentially a simplified Tortoise consensus protocol where each sampled miner first propose its own seed of randomness before the casting of all votes and the reaching of a global network consensus on the beacon value to be used in the next epoch.</p><p><em>Note:</em> Having the correct beacon determines weather a miner can generate proposals and participate in the Hare consensus protocol to earn mining rewards.</p><h3 id="h-self-healing" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Self-Healing</h3><p>While the Tortoise protocol is very secure it is also fragile in the sense that its security rely solely on the assumption that the protocol holds all the way from genesis. However, over a long enough period of time, the likelihood of some events to occurs and cause the protocol to faill approaches one. As such, in order to easily recover, Spacemesh includes a feature known as self-healing through which all nodes agree at every layer on a random coin-toss in order to easily make a decision regarding a block in the case where the margin of vote casted for and against this latter is very narrow.</p><h3 id="h-smesh" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">SMESH</h3><p>Future high-level programming language that will compile to SVM WebAssembly code and enable developer to easily build on top of the Spacemesh ecosystem and integrate smart contract logic inside their dApps.</p><h2 id="h-spacemesh-governance" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Spacemesh Governance</h2><p>While being fully open source, the governance of the Spacemesh protocol still remains pretty much centralised around the core team relatively to the protocol development and other critical aspects of the projects such as the coin issuance schedule.</p><p>Furthermore, it is important to note that there is currently no plans or discussion for a potential future on-chain governance of the upcoming Spacemesh Mainnet which could appear to some a bit worrying given the current track record of the core team and the numerous development delays suffered by the project originally set to hit market years ago.</p><h2 id="h-spacemesh-cryptoeconomics" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Spacemesh cryptoeconomics</h2><p>As every other layer 1 blockchains aiming to become viable decentralised payment networks, Spacemesh economical design is revolving around a native token called MESH whose two sole purposes are:</p><ul><li><p>Facilitating token denominated network reward for active agents supporting the network</p></li><li><p>Acting as a payment vehicle between users.</p></li></ul><p>Note: The network being still in its testnet phase there is for now no actual reliable market metrics available.</p><h3 id="h-token-distribution" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token Distribution</h3><p>Accordingly to its core goal of ensuring a fairer token distribution, the core team avoided any pre-mint mechanism that would have created a front-loading situation for early adopters and innovators and hinder any future token distribution to the rest of the ecosystem.</p><p>Furthermore, from a token distribution standpoint, similarly to Bitcoin, Spacemesh is using a custom gradual decay pattern enforced after each layer in order to implement an overall token issuance half-life of 29 years. However, on the contrary of PoW network where rewards are attributed solely to the block producer, in the case of Spacemesh given that multiple blocks get produced at the same time, all fees and network reward get pooled together during each mining period before then being proportionally reallocated to each participating miner up to the extent of work effectively performed across the period.</p><p><strong><em>Observation</em></strong>: As it stands, it appears that despite its claim of ensuring a fairer token distribution Spacemesh is here falling a bit short given its lack of innovation from a token emission standpoint compared to other existing PoW networks as well as its current network design which would in the current state of things inevitably lead to a token front-loading situation benefiting big miners right after mainnet launch due to the cryptoeconomic choice of allocating proportionally more mining slots to miners allocating more space to the network without any limitation.</p><h2 id="h-spacemesh-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Spacemesh Ecosystem</h2><h3 id="h-spacemesh-team" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Spacemesh team</h3><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/tomerafek/">Tomer Afek</a>, Co-founder, CEO</p><p>Extensive prior experience in M&amp;A and finance as well as in the tech industry with various ventures in the media and and web publishing sectors.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/ramikasterstein/">Rami Kasterstein</a>, Co-founder</p><p>Web entrepreneur and venture capitalist involved in several Web3 media and software related ventures.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/avive/">Aviv Eyal</a>, CPO, Co-founder</p><p>Blockchain specialist with extensive experience in mobile gaming and Web3 consulting.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://talmoran.net/">Tal Moran</a>, Chief Scientist</p><p>Blockchain specialist, computing science researcher and theoretical cryptography professor at the Reichman University.</p><h3 id="h-ecosystem-current-state" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Ecosystem current state</h3><p>As of now given the current state of development of the project, there is no ecosystem per se developed around the Spacemesh protocol. Furthermore, in the light of the numerous delay in the core project development witnessed over the past threeyears, it appears sound to say that the development of the ecosystem around the core protocol will most likely require several additional years.</p><h3 id="h-roadmap" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Roadmap</h3><p>As of now the core team is currently working on the testnet version of the Spacemesh protocol and aims to release its mainnet version later in December. However, in the light of the numerous delay witnessed since the initial launch date set back in 2018 it is important to remain cautious as the team might pushed back again the launch of its mainnet toward later in 2023.</p><h2 id="h-conclusion" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Conclusion</h2><p>Given the current Spacemesh mining process rewarding miner proportionally to their allocated disk space without any max cap and the cryptoeconomic choice of using a decay pattern the claim relative to a fairer token distribution appears for now somewhat misleading given that the nature of the network itself will inevitably create a front-loading situation for professional miners in the early days of the network if no max storage cap is implemented until then. Additionally, the proportional weighting of miners votes inside the Tortoise protocol appears to also be another critical reason why the implementation of a max cap would be a critical need given the risk of Sybil attack from an entity with extensive disk-space resouces upon mainnet launch.</p><p>Additionally, the protocol claim relative to becoming the main decentralised payment network protocol seems for now quite doubtful given the current protocol technical limitations compared to other more centralised alternatives and the fact that potential users will most likely not switch towards a more decentralised alternative if not at least as efficient from a settlement time standpoint as the ones they’re currently using.</p><p>So to conclude, despite the fact that this protocol is proposing an interesting alternative network structure that could bring new opportunities for miners through its novel token distribution and network architecture it appears that for now Spacemesh still need more time to further develop and fine-tune its core network and cryptoeconomic design before being able to be considered as a viable upcoming alternative to existing PoW networks such as Bitcoin.</p><h2 id="h-appendix" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Appendix</h2><p>Spacemesh uses the the account model for storing value, meaning that that users are encouraged to reuse accounts for multiple transactions and that from different transactions increasing the account balance become fungible and indistinguishable from on another.</p><p><em>Note:</em> Each account store a transaction counter in order to avoid any possiblity of replay attack inside the network.</p><h3 id="h-transaction-structure" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Transaction Structure</h3><p>The transaction data structure contains the following elements:</p><ul><li><p>Recipient’s address</p></li><li><p>Amount to transfer</p></li><li><p>Amount of the fee to be paid</p></li><li><p>Transaction counter</p></li><li><p>Signature</p></li></ul><p><em>Note:</em> Transaction do not mention explicitely the sender’s address given that each transaction is signed using the private key of the sender’s account from which the public key can be easily derived.</p><h3 id="h-transaction-processing" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Transaction processing</h3><p>Each miner receive each incoming unprocessed transactions locally over GRPC or via network gossip and automatically save those in its mempool after performing a serie of basic check ensuring the contextual validity of the transaction. Furthermore, the miner also further gossip the transaction in order to ensure its propagation to all peer in the network.</p><p><em>Note:</em> A transaction is deemed contextually valid if at the moment when the transaction is applied to the global state, the following condition apply:</p><ul><li><p>The transaction appears in a contextually valid block</p></li><li><p>The origin account exists</p></li><li><p>The counter on the account matches the transaction counter</p></li><li><p>The account balance is greater than or equal to the transaction amount + fee</p></li></ul><h3 id="h-transaction-ordering" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Transaction ordering</h3><p>The transaction ordering in Spacemesh is defined as follows:</p><ul><li><p>Blocks in layer n comes before those in layer n+1</p></li><li><p>The IDs of all contextually valid blocks in layer n are listed in ascending order</p></li><li><p>These block IDs are concatenated and hashed</p></li><li><p>The hash sum is used as the seed for a Mersenne Twister</p></li><li><p>A Fisher-Yates shuffle is performed on the list of block IDs using the output of the Mersenne Twister</p></li><li><p>Transactions from the ordered blcoks are then applied in the order they appear in each block ignoring duplicates.</p></li></ul><h3 id="h-" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0"></h3><p>Disclaimer: <em>All information/documents contained in this article rely solely  on my personal beliefs, and do not constitute professional investment advice.</em></p><p><em>Be careful in your investment and do not invest more than you can afford to loose.</em></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/7e8d3e6e98f2fce5fdbd37e7217f16d1e4ad040b03877605202d132193af7e56.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[Weekly News 11/01]]></title>
            <link>https://paragraph.com/@cybergen-lab/weekly-news-11-01</link>
            <guid>BTkka4x7s5ONK9lpJoR3</guid>
            <pubDate>Tue, 01 Nov 2022 13:53:17 GMT</pubDate>
            <description><![CDATA[DeFiArco ProtocolAdding on the to disastrous launch of the Aptos network last month, the Arco Protocol a new cross-chain decentralised finance platform developed on the Aptos blockchain recently experienced a disastrous fundraise DEX offering which led to a temporary shutdown of the project and the loss of key industry partnerships (Celer Network, Wormhole). At press time, the core team is polling the community in order to decide whether the funds received should be returned, kept or if the p...]]></description>
            <content:encoded><![CDATA[<h2 id="h-defi" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">DeFi</h2><h3 id="h-arco-protocol" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Arco Protocol</h3><p>Adding on the to disastrous launch of the Aptos network last month, the Arco Protocol a new cross-chain decentralised finance platform developed on the Aptos blockchain recently experienced a disastrous fundraise DEX offering which led to a temporary shutdown of the project and the loss of key industry partnerships (Celer Network, Wormhole). At press time, the core team is polling the community in order to decide whether the funds received should be returned, kept or if the project itself should be handed over to the community. Nonetheless the future for the protocol appears pretty bleak as the large majority of users are currently requesting full refund.</p><h3 id="h-blockchaincom" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Blockchain.com</h3><p>In response to the recent association between BitOasis and Mastercard aiming to bring digital assets to population in the Middle-East and North Africa region through a series of crypto card programs, Blockchain.com and Visa recently revealed a new key partnership which will enable US residents to pay for daily expenses using their crypto or cash balance even though from a US tax revenue standpoint this new form of debit cards still raises a lot of questions.</p><h3 id="h-team-finance" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Team Finance</h3><p>Team finance recently announced a successful $14.5M exploit of its platform and temporarily shut down all activities in order to prevent further exploits and try to get in touch with the hacker for a potential bug bounty payment. Technically speaking, the hack took advantage of a vulnerability of the Uniswap V2 to V3 migration function which hasn’t been spotted by the team during the testing phase nor during the previous external code audits raising again the question around the need for better overall code auditing and code standardisation if blockchain protocols are ever to be globally adopted.</p><h2 id="h-web3" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Web3</h2><h3 id="h-metawork" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">MetaWork</h3><p>MetaWork a communication and collaboration operating system for Web3 creating new standard of messaging and notifications for teams, enterprises and DAOs recently announced a new key partnership with Spheron which will enable it to build its web app in a fully decentralised manner using Spheron’s decentralised storage network.</p><h3 id="h-binance" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Binance</h3><p>Earlier last week, Binance announced the release of its native oracle service aiming to enable smart contracts to run on real-world inputs and outputs starting with the BNB chain ecosystem and its over 1,400 application. Furthermore, being chain-agnostic, this new oracle service will in-time integrate with other additional blockchains.</p><h3 id="h-zerion" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Zerion</h3><p>Zerion a Web3 investing tool and crypto wallet provider which launch its native wallet earlier this year with the goal of attracting seasoned crypto users recently closed an additional $12.3 million funding round in order to continue developing additional features (pre-trade verification, vetted cryptocurrencies verification…) and become a viable alternative to Metamask for degen users.</p><h2 id="h-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Ecosystem</h2><h3 id="h-defi-regulation" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">DeFi regulation</h3><p>Vitalik recently publicly took part in the current debate around DeFi regulation in the US and highlighted through an extensive twitter thread that in his opinion the blockchain industry is still not quite ready for fully integrated with mainstream finance and should take more time to mature in order to not fall victim of bad regulations that would intrude on how crypto work internally.</p><h3 id="h-us-regulation" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">US regulation</h3><p>In line with the recent commission bid to gain more regulatory oversight over the crypto industry, Commodity Future Trading Commission’s Christy Goldsmith Romero recently pointed out that the collapse of the Terra ecosystem and its flow-on effect was a perfect example of why the agency should be given additional authority to regulate this new upcoming industry given how similar contagion risks within crypto market are to those experienced in the traditional financial system.</p><h2 id="h-market" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Market</h2><ul><li><p>BTC     $20,524.50</p></li><li><p>ETH     $1592.17</p></li><li><p>BNB     $341,42</p></li><li><p>ADA     $0.40</p></li><li><p>SOL     $32.90</p></li></ul><h3 id="h-sei" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Sei</h3><p>Sei, a Layer 1 blockchain and Alpha Venture DAO a renowned Web3 venture builder recently unveiled a new incubation program called Alpha Incubate Batch 2 aiming to support the growth of new DeFi projects.</p><h3 id="h-meta" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Meta</h3><p>Meta joined forces with fashion company L’Oréal and French college HEC to support startups developing Web3 technologies, including avatar development through a future accelerator program geared toward facilitating creativity in the Metaverse.</p><h3 id="h-reap" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Reap</h3><p>Reap a Hong-Kong based start-up recently closed a $40M funding round aiming at developing infrastructure to help facilitates payments between Web3 projects and traditional businesses.</p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/0677e02b21d3dc301f77a3449947ea471ade8aca3ecaaff341c201ddd9b2c4b7.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Pi Network Overview]]></title>
            <link>https://paragraph.com/@cybergen-lab/pi-network-overview</link>
            <guid>6YvNyiLBWNnj5UGU3etj</guid>
            <pubDate>Sat, 29 Oct 2022 14:33:52 GMT</pubDate>
            <description><![CDATA[Originally founded by a Stanford Research group in response to the ever-growing centralisation of PoW network mining around a small subset of node operators, Pi network is a new form of L1 blockchain network enabling mining for everyday people through the leveraging of a custom version of the Stellar Consensus Protocol (SCP) enforcing network decentralisation through the integration of individual devices such as mobile phones, laptop and computers. Why using Pi network?Enable casual users to ...]]></description>
            <content:encoded><![CDATA[<p>Originally founded by a Stanford Research group in response to the ever-growing centralisation of PoW network mining around a small subset of node operators, Pi network is a new form of L1 blockchain network enabling mining for everyday people through the leveraging of a custom version of the Stellar Consensus Protocol (SCP) enforcing network decentralisation through the integration of individual devices such as mobile phones, laptop and computers.</p><p><strong>Why using Pi network?</strong></p><ul><li><p>Enable casual users to earn tokens through simple daily KYC procedure on the Pi mobile app</p></li><li><p>Better throughput than PoW network such as Bitcoin thanks to the leveraging of the SCP</p></li><li><p>Fair and meritocratic token distribution proportional to the daily network support of each active participant</p></li></ul><h2 id="h-pi-network-architecture" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Pi Network architecture</h2><p>As mentioned above, the Pi network consensus algorithm is built atop the Stellar Consensus Protocol designed by André Mazière and currently implemented within the Stellar Network. However, at the difference of this latter relying on a set of trusted nodes run by well-known corporations such as IBM and the likes, Pi network made the architectural choice of relying solely on a decentralised network of nodes composed of individual devices ranging from mobile phones to computers.</p><p>Furthermore from a design standpoint the Pi blockchain is leveraging four core network roles:</p><ul><li><p>Pioneer — Users of the Pi mobile app who simply confirm that they are not bots on a daily basis through connecting and signing on the app.</p></li><li><p>Contributor — Users of the Pi mobile app contributing and providing a list of pioneers she knows and trusts enabling in aggregate the build-up of a global trust graph.</p></li><li><p>Ambassador — Users of the Pi mobile app introducing new users to the app</p></li><li><p>Node — Users that are pioneer, contributors and also running the Pi node software which runs the core SCP algorithm taking into account the trust graph information provided by the contributors.</p></li></ul><h2 id="h-pi-features" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Pi Features</h2><p>Currently the project being still in its pre-release phase, the core team hasn’t developed any additional features outside of the core blockchain stack. However, as per the whitepaper, the core team is planning the future development of a full suite of features building-up on the current existing Pi mobile app:</p><ul><li><p>Pi attention marketplace — Scare Instagram like social media channel open to any pioneer to post their content and further build-up the community liveliness with debates, pools, exchange of media …</p></li><li><p>Pi’s Barter marketplace — Development of the existing Pi mobile app in order to enable each pioneer to sales goods and skills to other member of the Pi ecosystem.</p></li><li><p>Pi App store — Development of a dedicated app store enabling builders to bootstrap their dApps launch using the existing Pi community.</p></li></ul><h2 id="h-pi-governance" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Pi Governance</h2><p>Mindful of the necessity of having a lot of flexibility in the development of such a protocol, the core team has assembled a governance scheme evolving with the development of the Pi ecosystem and user base.</p><p>As such, in the first stage of development while the overall user base remains under 5M members, the Pi project will used a provisional governance model resembling off-chain governance models currently employed by other protocols such as Ethereum or Bitcoin in order to enable the core team to guide at best the development of the protocol. Then, following this first stage, once the Pi ecosystem will reach its target of 5M members, a provisional committee will be formed based on prior active agents contributions and be charged with the responsability of leading the future development of the protocol.</p><h2 id="h-pi-cryptoeconomics" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Pi Cryptoeconomics</h2><h3 id="h-main-token-metrics" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Main token metrics</h3><p>The network being still in its early stage of development the Pi token is not currently listed on any exchanges nor tradable for any currency. Nonetheless, it is interesting to note that in order to bootstrap interest for the project the team launched a test version of the token which will become later on redeemable for live Pi token once the network will reach its mainnet stage in order to reward early supporters of the project.</p><p>Note: In order to avoid any amlicious attack on the network during its pre-mainnet phase with agents spinning up multiple wallet address in order to perform parallel mining, the core team develop a really tight KYC process relying on manual verification from other active agents of the Pi network (video, picture, ID papers …)</p><h3 id="h-token-utility" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token utility</h3><p>The design being still in the midst of profound development, nothing can be asserted here with absolute certainty regarding the full range of utility functions that the Pi token will indeed fulfill upon mainnet launch. Nonetheless, for now as per the project documentation it appears that the Pi token will at least fulfill the following core utility functions:</p><ul><li><p>Governance — Grants voting rights on governance vote</p></li><li><p>Mining rewards — Grants proportional rewards to all agents actively supporting the network</p></li><li><p>Fee payment — Enable users to prioritise the processing of their transaction in the case of high throughput on the network</p></li></ul><h3 id="h-token-allocation" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token allocation</h3><p>As per the protocol documentation, the Pi network exhibits a max supply of 100 billions Pi tokens allocated as follows:</p><ul><li><p>20 billion for pre-mainnet</p></li><li><p>45 billion for future mainnet</p></li><li><p>5 billion for liquidity pool</p></li><li><p>10 billion for foundation and community grants</p></li><li><p>20 billion for core team, vested based upon the percentage of coins unlocked at given time on the network (4 billions unlocked during pre-mainnet corresponding to 20% of the 20 billions tokens unlocked during pre-mainnet)</p></li></ul><p><strong>Observation:</strong> Extremely inefficient token allocation creating a worrying token problematic as well as a massive front-loading situation at the benefit of the core team given that as per the current token vesting schedule the team composed of roughly 30 people has received 4 billions tokens leading to approximately a stack of 130M tokens per team member against only a potential stack of approximately 571 tokens per active users if we take the last report of Pi active users boasting 35 millions active agents.</p><h3 id="h-token-emission" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token emission</h3><p>The token emission policy of the network is governed by the following equation:</p><p>Total Max supply = Mining Supply (M) + Referral Supply (R) + Developer reward (D)</p><p>where:</p><p>The mining supply correspond to the aggregate amount of minted Pi token created for each new users joining the network up until the 100 millionth new user.</p><p>The referral supply corresponds to an additional protocol incentive minting a a fixed amount of Pi tokens as a referral bonus for both the referrer and its referee that gets reallocated equally to both parties over their lifetime on the network.</p><p>The developer rewards minted alongside each new Pi coin minted for mining or referral purposes in order to build a protocol treasury growing proportionally to the size of the Pi network and allowing the future funding of the protocol development and grants program.</p><h2 id="h-pi-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Pi Ecosystem</h2><h3 id="h-pi-team" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Pi Team</h3><p>Dr Nicolas Kokkalis, Head of Technology <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://minepi.com/team/member/1">https://minepi.com/team/member/1</a></p><p>Stanford PhD and instructor of Stanford’s first decentralized applications class; combining distributed systems and human computer interaction to bring cryptocurrency to everyday people.</p><p>Dr Chengdiao Fan, Head of product <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://minepi.com/team/member/2">https://minepi.com/team/member/2</a></p><p>Stanford PhD in Computational Anthropology harnessing social computing to unlock human potential on a global scale.</p><h3 id="h-ecosystem-current-state" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Ecosystem current state</h3><p>As of now given the early stage of development of the project, there is no ecosystem per se developed around the Pi network.</p><h2 id="h-roadmap" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Roadmap</h2><p>The roadmap published by the team is pretty opaque to say the least. Indeed even though it gives the major direction of the project there is no indication of the current state of the core development and no available github repo to audit the code implemented by the project raising again the problem of the opacity surrounding the project and casting somewhat of a shadow concerning the claim of future AI integration for the app users given the pretty basic state of the technical stack currently experienced by app users.</p><h3 id="h-pi-network-roadmap-as-per-the-whitepaper" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Pi network roadmap (as per the whitepaper):</h3><p><strong>Phase 1</strong> - Design, Distribution, Trust Graph Bootstrap.</p><p>The Pi server is operating as a faucet emulating the behaviour of the decentralised system as it will function once its live. During this phase improvements in the user experience and behaviour are possible and relatively easy to make compared to the stable phase of the main net.</p><p><strong>Note:</strong> All minting of coins to users will be migrated to the live net once it launches meaning that the livenet will pre-mint  in its genesis block all account holder balances generated during Phase 1, and continue operating just like the current system but fully decentralized.</p><p><strong>Phase 2</strong> - Testnet</p><p>Before we launch the main net, the Node software will be deployed on a test net. The test net will use the same exact trust graph as the main net but on a testing Pi coin. Pi core team will host several nodes on the test net, but will encourage more Pioneers to start their own nodes on the testnet.</p><p><strong>Phase 3</strong> - Mainnet</p><p>When the community feels the software is ready for production, and it has been thoroughly tested on the testnet, the official mainnet of the Pi network will be launched. An important detail is that, in the transition into the mainnet, only accounts validated to belong to distinct real individuals will be honored. After this point, the faucet and Pi network emulator of Phase 1 will be shut down and the system will continue on its own forever.</p><h2 id="h-conclusion" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">Conclusion</h2><p>Despite being conceptually interesting from an architectural and network design standpoint, the questionable cryptoeconomic design of the Pi network creating a massive token front-loading for the core team as well as the overall opacity regarding the current development and state of the project are raising many red flags which will undoubtedly hinder the interest of potential investors in the future. Consequently, instead of a breaking new L1 protocol reap for future success it appears better to consider the Pi Network as a good proof of concept of what is possible to create using the SCP consensus algorithm and a good theoretical foundation for future similar protocols integrating better cryptoeconomic and governance design</p><p>Disclaimer: <em>All information/documents contained in this blog rely solely  on my personal beliefs, and do not constitute professional investment advice.</em></p><p><em>Be careful in your investment and do not invest more than you can afford to loose.</em></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/46bf8dde5a6cb9c5297af14e79ba54b89f13a0f672c8a9794129e2382012674e.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[API3 Network Overview]]></title>
            <link>https://paragraph.com/@cybergen-lab/api3-network-overview</link>
            <guid>oDjW0G7lJ4SNMd57PqMC</guid>
            <pubDate>Sat, 29 Oct 2022 14:12:40 GMT</pubDate>
            <description><![CDATA[Advertised as the next generation of oracle solution, API3 is a community-owned oracle protocol relying on a decentralised network of first-party oracles enabling a direct integration between off-chain API providers and on-chain smart-contract processes. Indeed, on the contrary of other more established protocols like Chainlink using the legacy third party oracle framework where incentivised anonymous third parties run oracle nodes, API3 is able to benefits from a transparent and streamline n...]]></description>
            <content:encoded><![CDATA[<p>Advertised as the next generation of oracle solution, API3 is a community-owned oracle protocol relying on a decentralised network of first-party oracles enabling a direct integration between off-chain API providers and on-chain smart-contract processes. Indeed, on the contrary of other more established protocols like Chainlink using the legacy third party oracle framework where incentivised anonymous third parties run oracle nodes, API3 is able to benefits from a transparent and streamline network architecture leveraging existing off-chain API providers to deliver off-chain data in a reliable and trust-minimised way.</p><p><strong>Why using API3?</strong></p><ul><li><p>No middle-man tax — Redirection of cryptoeconomic flow towards the API providers</p></li><li><p>High level of transparency —  Cryptographically signed data by well-known API providers.</p></li><li><p>Collaborative governance — Interesting leveraging of the DAO framework offering a large level of community power over the protocol development compared to other competitors such as Chainlink</p></li></ul><h2 id="h-api3-architecture" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">API3 Architecture</h2><p>API3’s network architecture is built-upon five core components:</p><ul><li><p>Airnode — Easy to set-up serverless oracle node enabling API providers to transmit proprietary data on-chain with little to no maintenance.</p></li><li><p>Oracle Integration Specifications — Technical component enabling to define through a JSON object all the API specifications as well as the Airnode endpoints linked to API operations.</p></li><li><p>Beacons — Continuously updated streams of off-chain data provided by known API providers.</p></li><li><p>API3 market — Data feeds market place enabling to search, monitor and consume existing data feeds on a broad range of networks similar to the one developed by Chainlink.</p></li><li><p>API3 staking pool — Staking pool acting as collateral for the payment of valid service coverage claim and enabling token holders to obtain representation in the API3 DAO and earn staking rewards for their support of the network.</p></li></ul><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/258f8ef6d3fda22873ba6cbd0d89d1998159ff0a96e76c6d1a8ba51a5e19f251.png" alt="Overview of API3 design " blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">Overview of API3 design</figcaption></figure><p><strong>Note:</strong> Interesting use of the staking pool as a protocol collateral enabling a cryptoeconomic incentivisation of active agents towards efficient protocol governance.</p><h2 id="h-api3-features" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">API3 Features</h2><p>Despite being relatively new compared to other more established oracle protocols, API3 is already offering an interesting range of product bringing additional flexibility and features compared to what is currently available in the space:</p><ul><li><p>dAPIS —  Highly flexible data feed service enabling users to access general data-feeds or developed new ones with custom curation features (beacon numbers, …)</p></li><li><p>ChainAPI — Intuitive off-the-shelf solution enabling API provider to seamlessly configure and deploy an Airnode integrated with  their API without the need of a dedicated in-house devOps team.</p></li><li><p>QRNG — On-demand quantum random value generator offering a higher level of randomness than pseudo-randomness generation function such as Chainlink VRF.</p></li></ul><h2 id="h-api3-governance" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">API3 Governance</h2><p>The API3 network is a community-owned protocol governed through a Decentralized Autonomous Organisation (DAO) enabling every API3 token holders to participate in the governance of the protocol proportionally to his token holdings.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/a15ce4bb1e8dcf583da8e6144cf88165d0f070d5187f0f4b189f4004422b986f.png" alt="API3 DAO design overview" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">API3 DAO design overview</figcaption></figure><p>The API3 DAO only votes on high-level parameters regarding core protocol mechanics and developments (roadmap, grants …) while more granular tasks are conducted through a custom hierarchical team structure in order to avoid unnecessary overlapping during the development process and  ensure that critical operations such as dAPI management are executed swiftly and based on expert opinion.</p><h3 id="h-governance-flow" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Governance flow</h3><ul><li><p>Step 1 — Tasks get conducted through off-chain team of specialists appointed by the DAO through grants</p></li><li><p>Step 2 — Once the task reach a scale that can no-longer be fulfilled by a unique team, a subDAO with its own set of affiliated off-chain teams is created in replacement.</p></li></ul><h2 id="h-api3-cryptoeconomics" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">API3 Cryptoeconomics</h2><h3 id="h-main-token-metrics-102122" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Main token metrics (10.21.22)</h3><p>Source: <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://coinmarketcap.com/currencies/api3/">coinmarketcap</a></p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/5a76c9d2a82dd72262c71d386e4ac691801af9e450ed09e929d882e895473ba2.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h3 id="h-token-utility" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token Utility</h3><p>The native API3&apos;s ERC20 token fullfill three core utility functions:</p><ul><li><p>Staking — Grants inflationary rewards for users staking token in the API3 pool</p></li><li><p>Governance — Grants direct representation in the API3 DAO</p></li><li><p>Collateral — Ensure financial service coverage backing against damages caused by dAPI malfunctions</p></li></ul><p><strong>Observation:</strong> It is critical for these three utilities to coincide as all governing entities must receive staking rewards for them to govern in a way that maximises usage and also have their funds used as collateral to ensure a responsible protocol governance minimising security risks.</p><h3 id="h-token-allocation" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token Allocation</h3><p>As of today the total supply of AP3 tokens is of  114,855,860 tokens. Out of this total supply, 100 million tokens were initially allocated in November 2020 during the initial token distribution on Mesa DEX and the remaining 14,855,860 API3 has been minted as staking rewards since then.</p><p><strong>Initial token distribution breakdown:</strong></p><ul><li><p>30 million for API3 founders and team vested over three years with a six-month cliff</p></li><li><p>25 million for the API3 DAO to build the project&apos;s community and ecosystem vested over three years with a six month cliff</p></li><li><p>20 million for public investors unlocked after the initial Mesa DEX offering</p></li><li><p>10 million for API3 partners vested over three years with a six month cliff</p></li><li><p>10 million for seed investors over two years</p></li><li><p>5 million for pre-seed investors vested over two years</p></li></ul><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/35c2bdee4f88ac8ba2ed0a3d94563bdd28805f179e6037f519af95ed392cfb3d.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><h3 id="h-token-inflation" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Token inflation</h3><p>To secure the network through staking, API3 maintains an inflationary issuance of staking rewards. The staked tokens are primarily used as collateral in the on-chain service coverage fund. However, if the non-profit API3 DAO has excess revenue after subtracting all its costs, it use this excess revenue to buy API3 tokens that are afterwards burned. Similarly, any USDC left over from operations will be used to buy API3 from the open market and burn it.</p><p>Furthermore, the inflation rate is calculated using the staking pool size as a percentage of the total supply that is set by the API3 DAO. If the target is met, APR is decreased by 1% each week else APR is increased by 1% each week.</p><p><strong>Note:</strong>  Staking rewards are paid out weekly, and are locked for one year after they have been paid.</p><h2 id="h-api3-ecosystem" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">API3 Ecosystem</h2><h3 id="h-api3-team" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">API3 Team</h3><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/heikki-v%C3%A4nttinen-83a86380/">Heikki Vanttinen</a>: Co-founder, Strategy</p><p>Extensive prior experience in the blockchain space and more particularly in domain such as smart contract decentralised identity.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/burak-benligiray-b3055715b/">Burak Benligiray:</a> Co-founder, Core technical team lead</p><p>Former Google scholar and CTO at CLC Group and Honeycomb with a prior experience in computer vision and artificial intelligence.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/sasa-milic/">Saša Milić</a>: Co-founder</p><p>Software Engineer, data scientist and cryptocurrency researcher with prior software development experience at Facebook.</p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://www.linkedin.com/in/masonburkhalter/">Mason Burkhalter:</a> Operations Lead</p><p>Extensive prior experience in Business consulting and Software development.</p><h3 id="h-ecosystem-current-state" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Ecosystem current state</h3><ul><li><p>Airnode deployed on 13 chains and EVM compatible</p></li><li><p>Multi-feeds data sources still in development.</p></li><li><p>Over 150 data providers currently available</p></li><li><p>Lack of traction on flagship products (more than 2/3 of QRNG requests still originates from Rinkeby testnet, ...)</p></li><li><p>Little to no effort to attract new data providers to the ecosystem given the core team current focus on data curation quality</p></li></ul><h3 id="h-key-partnerships" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">Key Partnerships</h3><p><strong>API3 X AmberData</strong></p><p>Key partnership with Amberdata which enabled the development of a broad set of high-quality price data-feed on a small subset of EVM-compatible chains (Ethereum, Polygon, BNB, Avalanche and RSK). However, the sole focus on cryptocurrency prices still somewhat limits the potential use-cases of those API3 dAPIs.</p><p><strong>API3 X Silta</strong></p><p>Development of a new class of financial products leveraging API3’s oracle solution in order to develop credit score oracles as well as novel financial assets enabling physical infrastructure developers to use on-chain collateral.</p><h2 id="h-api3-roadmap" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">API3 Roadmap</h2><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/8673dd73f29ae2c527e764b9fb84b705907ea52aff50f94f4ce054e2e61d3b7e.png" alt="" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Disclaimer: <em>All information/documents contained in this article rely solely  on my personal beliefs, and do not constitute professional investment advice.</em></p><p><em>Be careful in your investment and do not invest more than you can afford to loose.</em></p>]]></content:encoded>
            <author>cybergen-lab@newsletter.paragraph.com (CyberGen Lab)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/23463b14a946ccc4902b3474ec890d6da67b328e078d946eade07063422660e5.png" length="0" type="image/png"/>
        </item>
    </channel>
</rss>