
Guru Idea Flow
How do you approach ideas, and how much do you stick to them once you start following them? When do you pivot to a new direction, turning the flow of previously shiny ideas in a new direction or wrapping things up under a new umbrella when they outgrow the old one? These are questions that often challenge Web2 startup founders and, even more so, Web3. FOMO, volatility, and trends are constantly shiftingโbull cycles, winters, and who knows where we are now. We started as a Multi-Chain/Multi-De...

In Search of an Infrastructure Holy Grail for DexGuru: Mastering Cost-Effectiveness and Performance
TL;DRDexGuru's transition from cloud-based infrastructure to an on-premises setup marks a strategic shift towards unparalleled cost-effectiveness and control. This move not only aligns with our commitment to full data supply chain control but also enables us to deliver custom-tailored, high-performance solutions, especially for data-intensive applications like OLAP databases.Unrivaled Cost-EffectivenessWe started our MVP in AWS (Got to $170k a month check), then decided to move towards H...

Guru Network Core Contributors Governance: A New Era Begins
Video version: Dear GURUS! This is supposed to be a Sunday thank-you community post. SoWe got great support with DexGuru V2 launch:https://scan.gurunetwork.ai/charts/network-activitiesPolished GURU Seasons Passes as Guru Network service account model, and got great traction: https://opensea.io/collection/guru-season-2-pass And Xs #GuruV2 handle:https://x.com/hashtag/GuruV2?src=hashtag_clickDriving participation and engagement like crazy. Why utility things(service accounts) should be boring? ...
CTO & Co-Founder of dex.guru, block explorer https://b2b.dex.guru/explorer and data warehouse https://warehouse.dex.guru/

Guru Idea Flow
How do you approach ideas, and how much do you stick to them once you start following them? When do you pivot to a new direction, turning the flow of previously shiny ideas in a new direction or wrapping things up under a new umbrella when they outgrow the old one? These are questions that often challenge Web2 startup founders and, even more so, Web3. FOMO, volatility, and trends are constantly shiftingโbull cycles, winters, and who knows where we are now. We started as a Multi-Chain/Multi-De...

In Search of an Infrastructure Holy Grail for DexGuru: Mastering Cost-Effectiveness and Performance
TL;DRDexGuru's transition from cloud-based infrastructure to an on-premises setup marks a strategic shift towards unparalleled cost-effectiveness and control. This move not only aligns with our commitment to full data supply chain control but also enables us to deliver custom-tailored, high-performance solutions, especially for data-intensive applications like OLAP databases.Unrivaled Cost-EffectivenessWe started our MVP in AWS (Got to $170k a month check), then decided to move towards H...

Guru Network Core Contributors Governance: A New Era Begins
Video version: Dear GURUS! This is supposed to be a Sunday thank-you community post. SoWe got great support with DexGuru V2 launch:https://scan.gurunetwork.ai/charts/network-activitiesPolished GURU Seasons Passes as Guru Network service account model, and got great traction: https://opensea.io/collection/guru-season-2-pass And Xs #GuruV2 handle:https://x.com/hashtag/GuruV2?src=hashtag_clickDriving participation and engagement like crazy. Why utility things(service accounts) should be boring? ...
CTO & Co-Founder of dex.guru, block explorer https://b2b.dex.guru/explorer and data warehouse https://warehouse.dex.guru/

Subscribe to evahteev

Subscribe to evahteev
Share Dialog
Share Dialog


<100 subscribers
<100 subscribers
As we are running our own EVM infrastructure ourselves and not relying on any third party services, we have to deal with all kind of new trendy things allowing bots/users to hit new bottom of Geth performance against hardware. In December it was Inscriptions.
For whatever reason someone decided that moving BRC-20 on EVM chains is a good idea, which can be cheaply implemented submitting transactions with nothing but data in Input field

Parsed easily with Clickhouse JSONExtractString function

So everyone started deploying and "minting" whatever, and nodes got on fire, multiple chains infrastructure got de-synced massively, block explorers were lagging and all the things we all love, when we go tru the next "Performance testing" like that.

Following @cygaar post on Inscirptions
https://twitter.com/0xcygaar/status/1736581628259107314
https://twitter.com/0xcygaar/status/173658162825910731
and intrigued by what the hell of a thunderstorm was causing nodes to de-sync on polygon and BSC, I'm followed my own the research. In a nutshell ,whoever want to have a dApp dealing with inscriptions needs to have own indexer running. So, whenever you are getting the transaction with inscriptions you would be able to update your state of the asset accordingly. Indexer operates off-chain and if everyone participating following brc-20 standard in the metadata they submit into Input of transaction, anyone indexing all the transactions can restore the state of the ticker balances wise. Tickers themselves are not unique, same as with Symbols on ERC-20, so the unique combination to track there is deployer from address plus ticker name.
Following the pattern and looking into Dune dashboard created by @hildobby EVM Inscriptions ๐ decided to tackle the thing myself, and dropped couple of Materiazlized views on top of our transactions table in Clickhouse powering up Guru Warehouse :
CREATE TABLE `inscriptions` $on_cluster
(
`block_number` UInt64,
`block_timestamp` UInt32,
`transaction_hash` String CODEC(ZSTD(1)),
`from_address` String CODEC(ZSTD(1)),
`to_address` Nullable(String) CODEC(ZSTD(1)),
`transaction_index` UInt32,
`gas_used` UInt64,
`gas_price` UInt256,
`standard` LowCardinality(String),
`operation` LowCardinality(String),
`ticker` String CODEC(ZSTD(1)),
`amount` String CODEC(ZSTD(1)),
`inscription` JSON CODEC(ZSTD(1)),
INDEX blocks_timestamp block_timestamp TYPE minmax GRANULARITY 1
)
ENGINE = ${replicated}ReplacingMergeTree
ORDER BY (block_number, transaction_hash)
SETTINGS index_granularity = 8192;
CREATE MATERIALIZED VIEW `inscription_mv_parsed_transactions` $on_cluster
TO `inscriptions`
AS
SELECT
block_number as block_number,
block_timestamp as block_timestamp,
hash AS transaction_hash,
from_address AS from_address,
to_address AS to_address,
transaction_index AS transaction_index,
ifNull(receipt_gas_used, 0) as gas_used,
ifNull(receipt_effective_gas_price, 0) as gas_price,
`gas_price` UInt256,
JSONExtractString(input_json, 'p') AS standard,
JSONExtractString(input_json, 'op') AS operation,
JSONExtractString(inscription, 'tick') AS ticker,
JSONExtractInt(inscription, 'amt') AS amount,
input_json AS inscription
FROM
(
SELECT
block_number,
block_timestamp,
hash,
from_address,
to_address,
transaction_index,
receipt_gas_used,
receipt_effective_gas_price,
substring(unhexed_input, position(unhexed_input, '{')) AS input_json, -- Extracts the JSON part from the input
unhex(input) AS unhexed_input -- Converts hex input to binary string
FROM transactions
)
WHERE
JSONHas(input_json, 'p') AND
JSONHas(input_json, 'op'); -- Check if JSON extraction is successful
Having there those tables as a result in warehouse, allowed me to start building queries:

Added Some Queries


Some funny tickers are overminted tremendously:

We've decided to embed Inscriptions Data Warehouse dashboard into our Markets Overview page, and test how Warehouse <-> Dex guru integrations looks overall, so we would utilize same idea for more dashboards in the future, placing them on different pages, allowing to have community driven analytics in the same place as trading happens.
https://dex.guru/markets/inscriptions
All those queries are available via API, so anyone can build their own dashboards and analytics on top of it. https://warehouse.dex.guru/queries/158#-1
Doing that research I was tackling the idea of utilizing inscriptions for something useful, so I've got an idea of using Inscriptions as a Rewards Points in off-chain/on-chain rewards system where user have to accomplish some quest off-chain(Web2 website) and can claim points with representation on-chain in Inscriptions form. Benefits there:
Inscriptions are cheap to mint
Allowing to da activities off-chain(Web2) submitting them into Inscriptions indexer.
So looking forward into building something off them.
As we are running our own EVM infrastructure ourselves and not relying on any third party services, we have to deal with all kind of new trendy things allowing bots/users to hit new bottom of Geth performance against hardware. In December it was Inscriptions.
For whatever reason someone decided that moving BRC-20 on EVM chains is a good idea, which can be cheaply implemented submitting transactions with nothing but data in Input field

Parsed easily with Clickhouse JSONExtractString function

So everyone started deploying and "minting" whatever, and nodes got on fire, multiple chains infrastructure got de-synced massively, block explorers were lagging and all the things we all love, when we go tru the next "Performance testing" like that.

Following @cygaar post on Inscirptions
https://twitter.com/0xcygaar/status/1736581628259107314
https://twitter.com/0xcygaar/status/173658162825910731
and intrigued by what the hell of a thunderstorm was causing nodes to de-sync on polygon and BSC, I'm followed my own the research. In a nutshell ,whoever want to have a dApp dealing with inscriptions needs to have own indexer running. So, whenever you are getting the transaction with inscriptions you would be able to update your state of the asset accordingly. Indexer operates off-chain and if everyone participating following brc-20 standard in the metadata they submit into Input of transaction, anyone indexing all the transactions can restore the state of the ticker balances wise. Tickers themselves are not unique, same as with Symbols on ERC-20, so the unique combination to track there is deployer from address plus ticker name.
Following the pattern and looking into Dune dashboard created by @hildobby EVM Inscriptions ๐ decided to tackle the thing myself, and dropped couple of Materiazlized views on top of our transactions table in Clickhouse powering up Guru Warehouse :
CREATE TABLE `inscriptions` $on_cluster
(
`block_number` UInt64,
`block_timestamp` UInt32,
`transaction_hash` String CODEC(ZSTD(1)),
`from_address` String CODEC(ZSTD(1)),
`to_address` Nullable(String) CODEC(ZSTD(1)),
`transaction_index` UInt32,
`gas_used` UInt64,
`gas_price` UInt256,
`standard` LowCardinality(String),
`operation` LowCardinality(String),
`ticker` String CODEC(ZSTD(1)),
`amount` String CODEC(ZSTD(1)),
`inscription` JSON CODEC(ZSTD(1)),
INDEX blocks_timestamp block_timestamp TYPE minmax GRANULARITY 1
)
ENGINE = ${replicated}ReplacingMergeTree
ORDER BY (block_number, transaction_hash)
SETTINGS index_granularity = 8192;
CREATE MATERIALIZED VIEW `inscription_mv_parsed_transactions` $on_cluster
TO `inscriptions`
AS
SELECT
block_number as block_number,
block_timestamp as block_timestamp,
hash AS transaction_hash,
from_address AS from_address,
to_address AS to_address,
transaction_index AS transaction_index,
ifNull(receipt_gas_used, 0) as gas_used,
ifNull(receipt_effective_gas_price, 0) as gas_price,
`gas_price` UInt256,
JSONExtractString(input_json, 'p') AS standard,
JSONExtractString(input_json, 'op') AS operation,
JSONExtractString(inscription, 'tick') AS ticker,
JSONExtractInt(inscription, 'amt') AS amount,
input_json AS inscription
FROM
(
SELECT
block_number,
block_timestamp,
hash,
from_address,
to_address,
transaction_index,
receipt_gas_used,
receipt_effective_gas_price,
substring(unhexed_input, position(unhexed_input, '{')) AS input_json, -- Extracts the JSON part from the input
unhex(input) AS unhexed_input -- Converts hex input to binary string
FROM transactions
)
WHERE
JSONHas(input_json, 'p') AND
JSONHas(input_json, 'op'); -- Check if JSON extraction is successful
Having there those tables as a result in warehouse, allowed me to start building queries:

Added Some Queries


Some funny tickers are overminted tremendously:

We've decided to embed Inscriptions Data Warehouse dashboard into our Markets Overview page, and test how Warehouse <-> Dex guru integrations looks overall, so we would utilize same idea for more dashboards in the future, placing them on different pages, allowing to have community driven analytics in the same place as trading happens.
https://dex.guru/markets/inscriptions
All those queries are available via API, so anyone can build their own dashboards and analytics on top of it. https://warehouse.dex.guru/queries/158#-1
Doing that research I was tackling the idea of utilizing inscriptions for something useful, so I've got an idea of using Inscriptions as a Rewards Points in off-chain/on-chain rewards system where user have to accomplish some quest off-chain(Web2 website) and can claim points with representation on-chain in Inscriptions form. Benefits there:
Inscriptions are cheap to mint
Allowing to da activities off-chain(Web2) submitting them into Inscriptions indexer.
So looking forward into building something off them.
No activity yet