
Guru Idea Flow
How do you approach ideas, and how much do you stick to them once you start following them? When do you pivot to a new direction, turning the flow of previously shiny ideas in a new direction or wrapping things up under a new umbrella when they outgrow the old one? These are questions that often challenge Web2 startup founders and, even more so, Web3. FOMO, volatility, and trends are constantly shifting—bull cycles, winters, and who knows where we are now. We started as a Multi-Chain/Multi-De...

In Search of an Infrastructure Holy Grail for DexGuru: Mastering Cost-Effectiveness and Performance
TL;DRDexGuru's transition from cloud-based infrastructure to an on-premises setup marks a strategic shift towards unparalleled cost-effectiveness and control. This move not only aligns with our commitment to full data supply chain control but also enables us to deliver custom-tailored, high-performance solutions, especially for data-intensive applications like OLAP databases.Unrivaled Cost-EffectivenessWe started our MVP in AWS (Got to $170k a month check), then decided to move towards H...

Inscriptions EVM Madness
As we are running our own EVM infrastructure ourselves and not relying on any third party services, we have to deal with all kind of new trendy things allowing bots/users to hit new bottom of Geth performance against hardware. In December it was Inscriptions. For whatever reason someone decided that moving BRC-20 on EVM chains is a good idea, which can be cheaply implemented submitting transactions with nothing but data in Input fieldinscriptions rawParsed easily with Clickhouse JSONExtractSt...
CTO & Co-Founder of dex.guru, block explorer https://b2b.dex.guru/explorer and data warehouse https://warehouse.dex.guru/

Guru Idea Flow
How do you approach ideas, and how much do you stick to them once you start following them? When do you pivot to a new direction, turning the flow of previously shiny ideas in a new direction or wrapping things up under a new umbrella when they outgrow the old one? These are questions that often challenge Web2 startup founders and, even more so, Web3. FOMO, volatility, and trends are constantly shifting—bull cycles, winters, and who knows where we are now. We started as a Multi-Chain/Multi-De...

In Search of an Infrastructure Holy Grail for DexGuru: Mastering Cost-Effectiveness and Performance
TL;DRDexGuru's transition from cloud-based infrastructure to an on-premises setup marks a strategic shift towards unparalleled cost-effectiveness and control. This move not only aligns with our commitment to full data supply chain control but also enables us to deliver custom-tailored, high-performance solutions, especially for data-intensive applications like OLAP databases.Unrivaled Cost-EffectivenessWe started our MVP in AWS (Got to $170k a month check), then decided to move towards H...

Inscriptions EVM Madness
As we are running our own EVM infrastructure ourselves and not relying on any third party services, we have to deal with all kind of new trendy things allowing bots/users to hit new bottom of Geth performance against hardware. In December it was Inscriptions. For whatever reason someone decided that moving BRC-20 on EVM chains is a good idea, which can be cheaply implemented submitting transactions with nothing but data in Input fieldinscriptions rawParsed easily with Clickhouse JSONExtractSt...
Share Dialog
Share Dialog
CTO & Co-Founder of dex.guru, block explorer https://b2b.dex.guru/explorer and data warehouse https://warehouse.dex.guru/

Subscribe to evahteev

Subscribe to evahteev


<100 subscribers
<100 subscribers
Was bored on Christmas Eve Saturday night, so faced the problem of predicting block number for new year date (UTC 1704067200). Sniffed around and found that itself interesting problem to tackle applying linear regression to data we have on blocks in Guru Data Warehouse. Created Public API and Warehouse SDK out of it, so prepaired to that feature (block time/number prediction) to be released in Guru Block Explorers from backend side.
For whatever reason, needed a new year date calculated in blocks on different chains.
epoch timestamp: 1704067200`
Timestamp in milliseconds: 1704067200000
Date and time (GMT): Monday, January 1, 2024 12:00:00 AM

Tried LLAMA API https://defillama.com/docs/api (not working)




To predict blocks in future we need info on blocks in past on each chain, and we have guru warehouse for that

GURU AI Assist:
Asked our assist to help me with getting linear regression for block number vs timestamp out of blocks table in warehouse:

Unfortunately Guru AI Assist had no clue on Clickhouse Linear Regression functionality :( so needed to figure out myself.
Because we are using Clickhouse and we can Utilize Clickhouse Linear Regression For number/timestamp predictions and backwards, so created following query:
SELECT
simpleLinearRegression(number, timestamp) as lr
FROM (
SELECT
number,
timestamp
FROM
{{ network }}.blocks FINAL
ORDER BY
number DESC
LIMIT {{ training_set_size }}
)

Thinking a little bit on how I'm going to combine those into meaningful answers with methods like:
def get_block_by_timestamp(self, network: str, timestamp: int) -> Optional[BlockModel]:
pass
def get_block(network: str, block_number: int) -> Optional[BlockModel]:
pass
I've decided to create Warehouse SDK under dex-guru and created PR there https://github.com/dex-guru/dg-warehouse-sdk/pull/1
Main idea here is to query for linear regression in Warehouse if block/timestamp is in the future, and return one if it's in the past:
Example:
def get_block_by_timestamp(self, network: str,
timestamp: int) -> Optional[BlockModel]:
"""
Method either predicts or returns block by timestamp
:param network:
:param timestamp:
:return:
"""
timestamp = self._transform_timestamp(timestamp)
last_indexed_block = self.get_last_indexed_block(network)
if not last_indexed_block:
return None
last_indexed_block = last_indexed_block[0]
if timestamp > last_indexed_block['timestamp']:
# 10% OF ALL BLOCKS
training_set_size = round(last_indexed_block['number'] * 0.1)
lr_coefficients = self.get_lr_coefficients(network, training_set_size)
if not lr_coefficients:
return None
lr_coefficients = lr_coefficients[0]['lr']
block_number = round((timestamp - lr_coefficients["b"]) // lr_coefficients["k"])
return BlockModel(number=block_number, timestamp=timestamp)
else:
block_by_timestamp = self.get_indexed_block_by_timestamp(network,
timestamp)
if not block_by_timestamp:
return None
block_by_timestamp = block_by_timestamp[0]
return BlockModel(**block_by_timestamp)
Embedded it into DexGuru Public API free of charge (no API KEY needed):

Let's figure out block number for 1704067200(New Year) timestamp:

Cool! Next time would put it into Guru Block Explorer to have Etherscan type of page with time prediction there :)

Was bored on Christmas Eve Saturday night, so faced the problem of predicting block number for new year date (UTC 1704067200). Sniffed around and found that itself interesting problem to tackle applying linear regression to data we have on blocks in Guru Data Warehouse. Created Public API and Warehouse SDK out of it, so prepaired to that feature (block time/number prediction) to be released in Guru Block Explorers from backend side.
For whatever reason, needed a new year date calculated in blocks on different chains.
epoch timestamp: 1704067200`
Timestamp in milliseconds: 1704067200000
Date and time (GMT): Monday, January 1, 2024 12:00:00 AM

Tried LLAMA API https://defillama.com/docs/api (not working)




To predict blocks in future we need info on blocks in past on each chain, and we have guru warehouse for that

GURU AI Assist:
Asked our assist to help me with getting linear regression for block number vs timestamp out of blocks table in warehouse:

Unfortunately Guru AI Assist had no clue on Clickhouse Linear Regression functionality :( so needed to figure out myself.
Because we are using Clickhouse and we can Utilize Clickhouse Linear Regression For number/timestamp predictions and backwards, so created following query:
SELECT
simpleLinearRegression(number, timestamp) as lr
FROM (
SELECT
number,
timestamp
FROM
{{ network }}.blocks FINAL
ORDER BY
number DESC
LIMIT {{ training_set_size }}
)

Thinking a little bit on how I'm going to combine those into meaningful answers with methods like:
def get_block_by_timestamp(self, network: str, timestamp: int) -> Optional[BlockModel]:
pass
def get_block(network: str, block_number: int) -> Optional[BlockModel]:
pass
I've decided to create Warehouse SDK under dex-guru and created PR there https://github.com/dex-guru/dg-warehouse-sdk/pull/1
Main idea here is to query for linear regression in Warehouse if block/timestamp is in the future, and return one if it's in the past:
Example:
def get_block_by_timestamp(self, network: str,
timestamp: int) -> Optional[BlockModel]:
"""
Method either predicts or returns block by timestamp
:param network:
:param timestamp:
:return:
"""
timestamp = self._transform_timestamp(timestamp)
last_indexed_block = self.get_last_indexed_block(network)
if not last_indexed_block:
return None
last_indexed_block = last_indexed_block[0]
if timestamp > last_indexed_block['timestamp']:
# 10% OF ALL BLOCKS
training_set_size = round(last_indexed_block['number'] * 0.1)
lr_coefficients = self.get_lr_coefficients(network, training_set_size)
if not lr_coefficients:
return None
lr_coefficients = lr_coefficients[0]['lr']
block_number = round((timestamp - lr_coefficients["b"]) // lr_coefficients["k"])
return BlockModel(number=block_number, timestamp=timestamp)
else:
block_by_timestamp = self.get_indexed_block_by_timestamp(network,
timestamp)
if not block_by_timestamp:
return None
block_by_timestamp = block_by_timestamp[0]
return BlockModel(**block_by_timestamp)
Embedded it into DexGuru Public API free of charge (no API KEY needed):

Let's figure out block number for 1704067200(New Year) timestamp:

Cool! Next time would put it into Guru Block Explorer to have Etherscan type of page with time prediction there :)

No activity yet