Cross-Chain NFT Marketplace MEV Strategy with Artemis: A Technical Commentary
"A strategy implementing atomic, cross-market NFT arbitrage between Seaport and Sudoswap. At a high level, we listen to a stream of new seaport orders, and compute whether we can atomically fulfill the order and sell the NFT into a sudoswap pool while making a profit." Paradigm recently released Artemis: An Open-Sourced MEV Bot Framework with a cross-market NFT arbitrage strategy implementation included. This is a brief technical commentary focusing on how the initial strategy is setup and ho...
Free Historical Blockchain Extraction with Cryo + Merkle Reth Nodes
IntroductionHistorical blockchain data poses challenges for analysis. Despite its general accessibility, obtaining and analyzing such data has been historically hindered by paywalls and restrictions imposed by node service providers. Setting up a personal archive node is also a non-trivial task, introducing extra steps before data analysis becomes feasible. By leveraging Cryo to extract historical data and utilizing the free Merkle RPC with archive node support, researchers can now easily acc...
Old mirror - https://mirror.xyz/evandekim.eth https://mirror.xyz/evandekim.eth/kowg_VFD7lp5p12C4wcytc2rooVXgKnUwBd-KUKtndQ
Cross-Chain NFT Marketplace MEV Strategy with Artemis: A Technical Commentary
"A strategy implementing atomic, cross-market NFT arbitrage between Seaport and Sudoswap. At a high level, we listen to a stream of new seaport orders, and compute whether we can atomically fulfill the order and sell the NFT into a sudoswap pool while making a profit." Paradigm recently released Artemis: An Open-Sourced MEV Bot Framework with a cross-market NFT arbitrage strategy implementation included. This is a brief technical commentary focusing on how the initial strategy is setup and ho...
Free Historical Blockchain Extraction with Cryo + Merkle Reth Nodes
IntroductionHistorical blockchain data poses challenges for analysis. Despite its general accessibility, obtaining and analyzing such data has been historically hindered by paywalls and restrictions imposed by node service providers. Setting up a personal archive node is also a non-trivial task, introducing extra steps before data analysis becomes feasible. By leveraging Cryo to extract historical data and utilizing the free Merkle RPC with archive node support, researchers can now easily acc...
Old mirror - https://mirror.xyz/evandekim.eth https://mirror.xyz/evandekim.eth/kowg_VFD7lp5p12C4wcytc2rooVXgKnUwBd-KUKtndQ
Subscribe to 0xEvan
Subscribe to 0xEvan
Share Dialog
Share Dialog
<100 subscribers
<100 subscribers
DataStreams a Subgraph query utility package that allows users to execute complex Subgraph queries. It provides extended functionality on top of The Graph data access python package Subgrounds. The main benefit is that now anyone can query Subgraph data, save it to their local storage as csv files, and perform data analytics immediately. DataStreams creates a reproducible data pipeline creation process in a transparent, lightweight manner. No database needed!
The main package in DataStreams is the Streamer class. Streamer streamlines the Subgraph query process in Python, exposing key information such as the queryable Subgraph schemas and filter clauses. In v1.0.0, functionality includes:
DRY Subgraph Query Code - reuse code to reduce complexity of Subgraph queries
Parallelized Subgraph Queries - Leverage the standard Python package concurrent to unlock concurrency at the Subgraph query level
Query the Chainlink Subgraph node for multiple token prices over a customized ranged period. Example notebook found here
Query the Cowswap Subgraph node for four specific schemas with parallelization support.
DataStreams a Subgraph query utility package that allows users to execute complex Subgraph queries. It provides extended functionality on top of The Graph data access python package Subgrounds. The main benefit is that now anyone can query Subgraph data, save it to their local storage as csv files, and perform data analytics immediately. DataStreams creates a reproducible data pipeline creation process in a transparent, lightweight manner. No database needed!
The main package in DataStreams is the Streamer class. Streamer streamlines the Subgraph query process in Python, exposing key information such as the queryable Subgraph schemas and filter clauses. In v1.0.0, functionality includes:
DRY Subgraph Query Code - reuse code to reduce complexity of Subgraph queries
Parallelized Subgraph Queries - Leverage the standard Python package concurrent to unlock concurrency at the Subgraph query level
Query the Chainlink Subgraph node for multiple token prices over a customized ranged period. Example notebook found here
Query the Cowswap Subgraph node for four specific schemas with parallelization support.
No activity yet