0G Labs is growing, now ∅ A community account in the 0G Labs (Zero Gravity) ecosystem

🐼Developer Office Hours #1: Unveiling the 0G Architecture & SDK Demo | Recap
On 26 July 2024, 0G Labs held its inaugural developer office hours, an event dedicated to introducing and exploring the intricacies of their revolutionary "0G" architecture. Hosted by Gathin, a developer role engineer, and Peter, a developer at 0G Labs, along with contributions from Yon, the session provided a comprehensive overview of the 0G architecture and its implications for decentralized data availability and storage. This recap delves into the key points discussed during the session, h...

🐼 Summary of: Testnet V2 AMA
AbstractThe AMA session primarily focused on updates related to the 0G Testnet V2, including the introduction of new features, the validator selection process, and future plans for the 0G network. Key discussions included the improvements made from Testnet V1 to V2, the stability and performance enhancements, and changes to the economic model. Additionally, future goals for the 0G chain, collaborations, and support for developers were highlighted.Highlights:Introduction of new economic models...

🐼 | Research: Theoriq
Review of TheoriqAI and its Partnership with 0GIntroduction to TheoriqAITheoriqAI represents a paradigm shift in the AI landscape, aiming to decentralize and democratize AI technology. Founded by Ron Bodkin, Arnaud Flament, David Mueller, and Ethan Jackson under ChainML Labs in 2022, TheoriqAI focuses on creating a community-governed AI ecosystem. This initiative addresses the critical issue of AI centralization, where a few entities wield significant control over AI advancements. TheoriqAI&a...

🐼Developer Office Hours #1: Unveiling the 0G Architecture & SDK Demo | Recap
On 26 July 2024, 0G Labs held its inaugural developer office hours, an event dedicated to introducing and exploring the intricacies of their revolutionary "0G" architecture. Hosted by Gathin, a developer role engineer, and Peter, a developer at 0G Labs, along with contributions from Yon, the session provided a comprehensive overview of the 0G architecture and its implications for decentralized data availability and storage. This recap delves into the key points discussed during the session, h...

🐼 Summary of: Testnet V2 AMA
AbstractThe AMA session primarily focused on updates related to the 0G Testnet V2, including the introduction of new features, the validator selection process, and future plans for the 0G network. Key discussions included the improvements made from Testnet V1 to V2, the stability and performance enhancements, and changes to the economic model. Additionally, future goals for the 0G chain, collaborations, and support for developers were highlighted.Highlights:Introduction of new economic models...

🐼 | Research: Theoriq
Review of TheoriqAI and its Partnership with 0GIntroduction to TheoriqAITheoriqAI represents a paradigm shift in the AI landscape, aiming to decentralize and democratize AI technology. Founded by Ron Bodkin, Arnaud Flament, David Mueller, and Ethan Jackson under ChainML Labs in 2022, TheoriqAI focuses on creating a community-governed AI ecosystem. This initiative addresses the critical issue of AI centralization, where a few entities wield significant control over AI advancements. TheoriqAI&a...
0G Labs is growing, now ∅ A community account in the 0G Labs (Zero Gravity) ecosystem

Subscribe to 0G Panda

Subscribe to 0G Panda
Share Dialog
Share Dialog


<100 subscribers
<100 subscribers
X Space:
https://x.com/i/spaces/1nAJEaAlpLlJL
This episode of Hook on Web3 Mastery Space features a discussion on pioneering modular AI chains for infinite scalability with Zero Gravity Labs. Ming, the Co-Founder and CTO of Zero Gravity Labs, shares insights into their innovative blockchain infrastructure designed to support high performance applications like on-chain AI gaming and decentralized finance. Key components include a decentralized storage network, a unique consensus protocol, and a scalable data availability layer. * Innovative consensus protocol achieving high efficiency without sacrificing decentralization and security. * Scalable data solutions critical for AI training and inference tasks. * Strategic partnerships within the blockchain and AI ecosystems, including Ethereum, Polygon, and Solana.
Hello, ladies and gentlemen. Welcome to another episode of hook on web free Mastery Space. In our past episodes, we are having a pleasure of discussing the future of web 3 with numerous partners. And today, we are exciting to delve into another very exciting topic, which is AI and the innovation of blockchain technology with our new partner, Zero Gravity Labs, a transformative blockchain infrastructure project aimed at enabling high performance applications such as on chain AI gaming and decentralized finance.As we just started the space, I would like to encourage you to interact with us to spice up the environment. You can send us a comment, a question or anything to the guest and we will read at the end of the first session. While I'm here sipping on my peach juice, you can also comment what is your favorite snack to go with the space or simply type GM from whenever you are in the world. It will greatly boost the confidence of the guests and encourage us to produce more meaningful content to you in the future. Give us a like a retweet as usual. And also don't forget to click follow Zero Grab Gravity Labs and Hook Protocol X Twitter official account right now for the latest update. If you are a frequent listener to our space, you would be mostly unfamiliar with me, your host, Claire. But if you are a first timer here, let me give you a reminder into what Hook 2.0 is.This is a visionary expansion of the current hook social learning ecosystem with a deeper focus on web 3 education. Here, we introduce you a more immersive social learning, metaverse and app chain infrastructure powered by the hook as a native token and multiple opportunities to generate yield through the rewards.Today, we are honored to have the representative from Zero Gravity Labs here. Let's meet Ming, co founder and CEO of Zero Gravity joining us. The topic today that we are covering is the pioneering modular AI chains for infinite scalability. It's time to show us what is the product about, as well as some introduction about yourself. Ming, over to you.
Yeah, thank you. Yeah, thank you, Claire. So, yeah, so I mean, and I'm the co founder and the CTO of 0G. And so my background is mainly on the technical side. So before join the web 3 area, I was working in the Microsoft Research Asia for 11 years and my research direction is mainly on the dish with this, including all kinds of aspects in that area, including the computation, storage, and even the AI platforms. And in 20, I co founded with Fan Long the complex project, which is a high performance consensus protocol, layer 1 black chain. And yeah, so in that, in where we design a innovative consensual protocol to achieve the high efficiency without sacrificing the decentralization and security. And then recently, yeah, we're trying to seek for some new challenges in this area and want to create new, the best web 3 project and the best web 3 team. So we, yeah, we ma with Michael and we discuss and brainstorm, come up with this idea of 0G. And so we start this project. And zero G is a decentralized data solution, which has a infinite scalability that can support all kinds of application scenarios that requires this huge amount of data throughput. For example, the data availability scenario that require at the Ethereum layer 2 scenario applications and also the a bunch of decentralized AI projects that will have this large request for this data engineering and data transformation stuff. And yeah, so Fuji actually provide a, it consists of three major components. One is a decentralized storage network and another is a consensus protocol and also can be treated as a layer 1 blockchain. And the store, the storage network will connect with this consensus protocol in the sense that we use the consensual protocol to incentivize this decentralized storage menu stuff. So each storage node in the storage network will periodically participate this storage manding by provide a proof of the storage for some specific piece of data in the entire history. And let's submit this proof to the smart contract on the layer 1 blockchain, their G chain, and in the smart contract where the incentive mechanics are implemented. And on top of the on top of this decentralized storage, and we also provide a data available layer that have some extra functionality specific to the layer 2, Ethereum layer 2 scenario. And the da layer will employ some error coding mechanism and also the combined with the KCG commitment. And it will split the data into multiple chunks and will submit the chunks to the different storage node in the network. Yeah, so yeah, this is the basic UH3 functionality of these three components of 0G at very high level. Yeah, so that's my answer for this first question. And I will happy to answer some more detailed questions along with the host.
Thanks. I think that this is super interesting to see that the OG, no pan intended, in the industry, finally be together and then figure out a new solution for this industry. May I be a little bit more nosy on the first time you meet Michael, or how does it work that you guys decided to work together? Is there any background stories to this or it is just like minded people sitting together and say, hey, let's do something that is so different to change the world.
So yeah, so we, I am actually got connected with Michael by through another co founder of OG, their g. Thomas. So who is the previous classmates with Michael in the Stanford Business School. And Thomas also was the one of the investors of conflicts. So both fan long and me knows Thomas very well. And so basically, because I think both the Michael is trying to migrate from his previous career from web 2 area to the web 3. And Fanlong and I are also trying to explore some new topics and new interesting areas and new challenge problem to solve. So, hey, we are, yeah, and then we occasionally got connected by Thomas and we meet with each other and discuss some of these, the really important challenges exist in the current web 3 area. And then we figure out this, that the data scalable, the scalable data solution for this entire industry is quite challenging and very, it needed features and both yeah, it, it very and also this scenario I it covers it will covers very extensive scenarios from this EFM Level 2, which may target for some financial applications. And also this, a bunch of AI scenarios, which quite recently a very hot trend in this area. And yeah, so that's why we have this 0G product and this idea.
That is super inspiring. And while building such a life changing product, what is the biggest challenge face in the field where the innovation and integration of web free and AI are eagerly anticipated?
Yeah, so I think the major challenge here is how we can build a very scalable data solution, but still can satisfy the requirement for the decentralization and security. I think in many areas of the blockchain industry, many of these blockchain system face this, a similar challenges and tradeoffs. And but for the data solution, it's quite, it's even more challenging because it's a scale could be very large. Yeah, happened, not happened before in any an in any layer 1 blockchain stuff. So we are targeting to achieve tens of gigabytes per second data to recruit that we envision will happen in the near future because currently we are enter, we are at the stage where a lot of their tools are building around Ethereum. And also the, a lot of AI systems are coming to this area. Well, which will come, brings a huge amount of user request from the for the AI influence and AI training, what kind of the stuff. And this is really the opportunity and that we can really bring this web 3 technologies into the web 2 word and to help the web mass adoption of the web 3 technologies.And so I think the way that we achieve this, address this challenge is that, so we are trying to maximize this horizontal scalability through design well, for the partitioned mechanism, all of the components set into this system. For example, for our storage network, we are employing a well designed partitioning mechanism and to let the different storage node can store the different data and, but still can well, can perform well to get the reasonably reasonable incentives from the smart contract monitored by the consensus layer. And from this perspective, you may think that our decentralized network is a, there are some other existing system that have the similar functionality like the file Coin RV. So, but we have the different incentive mechanism and also the partitioning system design. So we can achieve much more and much higher scalability compared to compare to the RV and firecoin with respect to this data throughput. And we also think that the, in this system, actually the consensus layer may also become the bottleneck eventually because every data come into the storage network will involve some interaction to the blockchain and to the smart contract on the blockchain to get settlement and to get incentives. So this transaction processing speed of the blockchain, we also become the bottleneck if we envision the keep increasing a data throughput that we want to support. So we also design a, we want also, so in that sense, we also want to further scale out this blockchain stuff. The way we doing that is, we trying to imply some different kind of design for this sharding to this consensus in the sense that we employs multiple consensus layers running independently and simultaneously in our system. And we still want to trying to guarantee this consensual protocol will have the same level of security. So that scheme actually we allow this different consensus protocol share the same set of validators and share the same stake, staking status of these validators. So we know that the for the POS based blockchain, the real thing that guarantees your security of the POS blockchain is how much the capital has staked for supporting this POS based consensus. So in this way, we share the same state stating status so that we share with that all these consensus protocol share the same staked capital capitalist. So in that sense, we can leverage the egg layer stuff that can also borrow this staked is east token from the eigenlayers to guarantee the security of our consensus protocols.And so we can run a arbitrary number of consenzer protocols. Each consenzer protocol will represent a log of a single log of the, for this decentralized data storage and the different consensus portal can be treated as different logs that run in parallel. So that we when we have a different layer 2 or different AI applications, AI frameworks, they actually have this behavior that can independent with each other because they don't have some dependencies among each other. So they can be deployed on a different decentralized logs so that we can have this aggregate throughput with this infinite scalability. And we can also extend our, this restaking stuff from the eigenlayer by introducing the risk taking of BTC, like by integrate with the cape Abilong project and even some other risk taking mechanism in the other blockchain ecosystem like Serrana. And so that we can really make our security larger than any of these ecosystem, which means that we can have the BTC level of security, which could be larger than the Ethereum level security.And yeah, so this is how we achieve this scalability on the storage network and the consensus network. And we also on the data availability layer, we also leverage, we want to maximize this parallelism data by employing this data partition mechanism that we will, when the data come into the da layer, we will first use the error coding to split the data into multiple chunks. And the users can actually by recover the original data by acquiring part of this data chunks. Yeah, and then these different tracks will be submitted to the different storage node in the internet network so that this data throughput can be paralyzed. And yeah, so they, yeah, through this, we can achieve this horizontal scalability for the da pipeline. And so every storage node that store this data, we'll also have some stake in our layer 1 blockchain. So that we actually use some BRF, actually the verifiable random function to select from all the validators of the our blockchain to form a column to hold this data. So because this selection is use this VRF based mechanism, so that is very, it is very decentralized and also can guarantee the security Assumption of the majority honesty.So then the client will collect the the a majority of the signatures from the storage node for the different part of data data. And once it accrues enough amount of signatures, it will submit these signatures to our to the smart contract in our blockchain to decide whether this da property got achieved.So it may becomes the bottleneck of the system if we have to introduce a lot of data into that protocol. So yeah, you can imagine that through our design, we actually trying to maximize all part of the data pipeline to be independent and parallel so that we have achieve the scalability. So we try to minimize this data requirements that has to be into the consensus layer that need to be broadcast. And yeah, so this is how we can achieve this scalability to support this applications. And I think this is the biggest challenges that we are facing in this, yeah, in this, the area that require this large scalable data solution.
Thanks. Mean, I think that we have received a large amount of knowledge from different sector. So we have gone through the deep, the system of validator that you share with us on Zero G, we have talked about security as well as the scalability as well and how we can make the balance of different chain on the performance as well. So is it safe to say that these are all of the criteria that support the high performance for the app building on zero G? And is there anything else that a user first time applying AI technology can expect from
Yes, I think there are a lot of introduce, sorry, a lot of interesting scenarios from the AI area that can be used to leverage 0G capability. For example, for our scale AI infrastructure, it will support a large amount of users for this AI influence or training tasks. And where this requires to store a large amount of data and also the model. So currently, because a popular model is a large language model, and the model can be very big, a single model can be have hundreds of gigabytes. So yeah, so it's a for these scenarios, actually, they can basically leverage this perfectly, leverage the scalability for the data storage and retrieval to their scenarios.And another scenario is this, the entire data engine engineering of the AI pipeline. So before the the data can before the data and model can be used for the inference and training. And the data need to be cleaned and the data might need to be annotated. And also some of the researches have show that a large portion of the data used in the AI training process is the synthetic data. Because the data in the real world actually is not enough to train a very large model. And these kind of data activities really need a scalable data solution as a platform to support their data storage and transformation. And yeah, and a bunch of the stuff. So we have some collaborations with some other AI projects that doing this, the different specific scenarios and which we think very interesting and quite exciting. And also besides this data storage, and we are also trying to build a start to build a decentralized serving platform, which the goal is to address this, how this decentralized resources for the data serving can be incentivized by the through some blockchain based methodology. And the, and this actually will introduce a fair fee marketplace for this service provider. And users want to achieve this to acquire some highly efficient and highly efficient service with the reasonable cost and also the servers need to provide such kind of service and to get reward monitored. Yeah, rude by this smart contract in the level of blockchain. And this service could cover a very extensive scope for the different types of service, including the plan data retrieval from the blob store and also the key value store and also this more complex service like the AI inference and also the training tasks. And but this actually is a long term plan in 40 G to build that kind of thing. Yes, so yes. Oh, so yes, I think I, yeah, I answer this question. Yeah, with this content.
I think that it makes a lot of sense and it would be very meaningful for the AI applicators to use 0G at the moment, because we all understand that how is expensive and how risky it is to combine with 3 and AI together as in the complexity of it or the scalability as you just share. Speaking of the perspective of using the public chains, we have saw, we have seen a lot of questions from the community as well that they are asking which public chains does Serie plan to connect in the future? And what are also the chain that you guys are supporting right now? And why are these chains chosen?
Yeah, so with respect to the connecting to other chains, I think there are two angles to understand this problem. One is to connecting with other chains through some bridge solution that can cross assets from the different chains to the zero G chain and vice versa. This is a this is means that the other chains still external the zero G system, but we have some informations to exchange between each other. So in that sense, actually, what will first, yeah, connect with many typical trends that have very good ecosystem like I, the serum, and some of the popular layer tools like polygon, Arbitron, op stack, and so on. And also other popular layer ones like sui, avalanche, blah, blah, blah. And this actually, I think they, the consideration for this kind of a chain is how valuable those ecosystem to the g 0 g ecosystem that we can go to flourished and through the interaction among them.And another, but another aspect of this connecting is that because as I mentioned that their G actually will employ this a multi consensus system where we will have a unified zero G token for all these kind of consensus protocols. And we think that this actually this consensus protocol can be deployed by our by our about ourself. So we can have some custom blockchain with specific feature requirement for the highly efficient and high with high security to be as one of them. And we can also, I think we can also integrate some existing other chains like Solana, like 3 Aptos, something like that into this consensus, multi, multi, multi consensus system. And, but in that sense, actually, the truth might be based on some criteria that how secure the chain is and how efficient that chain is. So basically, we want this chance to, because in that sense, these trends will become a part of the zero G ecosystem in this to support the to rule this, the storage network and also the da layer for their g.So that we, I we think that the performance and the security are very important features because we are trying to provide this, a large scale system with a highly security with a very high security. And but of course, in that sense, we will first consider to build our own chain, with our own innovative technologies, to be the major part of our system and where we can guarantee that it's a security and scalability because they are designed and are implemented by ourselves. And yeah, so yeah, that's the kind of answer for this question, I think.
Thanks, Ming. I think that it makes sense because at the end of the day, connections and decentralization is also the most necessity in this blockchain world, and thriving for more connection and partnership would be the end goal for any project. But also we have to consider and wake in the criteria of the security as well. And why we're on the topic of ecosystem, is there any program specialized for the app building on zero G? Is there any incubation program going on or any cramps that the user and builder can look forward to when building with zero G.
Yes, so I think we're actually interact with some potential Po, potential partners, also our potential customers like the layer 2 project and also the rollup as service project, for example, the polygon CDK and arbitrage op stack and also the auto layer stuff. And yeah, we AC actively working with them together to build some proof of concept prototype for this integration. It is still ongoing effort. And we also have some collaboration with a some project working on the fully launching game. One is called The blame game and sorry, blade game. And so because the fully launching game actually will trying to keep all the activities happened in the game to TRA to be tracked in the production so that it will, and it also generates many huge amount of DK proof data for those activities to be s to be able to settlement settled to be able to settled on the level 1 blockchain. So in that sense, they will have a requirement for the huge amount of data stored in a very scalable way and would be a good perfect fit for the Zero G's target scope. And yeah, and also the all these kind of AI project like Asia Io dot net and also something called the public AI. Yeah, so some of them are doing this AI computation, the decentralized AI computation framework, and some of them are working on this, the AI data engineering stuff.Yes. And so because our current test net has already been launched, we already have this, our storage network and the production system are deployed and running. And currently, I think many participants actively deploy our deployed the our software by provide this storage node and validation nodes in our network. And because this, we are still currently refractive, refactoring our design for this partitioned data storage layer and also this data availability layer. And so currently we don't have the real integration for our, yeah, with our new version of this system. And this refactoring will be get done very soon. I think it will be one or two weeks later. And we will then upgrade our test net framework to include this, the new version of the storage and dear layer so that we can do the real serious, a real integration stuff with all those customers and also the new applications that will encourage developers to contribute. And I think in the following weeks and months, we will have several incubation and also accelerator programs that we will really focus on the different application areas. I think the first one could be focus on the web 3 AI scenario that we will encourage the participants to build the different kind of AI applications and AI systems by using their g data framework. Yeah, and for the retail user, I think because this because the storage system of 0G is can be treated as a archive store for any kind of user data. So actually the normal user can use it as a one draft like stuff, can put their data into our system for long term storage. And so in that sense, actually, we also encourage users to actively use our system by uploading your, uploading and retrieval, retrieve your data by you. Yeah, by using our system so that it also gave us a test for the functionalities of our system. And yeah, so yeah, that's a, yeah, that's, yeah, mainly about that.
Thank you. I think that we are all looking forward to a world where.
Gamify Project can do 100% of the activities on chain. And it that would be quite impossible for last year, let's say. But for this year, thanks to the advancement of technology, thanks to the invention like zero G itself, I think that it is not too ambitious for a project, especially in games, in AI, to convert all of the activities to be transparent and secure on a blockchain system. So for the last question of today, I will pick a question from our community. And the question is, what kind of development can we expect from Tesnet Newton v 2?
Tesla Newton v 2, actually, we will have our partitioned mechanism implement in our storage network so that we can really test the performance and scalability of the system. And also we have, we will have this, the da capability enabled. Yeah, the mechanism as I just described before. And then user can really expect that we will have some integration with this existing layer true pipeline that, yeah, that if we're user of the real tool scenario, then you can probably observe the effect of how their GC data availability layer can scale the layer 2 activities. Yeah, and I think this is a major functionality that we will provide in the Newton v 2. Let me see what else. So basically.Yeah, so, so I think, I think, yeah, I think we will be very excited to see how the applications for how the ecosystem of a 0G will boomed in the period after our one hour Newton v 2 get along online. And this is yeah, I think this is a the most exciting things that we all of them, all of us can expect.
Thanks. Mean, I think that it is very meaningful to have this information within the community today. Because within our session, I can see a lot of builders getting interested and hyped up by what can be provided from 0G. So guys, if you are planning to apply AI into your new product, if you want to build up a d app that can track everything 100% on blockchain with lowest cost, higher security, higher scalability. You know where to go. So we have reached a finale of our panel discussion. So I would like to give a massive shout out and a heartfelt things to our incredible panelists, Ming from Zero G. Also, I have a very good news for you as well.As usual, 2 days from now, on 30th of may, on exactly APM GTC8 hook protocol will launch a course about zero G. And we hope that all of the information you heard today will help you win the leaderboard till June 6th and split the price of 500 hook token. So it is very exciting to finally apply everything you know from this space and learn more about zero G.At the same time, don't forget that you only have 7 days to enjoy this particular session and gift. And we are at the final part of today's space Ming. Is there anyone thing else that you would like to share with the audience today?
I think I have shared all the things that I want to say. And yeah, thank you for Kara clear for this host. Yeah, and very, I'm very glad to join this AMA.
The feeling is all the same.
0G team is one of the most appreciating team that I have ever had the chance to work with. And guys, before we close this space, don't forget to participate in the alumni course in the next few days. And if you see any good project that you would like us to visit, is there any content that you would like to see, any topic, anything that you want to be featured on hook? Don't forget to comment below. We really appreciate all of your support so far. And we read the comment one by one. And for now, we will have to say goodbye. I will have to see you next week. Goodbye, guys.
Bye bye.
0G Twitter:
My Twitter:
Warpcast:
X Space:
https://x.com/i/spaces/1nAJEaAlpLlJL
This episode of Hook on Web3 Mastery Space features a discussion on pioneering modular AI chains for infinite scalability with Zero Gravity Labs. Ming, the Co-Founder and CTO of Zero Gravity Labs, shares insights into their innovative blockchain infrastructure designed to support high performance applications like on-chain AI gaming and decentralized finance. Key components include a decentralized storage network, a unique consensus protocol, and a scalable data availability layer. * Innovative consensus protocol achieving high efficiency without sacrificing decentralization and security. * Scalable data solutions critical for AI training and inference tasks. * Strategic partnerships within the blockchain and AI ecosystems, including Ethereum, Polygon, and Solana.
Hello, ladies and gentlemen. Welcome to another episode of hook on web free Mastery Space. In our past episodes, we are having a pleasure of discussing the future of web 3 with numerous partners. And today, we are exciting to delve into another very exciting topic, which is AI and the innovation of blockchain technology with our new partner, Zero Gravity Labs, a transformative blockchain infrastructure project aimed at enabling high performance applications such as on chain AI gaming and decentralized finance.As we just started the space, I would like to encourage you to interact with us to spice up the environment. You can send us a comment, a question or anything to the guest and we will read at the end of the first session. While I'm here sipping on my peach juice, you can also comment what is your favorite snack to go with the space or simply type GM from whenever you are in the world. It will greatly boost the confidence of the guests and encourage us to produce more meaningful content to you in the future. Give us a like a retweet as usual. And also don't forget to click follow Zero Grab Gravity Labs and Hook Protocol X Twitter official account right now for the latest update. If you are a frequent listener to our space, you would be mostly unfamiliar with me, your host, Claire. But if you are a first timer here, let me give you a reminder into what Hook 2.0 is.This is a visionary expansion of the current hook social learning ecosystem with a deeper focus on web 3 education. Here, we introduce you a more immersive social learning, metaverse and app chain infrastructure powered by the hook as a native token and multiple opportunities to generate yield through the rewards.Today, we are honored to have the representative from Zero Gravity Labs here. Let's meet Ming, co founder and CEO of Zero Gravity joining us. The topic today that we are covering is the pioneering modular AI chains for infinite scalability. It's time to show us what is the product about, as well as some introduction about yourself. Ming, over to you.
Yeah, thank you. Yeah, thank you, Claire. So, yeah, so I mean, and I'm the co founder and the CTO of 0G. And so my background is mainly on the technical side. So before join the web 3 area, I was working in the Microsoft Research Asia for 11 years and my research direction is mainly on the dish with this, including all kinds of aspects in that area, including the computation, storage, and even the AI platforms. And in 20, I co founded with Fan Long the complex project, which is a high performance consensus protocol, layer 1 black chain. And yeah, so in that, in where we design a innovative consensual protocol to achieve the high efficiency without sacrificing the decentralization and security. And then recently, yeah, we're trying to seek for some new challenges in this area and want to create new, the best web 3 project and the best web 3 team. So we, yeah, we ma with Michael and we discuss and brainstorm, come up with this idea of 0G. And so we start this project. And zero G is a decentralized data solution, which has a infinite scalability that can support all kinds of application scenarios that requires this huge amount of data throughput. For example, the data availability scenario that require at the Ethereum layer 2 scenario applications and also the a bunch of decentralized AI projects that will have this large request for this data engineering and data transformation stuff. And yeah, so Fuji actually provide a, it consists of three major components. One is a decentralized storage network and another is a consensus protocol and also can be treated as a layer 1 blockchain. And the store, the storage network will connect with this consensus protocol in the sense that we use the consensual protocol to incentivize this decentralized storage menu stuff. So each storage node in the storage network will periodically participate this storage manding by provide a proof of the storage for some specific piece of data in the entire history. And let's submit this proof to the smart contract on the layer 1 blockchain, their G chain, and in the smart contract where the incentive mechanics are implemented. And on top of the on top of this decentralized storage, and we also provide a data available layer that have some extra functionality specific to the layer 2, Ethereum layer 2 scenario. And the da layer will employ some error coding mechanism and also the combined with the KCG commitment. And it will split the data into multiple chunks and will submit the chunks to the different storage node in the network. Yeah, so yeah, this is the basic UH3 functionality of these three components of 0G at very high level. Yeah, so that's my answer for this first question. And I will happy to answer some more detailed questions along with the host.
Thanks. I think that this is super interesting to see that the OG, no pan intended, in the industry, finally be together and then figure out a new solution for this industry. May I be a little bit more nosy on the first time you meet Michael, or how does it work that you guys decided to work together? Is there any background stories to this or it is just like minded people sitting together and say, hey, let's do something that is so different to change the world.
So yeah, so we, I am actually got connected with Michael by through another co founder of OG, their g. Thomas. So who is the previous classmates with Michael in the Stanford Business School. And Thomas also was the one of the investors of conflicts. So both fan long and me knows Thomas very well. And so basically, because I think both the Michael is trying to migrate from his previous career from web 2 area to the web 3. And Fanlong and I are also trying to explore some new topics and new interesting areas and new challenge problem to solve. So, hey, we are, yeah, and then we occasionally got connected by Thomas and we meet with each other and discuss some of these, the really important challenges exist in the current web 3 area. And then we figure out this, that the data scalable, the scalable data solution for this entire industry is quite challenging and very, it needed features and both yeah, it, it very and also this scenario I it covers it will covers very extensive scenarios from this EFM Level 2, which may target for some financial applications. And also this, a bunch of AI scenarios, which quite recently a very hot trend in this area. And yeah, so that's why we have this 0G product and this idea.
That is super inspiring. And while building such a life changing product, what is the biggest challenge face in the field where the innovation and integration of web free and AI are eagerly anticipated?
Yeah, so I think the major challenge here is how we can build a very scalable data solution, but still can satisfy the requirement for the decentralization and security. I think in many areas of the blockchain industry, many of these blockchain system face this, a similar challenges and tradeoffs. And but for the data solution, it's quite, it's even more challenging because it's a scale could be very large. Yeah, happened, not happened before in any an in any layer 1 blockchain stuff. So we are targeting to achieve tens of gigabytes per second data to recruit that we envision will happen in the near future because currently we are enter, we are at the stage where a lot of their tools are building around Ethereum. And also the, a lot of AI systems are coming to this area. Well, which will come, brings a huge amount of user request from the for the AI influence and AI training, what kind of the stuff. And this is really the opportunity and that we can really bring this web 3 technologies into the web 2 word and to help the web mass adoption of the web 3 technologies.And so I think the way that we achieve this, address this challenge is that, so we are trying to maximize this horizontal scalability through design well, for the partitioned mechanism, all of the components set into this system. For example, for our storage network, we are employing a well designed partitioning mechanism and to let the different storage node can store the different data and, but still can well, can perform well to get the reasonably reasonable incentives from the smart contract monitored by the consensus layer. And from this perspective, you may think that our decentralized network is a, there are some other existing system that have the similar functionality like the file Coin RV. So, but we have the different incentive mechanism and also the partitioning system design. So we can achieve much more and much higher scalability compared to compare to the RV and firecoin with respect to this data throughput. And we also think that the, in this system, actually the consensus layer may also become the bottleneck eventually because every data come into the storage network will involve some interaction to the blockchain and to the smart contract on the blockchain to get settlement and to get incentives. So this transaction processing speed of the blockchain, we also become the bottleneck if we envision the keep increasing a data throughput that we want to support. So we also design a, we want also, so in that sense, we also want to further scale out this blockchain stuff. The way we doing that is, we trying to imply some different kind of design for this sharding to this consensus in the sense that we employs multiple consensus layers running independently and simultaneously in our system. And we still want to trying to guarantee this consensual protocol will have the same level of security. So that scheme actually we allow this different consensus protocol share the same set of validators and share the same stake, staking status of these validators. So we know that the for the POS based blockchain, the real thing that guarantees your security of the POS blockchain is how much the capital has staked for supporting this POS based consensus. So in this way, we share the same state stating status so that we share with that all these consensus protocol share the same staked capital capitalist. So in that sense, we can leverage the egg layer stuff that can also borrow this staked is east token from the eigenlayers to guarantee the security of our consensus protocols.And so we can run a arbitrary number of consenzer protocols. Each consenzer protocol will represent a log of a single log of the, for this decentralized data storage and the different consensus portal can be treated as different logs that run in parallel. So that we when we have a different layer 2 or different AI applications, AI frameworks, they actually have this behavior that can independent with each other because they don't have some dependencies among each other. So they can be deployed on a different decentralized logs so that we can have this aggregate throughput with this infinite scalability. And we can also extend our, this restaking stuff from the eigenlayer by introducing the risk taking of BTC, like by integrate with the cape Abilong project and even some other risk taking mechanism in the other blockchain ecosystem like Serrana. And so that we can really make our security larger than any of these ecosystem, which means that we can have the BTC level of security, which could be larger than the Ethereum level security.And yeah, so this is how we achieve this scalability on the storage network and the consensus network. And we also on the data availability layer, we also leverage, we want to maximize this parallelism data by employing this data partition mechanism that we will, when the data come into the da layer, we will first use the error coding to split the data into multiple chunks. And the users can actually by recover the original data by acquiring part of this data chunks. Yeah, and then these different tracks will be submitted to the different storage node in the internet network so that this data throughput can be paralyzed. And yeah, so they, yeah, through this, we can achieve this horizontal scalability for the da pipeline. And so every storage node that store this data, we'll also have some stake in our layer 1 blockchain. So that we actually use some BRF, actually the verifiable random function to select from all the validators of the our blockchain to form a column to hold this data. So because this selection is use this VRF based mechanism, so that is very, it is very decentralized and also can guarantee the security Assumption of the majority honesty.So then the client will collect the the a majority of the signatures from the storage node for the different part of data data. And once it accrues enough amount of signatures, it will submit these signatures to our to the smart contract in our blockchain to decide whether this da property got achieved.So it may becomes the bottleneck of the system if we have to introduce a lot of data into that protocol. So yeah, you can imagine that through our design, we actually trying to maximize all part of the data pipeline to be independent and parallel so that we have achieve the scalability. So we try to minimize this data requirements that has to be into the consensus layer that need to be broadcast. And yeah, so this is how we can achieve this scalability to support this applications. And I think this is the biggest challenges that we are facing in this, yeah, in this, the area that require this large scalable data solution.
Thanks. Mean, I think that we have received a large amount of knowledge from different sector. So we have gone through the deep, the system of validator that you share with us on Zero G, we have talked about security as well as the scalability as well and how we can make the balance of different chain on the performance as well. So is it safe to say that these are all of the criteria that support the high performance for the app building on zero G? And is there anything else that a user first time applying AI technology can expect from
Yes, I think there are a lot of introduce, sorry, a lot of interesting scenarios from the AI area that can be used to leverage 0G capability. For example, for our scale AI infrastructure, it will support a large amount of users for this AI influence or training tasks. And where this requires to store a large amount of data and also the model. So currently, because a popular model is a large language model, and the model can be very big, a single model can be have hundreds of gigabytes. So yeah, so it's a for these scenarios, actually, they can basically leverage this perfectly, leverage the scalability for the data storage and retrieval to their scenarios.And another scenario is this, the entire data engine engineering of the AI pipeline. So before the the data can before the data and model can be used for the inference and training. And the data need to be cleaned and the data might need to be annotated. And also some of the researches have show that a large portion of the data used in the AI training process is the synthetic data. Because the data in the real world actually is not enough to train a very large model. And these kind of data activities really need a scalable data solution as a platform to support their data storage and transformation. And yeah, and a bunch of the stuff. So we have some collaborations with some other AI projects that doing this, the different specific scenarios and which we think very interesting and quite exciting. And also besides this data storage, and we are also trying to build a start to build a decentralized serving platform, which the goal is to address this, how this decentralized resources for the data serving can be incentivized by the through some blockchain based methodology. And the, and this actually will introduce a fair fee marketplace for this service provider. And users want to achieve this to acquire some highly efficient and highly efficient service with the reasonable cost and also the servers need to provide such kind of service and to get reward monitored. Yeah, rude by this smart contract in the level of blockchain. And this service could cover a very extensive scope for the different types of service, including the plan data retrieval from the blob store and also the key value store and also this more complex service like the AI inference and also the training tasks. And but this actually is a long term plan in 40 G to build that kind of thing. Yes, so yes. Oh, so yes, I think I, yeah, I answer this question. Yeah, with this content.
I think that it makes a lot of sense and it would be very meaningful for the AI applicators to use 0G at the moment, because we all understand that how is expensive and how risky it is to combine with 3 and AI together as in the complexity of it or the scalability as you just share. Speaking of the perspective of using the public chains, we have saw, we have seen a lot of questions from the community as well that they are asking which public chains does Serie plan to connect in the future? And what are also the chain that you guys are supporting right now? And why are these chains chosen?
Yeah, so with respect to the connecting to other chains, I think there are two angles to understand this problem. One is to connecting with other chains through some bridge solution that can cross assets from the different chains to the zero G chain and vice versa. This is a this is means that the other chains still external the zero G system, but we have some informations to exchange between each other. So in that sense, actually, what will first, yeah, connect with many typical trends that have very good ecosystem like I, the serum, and some of the popular layer tools like polygon, Arbitron, op stack, and so on. And also other popular layer ones like sui, avalanche, blah, blah, blah. And this actually, I think they, the consideration for this kind of a chain is how valuable those ecosystem to the g 0 g ecosystem that we can go to flourished and through the interaction among them.And another, but another aspect of this connecting is that because as I mentioned that their G actually will employ this a multi consensus system where we will have a unified zero G token for all these kind of consensus protocols. And we think that this actually this consensus protocol can be deployed by our by our about ourself. So we can have some custom blockchain with specific feature requirement for the highly efficient and high with high security to be as one of them. And we can also, I think we can also integrate some existing other chains like Solana, like 3 Aptos, something like that into this consensus, multi, multi, multi consensus system. And, but in that sense, actually, the truth might be based on some criteria that how secure the chain is and how efficient that chain is. So basically, we want this chance to, because in that sense, these trends will become a part of the zero G ecosystem in this to support the to rule this, the storage network and also the da layer for their g.So that we, I we think that the performance and the security are very important features because we are trying to provide this, a large scale system with a highly security with a very high security. And but of course, in that sense, we will first consider to build our own chain, with our own innovative technologies, to be the major part of our system and where we can guarantee that it's a security and scalability because they are designed and are implemented by ourselves. And yeah, so yeah, that's the kind of answer for this question, I think.
Thanks, Ming. I think that it makes sense because at the end of the day, connections and decentralization is also the most necessity in this blockchain world, and thriving for more connection and partnership would be the end goal for any project. But also we have to consider and wake in the criteria of the security as well. And why we're on the topic of ecosystem, is there any program specialized for the app building on zero G? Is there any incubation program going on or any cramps that the user and builder can look forward to when building with zero G.
Yes, so I think we're actually interact with some potential Po, potential partners, also our potential customers like the layer 2 project and also the rollup as service project, for example, the polygon CDK and arbitrage op stack and also the auto layer stuff. And yeah, we AC actively working with them together to build some proof of concept prototype for this integration. It is still ongoing effort. And we also have some collaboration with a some project working on the fully launching game. One is called The blame game and sorry, blade game. And so because the fully launching game actually will trying to keep all the activities happened in the game to TRA to be tracked in the production so that it will, and it also generates many huge amount of DK proof data for those activities to be s to be able to settlement settled to be able to settled on the level 1 blockchain. So in that sense, they will have a requirement for the huge amount of data stored in a very scalable way and would be a good perfect fit for the Zero G's target scope. And yeah, and also the all these kind of AI project like Asia Io dot net and also something called the public AI. Yeah, so some of them are doing this AI computation, the decentralized AI computation framework, and some of them are working on this, the AI data engineering stuff.Yes. And so because our current test net has already been launched, we already have this, our storage network and the production system are deployed and running. And currently, I think many participants actively deploy our deployed the our software by provide this storage node and validation nodes in our network. And because this, we are still currently refractive, refactoring our design for this partitioned data storage layer and also this data availability layer. And so currently we don't have the real integration for our, yeah, with our new version of this system. And this refactoring will be get done very soon. I think it will be one or two weeks later. And we will then upgrade our test net framework to include this, the new version of the storage and dear layer so that we can do the real serious, a real integration stuff with all those customers and also the new applications that will encourage developers to contribute. And I think in the following weeks and months, we will have several incubation and also accelerator programs that we will really focus on the different application areas. I think the first one could be focus on the web 3 AI scenario that we will encourage the participants to build the different kind of AI applications and AI systems by using their g data framework. Yeah, and for the retail user, I think because this because the storage system of 0G is can be treated as a archive store for any kind of user data. So actually the normal user can use it as a one draft like stuff, can put their data into our system for long term storage. And so in that sense, actually, we also encourage users to actively use our system by uploading your, uploading and retrieval, retrieve your data by you. Yeah, by using our system so that it also gave us a test for the functionalities of our system. And yeah, so yeah, that's a, yeah, that's, yeah, mainly about that.
Thank you. I think that we are all looking forward to a world where.
Gamify Project can do 100% of the activities on chain. And it that would be quite impossible for last year, let's say. But for this year, thanks to the advancement of technology, thanks to the invention like zero G itself, I think that it is not too ambitious for a project, especially in games, in AI, to convert all of the activities to be transparent and secure on a blockchain system. So for the last question of today, I will pick a question from our community. And the question is, what kind of development can we expect from Tesnet Newton v 2?
Tesla Newton v 2, actually, we will have our partitioned mechanism implement in our storage network so that we can really test the performance and scalability of the system. And also we have, we will have this, the da capability enabled. Yeah, the mechanism as I just described before. And then user can really expect that we will have some integration with this existing layer true pipeline that, yeah, that if we're user of the real tool scenario, then you can probably observe the effect of how their GC data availability layer can scale the layer 2 activities. Yeah, and I think this is a major functionality that we will provide in the Newton v 2. Let me see what else. So basically.Yeah, so, so I think, I think, yeah, I think we will be very excited to see how the applications for how the ecosystem of a 0G will boomed in the period after our one hour Newton v 2 get along online. And this is yeah, I think this is a the most exciting things that we all of them, all of us can expect.
Thanks. Mean, I think that it is very meaningful to have this information within the community today. Because within our session, I can see a lot of builders getting interested and hyped up by what can be provided from 0G. So guys, if you are planning to apply AI into your new product, if you want to build up a d app that can track everything 100% on blockchain with lowest cost, higher security, higher scalability. You know where to go. So we have reached a finale of our panel discussion. So I would like to give a massive shout out and a heartfelt things to our incredible panelists, Ming from Zero G. Also, I have a very good news for you as well.As usual, 2 days from now, on 30th of may, on exactly APM GTC8 hook protocol will launch a course about zero G. And we hope that all of the information you heard today will help you win the leaderboard till June 6th and split the price of 500 hook token. So it is very exciting to finally apply everything you know from this space and learn more about zero G.At the same time, don't forget that you only have 7 days to enjoy this particular session and gift. And we are at the final part of today's space Ming. Is there anyone thing else that you would like to share with the audience today?
I think I have shared all the things that I want to say. And yeah, thank you for Kara clear for this host. Yeah, and very, I'm very glad to join this AMA.
The feeling is all the same.
0G team is one of the most appreciating team that I have ever had the chance to work with. And guys, before we close this space, don't forget to participate in the alumni course in the next few days. And if you see any good project that you would like us to visit, is there any content that you would like to see, any topic, anything that you want to be featured on hook? Don't forget to comment below. We really appreciate all of your support so far. And we read the comment one by one. And for now, we will have to say goodbye. I will have to see you next week. Goodbye, guys.
Bye bye.
0G Twitter:
My Twitter:
Warpcast:
No activity yet