This edition of the newsletter dives into DeepSeek, which has disrupted the AI industry by creating high-performing open-source models at just a fraction of the cost of its competitors, offering API costs 95% cheaper as well. Their breakthrough challenges traditional AI development assumptions and impacts both tech and crypto sectors, though they face challenges with Chinese regulations and censorship requirements. We'll also share some interesting articles, portfolio updates and market highlights.
a) Why everyone in AI is freaking out about DeepSeek
• DeepSeek, a Chinese AI company, has released DeepSeek-R1, a new large language model that matches or exceeds OpenAI's best model in performance while being open-source, significantly cheaper to use (costing about $5 million to train), and offering features like web search integration.
• The model's success has caused significant discussion in Silicon Valley, particularly due to its ability to challenge American AI dominance at a fraction of the cost, though some have raised concerns about potential censorship issues given the company's Chinese origins.
b) JupNet
• Jupiter has announced Jupnet, an ambitious omnichain network designed to aggregate all cryptocurrency activity into a single decentralized ledger, built on three key systems: the DOVE Network (combining validators, oracles, and executors), an Omnichain Ledger Network, and Aggregated Decentralized Identity (ADI).
• They aim to create a unified global market where users can access all chains, currencies, and commodities through a single account, moving away from traditional wallet-based systems toward a more user-friendly account-based approach.
c) Deepseek
• DeepSeek's release of their R1 model, which matches OpenAI's performance at a fraction of the cost ($6 million vs billions) and is open-source, has sparked intense discussion about US-China AI competition, though the author argues this should be viewed as a positive open-source contribution to the field rather than a geopolitical threat.
• The article also covers the launch of Venice's privacy-focused AI chat platform and its VVV token, which reached a $350 million market cap on day one, while noting some misleading aspects of its privacy claims and controversy over Coinbase's quick listing of the token.
a) MyShell
• MyShell has announced their integration of DeepSeek's new AI models, completing the DeepSeek R1 integration in just 4 hours and expressing enthusiasm about DeepSeek V3's decentralized training approach.
• The partnership aims to democratize AI development by making powerful language models more accessible and cost-effective for a wider range of users.
b) Jambo
• Jambo is building a Web3 ecosystem centered around $99 smartphones targeted at emerging markets, having sold nearly 1 million phones to date and aiming to provide crypto access to the approximately 3 billion people globally who lack smartphone access.
• The project, recently valued at $500 million and backed by major investors like Paradigm and Pantera Capital, has just launched its $J token across multiple cryptocurrency exchanges as part of its strategy to drive ecosystem engagement through rewards and cost-offset mechanisms.
c) Tap Protocol
• Tap Protocol has partnered with Internet Computer Protocol (ICP) and the Dfinity Foundation to develop DeFi applications directly on Bitcoin's Layer-1 blockchain, with their first achievement being the launch of Taparoo Swap, the first decentralized exchange built directly on Bitcoin.
• The collaboration aims to enhance Bitcoin's functionality beyond just being a store of value by enabling developers to create and deploy smart contracts on Bitcoin's native blockchain without requiring third-party bridges or intermediaries.
We are just at the start of the year 2025, and already we have another emergent AI player in DeepSeek, a Chinese AI company which has since emerged as a transformative force in AI, sending shockwaves through the entire Silicon Valley area and the whole AI sector in general. Both AI and tech stocks has been plunging ever since the news of DeepSeek broke out and took mainstream media by storm. Part of a Chinese quant trading firm, DeepSeek has achieved what many thought was impossible: creating state-of-the-art AI models at a fraction of the cost compared to industry giants.
Background and technical innovation
One of DeepSeek's most outstanding achievement is training its V3 model for just $5.576 million (exact numbers vary, depending on the source you use), using 2,788 thousand H800 GPU hours – this typically costs competitors (specifically their US counterparts) hundreds of millions. DeepSeek's efficiency stems from several groundbreaking innovations in model architecture and training methodology.
The company developed Multi-Head Latent Attention (MLA), a compression technique that dramatically reduces memory usage during training and inference. Their Mixture of Experts (MoE) architecture selectively activates only necessary model components - you can see how this significantly reduces computational costs. Additionally, their implementation of FP8-based mixed-precision training decreases resource requirements while maintaining industry-grade performance standards.
The company's R1 model (inference) further demonstrates their capabilities, matching or exceeding OpenAI's o1 in reasoning abilities. It achieves impressive benchmarks, including 79.8% accuracy on AIME 2024 mathematics competition, 97.3% on MATH-500, and reaches the 96.3 percentile on Codeforces programming competitions (this data was pulled from various sources).
What's unique about DeepSeek?
DeepSeek distinguishes itself through various factors. Their API pricing is approximately 95% lower than competitors like OpenAI and Anthropic, making advanced AI capabilities accessible to a broader range of organizations. This means that companies using OpenAI tech or wrappers now have a cheaper alternative and will ultimately bring down their costs as well.
Unlike many Western counterparts, they maintain a strong commitment to open source, providing full access to model weights and detailed technical documentation. Their performance matches or exceeds leading closed-source alternatives, challenging the assumption that proprietary models are superior. However we note that some phrases, especially sensitive topics like politics, are still censored.
How we feel about DeepSeek's impact on AI x Crypto
The dramatic reduction in training costs opens new possibilities for distributed and decentralised AI development, addressing one of decentralized AI's biggest bottlenecks. Smaller teams can now compete effectively, potentially accelerating the development of specialized, niche AI models in crypto.
The 95% reduction in API costs could revolutionize AI-powered crypto dApps, making advanced AI capabilities accessible to projects of all ranges. DeepSeek's success provides a template for future AI projects in crypto, demonstrating how open-source principles can align with tokenomics and decentralized governance models. We have also seen a few projects integrating DeepSeek such as MyShell which completed the DeepSeek R1 integration in just 4 hours.
However we also note that most of the AI projects in crypto have overly inflated valuations that cannot be justified now that DeepSeek has done it with a much lower budget and valuation. This will severely compress the valuations of most AI x Crypto projects moving forward.
Key risks and challenges
Despite its achievements, DeepSeek faces several notable challenges. Operating under Chinese regulations means certain content restrictions and censorship requirements, as evidenced by the model's refusal to discuss topics like Tiananmen Square and other political questions. Questions persist about potential Chinese government influence and control over the company.
Training data transparency remains a concern, with uncertainty about the sources and nature of the data used to train their models. The ultra-low cost model raises questions about long-term business sustainability and ability to maintain competitive advantages. Security concerns may limit adoption by Western organizations, particularly those handling sensitive data.
Market impact
DeepSeek's innovations have triggered significant responses from established players. Major tech companies face pressure to justify their higher prices and closed-source approaches. The efficiency achievements suggest the possibility of delivering state-of-the-art results with far fewer resources than previously thought necessary (this points to why Nvidia and most of the tech stocks took a nose dive in prices).
The impact extends beyond technical achievements. DeepSeek's success challenges fundamental assumptions about AI development costs and resource requirements. Their approach suggests a future where advanced AI capabilities become increasingly accessible and cost-effective, potentially accelerating the democratization of AI technology.
Future outlook
As CEO Liang Wenfeng stated, "In the face of disruptive technologies, moats created by closed source are temporary." This philosophy, combined with their technical innovations, positions DeepSeek at the forefront of AI democratization (which also aligns with the whole ethos of Crypto x AI). Their success demonstrates that efficient, open-source alternatives can compete with well-funded, closed-source incumbents that have higher budget and valuations.
Looking ahead, DeepSeek's trajectory suggests a fundamental shift in how AI models are developed and deployed. Their achievements challenge the notion that cutting-edge AI development requires massive resources and proprietary technology. Instead, they point toward a future where innovation in efficiency and open collaboration drive progress.
The company's success will ultimately depend on navigating complex technical, regulatory, and security challenges while maintaining their innovative edge. Their ability to balance open-source principles with commercial viability, while addressing concerns about government influence and data privacy, will be crucial.
As the AI landscape continues to evolve, DeepSeek's approach could serve as a template for future development, potentially reshaping how we think about AI model training, deployment, and commercialization. Their impact extends beyond technical achievements, suggesting a future where advanced AI capabilities become more accessible, efficient, and collaborative.
Over 200 subscribers