<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/">
    <channel>
        <title>Niket Chauhan</title>
        <link>https://paragraph.com/@niket-chauhan</link>
        <description>Crypto Hodler</description>
        <lastBuildDate>Sun, 19 Apr 2026 04:02:37 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>en</language>
        
        <copyright>All rights reserved</copyright>
        <item>
            <title><![CDATA[AI Researchers Claim They Can Double the Efficiency of Chatbots]]></title>
            <link>https://paragraph.com/@niket-chauhan/ai-researchers-claim-they-can-double-the-efficiency-of-chatbots</link>
            <guid>k3Tgvq00zhOBNgRejw8Y</guid>
            <pubDate>Sat, 05 Aug 2023 14:33:56 GMT</pubDate>
            <description><![CDATA[Abacus AI claims to have found a way to fine-tune LLMs, making them capable of processing 200% their original context token capacity.Have you ever noticed that your AI chatbot get lost in the middle of a conversation, or it simply says it cannot handle prompts that are too long? Well, that is because each model has a limitation in its processing capabilities, and starts to suffer once it goes over that limit —pretty much like they suffered from some kind of a digital attention deficit disorde...]]></description>
            <content:encoded><![CDATA[<h2 id="h-abacus-ai-claims-to-have-found-a-way-to-fine-tune-llms-making-them-capable-of-processing-200percent-their-original-context-token-capacity" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Abacus AI claims to have found a way to fine-tune LLMs, making them capable of processing 200% their original context token capacity.</strong></h2><p>Have you ever noticed that your AI chatbot get lost in the middle of a conversation, or it simply says it cannot handle prompts that are too long? Well, that is because each model has a limitation in its processing capabilities, and starts to suffer once it goes over that limit —pretty much like they suffered from some kind of a digital attention deficit disorder. But this could soon change thanks to a new method for supercharging LLM capabilities.</p><p>Current LLMs have limited context capacities. For example, ChatGPT taps just 8,000 tokens of context, while Claude handles 100,000. Tokens are the basic units of text or code used by an LLM AI to process and generate language This restricts how much background information they can harness when formulating replies. Abacus AI has developed a method that allegedly doubles the usable context length for open-source LLMs like Meta’s Llama without compromising the model&apos;s accuracy in practical application.</p><p>Their technique involves &quot;scaling&quot; the position embeddings that track word locations in input texts. According to their <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://github.com/abacusai/Long-Context">Github page</a>, Abacus AI claims that its scaling method drastically increases the number of tokens that a model can handle.</p><p>The researchers evaluated two scaled LlaMA variants on tasks like substring location and open-book QA. The scale 16 model maintained accuracy on real-world examples up to 16,000-word contexts, versus only 2,000 words in baseline Llama. It even showed some coherence at 20,000+ words, something that was not possible to achieve with just fine-tuning techniques.</p><p>The significance of context extension cannot be overstated. A narrow context window makes the model accurate but not really usable in complex tasks that require some background. Conversely, with an expanded context, LLMs can process and generate better responses but either take more time to do so or return sup-par results. Handling longer contexts efficiently could enable LLMs to absorb whole documents or multiple documents as background when generating text. This may lead to outputs that are more knowledge-grounded and consistent across long conversations.</p><p>However, the gains are not perfectly proportional to the scale factors.</p><p>It’s still necessary to fine tune strategies because scaling alone doesn’t guarantee high quality outputs. The Abacus team is also exploring advanced position encoding schemes from recent papers to further extend context capacity.</p><p>Their work suggests that scaling up existing LLMs is a viable path to expanding usable context length. This could democratize access to Large Language Models capable of handling lots of context at once.</p><p>However, the gains are not perfectly proportional to the scale factors.</p><p>It’s still necessary to fine tune strategies because scaling alone doesn’t guarantee high quality outputs. The Abacus team is also exploring advanced position encoding schemes from recent papers to further extend context capacity.</p><p>Their work suggests that scaling up existing LLMs is a viable path to expanding usable context length. This could democratize access to Large Language Models capable of handling lots of context at once.</p>]]></content:encoded>
            <author>niket-chauhan@newsletter.paragraph.com (Niket Chauhan)</author>
        </item>
        <item>
            <title><![CDATA[Colombia Could Use Waterfalls to Produce Bitcoin, Not Cocaine: Senator Petro]]></title>
            <link>https://paragraph.com/@niket-chauhan/colombia-could-use-waterfalls-to-produce-bitcoin-not-cocaine-senator-petro</link>
            <guid>hMQR1kAu95rWCfqtbzgr</guid>
            <pubDate>Wed, 06 Oct 2021 08:07:22 GMT</pubDate>
            <description><![CDATA[A leftist senator and former presidential candidate in Colombia says the country should follow in El Salvador’s footsteps.Crypto CoinsIn brief:A Colombian senator said that the country should look to El Salvador for inspiration.Gustavo Petro said that the South American country could use renewable energy to mine Bitcoin instead of making cocaine.A high-profile Colombian politician has suggested the country look to El Salvador for inspiration and mine Bitcoin using renewable energy—instead of ...]]></description>
            <content:encoded><![CDATA[<h2 id="h-a-leftist-senator-and-former-presidential-candidate-in-colombia-says-the-country-should-follow-in-el-salvadors-footsteps" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">A leftist senator and former presidential candidate in Colombia says the country should follow in El Salvador’s footsteps.</h2><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/ac08499605e1fb65c7061b41f89202ebf79ddd672ab08f4ea5a9141296452695.jpg" alt="Crypto Coins" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">Crypto Coins</figcaption></figure><h3 id="h-in-brief" class="text-2xl font-header !mt-6 !mb-4 first:!mt-0 first:!mb-0">In brief:</h3><ul><li><p>A Colombian senator said that the country should look to El Salvador for inspiration.</p></li><li><p>Gustavo Petro said that the South American country could use renewable energy to mine Bitcoin instead of making cocaine.</p></li></ul><p>A high-profile Colombian politician has suggested the country look to El Salvador for inspiration and mine Bitcoin using renewable energy—instead of producing cocaine. </p><p>Gustavo Petro, a leftist senator retweeted a story about El Salvador President Nayib Bukele announcing that the country had started mining Bitcoin using volcanic energy and commented: “What if the Pacific coast took advantage of the steep falls of the rivers of the western mountains to produce all the energy of the coast and replace cocaine with energy for cryptocurrencies?” </p><p>President Bukele, who made Bitcoin legal tender in El Salvador last month, claims the Central American nation will mine Bitcoin using geothermal power from its volcanoes.</p><p>Bitcoin mining is the process of using powerful computers to verify transactions on the blockchain and produce digital assets. It uses lots of energy, though, so mining companies are now looking for clean energy to produce the asset. </p><p>Petro, a frontrunner for Colombia’s presidential election next year, added via Twitter that “virtual currency is pure information and therefore energy.”</p><p>It wasn’t clear what Petro meant but Latin America energy policy analyst Wesley Tomaselli told <em>Decrypt</em> that the senator was referring to the potential of Colombia’s Pacific coast to use renewable energy. </p><p>“He&apos;s right in that Colombia has great potential for Bitcoin mining because about three-fourths of its electricity generation comes from hydroelectric power,” he said. “The problem is, Petro appears to be selling Bitcoin mining as an alternative model for development to coca growing and cocaine shipments. It&apos;s not. Unless Petro has a magic wand.”</p><p>Colombia is the world’s largest producer of cocaine, according to the United Nations, and its Pacific coast is where a lot of the coca—the drug’s base ingredient—is grown. </p><p>The Pacific coast is also a potential renewable energy hotspot. “Colombia&apos;s severe Andean mountain geography and river systems make it prime for hydroelectric power generation,” noted Tomaselli. </p><p>But he added that mega projects in the area “have grown less popular here because of resistance from environmentalists who see them as damaging to local communities and ecosystems.”</p><p>El Salvador’s government has said that it will use only renewable and clean energy to produce the currency, an idea that has been both praised by Bitcoin believers and slammed by Bukele’s critics. </p><p>Could Colombia be next?</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/d4440e04ad251a22625770f36e284190aee51964eae63c7eb95f7184cbbe7334.png" alt="Twitter Tweet" blurdataurl="data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACwAAAAAAQABAAACAkQBADs=" nextheight="600" nextwidth="800" class="image-node embed"><figcaption HTMLAttributes="[object Object]" class="">Twitter Tweet</figcaption></figure>]]></content:encoded>
            <author>niket-chauhan@newsletter.paragraph.com (Niket Chauhan)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/c94518b6fa0b0db1e8c14a5dd062af9f6bb91ad10c913f5a7a384155a52e8c40.png" length="0" type="image/png"/>
        </item>
    </channel>
</rss>