<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/">
    <channel>
        <title>Code &amp; Codex</title>
        <link>https://paragraph.com/@code_codex</link>
        <description>Code &amp; Codex is a newsletter I started in early 2025. It’s a cyberpunk-inspired project that explores the liminal space between software engineering, encrypted knowledge, and the aesthetics of digital subculture.</description>
        <lastBuildDate>Wed, 22 Apr 2026 22:32:50 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>en</language>
        
        <copyright>All rights reserved</copyright>
        <item>
            <title><![CDATA[The Brainrot Industrial Complex]]></title>
            <link>https://paragraph.com/@code_codex/the-brainrot-industrial-complex</link>
            <guid>lCUtlEXNjpqui5DdHujO</guid>
            <pubDate>Sun, 12 Apr 2026 04:06:41 GMT</pubDate>
            <description><![CDATA[Peace be upon you, fellow digital wanderer. There is perhaps no better term to describe the condition of modern society than the internet slang: “brainrot.” A phrase often used jokingly by the younger digital native generation to describe that familiar state where the mind feels dulled, overstimulated, unable to hold a single thought for long. “This meme gave me brainrot” The term is often said in humour, but beneath the joke lies something real. There are underlying signals of, for the lack ...]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital wanderer.</em></p><p>There is perhaps no better term to describe the condition of modern society than the internet slang: <strong>“brainrot.”</strong> A phrase often used jokingly by the younger digital native generation to describe that familiar state where the mind feels dulled, overstimulated, unable to hold a single thought for long.</p><p><em>“This meme gave me brainrot”</em></p><p>The term is often said in humour, but beneath the joke lies something real. There are underlying signals of, for the lack of better term, <em>rotting of the brain, one’s conscious mind.</em></p><p>Interestingly, the term did not emerge from critics nor did it come from academics. The term <strong>“brainrot”</strong> came from the very generation that is immersed in it. A kind of self-aware diagnosis. They feel something is off, even if they don’t fully articulate why.</p><p>Here is how I would define <strong>“brainrot”</strong>: </p><blockquote><p><em>Brainrot is the gradual erosion of one’s ability to think, focus, and reflect, caused by continuous exposure to high-stimulation, low-substance digital input.</em></p></blockquote><p>The term <strong>“Industrial Complex”</strong> here came from the phrase popularised by Dwight D. Eisenhower, <strong>“The Military-Industrial Complex”</strong>. In simple terms, it describes the tight relationship between government, military, and private industry, where each benefits from the continued production of weapons and wars. It is not merely an industry that produces weapons. It is a complete system designed to sustain the need for them.</p><p>And we are witnessing today something similar. The <strong>“brainrot industrial complex”</strong> is a whole industry designed to hijack your attention, by making your brain’s reward centers produce dopamine as cheaply and abundantly as possible to maximise profits.</p><p>The system offers this “cheap thrills” as a remedy for boredom, for restlessness, and perhaps for something deeper we struggle to name.</p><p>The modern internet is no longer designed to inform you, it is now designed to dissolve your ability to think clearly. The <strong>“brainrot”</strong> phenomena is not just distraction, but decay.</p><p>There was a time when the word <strong>“distraction”</strong> carried a far heavier meaning than it does today. In older English, to be “distracted” did not mean to lose focus momentarily, it used to mean <strong>“to be mentally disturbed”</strong>, a mind pulled apart, unable to hold itself together.</p><p>The term itself comes from the Latin <em>distrahere</em>, which means to draw apart, to tear in different directions.</p><p>Today, when we get distracted, what are we drawn apart from? Think about this.</p><p>What was once a condition of disorder, has now become a condition of normalcy. What was once described as <strong>the mind being “pulled apart”</strong>, is now the <em>default state</em> of the modern internet user.</p><p>Distraction is not a new crisis. Humans have always been susceptible. The Romans worried about the spectacle of the Coliseum, they lamented moral decay from games and gossip. The Victorians panicked over novels that stole attention from more serious pursuits.</p><p>What is different today is the scale, intensity, and intent. The <strong>brainrot industrial complex</strong> is not creating a new weakness, it is weaponising one that has always existed.</p><h2 id="h-what-can-we-do-about-it" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">What can we do about it? </h2><p>We are not yet in a position to dismantle the whole system. It requires policy reform, platform regulations, and a lot of activism. I’m not saying that we should not do that, for now, I just want to leave you with practical remedies for the individual self.</p><p>Awareness is the first step. We cannot dismantle the system, but we can <strong>navigate it with intention</strong>. </p><p>Start noticing the pulls on your attention, pause before you scroll. Observe the triggers: boredom, anxiety, existential itch. These are what the <strong>brainrot industrial complex</strong> exploits. Take small steps to reclaim your attention.</p><p>Most important is to be more intentional in your digital media consumption. Do not let the algorithm decide what you should indulge. Choose consciously what enters your mind.</p><p>Understand that you are not bored, you are being processed. It is not that you are lacking discipline that makes you easily distracted, it is because you are in the machine optimised for <strong>brainrot</strong>.</p><p>And for those who build, understand this: the systems we inhabit are not inevitable. They are designed.</p><p>The feeds, the loops, and the endless scroll, these are choices, not laws of nature. It is possible to build differently. To design tools that respect attention, rather than exploit it. To create protocols that serve users, rather than process them.</p><p>A different internet is not a fantasy. <strong>It is a matter of intention.</strong></p><p><em>Stay glitched, stay human.</em><br>Jibone.</p><hr><p>On-chain edition / <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://jshamsul.com/code-and-codex"><strong>Archived at </strong></a><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="http://jshamsul.com"><strong>jshamsul.com</strong></a><a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://jshamsul.com/code-and-codex"><strong> </strong></a>/ <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://codeandcodex.substack.com/"><strong>Published on Substack</strong></a> / <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://primal.net/p/nprofile1qqsqsgehv75h9hvla3d39gzh87w5karednhcaj6z3jhy5tmtsxmzgjgpwsmcx#reads"><strong>Relayed across Nostr</strong></a> .</p><br><br>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <category>social-media</category>
            <category>attention-economy</category>
            <enclosure url="https://storage.googleapis.com/papyrus_images/cb74339799bb4179fec4c7ef1fdf62fcbc5f61438e199a2e07cd9ccec82d6225.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Fear + Hate = Power]]></title>
            <link>https://paragraph.com/@code_codex/fear-hate-power</link>
            <guid>tNUfYZlkJrnbLEsTAW1u</guid>
            <pubDate>Sun, 05 Apr 2026 02:00:49 GMT</pubDate>
            <description><![CDATA[The old formula to keep people obedient is cracking.]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital wander.</em></p><p>“f + h = p (fear plus hate equals power)” was the marketing tagline for Eugene Burdick’s 1956 novel titled, ‘The Ninth Wave’. He was a political scientist interested in the psychology of mass persuasion, political marketing, and how fear and emotion can be engineered to manipulate democratic societies.</p><p>The Ninth Wave is a political novel about a charismatic, ruthless political strategist that uses the “fear + hate” formula to gain power. The same formula has been applied again and again, in global politics and more prominently in Western politics.</p><p>Historically, political mobilisation often relies on external threats, moral panic, and outgroup construction. They construct a group, and use that group as a reference that is closely related to, but not a member of, the group being targeted. For example, the terrorist group ISIS is an outgroup constructed to reference the larger Islamic demographic, thus making the bigger group, in this case muslims, as the target. They made muslims and the religion Islam, as something to fear and something to hate, all for political milage.</p><p>This “fear + hate” has been going on for years and years. In the U.S. it appears that there is always something to ‘fear’, and for that reason, there is always something to ‘hate’, both of them together gives them ‘power’. It could be Islam, or communism, or socialism, or China, or Russia, or Iran and so on, every day there is something new for the U.S. citizens to fear, and something new to hate.</p><blockquote><p><strong><em>Fear keeps them obedient; hate gives them a target.</em></strong></p></blockquote><p>This is not the 1984 Orwellian world, it’s Huxley’s Brave New World.</p><p>Just like in the novel Brave New World, they have been kept walled up in their perfect ‘World State’. Controlled and conditioned in their caste-based society. Everyone is given Soma, the pleasure drug, to keep them distracted all the time. In the novel, they refer to the rest of the outside world as ‘Savage Reservations’, which to them, it’s exotic, uncontrolled, traditional, and primitive.</p><p>For the longest time, this was the case. The ‘powers that be’ walled up their people in their cultural enclosure, creating a wall separating from the outside world. The wall is not physical, it's psychological, informational, and cultural. Built out of hate and fear of the unfamiliar. Behind their walled fortress, their people are constantly being told that the “barbarians” are at the gates. “We can protect you, as long as we are in power you’ll be safe.” they keep saying.</p><p>Things have changed today. The internet’s global interconnectedness had torn down the wall, breaking down the information silos that once made fear propaganda sustainable. A total context collapse. It is more difficult now to keep an isolated, one-sided political narrative.</p><p>We used to sit at different tables. One table can talk bad, invent false narratives, and frame the group on the other table in a bad light just so that they could create something to fear, something to hate. The groups from the other table know nothing about it.</p><p>Today, we all sit at the same table. You can’t talk bad about the other group without the other group calling it out. This is why the “fear + hate” formula is cracking.</p><p>Before the Internet, access to information is being monopolised. People rely on broadcast news and published newspapers that are all being controlled and filtered. There were less peer-to-peer information exchange. Sure, during the industrial era, transportation allows people to travel far, but not everyone travels. Transportation carries people and goods, but ideas and information, not much.</p><p>The Internet, in its pure protocol and distributed nature, allows peer-to-peer information exchange. You can know something directly from the source.</p><p>For example, Western politicians might ramp up the anti-Islam agenda by flooding fake narratives about Islam, inventing an external threat, planting false ideas of what Islam is, all with the goal to create fear that would lead to hate, thus giving them power. Today, with the global interconnectedness of the Internet, muslim from other parts of the world can refute their narrative. Non-muslim social influencers read the Quran, and debunking all their false claims about the religion live on social media platforms.</p><p>The internet had allowed the chance for a counter-narrative to compete. When information is high asymmetry, centralise controlled, being gatekeep, is easier to manipulate for “fear + hate”. In a distributed access model, with lower asymmetry of information, truth becomes accessible directly, reducing the effectiveness of propaganda.</p><p>The “fear + hate = power” formula is cracking, and the powers that be know this.</p><h2 id="h-the-new-formula-for-the-digital-age" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0">The new formula for the digital age</h2><p>While the old wall of ignorance is falling, the new digital wall is being built. A wall made algorithmically, creating new information silos and echo chambers.</p><p>They can’t stop the internet. What they’ll do is to controlled what you see on the internet. This is not censorship, you can still say what you want online, but the chances that your content reaches the audience you wanted to reach is not in your control. Your content is at the mercy of the algorithm. It is no longer about content control, it's about distribution control.</p><p>The powers that be, and their tight hold on the owners of the algorithm, they decide what the people see and what the people should not see. Algorithm that create fear, uncertainty, and doubt pushes content that amplifies outrage in their enclosed echo-chamber.</p><blockquote><p>The new formula becomes;<br>fear + hate = outrage engagement<br>outrage engagement = algorithmic amplification<br>amplification = power</p></blockquote><p>They have found a way to scale the formula in the digital age.</p><p>We have to resist this. We have to be more intentional and conscious in consuming informational media. Do not sit passively scrolling your feed, letting the algorithm fed you the things they want you to be fed. Consciously choose what enters your mind. Go out and seek the information actively, and consciously.</p><p>In future Code &amp; Codex newsletters, I’ll break down what active information-seeking actually look like against algorithmic walls specifically designed to distract you. Check out the new Code &amp; Codex manifesto: <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://jshamsul.com/code-and-codex/manifesto">https://jshamsul.com/code-and-codex/manifesto</a></p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://paragraph.com/@code_codex">Subscribe to Code &amp; Codex</a> if you haven’t.</p><p><em>Stay glitched, stay human.</em><br>Jibone.</p><hr><p>On-chain edition / <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://jshamsul.com/code-and-codex"><strong>Archived at </strong></a><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="http://jshamsul.com"><strong>jshamsul.com</strong></a><a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://jshamsul.com/code-and-codex"><strong> </strong></a>/ <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://codeandcodex.substack.com/"><strong>Published on Substack</strong></a> / <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://primal.net/p/nprofile1qqsqsgehv75h9hvla3d39gzh87w5karednhcaj6z3jhy5tmtsxmzgjgpwsmcx#reads"><strong>Relayed across Nostr</strong></a> .</p>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <category>politics</category>
            <category>internet</category>
            <category>fear</category>
            <category>hate</category>
            <category>algorithm</category>
            <enclosure url="https://storage.googleapis.com/papyrus_images/6caf4677402966b14fec083f13b5c0a1b5b7cc7f5d2c21d78e63713524d4d593.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[AI is a Bubble]]></title>
            <link>https://paragraph.com/@code_codex/ai-is-a-bubble</link>
            <guid>pRcmSFZuheH9FZRYzwma</guid>
            <pubDate>Sat, 27 Dec 2025 00:16:38 GMT</pubDate>
            <description><![CDATA[Loud, flashy, overhype,.. until it is not.]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital wanderer.</em></p><p>AI at the moment, towards the end of 2025, is a bubble. AI right now is loud, <strong>“AI Powered” </strong>this, <strong>“AI Powered”</strong> that. You see it everywhere.</p><p>Every product demo, every pitch deck, every landing page feels compelled to announce it. AI has become nothing but just a buzzword, a marketing hook, a shiny sticker slapped onto products to make them feel current, relevant, and most importantly investable.</p><p>The bubble is not the technology itself. The bubble is the attention economy around it. This distinction matters.</p><p>We’ve seen this pattern before.</p><p>Databases were once exciting. The idea that information could be stored, queried, and retrieved at scale felt revolutionary. Today, databases are boring. No one brags about using a database anymore, everyone assumes there is some sort of datastore that just works.</p><p>Search engines were once magical. The ability to type a few words and retrieve knowledge from across the world felt almost supernatural. Now they are the norm. Their absence would be highly inconvenient, but their presence is unremarkable.</p><p>Cloud computing was once exotic. A server somewhere far away, doing things you didn’t fully understand. Today, cloud infrastructure is invisible, until it breaks the internet.</p><p>This is how technologies mature. They begin as spectacle, evolved into tools. Eventually, they fade into the background.</p><p><strong>AI has not reached that stage yet.</strong></p><p>Instead, we are still in the phase where visibility is mistaken for value. Where the loudest claims receive the most attention. Where novelty is rewarded more than reliability. Where saying “AI” is sometimes more important than explaining why it’s there.</p><p>That is the bubble. Not the models, not the research, not the real incremental improvements happening quietly behind the scenes.</p><p>The bubble is the belief that a technology must constantly announce itself to justify its existence.</p><p><strong>AI will stop being a bubble when it no longer needs to introduce itself.</strong> When AI becomes invisible, when AI is taken for granted, when AI is boring, that is when you know AI has made it.</p><p>When it quietly improves workflows instead of redefining them every six months. When it fades into the infrastructure layer, doing the job without demanding applause. When we stop talking about “AI-powered” products in the same way we no longer talk about “database-powered” websites.</p><p>This is not a dismissal of AI’s importance. It’s an acknowledgment of its future. </p><p>The most transformative technologies are not the ones we endlessly debate, they’re the ones we forget are even there.</p><p><em>Stay glitched, stay human.</em></p><p><strong>Jibone.</strong></p><hr><p>On-chain edition / <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://jshamsul.com/code-and-codex"><strong>Archived at </strong></a><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="http://jshamsul.com"><strong>jshamsul.com</strong></a><a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://jshamsul.com/code-and-codex"><strong> </strong></a>/ <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://codeandcodex.substack.com/"><strong>Published on Substack</strong></a> / <a target="_blank" rel="noopener noreferrer nofollow" class="dont-break-out graf markup--anchor markup--anchor-readOnly" href="https://primal.net/p/nprofile1qqsqsgehv75h9hvla3d39gzh87w5karednhcaj6z3jhy5tmtsxmzgjgpwsmcx#reads"><strong>Relayed across Nostr</strong></a> .</p>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <category>ai</category>
            <enclosure url="https://storage.googleapis.com/papyrus_images/32e54e4f3345454bed7086293a8bdeb60e4bdca2204ed8333342c3bbbb92d8ee.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[The Story of Theuth and King Thamus]]></title>
            <link>https://paragraph.com/@code_codex/the-story-of-theuth-and-king-thamus</link>
            <guid>DcT9r1QdeecKnGunTSbn</guid>
            <pubDate>Wed, 17 Dec 2025 00:41:39 GMT</pubDate>
            <description><![CDATA[Applying ancient wisdom to modern AI discourse]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital wanderer.</em></p><p>Every now and then, I see people celebrating AI as a magic shortcut tool. It can generate boilerplate code, draft your email, and smoothing over all the rough edges of work.</p><p>Most people would agree that AI excels at these mundane tasks, and that it can help to offload some of the thinking process needed.</p><p>This reminds me of one of Plato’s dialogues. In Plato’s Phaedrus, there is a story about Theuth the inventor and King Thamus.</p><p>In Phaedrus, Socrates tells a story from ancient Egypt. Theuth was a great inventor, credited with many inventions, one of which was writing (grammata).</p><p>He proudly presents his inventions to the Egyptian King, Thamus, saying that writing will give much benefit to humanity.</p><p>He claims that writing will improve memory, and help preserve knowledge. He believes this to be his greatest gift to humanity.</p><p>King Thamus disagrees. He says the inventor is not the best person to judge their own inventions. It is the users that determine if an invention will be beneficial.</p><p>King Thamus argues that writing will actually weaken our memory. People will rely on written pages instead of using their minds to commit information into memory.</p><p>He argues that writing only gives the appearance of wisdom, not true understanding. We will recite information without genuinely learning.</p><p>It’ll make people seem knowledgeable, while actually remaining ignorant.</p><p>This was King Thamus’s famous critique of writing:</p><blockquote><p><em>This invention will produce forgetfulness in the souls of those who learn it,.. They will be hearers of many things and will have learned nothing. They will appear wise but will not be so.</em></p></blockquote><p>Ironically, this story was preserved by Plato through his written dialogues.</p><p>This is the classic Platonic philosophical tension between knowledge and information.</p><p>According to Plato, and probably his teacher Socrates, true knowledge is living, interacting, and dialectical. Writing freezes ideas, it cannot answer new questions. Written words seem wise but, cannot defend themselves.</p><p>I do think that his little story applies to our modern discourse about AI technology. Replace ‘writing’ with ‘AI’ and King Thamus’ critique feels eerily contemporary.</p><p>Writing, like AI, was once presented as a cognitive superpower, but Plato (via King Thamus) saw that it also outsourced memory and gave the illusion of understanding.</p><p>It is undeniable that writing became the backbone of civilisation, preserving knowledge across generations. But acknowledging its benefits does not invalidate King Thamus’ concern.</p><p>His concern isn’t about the existence of writing, it is about its effect on human cognition. Most warnings about technological advancement aren’t about the technology itself, but about its effect on human societies and culture.</p><p>It is true that modern technology gives us tools that remove frictions, but we forget that friction is often where thinking happens.</p><p>Technology can be a great tool if we use it deliberately. Unfortunately, many of us treat technology as a crutch, something we rely on rather than wield.</p><p><strong>If we see technology as crutches and uses it as crutches, we will be crippled by technology.</strong> Over-reliance creates dependence, and dependence creates weakness.</p><p>The current state of AI and LLMs does create the appearance of wisdom. It is a great amplifier. However, many people have misused it as an <strong>intellectual prosthetic</strong>.</p><p><em>Stay glitch, stay human.</em><br>Jibone.</p><hr><p>On-chain edition / <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://jshamsul.com/code-and-codex">Archived at jshamsul.com </a>/ <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://codeandcodex.substack.com">Published on Substack</a> / <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://primal.net/p/nprofile1qqsqsgehv75h9hvla3d39gzh87w5karednhcaj6z3jhy5tmtsxmzgjgpwsmcx#reads">Relayed across Nostr</a> .</p>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <category>ai</category>
            <category>plato</category>
            <enclosure url="https://storage.googleapis.com/papyrus_images/571af9656dcf4454d7e554a89d3769b7c9daf597cdffb36d91eb6a1fe99d9874.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Digital Sufism]]></title>
            <link>https://paragraph.com/@code_codex/digital-sufism</link>
            <guid>ckb9YR6JTA787Bv1RxgH</guid>
            <pubDate>Mon, 15 Dec 2025 08:52:42 GMT</pubDate>
            <description><![CDATA[The soul, encoded in flesh and logic.]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital wanderer.</em></p><p>A computer program does not know that it is a computer program.</p><p>It does not know that its existence is dependent on many other things that it <strong>does not have control over</strong>.</p><p>It needs other written programs for it to execute its written instruction.</p><p>In its digital form, it needs hardware to exist. It needs hardware to store its code. It needs hardware to send and receive the data it processes. It needs hardware to execute the tasks that are written for it to do.</p><p>In many ways, we are like a computer program.</p><p><strong>Our existence</strong> is dependent on other things that we have no control over.</p><p>We exist in the confines of space and time. I am here, you are there. I am writing this then, you are reading this now.</p><p>We need air to breathe. We need food to eat. We need water to drink.</p><p>All the things we need are dependent on other things, and those things are dependent on other things.</p><p>There is an entire cosmos designed by the <strong>ultimate Creator</strong> that in every moment is constantly shifting, moving, expanding. Every element down to a tiny atom has it purpose.</p><p>However, unlike a computer program that does not know it is a computer program, <strong>we know what we are</strong>.</p><p><strong>We know our purpose</strong>.</p><p>Well, I do, but do you? This program remembers its <strong>Maker</strong>.</p><p><em>Stay glitch, stay human.</em><br>Jibone</p>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/1100dfc73026e434485ce1747acb512e67268d68f339f3b4f76b7c62941ec5f2.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[The Queen’s Magic Mirror]]></title>
            <link>https://paragraph.com/@code_codex/the-queens-magic-mirror</link>
            <guid>zQXmSUw4fZN1bErNDrdy</guid>
            <pubDate>Mon, 15 Dec 2025 08:38:37 GMT</pubDate>
            <description><![CDATA[Reflections On Vanity, Algorithms, and the Devices We Obey]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital wanderer.</em></p><p>I’m sure most of you are familiar with the story of <strong>Snow White</strong>. Maybe not the original 1812 version by the <strong>Brothers Grimm</strong> in their <em>Children’s and Household Tales</em>, but certainly the sanitised retelling by Disney.</p><p>The Disney version transforms what was once a cautionary tale of envy, survival, and justice into a romantic love story.</p><p>In the original, there was a Queen. And there was a <strong>magic mirror</strong>, a mirror that told who was the fairest of them all. One day, the mirror stopped saying the Queen was the fairest, and named a girl by the name of <strong>Snow White</strong> instead.</p><p>The Queen sent a huntsman to kill Snow White, this much remains the same in both versions. But the Grimm version is far more gruesome. The huntsman was ordered to return with Snow White’s lungs and liver as proof. He faked it by killing a wild boar, and brought its organs to the Queen, who cooked and ate them.</p><p>There were seven dwarves in the original, but they were not given names, nor much in the way of personality.</p><p>The Queen attempted to kill Snow White three times. First, with a lace bodice, a garment so tightly laced it suffocated her. Then with a poisoned comb, as it stroked her hair, it made her deathly ill. The third attempt is the one we all remember: the <strong>poisoned apple</strong>. The first two times, the dwarves returned just in time to revive her. But with the apple, we know what happened.</p><p>Snow White wasn’t saved by a prince’s kiss. In the Grimm version, the prince sees her in her glass coffin and insists on taking it back to his kingdom. During the journey, one of his servants stumbles. The coffin falls. A piece of the apple dislodges from Snow White’s throat, and she awakens.</p><p>She marries the prince. The evil Queen is captured. As a punishment, the prince forces her to wear red-hot iron shoes, and dance in them until she dies. This part was left out of Disney’s version, I believe, in that one, the Queen simply falls to her death.</p><h2 id="h-magic-mirror-symbolism" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Magic Mirror Symbolism</strong></h2><p>There are many symbols and interpretations one could draw from the original version of Snow White compared to Disney’s retelling. I’ll leave you to contemplate the rest. In this Code &amp; Codex dispatch, I want to focus on the <strong>‘magic mirror’</strong>.</p><p>I trust most of you are familiar with the <strong>magic mirror</strong>, it appears in both versions of the tale. A strange, enchanted object that always tells the truth, but only when asked.</p><p>I’ve often wondered: what exactly is the <strong>magic mirror</strong>? Is it a device you can ask anything, and it gives you the answer? Perhaps the mirror <em>answered based on what it was trained</em> to reflect. Has the Queen ever asked it anything other than who is the fairest in the land? If the mirror truly possessed magical intelligence, wouldn’t it be more useful for other kinds of questions?</p><p>Why not ask it for dirt on a political rival? Or for intel on a hostile state? Or for secrets that could expand her power?</p><p><em>(I’ve read some interpretations that suggest “beauty” represents power. If Snow White is more beautiful, then she may one day hold more influence and authority than the Queen.)</em></p><p>But if the <strong>magic mirror</strong> is limited, if the only answer it ever gives is who is the fairest, then what good is it? What purpose does it serve, beyond stoking vanity, envy, and eventual chaos?</p><p>I prefer to believe the <strong>magic mirror</strong> could answer far more profound questions. It was the Queen’s own vanity that blinded her. She never used it to its full potential. Like many powerful tools, the mirror didn’t limit itself, it was limited by the user.</p><h2 id="h-our-magic-mirror" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Our Magic Mirror</strong></h2><p>We are living in the age of <strong>‘magic mirrors’</strong>. Everyone now has their own. It's small enough to carry in our pockets wherever we go. When inactive, its black glass surface reflects our face. When awakened, it provides us with access to whatever information we desire.</p><p>But are we using this <strong>magic mirror</strong> to its full potential, or are we just as vain as the Queen in Snow White? As we scroll through algorithmically generated feeds, we keep looking for, who is the fairest? Who is the richest? Who has more than us?</p><p>The algorithmic nature of this magic mirror is designed to keep us engaged, by reminding us of what we don’t have, blurring the line between our <em>‘wants’</em> and our <em>‘needs’</em>.<br><br>Another thought I had when rereading Snow White: was the Queen using the mirror, or was the mirror using the Queen? Mirrors reflect. But reflections can mislead, especially when we fall in love with them.</p><p>We think we're in control of our digital media consumption, but is that truly the case? Or are we <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://codeandcodex.substack.com/p/spiders-and-their-web"><u>caught like flies in the algorithmic web, spun by big-tech spiders</u></a> waiting to suck us dry?</p><p>Our <strong>‘magic mirror’</strong> today is more magical than ever, thanks to <strong>AI and large language models (LLMs)</strong>. You can literally ask it anything, and it will generate answers that seemingly ring true. I know that fluency isn’t truth, and coherence isn’t wisdom. Nevertheless, the generative capabilities of these next gen AI models are powerful and wondrous.</p><p>And yet, what do we use it for? To turn our photos into Studio Ghibli animations? To roast strangers on the internet?</p><p>Let’s pause and reflect. Let’s reflect on our use of this mirror. Let’s be mindful of our digital consumption. This mirror can be wondrous, but it can also be deceptive. Let’s not be like the Queen, using it only for vanity.</p><p>Both <strong>mirrors</strong> and <strong>microchips</strong> share a common ancestor: <strong>silicon</strong>. One reflects our face. The other powers the algorithms that reflect our desires, fears, and vanity back at us.</p><p>In folklore, mirrors told the truth.</p><p>In our world, algorithms do, or claim to.</p><p><em>Stay glitch, stay human.<br></em>Jibone</p>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/9149369a585b27dc9fe674e41acf60e928b531d5d5d9e90f4eca7ab644f0afce.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Click Here to Enter]]></title>
            <link>https://paragraph.com/@code_codex/click-here-to-enter</link>
            <guid>EJmszLRT57fLDRaC6r7H</guid>
            <pubDate>Mon, 15 Dec 2025 08:19:19 GMT</pubDate>
            <description><![CDATA[There used to be a door to the digital world. There is a ritual to log in and log out. Now the door is gone,]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital traveller.</em></p><p>There was a time when the “<strong>family computer</strong>” sat in the family room. Around this time in the 90s’ not every household, especially in developing countries, had a computer, but I was lucky to grow up in a household that had a computer at that time. And then came <strong>the Internet</strong>.</p><p>I remember coming back from school, I would immediately drop my bag and rush to the computer. I only have a few hours until my parents come home from work. I turned on the family <strong>Gateway PC</strong>, the <strong>56k modem</strong> made a loud screeching noise as it attempted to establish a connection to the ISP providing me with access to the Internet.</p><p>I check my <strong>Yahoo Mail</strong> inbox for any e-mail I may have received. I reply to any <strong>ICQ messages</strong> that came in while I was offline. I then check the front page of <strong>Yahoo</strong> for news. Logged on to <strong>Yahoo Chat</strong> to see who is around and then launched <strong>mIRC</strong> right after that. Then, if there is nothing interesting, I just surf random webpages online. And without me realising, time was up.</p><p>I disconnect and log off, freeing the phone line. I click “shut down” to initiate the shutdown sequence for the computer, and I watch as the CRT computer screen go black.</p><p>In the <strong>‘old’ internet</strong>, there is a ritualistic nature to it. There are things you have to do to get online, and once done, there are things you do to log off. The Internet used to be a place that you go to. It has a doorway that you open and walk in, and after you’re done, you walk out and close it.</p><p>There is a clear separation between the real world and the digital realm. The Internet used to have a sense of ‘space’ or ‘location’ to it. Things on the Internet metaphorically symbolised a sense of ‘place’ that you go to, and it is reflected in the online jargon that we use at that time, <strong>Home</strong>page, note the word ‘<strong>Home</strong>’; Web <strong>address</strong>, note the word ‘<strong>address</strong>’; Chat <strong>room</strong>, note the word ‘<strong>room</strong>’.</p><p>If you had the chance to experience the old <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/GeoCities"><u>Geocities</u></a> internet, you would have noticed that many of the old web designs took the term ‘Homepage’ literally. Many have virtual ‘<strong>doorways</strong>’. You tend to see links that say, “<strong>click here to enter</strong>” as if you are entering a place.</p><p>Most websites in those days would display a visitor counter and a <strong>guestbook</strong>, where visitors were encouraged to leave a note about their experience.</p><p><a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/GeoCities"><u>Geocities</u></a>, one of the earliest platforms that let users build personal websites, fully embraced this sense of place. Its communities were organised into themed "neighbourhoods" named after real-world locations, Area 51, Hollywood, Capitol Hill. Your little site didn’t just exist online, it lived in a <strong>neighbourhood</strong>.</p><p>I have never given much thought to the act of logging in and logging off. I never noticed these boundaries between the real-world and the digital realm. We move through doorways, arrive, and we left. The internet lives in a place, and that place had a door. It took a long time for me to notice the door went missing.</p><p>Today in 2025, the internet is no longer a place we go to, it is always there with us. We take out our phones, and we are on the web. Notifications from the digital realm vibrates physical phones in our pocket, demanding our attention. There is no more sense of boundaries and place. We no longer log in and log off; we no longer arrive and leave. <strong>We can’t leave</strong>.</p><blockquote><p><em>Mirrors on the ceiling, the pink champagne on ice</em></p><p><em>And she said, "We are all just prisoners here of our own </em><strong><em>device</em></strong><em>"</em></p><p><em>And in the master's chambers, they gathered for the feast</em></p><p><em>They stab it with their steely knives, but they just can't kill the beast</em></p><p><em>Last thing I remember, I was running for the door</em></p><p><em>I had to find the passage back to the place I was before</em></p><p><em>"Relax," said the night man, "We are programmed to receive</em></p><p><em>You can check out any time you like, but you can never leave"</em></p></blockquote><p>Eerie lyrics from <em>The Eagles — Hotel California</em>.</p><h2 id="h-psychology-of-a-door" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>Psychology of a Door</strong></h2><p>Spaces with defined entrances and exits, even symbolic ones, give us a sense of <strong>orientation and belonging</strong>. They offer safety and control. A door is a threshold, a passage between one state or environment and another.</p><p>There’s a well-documented phenomenon in psychology called the “<a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/Doorway_effect"><u>doorway effect.</u></a>” Have you ever walked into a room intending to do something, only to forget what it was once you got there? Studies suggest our brains compartmentalise experiences by location, the doorway acts as an event boundary, segmenting our memory.</p><p>Perhaps we didn’t think much about this when we removed the door to the internet. Without it, we’ve lost the ability to compartmentalise the offline and the online.</p><p>Was it for the sake of convenience and efficiency that we discarded the ritual of logging in and logging off?</p><p>In <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://jshamsul.com/bookshelf/neil-postman-amusing-ourselves-to-death"><u>Amusing Ourselves to Death: Public Discourse in the Age of Show Business (1985)</u></a>, Neil Postman warned us about electronic media. He was deeply critical of television and the way it redefined truth, knowledge, and culture. In the final chapter, he emphasised the importance of understanding <strong>the politics and the epistemology of media</strong>, how each medium shapes the structure of our discourse.</p><p>Maybe we haven’t thought hard enough about the politics and epistemology of the internet. Television once reshaped modern culture. Now, the <strong>always-on Internet</strong> with its bottomless pit of digital media has assumed that role, faster, deeper, and more immersive.</p><p>Previously, there was a symbolic door that separated the Internet and the rest of the world, now this door ceases to exist. Should we let this new medium of communication take full control of our culture direction unchecked?</p><p>There is a lot to ponder on, but for now, I just want you to realise that <strong>the door is missing</strong> and <strong>there is no way we can close it anymore</strong>.</p><p><em>Stay glitched, stay human.</em><br>Jibone</p>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/43962b954220516b488dc9dceb3d84fd651ba243cb57e50abf50b53cb9a82284.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Spiders And Their Web]]></title>
            <link>https://paragraph.com/@code_codex/spiders-and-their-web</link>
            <guid>1Wn2ZtWrVJNhn9Bj332c</guid>
            <pubDate>Mon, 15 Dec 2025 08:00:46 GMT</pubDate>
            <description><![CDATA[Peace be upon you, fellow digital travellers. Spiders are fascinating creatures. They are not insects but arachnids, distinguished by their eight legs, unlike insects, which have six. Spiders are solitary by nature. While many insects form large colonies, most spider species are individualistic and often aggressive toward their own kind. Some even display cannibalistic tendencies, whether by devouring their offspring, being eaten by their young, or engaging in sibling cannibalism shortly afte...]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital travellers.</em></p><p><strong>Spiders</strong> are fascinating creatures. They are not insects but arachnids, distinguished by their eight legs, unlike insects, which have six.</p><p>Spiders are solitary by nature. While many insects form large colonies, most spider species are individualistic and often aggressive toward their own kind. Some even display cannibalistic tendencies, whether by devouring their offspring, being eaten by their young, or engaging in sibling cannibalism shortly after hatching.</p><p>Though rare, a few spider species exhibit social behaviours, forming colonies that cooperate in web-building and hunting. But for the most part, <strong>spiders walk alone</strong>.</p><p>One of the spider’s most distinctive traits is its ability to spin webs. These silk structures serve multiple purposes, shelter, reproduction, egg protection, but their primary function, the one we all recognise, is <strong>to ensnare their prey</strong>.</p><p>Orb-weaving spiders construct intricate, sticky webs strategically placed to catch flying insects. When a prey is trapped, vibrations ripple through the silk, signalling the spider to strike. It swiftly subdues its victim, injects digestive enzymes, and slowly liquefies the insides, <strong>sucking them dry and leaving behind only an empty husk</strong>.</p><p>Who else builds webs to ensnare prey and drain them until nothing is left?</p><p><strong>Tech companies.</strong></p><p>Mainly the ones that involve with entertainment and media, distribution and creation. They disguise their platform as something that <em>‘empower’</em> the masses. They say, now you can reach more people with your message, the content you created can impact more lives.</p><p>However, the number of impactful pieces of content or lives changed for the better isn’t what they’re tracking. They are looking at the amount of time their user are spending on their platform, and every tweak, every new feature they do is to increase this metric. <strong>The more time their user spend time on their platform, the more ads they can push to them</strong>.</p><p>Like spiders weaving their traps, tech companies create digital online webs. They even call it by that name, <strong>‘web’</strong>, the World Wide Web.</p><p>Like solitary spiders with cannibalistic instincts, tech giants devour their competitors, absorbing them or wiping them out.</p><p>Like spiders sensing vibrations in their web, the tech industry tracks every movement, cookies, analytics, behavioural profiling, all signalling the presence of a potential prey. They don’t call them <strong>‘prey’</strong>, they call them <strong>‘users’</strong>.</p><p>If you think about it, a spider’s web is, at its core, a net. And what does a net do? <strong>It catches things</strong>.</p><p>The tech industry doesn’t even try to hide it, they literally call it <strong>‘the net.’</strong> They claim it stands for <strong>‘internet’</strong> or <strong>‘network,’</strong> but we all know what it is really meant to do.</p><p>What are they trying to catch with this ‘net’? Not our physical bodies, they want to capture <strong>our attention</strong>.</p><p><strong>What we pay attention to defines our reality</strong>, and algorithm-driven feeds are increasingly what we pay attention to these days. <strong>Those who control these algorithmic feeds control your reality</strong>, if that is the only thing you are paying attention to.</p><p>Spiders employ clever tactics to enhance their traps. Some webs exploit electrostatic attraction, positively charged flying insects are drawn into the web, increasing their chances of capture. Others use web decorations (stabilimenta) to attract prey.</p><p>Tech companies operate the same way. Their ‘webs’ are reinforced by aggressive advertising, designed constantly to pull your attention toward them. The industry calls it ‘advertising campaigns’. The word ‘campaigns’ is a military term. It involves strategic manoeuvres to capture valuable targets.</p><blockquote><p><em>“There is a war going on. The battlefield is our minds, and the price is our soul.” — </em><a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://jshamsul.com/essays/2024-08-08-the-war"><em><u>The artist formerly known as Prince.</u></em></a></p></blockquote><p>In the advertising sense, an ‘advertising campaign’ is a carefully orchestrated operation, a series of coordinated efforts (ads, promotions, social media, etc.) designed to achieve a singular goal: ensnaring more <strong>prey</strong> that they called <strong>‘users’</strong>. The larger their user base, the more they can extract.</p><p>And similar to the spiders that subdue and immobilise their prey and drain the life out until what's left is an empty shell; tech companies, typically those offering media platforms, will subdue users with <strong>endless feeds, draining their focus, their time, their sense of self, until what remains is a hollow shell, mindlessly consuming content.</strong></p><p>They want you to have your head up in the <strong>clouds</strong>. They are not hiding this, they literally name it, <strong>‘clouds’</strong>.</p><p>Some spiders go even further, using deceptive signals to lure prey, mimicking mating cues to entice victims into their trap. Tech companies do the same, <strong>unapologetically leveraging hyper-sexualised marketing and manipulative design to keep users hooked.</strong></p><p>Unlike a lion that hunts actively, a spider builds its web and waits. Occasionally, they will fix or expend the web if needed.</p><p>Tech companies do the same. They build a platform once and let the <strong>‘For You’ algorithms</strong> do the rest, feeding users a steady stream of content designed to keep them docile and ensnared.</p><p>The spider’s web, however, could only catch small insects like flies, creatures that flit from place to place, always seeking stimulation, always searching for food, breeding sites, or mates.</p><p>A fly has two large compound eyes, each made up of thousands of lenses, granting it an almost panoramic field of vision. Looking at everything, everywhere, all at once. Yet despite this, it cannot see the web until it’s too late.</p><p>Tech companies, like spiders, don’t ensnare everyone. They primarily trap those who go online aimlessly, seeking only fleeting stimulation, hedonistic distractions that keep them stuck in the web.</p><p>But if you go online with purpose, if you <strong>remain mindful of what you consume</strong>, you are not a fly.</p><p>You can see the web for what it truly is.</p><p><strong>Don’t be a fly.</strong></p><p><em>Stay glitched, stay human.<br></em>Jibone.</p>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/d34bfcbecfdc1f11a26635c771b460533bd33f6f1f27772db08617e61686de16.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[AI is the UI]]></title>
            <link>https://paragraph.com/@code_codex/ai-is-the-ui</link>
            <guid>X3s2SHRprYueTIB7cxsj</guid>
            <pubDate>Mon, 15 Dec 2025 07:26:55 GMT</pubDate>
            <description><![CDATA[We started with the command prompt, then we were given the GUI, now we are back to the prompt.]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital traveler.</em></p><p>The AI revolution is upon us. In the near future, AI will be as ubiquitous as computers are today. But this future isn’t about <strong>killer robots taking over</strong> or <strong>sinister machines plotting against their masters</strong>. Just as networked computers reshaped the way we work, do business, socialise, and entertain ourselves, AI will introduce a new paradigm for all these things.</p><p>But before we get there, we must start at the beginning, the command prompt.</p><h2 id="h-first-there-was-the-command-prompt" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>First there was the command prompt</strong></h2><p>The earliest interface I used to interact with a computer was the <strong>‘command prompt’</strong>. A beige plastic enclosure housed a CRT monitor, its black screen sometimes reflecting my face at just the right angle. Within that black mirror, it displayed white or green text, and a small blinking rectangle marking where the next letter would appear.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/d983af2692c8a22a6f51945cb20f653a4cb2656d8e1029bfe48c73e724cbf8a8.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAB8AAAAgCAIAAABl4DQWAAAACXBIWXMAAAsTAAALEwEAmpwYAAAFBklEQVR4nM1WLY+kShQlaUGCIISEFCGEpEJIKggSkhKIShAIBALRogwCgUCUKEOCQSAQLdog2iAQLUaMaLNizIoVa0eumT+wf2Hdy3DnzUfv7L55myfeEZXugrrcj3MOKMuyTNMkhCiK4nQ67ff7YRjWdeWcf//+ves6XdcNw9jtdsofQFVVy7LgvG3bGGNVVbUNhmFomrbboPwfgRDyNyCE1A26rmuaBg2B3FVVNQxD32BsgE2o8rkyuEfTtJfo8zx/+fLl4eGhaRq2IQzDeZ7bti3L8nK5ZFnGOZdSjuO4LMv5fO66Lo7jm5ubdV3HcZymCSFkmuYwDMuy9H1vWdaHKtO2Ip7/Ps8AfqiqutvtTNN83oSdl/NxHHueRykNgiCKIimlECJJEinlNE1N0/R9P00TpZQxNk0TIcR13TRNP5QdIQRjHMex7/tRFNV1zTmnlFZV1fd9URRVVUkpgyBgjAkhfN/HGFNKPxRdVdUP3fdec3519YXEP378eHh4OBwOlNJxHIdhmKZpHEe64XA4LMvi+/4wDLe3t2VZEkKWZWnbtmmatm2hLM75sgF25nlmjL08UFEU0JRpmgghINZut4Opappm27bnecBFIOVrxYEkHcd5zenH0MMw1HXNGNM0zfO8MAyhs6ZpEkIYY7quu66LMXYcx7KsOI4xxkEQwJwsy7JtOwxDz/OAF2EYEkKeEgfaNk0ThqGUknPeti0Qo23baZp832+a5nA4SCkdx4EbiqKIomgcR4jV9/3xeOQbQChP0bMs832fc951XVEU/7Gl7N6GezPxV/gVQ65EdH0cY1wURZIkZVlSSi3LopQihIqiCMMwz3OEECHENE1KKcZ4WRZCiJSyaRrovvIbBEEgpazr+nQ6SSkxxmAjzWY7wzBwzhljIDTP87quC8PwdDqt69p1XZZlhBDf99/3FkJI13VlWcKKMZZSUkrrus6yLI5jMKZ5npMkgZp0XRdCXC4XznkQBMMwdF0XRdE7DYzjeBzHsiyFEHVdCyH6vg+CQAiBMWaMJUnyUc/7exhvug/2bW0wTdN1XXPDleiv1g+xC6T0rts4jpOmKef8RR1vr0LH8zzXdb0oCtM0FUUJNwRBEMexcn9//+nTJ8YYxhiimKYZRRFIv2maruvAbxFCtm0bhoEx1nU9y7K+78uyXJYliqJ1XRFCIKCqqsqylFIqQoi7u7v7Dbe3t6qqRlH09etXIcRVN4qiEEKkaQrZpGnaNE2SJMMw2LbddR14i2VZeZ6Dhyvn8xnMYJ5njLGiKFEUffv2DRz8dX91XS/LMgzDm5ubNE3zPBdC5Hl+PB6DIJimCd5ilmXt9/s8z4dhUPq+v7u7Ox6PTdM4jgPRl2UBC/1Zya7rAjsJIfv9njE2jqPjOOM4QmfSNN3v9zAwZV3Xz58/n04neFMjhMqyXNe1KIp3DcB13cvlAkUIIbIsg9zXdXVdV1EUznlVVUVRPPZ9nueqqoDm8zxTSuEA24gEJv76GQih8/kMHlmW5X6/7/seNOh5nqIoMBJKaZIkT/U+r4SQNE3jOG7btuu6asPVZwH4xzRNnueBM4MbQ4rQn3ekdSWz3U9+iRCCzwIp5fl8dhxnmqbD4RBF0fF47PteSmnbNudcCPHY96ukfg9d1zHGhmG4rhtFkaZpICjTNBljQRDAVcdxfN9/HIPjOKZpGobxj6H/BM/zeRrCv8dviv4LKK3WQfvaJCMAAAAASUVORK5CYII=" nextheight="400" nextwidth="387" class="image-node embed"><figcaption htmlattributes="[object Object]" class="">MS-DOS Command Prompt</figcaption></figure><p>I was 11 or 12 when I joined my school’s newly formed "computer club." In the early '90s, computers were still expensive machines used mainly for serious business work, and the club was my only chance to experience one.</p><p>We learned <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/WordStar"><u>WordStar</u></a>, <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/Lotus_1-2-3"><u>Lotus 1-2-3</u></a>, and basic <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/MS-DOS"><u>MS-DOS</u></a> operations. We were assigned simple tasks, formatting a document in WordStar, creating a spreadsheet in Lotus 1-2-3. Everyone rushed to finish because once we were done, we had free time. For me, that meant booting up <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/Prince_of_Persia"><u>Prince of Persia</u></a>.</p><p>Another favourite was <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/Space_Quest"><u>Space Quest</u></a>, a sci-fi adventure game. Instead of pointing and clicking, you type commands into the prompt, “search the body,” “look at window,” “pick up card,” “open box.” Each command made the character take action, a text-driven interface for an interactive world.</p><p>With MS-DOS, UNIX, and other operating systems of the time, the command prompt was the main interface to the computer. You had to type the correct command to get it to do what you wanted.</p><figure float="none" width="469px" data-type="figure" class="img-center" style="max-width: 469px;"><img src="https://storage.googleapis.com/papyrus_images/862492daef0b3e64a9aa8606f1277b3f9147f5b4f04ce43c7d0f46695ca90133.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAUCAIAAABj86gYAAAACXBIWXMAAAsTAAALEwEAmpwYAAAF40lEQVR4nLVVaVBTZxT9XvIeSRADQmVTCoGAZGMLIYSwBEgBWZVdXLAIlCXYgAhIKy4MLiAWWjv+YKZYq4BsDpQygoiCIJQRR6mOClisyKqCS6Uo0tvJA8VO7fRXz5z53v1+vHPuPTP3PTQ8PFxRUdHS0lJf39DZ2d3W1nb//m9NTc0dlzvGxsbz8vJOnDgxMzMz/2Z+7l8w/2Z+5uVMWVlZdVV1TU1NSUlJb29vRUXl4OAgACCEEIZhCCEqTqFQKVScguNU1ZWqOnECp+AU9A9gqrcWHkskZUg1slq4vgMNoWUIqSNEZ9Co6gwGlWDgNDpBMNRoywiCQRAMKkHDyUKNxsAJBoOqzsQYTIzOxOhUROBUGkHQ1On0FUwajU5FGO1tL6SZrV6n3OGlm8kTLzZIVg8q7WqOyc8ellUHrU4/4FpR4H72mFdjFGf3F06lRz3qiuUNkWZZwe6HfJVNfspm37RWO699efLKQu+WRGG9RP+Kq/EjD9PfKRRNUpwMwFP6dOcYpA5AwjnYdBA+iRmP3A8RORC+ByL3Q2Suigv1BrIOSgCRT3vmPGQ+haxXIIltc7OZCM0C95AHEckQnQ8h7oDQClKcNJBJJjMmIOkXUNyBmGsQ3QW7AXbNg3xvkYsiK37H3s3K7NjU3Z7p+8zjU31rOhJvgMT/6vYxSOoDxRA4Bnb5K18rn4JP6dUvAXIAopIAIU0MxxYNPJym0x+C/6XyiO6h5D4IqZlKfwjbusGtoMXz2w5Z4Xn/k/0+Jdfdiy55FF9aW3p9ayM4eHZGNsxtqYVtneAR88An/kXcHfA+3qccgZT7ELpRNcGSgZd8NrwUnHMuyIrOb+2A9WfGlUOwuRW2j0DGY9hYDiEdl73Lm3dMQuYziG+HyErgCqs8v+pKuAkJt0Ec0OmbOJs8Bn6ld1MnIXkIgiMAIe2liLx9Xq0/AO6s536uEHDqnsXm3J1jEN0GadOQ8QjWFczVjU5nNvalTULWc/js3KKB7MjFhH5IewQO/u1+itnUSZAX/5zxHDKmICIGENJamsDbbza44E8X4wlvt7mgkxO82KIdDyG6HRT98Pl1WH90TjEGGy5OpgyA8hbE/QiRVcAT1noWdybcVhlIgnt8Y15v7HoW0fws/QlkPIOoxHcRkevgI4PggyAzmfJ1gcDSUcvognyngO+5knpxYJ1j2BlrrwpZyBlnv2qHoHpxaL046Duut5NpstuB1vj2uc0dYBtWm6QbUcnj5wQqYm5CSg/I7f5ASG1BXDUCTS1MZjYlN3vtKXoRUj4dLgmdcjHtE7l32FgPiLn9YsFla85NEXdAzOsVci9a8Uek/EtWdpVrpNfsXXqELg0Czwln1riUNe1itst4j9jyJgXTJtUXt5mMieosM5nyWwvS3NoxKeuB1OuUhSRZ2+AHnluNlUMCx+GIuWMVz/G0vd32VfzKNcLjLGGSNr/MwqrcwmanruUFu+BhJ5MB8cci5iGE0Sjvqb9baQZ/ZZPI6Nppc/ZjZ9NWp61NPH2vohthuRD9DUTlQ1gybPkawgpBFvJrsD0EWEKwAwRYw3ouGDIObtJedU8iH3I0tmFsRxiiLDS9JK46dYxpmb3CVY+dTUc8Yn/iGvl/mhZSCLyPGl2MJ2z1rriznrpyR52NhlnL8lP1Nyn1wpV6G1P0NmUYxq1Tt5arqZ82NxqRsm3pyW8zWdRWgYIhjKIj1NhzS2R81zGgR3a4VcCKrAK56/M6jrhXaNlta9Vta99ty+u04fcKuaNOJu9zRMpW6Jhn6xkMOLpZqWeTBn/PZwFqeKgWkcLEE5fjcUw8UXdZNhNP1MBjNPBtGnjMcjxWE09a4Aoi9X1qESlahEITT9JR24nh8rdfdBIcEubmFhyOpYG+NlOTyeEZ2wktDI30DA1W8gUmXIEpV8DiCljaKzU+0NR/QiAQ2NjYsNlsoVBoamrGXK4lEjlyLPm6uvr6+qvFYmcuh8/l8MViKZttieN0DOEfJCJJFh/4Qf2P+AuIx4TDEVNXrgAAAABJRU5ErkJggg==" nextheight="200" nextwidth="320" class="image-node embed"><figcaption htmlattributes="[object Object]" class="">Space Quest</figcaption></figure><p>Want to change directories? — <strong>‘cd’</strong></p><p>Want to create a new directory? — <strong>‘mkdir’</strong></p><p>Want to list all files in a directory? — <strong>‘ls’</strong> (or <strong>‘dir’</strong> for MS-DOS)</p><p>Command prompts were rigid. You had to be precise, no room for ambiguity. To many users, this was cumbersome; you either memorised commands or relied on a manual. For some, it was downright intimidating, full of jargon that seemed alien.</p><p>That’s why the arrival of the <strong>Graphical User Interface (GUI)</strong> changed everything. It made computers accessible by using familiar real-world metaphors, the <strong><em>"desktop," "folders," "documents," and "trash bin."</em></strong> People instantly understood these concepts, and computing was no longer limited to those who could decode cryptic text commands.</p><h2 id="h-the-graphical-user-interface-revolution" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>The Graphical User Interface Revolution</strong></h2><p>In December 1979, Steve Jobs, accompanied by a team of Apple engineers, visited <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://en.wikipedia.org/wiki/PARC_(company)"><u>Xerox’s Palo Alto Research Center (PARC)</u></a>. The visit was part of a deal, Xerox invested $1 million in Apple before its IPO in exchange for allowing Apple engineers to tour PARC and explore its technological innovations.</p><p>What they saw changed the future of personal computing.</p><p>The Xerox researchers introduced the Apple team to several groundbreaking technologies developed at PARC. Among them was the <strong>Graphical User Interface (GUI)</strong>, a revolutionary way to interact with computers. Unlike the rigid command prompt, the GUI presented a world of visual metaphors: windows, icons, and clickable elements that mimicked real-world objects.</p><figure float="none" data-type="figure" class="img-center" style="max-width: null;"><img src="https://storage.googleapis.com/papyrus_images/57ccba5b8d2ce251749ef2bf35506175b9af15b02d63bb46a62a419657899b69.png" blurdataurl="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAVCAIAAACor3u9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAEGUlEQVR4nJWVf2Q6fRzAz8h4yDnnXOdOls09liRNXyllMqZ/tr+WSZn+m9l8n8mUzCz9kZLJFBOZ/bXJ45zMGdkfM1NEZpLMnGRO+uMeSZIz96jP1rMf376Pvf7qPne9X73f73u/g25vbxOJxOXl5eHhYaVS8fv9hUIhk8kEAoFSqRQdEYlEHh8fn5+fm7+i0WjwH3l6eqrVajzP39/fQzRNUxSF47hKpSIIgiRJiqJIklSpVNgIFEXHH8ADxEfUavXsO2ZmZiwWy/r6utlsnp2dhWAY1ul0brdbr9drtdo/R9A0rdVqvV7vz59/bW9v7+zs+Hw+v99vNBppmtbpdPo3FhYWSJLEMIwgCJVKRZKkQqFwOBzpdNrhcCgUCmh6enptbe38/DwYDJ6ennIcl8/nWZblOO7u7u7g4CCVSp2dnUWj0VKpVCgUGIaJxWKhUCiZTO7v74fDYY/HgyAIOQLDsB8/fsRisXw+H41GbTbbUOByuXK5XCwW4ziuUqmU37i5uTk6Ojo+Pj45OQF3y+VypVLJ5XKZTCabzSaTyUgksrq6CgQEQeA4rtfrbTabw+GwWq1GoxGCICgYDMqy3Gq1RFHs9/uDwaDf73c6HUmSZFl+eUOSpMFgwPN8o9EAl7Is8zzvcrnGAlCoqakpCIL0ej2CIENBKBSSZblcLvM83+/3ZVlut9v1ep3n+cFgMHbIsixJUjqdTiQSgiCAy16vF4/HlUolRVHgFYBheHFxMZvNer3eYQ9ABoIg/D3i4eHh5eWlWCxeX1+zLFuv1+V3iKJ4cXHBMEy73QYn7wUkSYIShcNhnufD4bDVan0V9Pv9Wq3WbDZbrVa3261UKrUR1WoVBAIZNJvNQqHAsmyz2QSHnzLAcVyr1ZrNZpPJtLS09KEHIG6v1wMRO50OiPKVccVABrFYbCwADnQEDMMYhg0FgUAA1D2ZTDIMA75ZLpdTqRTo5KTon0oEBOQ7CIL4r8nFYvHq6orjuG63K8tyrVZjWVaSpJcJALcgCFtbWzAMf51wAKRQKLxeb7vdbjQawghRFEF9qtWqKIr/TKDT6QiCwDCM0+lEUXSiAMMwjUbj8XhCodDe3l4gENgbEQgEwMkkgsGgz+ezWCxga00UgNFAEET5kT9GKH8LaCPYkpMYCgiCoChKrVZ/6g9YXv/LL+OOz18FMAyDbTy+PTc3Nz8/P343vguO468CsALdbvfKygrYukAAvUF930GSJE3ToHTDDFAUjcfjm5ubYGehKGq32zUaDQRBbrdbrVb/vspffztN07u7u0qlciigKMrtdvtHbGxsqNVqMO52u91ms1mt1m9lQJIkgiBOp5NlWbvdPpxkFEXBn5TBYNBqteA5DMMQBAET/93iYBhmMpmWl5cNBgMMw/8CazEeL1VEQbYAAAAASUVORK5CYII=" nextheight="342" nextwidth="512" class="image-node embed"><figcaption htmlattributes="[object Object]" class="hide-figcaption"></figcaption></figure><p>Jobs immediately grasped its potential. He saw how this interface could make computers accessible to the masses. He wasted no time directing his team to integrate similar GUI elements into Apple’s upcoming products, most notably the <strong>Lisa</strong> and <strong>Macintosh</strong> computers. Microsoft followed not long after, bringing GUI-driven computing to an even wider audience.</p><p>New computer users were introduced to a digital environment that felt familiar. The <strong>"desktop" metaphor</strong> turned abstract computing tasks into intuitive actions. Instead of typing arcane commands, you could move files into <strong>"folders"</strong> and delete them by dragging them to the <strong>"trash bin."</strong> No manuals, no memorisation, just point and click.</p><p>This was the dawn of <strong>"user-friendly"</strong> computing. The GUI made computers approachable, transforming them from tools for specialists into household appliances.</p><p>But as we embraced this new interface, we unknowingly locked ourselves into its paradigm. The desktop metaphor became the standard, shaping how we think about and interact with digital systems. Decades later, we’re still inside this invisible prison, constrained by the very interface that once set us free.</p><h2 id="h-the-desktop-metaphor-prison" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>The Desktop Metaphor Prison</strong></h2><p>The <strong>Graphical User Interface (GUI)</strong> as we know it was first developed at <strong>Xerox’s Palo Alto Research Center (PARC)</strong>. Xerox, a company rooted in paper-based workflows, photocopiers, printers, and document management, naturally designed its GUI around the concept of paper and documents.</p><p>For decades, this is how the average user has understood computers: as digital representations of desks, files, and folders. We <strong>"save"</strong> documents into digital <strong>"folders."</strong> We <strong>"delete"</strong> them by dragging them into a virtual <strong>"trash bin."</strong> These actions feel intuitive because they mimic the physical world.</p><p>But in reality, these are just metaphors, abstractions designed to make computers more <strong>"user-friendly."</strong> And like all powerful illusions, they obscure a deeper truth.</p><p>Just like the spoon-bending monk in <strong>The Matrix</strong> tells Neo:</p><div data-type="youtube" videoid="uAXtO5dMqEI">
      <div class="youtube-player" data-id="uAXtO5dMqEI" style="background-image: url('https://i.ytimg.com/vi/uAXtO5dMqEI/hqdefault.jpg'); background-size: cover; background-position: center">
        <a href="https://www.youtube.com/watch?v=uAXtO5dMqEI">
          <img src="https://paragraph.com/editor/youtube/play.png" class="play">
        </a>
      </div></div><blockquote><p><em>“Do not try to band the spoon, that is impossible. Instead, only try to realise the truth. There is no spoon. Then you’ll see that it is not the spoon that bands, it is only yourself.”</em></p></blockquote><p>As Neo begins to grasp this truth, he is interrupted: <em>"The Oracle will see you now."</em> Ironically, <strong>Oracle</strong> is the name of a database company, one that deals not with documents, but with data.</p><p>The truth is: there are no <strong>"documents."</strong> No <strong>"folders."</strong> No <strong>"trash bin."</strong></p><p>A document is nothing more than a structured series of binary bits, 1s and 0s, etched onto a storage medium, whether it’s a magnetic hard disk, a solid-state drive, or another form of digital storage. Even the act of <strong>"saving" is a metaphor</strong>. In the early command-line era, the term <strong>"save"</strong> was seldom used, it is more natural to see the action being referred to as <strong>“write”</strong>.</p><p>And what about deleting? When you drag a file into the trash, and empty it, the system doesn’t destroy it. It simply marks the binary bits as free space, available to be overwritten. The file remains, invisible but recoverable, until it is replaced by something new.</p><p>Folders? Another illusion. In the command-line world, they were never called folders, they were directories. A directory is nothing more than a list of files. You don’t “place” a document inside a directory the way you would do in a physical folder. You append it to a list, or remove it from one.</p><p>And even a file, the fundamental unit of computing, is just another metaphor, another attempt to impose a familiar structure onto something inherently different. Layer upon layer of abstraction separates us from the raw truth of computing.</p><p>That’s why I call <strong>the desktop metaphor a prison</strong>. It confines us within an interface designed for paper-based workflows, limiting the true potential of digital computing.</p><p>Take the concept of folders. In the physical world, a sheet of paper can only exist in one folder at a time. If you want it in multiple folders, you must create copies. Editing one copy does not affect the others.</p><p>But this is not a limitation of computers, it’s a limitation of paper.</p><p>A digital document doesn’t have to exist in only one place. The same file can be referenced by multiple directories without duplication. Advanced users might recognise this as a <strong><em>symbolic link</em></strong> or an <strong><em>alias</em></strong>. Modern operating systems like <strong>MacOS</strong> allow documents to be tagged and dynamically grouped into <strong>“smart folders”</strong> based on metadata, rather than rigid hierarchical structures.</p><p>These aren't just features added onto folders, this is a more natural way for computers to operate, but obfuscated away hidden from average computer users.</p><p>And this is just one example of how the desktop metaphor limits the true capabilities of our machines. If you look deeply enough, and if you realise that <strong>“there is no spoon”</strong>, you’ll find countless others.</p><p>I could write more on this, but I’ll leave it for a future Code &amp; Codex dispatch. Today’s topic is about the AI revolution.</p><h2 id="h-the-ai-revolution" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>The AI Revolution</strong></h2><p>There have been attempts to break free from <strong>the prison of the desktop metaphor</strong>. The modern smartphone interface is one such effort. It introduced new ways to interact with computational devices, pinching, swiping, rotating, suggesting a possible escape from the constraints of the desktop.</p><p>Yet, despite these innovations, mobile UI design has plateaued. App interfaces have settled into predictable patterns: the back button in the top left, primary actions neatly lined up at the bottom. We've traded one rigid metaphor for another.</p><p>Perhaps we’ve <strong>confused familiarity with user-friendliness</strong>. If something feels familiar, we assume it’s easy to use. But is that truly the case?</p><p>Another attempt at escaping the desktop metaphor came through AR/VR. When Apple introduced the <strong>Vision Pro</strong>, it promised a new way to interact with computers. But instead of breaking free, it doubled down on the old paradigm. What we got was... <strong>floating windows</strong>. The same desktop metaphor, just projected into 3D space.</p><p>Which brings us to AI, specifically, <strong>generative AI and large language models (LLMs)</strong>. This, I believe, is our best shot at truly escaping the graphical user interface.</p><p>Instead of manipulating digital representations of physical objects, icons, buttons, folders, we can now <strong>prompt the machine in natural language</strong>. No documents, no menus, no rigid workflows. Just intent and execution.</p><p>And yet, there's a strange sense of that one has experienced before, a <strong>déjà vu</strong>.</p><p>We started with the <strong>‘command prompt.’</strong> Now we’re back to the <strong>‘prompt.’</strong></p><p>The difference is that today, our commands don’t need to be precise. <strong>No memorising cryptic syntax. No rigid structure.</strong> We can simply describe what we want, and the system interprets it.</p><p>But here’s the problem: we're not actually moving towards this vision. The current trajectory of AI is reinforcing the desktop metaphor, not replacing it.</p><p><a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://www.anthropic.com/news/3-5-models-and-computer-use"><u>Anthropic’s ‘computer use’</u></a> feature lets AI take control of your computer to perform tasks. <a target="_blank" rel="nofollow ugc noopener" class="dont-break-out" href="https://openai.com/index/introducing-operator/"><u>OpenAI’s ‘Operator’</u></a> lets AI use a browser to execute commands. These AI agents are not operating at a lower abstraction, they’re interacting with computers just like we do, navigating the same buttons, windows, and menus.</p><p>In other words, <strong>they’re just as trapped in the desktop metaphor as we are.</strong></p><p>But AI doesn’t need to be confined to our interface conventions. It doesn’t need to click buttons or open files. It could, and should, <strong>operate at a much lower level of abstraction.</strong></p><p>This is where the generative aspect of AI becomes fascinating. Presently, we’re using AI to generate intermediary artifacts, code, images, text, things that humans are already capable of creating.</p><p>Take <strong>‘vibe coding,’</strong> trend for example. We’re seeing people build entire games over a weekend using AI. But AI isn’t generating games, it’s generating code that still needs to be compiled or interpreted.</p><p>Why have these extra steps?</p><p>Why should AI generate code for a compiler when it could generate the executable binary directly?</p><p>And beyond that, why should we even need ‘software’ in the traditional sense? If AI is advanced enough, it could simply do the task instead of generating software to do it. Need a tool for a specific job? Describe it, and AI generates a bespoke solution on the fly.</p><p>This is what I mean by <strong>"AI is the UI."</strong></p><p>AI itself is the interface. There’s no need for icons, buttons, or menus. We return to the minimal prompt, but this time, instead of rigid commands, we use natural language.</p><p>Maybe we’ve been thinking about "AI" all wrong. What if it doesn’t stand for <strong>Artificial Intelligence</strong>, but rather <strong>Anthropomorphic Interface</strong>, a system that adapts to us instead of forcing us to adapt to it? And if AGI ever arrives, perhaps it won’t be <strong>"Artificial General Intelligence"</strong> but “<strong>Anthropomorphic Graphical Interface”</strong>, a system that understands human intent so deeply that it eliminates the need for traditional UI altogether.</p><p>The desktop metaphor is a relic of a world where humans had to learn how to use computers. AI presents an opportunity to reverse that equation, to have computers that understand us.</p><h2 id="h-the-chatbot-metaphor" class="text-3xl font-header !mt-8 !mb-4 first:!mt-0 first:!mb-0"><strong>The Chatbot Metaphor</strong></h2><p>Just as the <strong>Graphical User Interface (GUI)</strong> ushered in the personal computing revolution, making computers accessible to the masses, I believe generative AI has the potential to do the same. It could redefine how we interact with machines, breaking free from the constraints of the desktop metaphor and introducing a new era of computing.</p><p>But here’s the catch: are we just trading one prison for another?</p><p>If the desktop metaphor trapped us in the abstraction of paper, AI might <strong>confine us to the chatbot metaphor</strong>, where our interaction with computers is reduced to how well we can phrase a prompt. Instead of being limited by icons and folders, we might soon be limited by language itself, bound by grammar, syntax, and how precisely we can articulate our intent.</p><p>Something to think about. Until the next dispatch,</p><p><em>Stay glitched, stay human.<br></em>Jibone</p><br>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/248d5e7cc67dfe4bcbf415462bb877d568a114d34e0b6e810558ebdcfbad151b.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Paragraph Genesis Entry]]></title>
            <link>https://paragraph.com/@code_codex/paragraph-genesis-entry</link>
            <guid>efb8kToqrMuLv9Y5XIpc</guid>
            <pubDate>Mon, 15 Dec 2025 06:41:26 GMT</pubDate>
            <description><![CDATA[Code & Codex is a newsletter I started on Substack, a space where technology, storytelling, and philosophy braid together into something deeper than dev logs and industry chatter. I no longer feel that Substack should be the only platform for Code & Codex, which is why I’ve ventured out here in Paragraph.]]></description>
            <content:encoded><![CDATA[<p><em>Peace be upon you, fellow digital wanderer.</em></p><p><a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://codeandcodex.substack.com">Code &amp; Codex is a newsletter</a> I started on Substack, a space where technology, storytelling, and philosophy braid together into something deeper than dev logs and industry chatter.</p><p>I’ve spent most of my life inside systems: writing them, debugging them, scaling them, sometimes breaking them for understanding. But over time, I found myself returning to a certain question:</p><p><strong><em>What does it mean to create in a world where everything is both ephemeral and permanent?</em></strong></p><p>This newsletter is my way of exploring that question.</p><p>I no longer feel that Substack should be the only platform for Code &amp; Codex, which is why I’ve ventured out here in Paragraph. You can read more about it in <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://codeandcodex.substack.com/p/intermission-dispatch">“Intermission dispatch: An update on the state of Code &amp; Codex”</a>.</p><p>I’ll start by reposting previously published essays here, and new write-ups will be posted concurrently on <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="https://codeandcodex.substack.com">Substack</a>, Paragraph, my website <a target="_blank" rel="noopener noreferrer nofollow ugc" class="dont-break-out" href="http://jshamsul.com">jshamsul.com</a> and on Nostr as long-form addressable event.</p><p>This is my attempt in making<strong> Code &amp; Codex exist across multiple domains and protocols</strong>, not locked into one platform’s vision of the future.</p><p><em>Stay glitched, stay human.</em><br>Jibone.</p>]]></content:encoded>
            <author>code_codex@newsletter.paragraph.com (Code &amp; Codex)</author>
            <enclosure url="https://storage.googleapis.com/papyrus_images/14f06607c851ace77d51b16a8efc7eaa47d26caf0695f97aff881df395e85ed6.jpg" length="0" type="image/jpg"/>
        </item>
    </channel>
</rss>