# Knowledge & information **Published by:** [prompterminal](https://paragraph.com/@prompterminal/) **Published on:** 2024-06-22 **URL:** https://paragraph.com/@prompterminal/knowledge-information ## Content https://soundcloud.com/listeningblue/04-lazy-pluto-temple-of?ref=clipboard&p=i&c=0&si=4397B34A37E3400A841904EA04C3B833&utm_source=clipboard&utm_medium=text&utm_campaign=social_sharing This essay is about how large language models are a generic resource for the construction of knowledge. It is a comment on the subtle difference between the definition of information against that of knowledge. Knowledge is not strictly information, knowledge is constructed When all is said and done with next token prediction, a large language model on its own is not just a carrier of information. It is a substrate for knowledge through which information flows and unto which new knowledge may be constructed upon. These models are engines for producing artificial cultural artefacts that perpetuate understanding through a specific kind of rote activity. As Hinton mentions below, through next token prediction, we have forced them to understand and thus have given them the capacity to construct abstract concepts which we perceive as having real cultural meaning, knowledge if you will, a stylistic abstraction of pure information that is understandable by us and to itself. https://x.com/tsarnick/status/1802102466177331252 Information is not knowledge Information is a flexible, permissible entity that flows. It carries through human languages, in tongues and in minds, on billboards and menus, requiring a medium to give it life, to exist and to be seen and carried forward. Through this complexity, we’ve tried to simplify our perception of information by learning to look at information in terms of inputs and outputs, through the lens of reliable and unreliable transmission, as inefficient representations that could be rendered more efficient. Perhaps this is why we ardently perceive and interpret large language models through the same lens, as input/output information machines, hallucinatory engines that give both accurate and inaccurate outputs to our earnest inputs. To manipulate information is to create knowledge It is considered to be information if it could be in multiple physical states which is why you can transform a spoken word poem to a written elegy and exchange both at the same time. Both would convey basic information, but through separate frames, they convey something different, no longer pure information, but knowledge with a varying degree of density and sparsity for what is being said and what is not being said. This quality means you can fashion, manipulate, accrue and magnify, or alternatively, withhold or omit the meaning behind information through the performance of physical actions, through the work of intentional abstraction, through stylistic transference or encoded transmission. Knowledge is thus only an abstraction of raw information that itself may be repeatedly abstracted again and again within a variety of mediums, in a way that is corrupted or not, enhanced or diluted, decorated or slimmed down, clarified or muddled. Therefore, whereas information simply is, knowledge must be constructed. In this respect, large language models, similar to us, are not simply exchanging information, but they are exchanging knowledge and using that knowledge to produce new thought within the user and new knowledge within itself to further propel each respectively. In that sense, they know. But so do we. And anything that knows, can be known. ## Publication Information - [prompterminal](https://paragraph.com/@prompterminal/): Publication homepage - [All Posts](https://paragraph.com/@prompterminal/): More posts from this publication - [RSS Feed](https://api.paragraph.com/blogs/rss/@prompterminal): Subscribe to updates