Power Changes Responsibility: Different Advice for the Socialist International and the Fourth Intern…
Introduction: The Left’s Crisis Is Not Ideological, but RelationalThe contemporary Left does not suffer from a lack of ideals. It suffers from a refusal to differentiate responsibility according to power. For more than a century, internal debates have treated left-wing organisations as if they occupied comparable positions in the world system. They do not. Some hold state power, legislative leverage, regulatory capacity, and international access. Others hold little more than critique, memory,...
Loaded Magazines and the Collapse of Political Legitimacy:A Risk-Ethical and Political-Economic Anal…
Political legitimacy does not collapse at the moment a weapon is fired. It collapses earlier—at the moment a governing authority accepts the presence of live ammunition in domestic crowd control as a legitimate option. The decision to deploy armed personnel carrying loaded magazines is not a neutral security measure. It is a risk-ethical commitment. By definition, live ammunition introduces a non-zero probability of accidental discharge, misjudgment, panic escalation, or chain reactions leadi...
Cognitive Constructivism: Narrative Sovereignty and the Architecture of Social Reality-CC0
An archival essay for independent readingIntroduction: From “What the World Is” to “How the World Is Told”Most analyses of power begin inside an already-given reality. They ask who controls resources, institutions, or bodies, and how domination operates within these parameters. Such approaches, while necessary, leave a deeper question largely untouched:How does a particular version of reality come to be accepted as reality in the first place?This essay proposes a shift in analytical focus—fro...
<100 subscribers
Power Changes Responsibility: Different Advice for the Socialist International and the Fourth Intern…
Introduction: The Left’s Crisis Is Not Ideological, but RelationalThe contemporary Left does not suffer from a lack of ideals. It suffers from a refusal to differentiate responsibility according to power. For more than a century, internal debates have treated left-wing organisations as if they occupied comparable positions in the world system. They do not. Some hold state power, legislative leverage, regulatory capacity, and international access. Others hold little more than critique, memory,...
Loaded Magazines and the Collapse of Political Legitimacy:A Risk-Ethical and Political-Economic Anal…
Political legitimacy does not collapse at the moment a weapon is fired. It collapses earlier—at the moment a governing authority accepts the presence of live ammunition in domestic crowd control as a legitimate option. The decision to deploy armed personnel carrying loaded magazines is not a neutral security measure. It is a risk-ethical commitment. By definition, live ammunition introduces a non-zero probability of accidental discharge, misjudgment, panic escalation, or chain reactions leadi...
Cognitive Constructivism: Narrative Sovereignty and the Architecture of Social Reality-CC0
An archival essay for independent readingIntroduction: From “What the World Is” to “How the World Is Told”Most analyses of power begin inside an already-given reality. They ask who controls resources, institutions, or bodies, and how domination operates within these parameters. Such approaches, while necessary, leave a deeper question largely untouched:How does a particular version of reality come to be accepted as reality in the first place?This essay proposes a shift in analytical focus—fro...
In the pre–brain-computer-interface era, algorithms and AI systems function less as neutral tools of computation and more as extensions of existing power structures. They operate as transitional instruments of cognitive regulation — provisional mechanisms that simulate a form of behavioral governance while masking the deeper asymmetry of control embedded in political economy and institutional power.
Technology is often described as neutral, capable of enabling emancipation as much as domination. Yet neutrality collapses the moment power asymmetries enter the frame. The actors who possess capital, infrastructure, and institutional legitimacy — states, corporations, financial networks, security bureaucracies — inherit the first-mover advantage in defining how technologies are designed, deployed, narrated, and normalized. What appears as technological inevitability is, in practice, a continuation of cognitive hegemony by technical means.
Algorithms do not merely classify behavior; they encode normative worlds. They sort attention, stabilize desirable forms of compliance, and render resistance statistically irrational. Their promise of personalization is inseparable from a deeper logic of behavioral enclosure: to pre-structure possibility spaces before political consciousness can emerge. Instead of coercion in the classical sense, we encounter a soft regime of preference-shaping — where desire is managed, not suppressed; where silence is not imposed, but statistically produced.
This regime is transitional rather than final. It bridges the gap between legacy power formations and more comprehensive forms of cognitive integration envisioned in future human-machine systems. The present moment is a pre-disciplinary phase: algorithms simulate governance outcomes without the totalizing reach of direct neural interfacing. Their authority remains contingent, dependent on data feedback loops, market legitimacy, and infrastructural belief. Precisely because this regime is incomplete, it requires mythologies of inevitability — progress narratives, innovation fetishism, and the moral aura of efficiency.
Yet technology remains structurally ambivalent. The same infrastructures that enable enclosure can also enable cognitive liberation: collective archives, counter-public epistemologies, solidaristic knowledge networks. The decisive variable is not the tool but the political distribution of interpretive power — who defines meaning, who frames legitimacy, who determines which forms of knowing are permitted to scale.
The struggle over AI is therefore not a struggle over machines, but over cognitive sovereignty. To reclaim thought from algorithmic mediation is to contest the silent architecture of classification itself: to expose the cultural will embedded in code, the ideology nested in optimization, the historical choices hidden inside technical form. Liberation in this sense is neither technological rejection nor acceleration, but the refusal to accept cognitive supervision as the price of participation in social life.
Algorithms regulate the present only because the future remains under negotiation. The question is whether this transitional era becomes a corridor to deeper domination — or a brief and fragile interlude before new forms of collective intelligence emerge that re-anchor technology in justice, reciprocity, and human autonomy.
In the pre–brain-computer-interface era, algorithms and AI systems function less as neutral tools of computation and more as extensions of existing power structures. They operate as transitional instruments of cognitive regulation — provisional mechanisms that simulate a form of behavioral governance while masking the deeper asymmetry of control embedded in political economy and institutional power.
Technology is often described as neutral, capable of enabling emancipation as much as domination. Yet neutrality collapses the moment power asymmetries enter the frame. The actors who possess capital, infrastructure, and institutional legitimacy — states, corporations, financial networks, security bureaucracies — inherit the first-mover advantage in defining how technologies are designed, deployed, narrated, and normalized. What appears as technological inevitability is, in practice, a continuation of cognitive hegemony by technical means.
Algorithms do not merely classify behavior; they encode normative worlds. They sort attention, stabilize desirable forms of compliance, and render resistance statistically irrational. Their promise of personalization is inseparable from a deeper logic of behavioral enclosure: to pre-structure possibility spaces before political consciousness can emerge. Instead of coercion in the classical sense, we encounter a soft regime of preference-shaping — where desire is managed, not suppressed; where silence is not imposed, but statistically produced.
This regime is transitional rather than final. It bridges the gap between legacy power formations and more comprehensive forms of cognitive integration envisioned in future human-machine systems. The present moment is a pre-disciplinary phase: algorithms simulate governance outcomes without the totalizing reach of direct neural interfacing. Their authority remains contingent, dependent on data feedback loops, market legitimacy, and infrastructural belief. Precisely because this regime is incomplete, it requires mythologies of inevitability — progress narratives, innovation fetishism, and the moral aura of efficiency.
Yet technology remains structurally ambivalent. The same infrastructures that enable enclosure can also enable cognitive liberation: collective archives, counter-public epistemologies, solidaristic knowledge networks. The decisive variable is not the tool but the political distribution of interpretive power — who defines meaning, who frames legitimacy, who determines which forms of knowing are permitted to scale.
The struggle over AI is therefore not a struggle over machines, but over cognitive sovereignty. To reclaim thought from algorithmic mediation is to contest the silent architecture of classification itself: to expose the cultural will embedded in code, the ideology nested in optimization, the historical choices hidden inside technical form. Liberation in this sense is neither technological rejection nor acceleration, but the refusal to accept cognitive supervision as the price of participation in social life.
Algorithms regulate the present only because the future remains under negotiation. The question is whether this transitional era becomes a corridor to deeper domination — or a brief and fragile interlude before new forms of collective intelligence emerge that re-anchor technology in justice, reciprocity, and human autonomy.
Share Dialog
Share Dialog
No comments yet