Mediocrity as a Technological Imperative
No.investigues and 28.research are Mexico-based experimental projects exploring new avenues for discussion, interaction, and conversation.
For in fact the cybernetic hypothesis calls for a radically new structuring of the individual or collective subject, in the direction of a hollowing out. It dismisses interiority as a myth and along with it the entire psychology of the 19th century, including psychoanalysis. It’s no longer a matter of separating the subject from their traditional external ties as the liberal hypothesis had demanded, but of reconstructing the social bond by stripping the subject of any substance. Everyone must become an envelope without flesh and blood, the best possible conductor of social communication, the locus of an endless recursive loop that rids itself of kinks.
Tiqqun. The Cybernetic Hypothesis
What happens when the radical optimization of our cybernetically mediated reality comes to treat humanity itself as waste? Not in the science-fiction horror scenario in which a superior machine intelligence decides that humanity must be extinguished, but in a far simpler algorithmic logic—one in which humans cease to be the center of systemic efficiency and become, instead, its primary obstacle.
Some years ago, objects of consumption were clearly delimited: a car, a chocolate bar, a household appliance, a shampoo, and so on. Today, however, consumption dynamics are so ubiquitous that reality itself appears configured for that single purpose. Our feeds, messages, news, food, experiences, and even our thoughts—now mediated by algorithmic tools—both imply and enable specific patterns of consumption. Everything we do, everything we observe or interact with, is potentially a product, a commercial, or an advertising placement. We ourselves move through the collective illusion of authentic individuality, when in fact we are operational accumulations of marketing narratives: walking advertisements in disguise.
For this to be possible, a technical structure with clearly defined goals must sustain it. Popular wisdom suggests that technology converges toward better outcomes by iteratively optimizing its parameters. Under this assumption, technological progress has long carried an ontological promise of a better future. Why, then, do we experience the opposite? Why does it feel as though, even as technologies advance, everything declines—everything we consume appearing as a mediocre version of something that already existed and was better years ago?
Functionality, robustness, practicality, and aesthetics have been relegated to the background in favor of efficiency in production, consumption, and cultural signification. Many of the objects we acquire or use no longer respond —primarily at least—to any concrete practical function; instead, they have been transformed into rituals, markers of class differentiation, or components of a wholly irrational social inertia. It is therefore unsurprising that, when consumption no longer depends on functional qualities but on factors that facilitate circulation and symbolic value, the technological media designed to support this system will optimize those factors rather than any substantive notion of quality. If quality is secondary to ease of sale, distribution, and signification as an object of consumption, then these are precisely the variables that algorithmic and productive systems will optimize across successive iterations.
This produces two complex conditions. The first is the decoupling of the user’s needs from the objects they consume. The second—and more severe—is the transformation of the user or consumer into yet another element of the technical network to be optimized. The former explains why the satisfaction of subjective needs operates on a secondary plane, and why perceived quality appears to be in free fall. The latter converts us into instrumental components of a supply chain optimized exclusively for profit generation. We are no longer merely consumers or even products; we are technical objects embedded within the operational life cycle of everything we consume. In that sense, we too are absorbed in an inertia of operational mediocrity masquerading as optimization.
The initial conclusion is straightforward: human needs were never the primary vector of optimization within these systems. Operational efficiency itself was the central parameter to maximize. The cybernetic evolution of liberalism beginning in the 1970s did not emerge through a deliberate conspiracy, but through systemic optimization—the gradual redirection of collective goals toward friction reduction within consumption cycles. This logic materialized in how states, institutions, and corporations operate. Collective needs became a variable, not an objective.
This has two critical implications. First, humanity itself—inasmuch as it articulates needs—became noise: inefficiencies, interruptions to be debugged and flattened. Not only did we become products, with our time, energy, and labor converted into commodities for external demand; we became integral components of the production and consumption process itself. Whatever mechanisms now generate value, we function merely as inputs within them—parameters to be optimized, operational waste to be reduced. The rational individual myth transforms into a subject to hollow out and extract the information needed for the system to work seamlessly.
Second, mediocrity emerges as the technical ethos of machinic systems. No longer is the objective to satisfy particular needs or preserve the artistic and human dimensions of cultural objects. Instead, the system rewards precisely that which erases the human from the equation of consumption.
Artificial intelligence represents the limit case. It simulates how humans think and act at their most homogeneous and mediocre point, because that is where the probability of utility is highest. It produces the impression of interacting with the world syntactically, while remaining incapable of semantic engagement beyond its probabilistic model. The fleeting sense that these systems understand the world—or understand us—is merely a projection of our own socially normalized mediocrity. Their output is mediocre by definition. AI slop is not a bug, but a feature. The refinement of AI-generated content is a technical process operating on what already exists, designed not to elevate quality but to more effectively simulate the average of existing production.
Programs, films, series, and other media content thus become derivatives: reheated leftovers, plastic and soulless pastiches of whatever surface-level indicators register as relevant and safe within consumption cycles. Consumer goods—phones, appliances, consoles, peripheral devices—likewise become repetitions of standardized elements, aesthetically inert and technologically ambiguous in their marginal improvements. We use products burdened with overpricing, overengineering, and overdesign that appear as ghostly remnants of technical assemblies and innovations from eras we no longer comprehend. Quality is reduced to a floating constellation of signs and signifiers that brands evoke—ancient specters haunting our commercial environment.
The system consequently moves toward a cybernetic stability whose parameters reward mediocrity—not through conscious moral judgment or calculated political intent, but through the basic logic that structures consumption itself. As productive chains become increasingly technified, their machinic and algorithmic character imposes a clear operational direction: convergence, homogeneity, and statistical anticipation. The system is averse to risk, interruption, noise, pause, and recontextualization. Its technical essence is mediocrity expressed as probabilistic normality. The extremes of the Gaussian curve are targets for normalization, optimization, or elimination—and it is precisely in those margins that humanity resides.
This is crucial to articulate, as it dismantles many objectively false narratives about the supposed creativity or superhuman capacities of these systems. AI treats mediocrity as a primary functional parameter. While mediocre performance may suffice for many mundane tasks, the persistent calibration toward this median has long-term consequences for our capacity to appreciate, demand, and value—particularly in the realm of culture. This is without even addressing the erosion of cognitive abilities resulting from sustained reliance on systems of simulated mediocrity.
The human–machine pairing is evident, but its operational conditions are often obscured. The central fact is that, by participating in this framework, humans themselves become subject to optimization as instrumental elements of the system. The issue is no longer how to articulate our needs or consumption desires, but how to recognize what is fundamentally at stake: the position of the human within the system itself.
AI, as a limiting case, reveals how easily expectations can be normalized downward. It dehumanizes not in a romantic or ethical sense, but by homogenizing the human, reducing it to its average condition, and eliminating any interruption or disturbance that threatens optimization. In cybernetic terms, it seeks to reduce the entropy of the world’s systemic complexity—and in doing so, it reduces our intention and will.
If mediocrity is indeed the guiding principle of the technological structures governing production and consumption, then this recognition must orient us toward a technical and political vision capable of resisting it. If the current system privileges optimization and distribution over human notions of quality, then it becomes necessary to reinsert the human into our relationships with machines—to resist the thermodynamic death implied by cybernetic closure through rupture, interruption, and refusal.
To progressively transform ourselves into products and instruments within a machinic network that diffuses human will through the technical and cultural propagation of mediocrity is, ultimately, a way of reducing ourselves to nothing. Understanding how this principle operates within our technical reality is the first step toward confronting it consciously.





*Galloway and yuk hui like this*