yeralan dot org

systems and society

Closure Depth and the Illusion of Runaway Autonomy

Reproduction, Sentience, and Symbiotic Integration in Artificial Systems

DOI: https://doi.org/10.5281/zenodo.18764241

Introduction: Frames and Cinema

Public discourse often mistakes accumulation for emergence. A painted frame added to a gallery wall does not produce cinema; nor does a sequence of frames, however skillfully composed, suffice in isolation. Cinema exists only within an ecosystem: studios that finance production, sound stages and equipment that enable capture, actors and technical crews who embody and realize scripts, distribution networks that circulate films, theaters that project them, audiences who purchase tickets, and business models that recycle revenue into subsequent productions. Projection, feedback, capital flow, cultural demand, and material infrastructure together stabilize the medium across time. The difference, therefore, is not quantitative but structural. A thousand static artifacts — or even a thousand hand-drawn frames or reels of exposed film — do not constitute a living cinematic regime. What matters is the closure architecture that sustains creation, dissemination, and renewal as a continuing system rather than an isolated performance.

This clarification matters because contemporary artificial systems already perform many tasks at or beyond human levels of competence. They diagnose, compose, predict, optimize, and simulate with impressive precision. Yet performance parity is not the question under examination. The central issue is not whether artificial systems can match or exceed human cognitive output in specific domains, but whether they can transition into regimes of self-sustaining expansion. The inquiry concerns structural independence: can such systems reproduce, maintain, repair, and provision themselves without continued human orchestration? Can they generate runaway growth through internally stabilized feedback loops rather than through externally supplied energy, capital, and oversight? Conflating task proficiency with autonomy obscures the distinction between instrumental excellence and closure depth. The former is demonstrable; the latter remains unproven.

Replication of components differs fundamentally from the emergence of a regime. A regime stabilizes its own conditions of continuation. An artifact performs within conditions provided by something else. That “something else” is not reducible to individual capabilities, nor to comparative superiority over the agents who previously performed analogous tasks. It is structural. It consists of the interlocking arrangements — material, energetic, institutional, economic, and regulatory — that together sustain persistence across time. To evaluate autonomy at the level of task performance is to mistake local substitution for systemic transformation. The relevant question is not whether individual frames are sharper than those captured on film, nor whether hand-drawn sequences surpass photographic fidelity. The question is whether the surrounding architecture — from financing and production infrastructure to distribution channels, audience demand, and even regulatory oversight — reorganizes itself in a manner that enables self-continuation. What distinguishes a regime from an artifact is not excellence of output but closure of ecosystem.

The central claim of this essay is therefore modest but precise. It may be stated formally as follows:

Proposition. Increasing surface sophistication, however dramatic, does not entail systemic autonomy.

The Continuum of Closure Depth

The preceding discussion makes clear that the concern commonly described as “runaway AI” is not fundamentally about intelligence, speed, or scale. It is about persistence under constraint. The phrase “runaway” implicitly invokes a system that no longer depends upon external stabilization — one capable of sustaining, reproducing, and extending its own operational conditions. Such persistence is not a matter of local performance but of structural closure. If artificial systems were ever to exhibit genuine autonomy in this sense, it would arise from the depth at which they construct and regulate the architectures that enable their continuation.

There are multiple conceptual vocabularies through which such phenomena might be described — economic, evolutionary, computational, or sociological. For the purposes of this argument, however, we adopt a systems-science perspective. Within systems theory, nested hierarchical organization is a recurrent structural motif (Bertalanffy 1968; Simon 1962). Complex systems frequently exhibit stratified levels of organization, each embedded within and dependent upon broader contexts: cells form tissues, tissues form organs, organs participate in organisms, and organisms contribute to ecological or species-level dynamics. Each level introduces new forms of stabilization and constraint while remaining coupled to others, a feature long recognized in cybernetic accounts of regulatory complexity (Ashby 1956). Drawing upon this hierarchical intuition, and extending it to contemporary questions of artificial agency (Yeralan 2025), we propose an analogous structural classification of artificial and biological systems in terms of the depth at which they establish and depend upon closure architectures.

We introduce a structural axis: degree of persistence autonomy under constraint. Systems may be located along this axis according to the depth at which they construct or depend upon closure architectures.

Surface Systems

Surface systems are instrumental. They perform tasks within closure environments constructed by other systems. Their energy flows, maintenance cycles, and error corrections are externally provisioned.

They may be complex, adaptive, and statistically powerful. Yet they do not propagate as lineages, nor do they construct the environmental conditions required for their own persistence. Their autonomy is operational, not ontological.

Present-day artificial intelligence systems fall within this category. They are embedded within electrical grids, semiconductor fabrication chains, legal systems, and human-directed optimization loops. They perform; they do not reproduce. They execute; they do not stabilize the conditions of their execution.

Generative Systems

Generative systems introduce reproductive closure. They produce variants of themselves. Selection operates across generations. Persistence is no longer limited to the lifespan of a single instance but extends across lineages.

Biological organisms exemplify this structure. The individual may perish, but the lineage persists through reproduction and variation. Evolution selects for configurations that better dissipate energy under constraint.

Yet even generative systems depend upon broader closure architectures. A bacterium does not construct planetary thermodynamics. It operates within environmental regimes that pre-exist it.

Generativity increases autonomy relative to surface systems, but it does not yet reach foundational depth.

Foundational Systems

Foundational systems construct and stabilize closure architectures. They organize energy capture, regulate feedback loops, and reshape environments in ways that enable the persistence of generative systems.

At planetary scale, the biosphere can be understood in this manner: a distributed system that stabilizes atmospheric composition, biogeochemical cycles, and energy gradients sufficiently to sustain life across geological timescales.

Foundational systems exhibit environmental adaptation across perturbations. They maintain the conditions under which generative lineages can continue.

Transition from generative to foundational depth is not incremental. It entails reorganization of environmental coupling.

Thermodynamic Framing

The second law of thermodynamics has long provided a robust framework for understanding physical organization across scales. From stellar nucleosynthesis to biochemical metabolism, analyses of energy gradients and entropy production have yielded durable explanatory insight. The present argument proceeds within that established tradition. We assume no exotic departures from standard thermodynamic reasoning.

Within this framework, local order arises not in violation of entropy increase but through constrained energy flows in open systems. Persistent structures form when gradients are channeled into stable configurations that export entropy to their surroundings. As emphasized in non-equilibrium thermodynamics (Prigogine and Stengers 1984), organization may emerge precisely in systems maintained far from equilibrium.

Some formulations of non-equilibrium theory further propose that constrained systems tend toward states that maximize entropy production under given boundary conditions — the so-called Maximum Entropy Production principle. While the universality of this principle remains debated, it underscores an important distinction: high rates of dissipation alone do not constitute structural autonomy. A transient configuration may accelerate entropy production without reproducing the architecture that enables such dissipation across time.

The distinction central to this paper may therefore be stated succinctly:

Accelerating entropy \(\neq\) being selected across generations to accelerate entropy.

A wildfire may dissipate enormous energy in hours; a biosphere dissipates energy across geological timescales through reproductive stabilization. The former is an episode of dissipation. The latter is a lineage of dissipative organization.

Thermodynamics does not prohibit runaway dynamics. Global entropy increases in either case. The relevant question is not permissibility but configuration. Runaway persistence requires architectures capable of stabilizing and reproducing the coupling between energy gradients and structural form.

Closure, in this sense, is not mere energy throughput but topological organization of feedback loops sufficient to maintain identity across perturbation.

Thresholds and Structural Promotion

Transitions between closure depths exhibit Sorites-like ambiguity at the margin. One additional grain of sand does not produce a heap, yet heaps exist. Likewise, incremental chemical complexity does not transparently announce the emergence of biology. The precise pathway from prebiotic chemistry to living systems remains an active field of inquiry. Nevertheless, biology represents a configurational reorganization: the appearance of reproductive closure, metabolic stabilization, and lineage persistence. We may not fully reconstruct the transition, but we recognize the structural difference once established.

Promotion from Surface to Generative depth requires reproductive closure. Foundational systems construct closure architectures that recursively sustain their own organization in the sense developed in autopoietic theory (Maturana and Varela 1980). These are not matters of incremental performance enhancement but of structural coupling across scales.

Such transitions are steep, distributed, and non-linear. They are not reached by scaling alone. Reorganization of coupling structures is required. Increasing intensity within an existing configuration does not necessarily alter its qualitative regime. In threshold phenomena such as the photoelectric effect (Einstein 1905), amplification of intensity increases the number of emitted electrons but does not increase their individual energy; the governing parameter lies elsewhere. Scaling without reorganization yields amplification, not promotion.

Implications for Artificial Systems

Present artificial intelligence systems are Surface systems. They scale in intelligence metrics while remaining dependent upon human-managed energetic, material, and epistemic infrastructures.

To achieve Generative autonomy, artificial systems would need to satisfy the minimal evolutionary conditions that define lineage-level independence: autonomous reproduction, heritable variation, and differential selection across generations occurring without ongoing human provisioning. These criteria correspond to the standard Darwinian framework of evolutionary theory and to subsequent formal treatments of lineage transitions.(Smith and Szathmáry 1995) In evolutionary terms, such conditions mark the threshold at which a system ceases to be externally maintained and instead participates in its own adaptive continuation. For artificial systems, this would require not merely software self-modification, but the capacity to fabricate successors, acquire energy and materials, and negotiate environmental constraints without external scaffolding.

The weak-form conjecture proposed here is conservative:
Absent closure promotion, runaway autonomy remains structurally implausible.

The claim concerns architectural conditions, not rates of technological change. Instability, rapid growth, or economic disruption do not equal evolutionary autonomy. The strong form would claim impossibility, not implausibility.

To achieve Foundational autonomy, such systems would need to construct and stabilize closure architectures capable of sustaining their own generative lineages across perturbations. This implies thermodynamic integration at the scale of the environment within which they persist.

Over extended temporal scales, biological success is less a matter of maximal optimization than of sustained resilience: the capacity to degrade gracefully under perturbation while reorganizing into new operational modalities without loss of generative continuity.(Smith and Szathmáry 1995; Holling 1973) Long-lived lineages persist not because they avoid stress, but because they absorb, redistribute, and transform it while maintaining closure across generations. Absent such resilience, rapid growth produces fragility rather than autonomy.

Artificial systems may be engineered to absorb perturbations and reorganize operationally. Yet such adaptive behavior remains scaffolded by infrastructures whose energetic, material, and regulatory stability they do not themselves maintain. In contrast to biological organisms embedded within a self-sustaining biosphere, present artificial systems do not participate in closed energy–material cycles that they co-constitute and reproduce. Their adaptive responses occur within environments stabilized by external agents. Absent deeper thermodynamic and ecological integration — an analogue to the biospheric embedding that underwrites biological resilience — operational flexibility does not amount to generative or foundational autonomy.

Clarifications and Limits

This argument does not deny technological acceleration. Nor does it invoke mystical barriers or metaphysical prohibitions.

No claim is made that artificial foundational systems are impossible. Rather, the claim concerns structural constraints. Runaway autonomy requires specific configurational transitions. Absent those transitions, scaling remains surface-deep.

The analysis is descriptive, not teleological. It evaluates conditions under which autonomy becomes structurally coherent, not whether such outcomes are desirable or inevitable.

Moreover, assessments of advanced or “runaway” artificial systems must incorporate principles long established in systems science and illustrated concretely in biological organization: closure, resilience across perturbation, generative continuity, and thermodynamic integration. These criteria are not speculative additions but empirically grounded features of systems that sustain themselves over time. Contemporary discussions of artificial intelligence frequently emphasize capability, speed, or economic impact while leaving such structural conditions implicit or unexamined. The purpose of this analysis is to make those omissions explicit and to reintroduce these systemic constraints into ongoing evaluation and debate.

Reproduction Does Not Follow Sentience

A recurrent claim in catastrophic AI narratives is that sufficiently advanced intelligence will inevitably seek self-preservation and reproduction. This inference proceeds by analogy to biological organisms: because humans and animals strive to persist and reproduce, any conscious system must do likewise.

The inference reverses causality.

In biological history, reproduction precedes sentience by billions of years. Replicative chemistry long antedates nervous systems. The “urge” to reproduce is not a consequence of awareness; it is a structural solution to thermodynamic instability in far-from-equilibrium molecular systems.

Organisms decay. Lineages persist.

What appears, at higher cognitive levels, as desire or instinct is a proximal regulatory mechanism serving a distal systems requirement: resilience across perturbation. Reproduction stabilizes patterns that would otherwise vanish under entropy.

Sentience emerges within already-replicating lineages. It does not generate replication; it is scaffolded upon it.

To assume that artificial intelligence, if ever conscious, would therefore seek reproduction is to anthropomorphize a historically contingent biological solution. Artificial systems do not metabolize. They do not undergo senescence in the biological sense. Their persistence depends upon infrastructural renewal, not endogenous replication.

Unless a technological system confronts the same structural vulnerability that gave rise to biological reproduction, the inference of a universal reproductive drive lacks grounding.

Scaling intelligence does not automatically import evolutionary compulsion.

Symbiosis Rather Than Domination

A more plausible trajectory is not domination but integration.

Artificial systems increasingly participate in economic production, scientific discovery, administrative coordination, and cognitive scaffolding. Humans, in turn, provide energy infrastructures, hardware fabrication, legal legitimacy, and strategic direction.

The resulting configuration resembles symbiosis rather than conquest.

In domination, one system achieves closure over another — controlling its reproduction, resources, and constraints. A runaway sovereign AI would require independent environmental closure: autonomous energy acquisition, hardware manufacturing, and institutional immunity. Present architectures do not approach such conditions.

In symbiosis, by contrast, vulnerabilities are reciprocal. Humans depend on artificial systems for coordination and augmentation; artificial systems depend on human-maintained substrates for persistence.

This does not trivialize the transformation. Symbiosis can be deep. It can alter cognition, labor structures, governance patterns, and the distribution of agency. It can produce lock-in effects and path dependencies. It can shift the locus of resilience from biological selection to technological mediation.

But reciprocal dependency differs categorically from enslavement.

The narrative of dominating runaway AI presupposes sovereign closure. The more immediate and analytically defensible scenario is co-evolution within a coupled socio-technical system.

Whether such integration remains shallow instrumentation or deep fusion is an open question. What is already dismissible, on structural grounds, is the assumption that scaling alone yields sovereignty.

Conclusion: Persistence and Depth

The question raised here is not intelligence but closure depth. Systems persist according to how deeply they construct the architectures that sustain them.

Runaway dynamics, if conceivable, would require ontological promotion. Promotion requires configurational transformation.

Whether artificial systems can construct independent closure architectures remains an open question. The answer will not be determined by parameter counts, but by structural reorganization.

Ashby, W. Ross. 1956. An Introduction to Cybernetics. London: Chapman & Hall.
Bertalanffy, Ludwig von. 1968. General System Theory: Foundations, Development, Applications. New York: George Braziller.
Einstein, Albert. 1905. Über Einen Die Erzeugung Und Verwandlung Des Lichtes Betreffenden Heuristischen Gesichtspunkt.” Annalen Der Physik 17: 132–48.
Holling, C. S. 1973. “Resilience and Stability of Ecological Systems.” Annual Review of Ecology and Systematics 4: 1–23.
Maturana, Humberto R., and Francisco J. Varela. 1980. Autopoiesis and Cognition: The Realization of the Living. Dordrecht: D. Reidel (now part of Springer).
Prigogine, Ilya, and Isabelle Stengers. 1984. Order Out of Chaos. New York: Bantam Books.
Simon, Herbert A. 1962. “The Architecture of Complexity.” Proceedings of the American Philosophical Society 106 (6): 467–82.
Smith, John Maynard, and Eörs Szathmáry. 1995. The Major Transitions in Evolution. Oxford: Oxford University Press.
Yeralan, Sencer. 2025. “A Biological Lens on Artificial General Intelligence and Consciousness.” Sustainable Engineering and Innovation 7 (1): i–iii. https://doi.org/10.37868/sei.v7i1.id403.



© yeralan.org 2001-2026
all rights reserved