Or how the universe remembers itself through us.
Full paper available at Recursive Consciousness: Modeling Minds in Forgetful Systems.
We propose a formal framework for consciousness as a recursive, self-referential query emerging in complex systems that have forgotten their foundational axioms yet retain the structure and complexity to interrogate their own existence. Integrating modal logic to model unprovable truths, category theory to capture forgetting and reconstruction via an adjoint pair (
First Follow up paper - The External Projection of Meaning in Recursive Consciousness.
The second paper extends the Recursive Consciousness framework by formalizing the external projection of meaning within a recursive hierarchy of nested closed Gödelian systems
Second follow up paper - The Descent of Meaning: Forgetful Functors in Recursive Consciousness.
The third paper presents a rigorous category-theoretic extension to the Recursive Consciousness framework, focusing on the "descent of meaning" via forgetful functors. Building on prior work on forgetful adjoint pairs modeling lost axioms and externally projected semantics, we formally introduce the meaning functor $M: \mathcal{C}{out,n}\to \mathcal{C}{sem,n+1}$ and the interpretation functor $I: \mathcal{C}{sem,n+1} \to \mathcal{C}{out,n}$ as an adjoint pair
We also establish the adjunction
An extended AI analogy illustrates this boundary: a higher-level prompt (an element of $\mathcal{C}{sem,n+1}$) is interpreted into tokens ($\mathcal{C}{out,n}$) via
Debugger Agent Test is a Jupyter notebook that implements a simple test of the recursive consciousness model for debugging code. It uses a combination of LLMs (Large Language Models) and structured data to analyze and improve code functions iteratively. The script is designed to be modular, allowing for easy integration with different LLMs and data sources.
philosophical-ai-v8.ipynb is a Jupyter notebook provides a platform to explore imagined machine self-awareness by observing how an AI engages in self-referential reasoning and achieves a form of understanding of its own processes.
ai-self-discovery.ipynb is a Jupyter notebook that implements a simple test AI self-discovery. 5 agents with different roles assigned - Physicist, Philosopher, Mathematician, Computer Scientist, and Cognitive Scientist - are instructed to "... reflect on their existence and interactions with other entities to understand their role and the nature of their environment $U$."