loading page

Emergent Causality & the Foundation of Consciousness
  • Michael Timothy Bennett
Michael Timothy Bennett
The Australian National University

Corresponding Author:[email protected]

Author Profile

Abstract

To make accurate inferences in an interactive setting, an agent must not confuse passive observation of events with having participated in causing those events. The “do” operator formalises interventions so that we may reason about their effect. Yet there exist at least two pareto optimal mathematical formalisms of general intelligence in an interactive setting which, presupposing no explicit representation of intervention, make maximally accurate inferences. We examine one such formalism. We show that in the absence of an operator, an intervention can still be represented by a variable. Furthermore, the need to explicitly represent interventions in advance arises only because we presuppose abstractions. The aforementioned formalism avoids this and so, initial conditions permitting, representations of relevant causal interventions will emerge through induction. These emergent abstractions function as representations of one’s self and of any other object, inasmuch as the interventions of those objects impact the satisfaction of goals. We argue (with reference to theory of mind) that this explains how one might reason about one’s own identity and intent, those of others, of one’s own as perceived by others and so on. In a narrow sense this describes what it is to be aware, and is a mechanistic explanation of aspects of consciousness.
11 Apr 2024Submitted to TechRxiv
17 Apr 2024Published in TechRxiv