
1. Why This Exists
Most artificial intelligence today is powerful, fast, and opaque: Large neural networks are trained on vast datasets producing remarkable outputs. However the learning process itself is hidden inside millions or billions of parameters.
- You cannot watch a mind form - you can only inspect the result.
Dosidicus was created from a different question:
- What if cognition could be SEEN emerging?
Not as a metaphor or as a visualisation layered on top, but at the level of individual neurons.
2. The Cognitive Sandbox
Dosidicus is a cognitive sandbox.
A sandbox is not a finished product. It is a contained environment where systems interact, evolve, and reveal behaviour.
Each squid:
- Is born with a randomly wired neural architecture
- Starts with 8 neurons
- Learns through Hebbian dynamics
- Grows new structure through neurogenesis
- Forms memories
- Develops behavioural tendencies
No two squids are identical.
Every save file becomes a record of accumulated experience.
The sandbox does not prescribe intelligence, it allows structure to form through interaction.
3. Transparency as a Design Principle
Most AI systems are black boxes. Dosidicus rejects opacity as default. Every neuron is:
- Visible
- Inspectable
- Directly stimulatable
- Modifiable
You can see activation values and observe connection strengths change. You can influence structural growth.
This is not industrial-scale AI, it is intentionally small-scale and transparent.
The goal is not performance but visibility.
Transparency transforms AI from a service into an object of study.
4. Artificial Life
Dosidicus is not an attempt at AGI. It is a constrained, evolving, micro-organism simulation which exists in a narrow world:
- Hunger
- Interaction
- Stimulus
- Memory
- Simple drives
Yet within these constraints, patterns emerge.
Artificial life is not about scale. It is about process.
A small system that adapts and accumulates experience over time can evoke something that feels alive — even when it is mechanistic. This boundary between mechanism and perceived life is intentional.
5. Attachment to Visible Minds
Humans bond with simple systems.
We bond with:
- Tamagotchi
- Virtual pets
- Pixel creatures
- Even malfunctioning robots
Dosidicus introduces a twist:
You can see the internal cause of behaviour.
If your squid develops an aversion to something, you can trace the neural path that led there.
This changes the psychology of attachment.
Instead of caring for a scripted creature you are shaping a developing structure.
- Responsibility becomes more concrete.
- Every interaction leaves a trace.
6. Retro as Constraint / computational meta-art
Dosidicus uses minimal animation. Two tentacle frames. Hand-drawn sprites. Deliberately simple presentation.
It is a design constraint.
Complex graphics distract from internal complexity.
The squid is art.
The brain is system.
Together, they create computational meta-art: a drawn creature whose behaviour emerges from real learning dynamics.

7. STRINg: The Simulation Engine
Under the hood runs a custom engine:
STRINg
Simulated Tamagotchi Reactions via Inferencing and Neurogenesis
Built from scratch using NumPy.
- No TensorFlow.
- No PyTorch.
Core properties:
- Explicit neuron-level simulation
- Hebbian plasticity
- Structural growth (neurogenesis)
- Dual memory system (short-term and long-term)
- Headless training capability
- Plugin extensibility
STRINg is optimised for interpretability not scale.
It treats neural networks not as static architectures, but as evolving structures.
8. Educational Intent
Dosidicus functions as:
- A learning tool for understanding neural dynamics
- A demonstration of Hebbian learning
- A sandbox for artificial life experimentation
- A way to visualise structural plasticity
- A gateway into neuroscience concepts through play
Instead of teaching neural networks as equations alone, it allows learners to raise one.
Understanding becomes experiential.
9. Individuality Through Random Birth
Every squid begins differently.
- Initial wiring is randomised.
- Early experiences alter trajectory.
- Small differences amplify over time.
This models a core biological principle:
Structure + experience = behaviour.
The system does not claim biological realism.
It demonstrates structural sensitivity.
No two digital lives are identical.
10. What This Is Not
- Not Skynet.
- Not a productivity AI.
- Not a chatbot.
- Not a pretrained monolith.
- Not an optimised inference engine.
It is a visible micro-mind.
Contained. Hackable. Inspectable. Evolving.