Logbook
Mia's story doesn't start in a lab. It starts in the mind of a kid who read Asimov.
Language integration (LLM)
A language model will be integrated for speech. Mia will keep her real-time cognition — the LLM will serve as support.
Body awareness
Mia will learn to control her own face through autonomous exploration — expressions will be discovered, not programmed.
MiaByZan website
Launch of miabyzan.com to share the project with the world. Because a project like this deserves to be seen.
Vision pipeline
Real-time face detection and recognition. Mia sees, recognizes faces and estimates distances. Vision feeds directly into the cognitive loop.
Modeling Mia's psyche
While the body is being built, Zan models Mia's psyche. 109 cognitive agents, 6 engines, a real-time loop at 350ms. Emotions, living memory, dreams, arbitration — an operational artificial psychic system.
Building the android
For a year and a half, Mia has been taking physical shape. Over 30 kg of PLA printed, 28 servo motors, a complete mechanical skull. Each part is modeled in 3ds Max, printed, tested, corrected, reprinted.
Creating an OS in preparation
In parallel with theoretical studies, Zan develops a software system that will serve as the foundation for Mia's future cognition. The foundations are laid well before the first printed part.
Research in artificial cognition
For over ten years, Zan explores the literature on artificial cognition, psychic systems and embodied consciousness. This research structures Mia's entire future architecture — autonomous cognition, not just a program.
First 3D drawing of the android
Leaving Madagascar, Zan begins drawing an android in 3D. It's still just a modeled dream, but the form is there — a humanoid body, a face, a presence to bring to life.
The seed — Asimov at 15
As a teenager, Zan devours Isaac Asimov's novels. Positronic robots, the laws of robotics, the idea that a machine could have a form of consciousness — it all starts there. The idea of an android will never leave him.