Technologies
Current learning surfaces plus connected systems that are still under active build.
Today the public product line centers on structured learning, memory review, library and reading surfaces, profile and home entry points, and AI-assisted workflows. The broader architecture still aims to keep learning, retention, AI, and knowledge structures inside one environment.
The runtime has also shifted to the current VPS stack: PostgreSQL, Redis, S3-compatible storage, and internal AI services now back the official web experience, while local-first and BYOS paths remain part of the product direction.
Core Systems
Each system has a distinct role, but none of them are meant to operate in isolation.
Learning Engine
The current public wave already ships learning home, course detail, section reading, and roadmap surfaces to keep structured study moving.
Memory System
Uses FSRS-driven spaced repetition, SRS cards, and review history to convert recent work into durable recall.
AI Tools
Provides dialog assistants, model access, and workflow actions for summarization, generation, and automation.
Knowledge Layer
Connected notes, graph-aware structure, and richer knowledge views remain under active build rather than something the site should overclaim as fully finished today.
Operating Boundaries
The surrounding infrastructure is part of the product promise because it shapes privacy, ownership, and portability.
Local-First
Core knowledge can remain on-device by default, giving the user direct control over their primary workspace.
VPS Runtime And BYOS
The current stack uses PostgreSQL, Redis, and S3-compatible storage on the VPS, while BYOS paths such as WebDAV and private storage remain open.
On-Demand AI Transmission
API calls are scoped to the task the user triggers rather than exporting the full knowledge base to model providers.
Why These Systems Stay Together
Integration matters because learning quality is shaped by transitions between stages, not only by the stages themselves.
Shared Context
Learning, reading, memory, and AI surfaces can reference the same material instead of being rebuilt inside separate apps.
Clear Handoffs
Course progress, reading context, review history, and future knowledge-layer work all stay easier to connect when the boundaries are explicit.
Lower Friction
The fewer boundaries between modules, the easier it is to sustain a real daily workflow instead of abandoning it halfway through.
Continuity matters more than isolated features
The point of keeping these systems together is simple: fewer handoffs, less duplicated work, and a better chance that what you read, practice, remember, and connect will continue to strengthen each other over time.
More
Continue Exploring
Learning Engine
The Learning Engine page explains the current scope, usage, and product boundaries for this part of Akari.
Memory System
The Memory System page explains the current scope, usage, and product boundaries for this part of Akari.
AI Tools
The AI Tools page explains the current scope, usage, and product boundaries for this part of Akari.
Knowledge Graph
The Knowledge Graph page explains the current scope, usage, and product boundaries for this part of Akari.