Site Section

Technologies

The Technologies page explains the current scope, usage, and product boundaries for this part of Akari.

Technologies

Current learning surfaces plus connected systems that are still under active build.

Today the public product line centers on structured learning, memory review, library and reading surfaces, profile and home entry points, and AI-assisted workflows. The broader architecture still aims to keep learning, retention, AI, and knowledge structures inside one environment.

The runtime has also shifted to the current VPS stack: PostgreSQL, Redis, S3-compatible storage, and internal AI services now back the official web experience, while local-first and BYOS paths remain part of the product direction.

Core Systems

Each system has a distinct role, but none of them are meant to operate in isolation.

school

Learning Engine

The current public wave already ships learning home, course detail, section reading, and roadmap surfaces to keep structured study moving.

psychology

Memory System

Uses FSRS-driven spaced repetition, SRS cards, and review history to convert recent work into durable recall.

model_training

AI Tools

Provides dialog assistants, model access, and workflow actions for summarization, generation, and automation.

hub

Knowledge Layer

Connected notes, graph-aware structure, and richer knowledge views remain under active build rather than something the site should overclaim as fully finished today.

Operating Boundaries

The surrounding infrastructure is part of the product promise because it shapes privacy, ownership, and portability.

desktop_windows

Local-First

Core knowledge can remain on-device by default, giving the user direct control over their primary workspace.

cloud_sync

VPS Runtime And BYOS

The current stack uses PostgreSQL, Redis, and S3-compatible storage on the VPS, while BYOS paths such as WebDAV and private storage remain open.

data_object

On-Demand AI Transmission

API calls are scoped to the task the user triggers rather than exporting the full knowledge base to model providers.

Why These Systems Stay Together

Integration matters because learning quality is shaped by transitions between stages, not only by the stages themselves.

merge_type

Shared Context

Learning, reading, memory, and AI surfaces can reference the same material instead of being rebuilt inside separate apps.

sync_alt

Clear Handoffs

Course progress, reading context, review history, and future knowledge-layer work all stay easier to connect when the boundaries are explicit.

stack

Lower Friction

The fewer boundaries between modules, the easier it is to sustain a real daily workflow instead of abandoning it halfway through.

south_east

Continuity matters more than isolated features

The point of keeping these systems together is simple: fewer handoffs, less duplicated work, and a better chance that what you read, practice, remember, and connect will continue to strengthen each other over time.