System Protocols
Cognito is a functional, tactical workstation for sovereign intelligence. This documentation outlines the exact technical standards for local deployment.
Architecture Overview
Designed for absolute locality, the stack is partitioned into three functional layers:
- Command Shell (Electron/React): A tactical UI built for high-legibility and local resource binding.
- Inference Core (FastAPI): The localized gateway service that maps neural
instructions to the
llama.cppengine. - Context Module (RAG): An in-memory retrieval engine using TF-IDF protocols to inject private knowledge archives into active context.
Boot Sequence
Node deployment requires an isolated environment. We explicitly support source-level builds to ensure operational transparency.
Prerequisites
- Node.js v18.0.0+ (Interface Runtime)
- Python v3.10.0+ (Processing Kernel)
- C++ Compiler (Hardware Layer Mapping: Metal/CUDA)
# CLONE_SOURCE
git clone https://github.com/ArjunDeshwal/cognitoai.git
# INITIALIZE_VENV
python -m venv venv
source venv/bin/activate
pip install -r backend/requirements.txt
# EXECUTE_INIT
cd app && npm install && npm run electron:dev
Retrieval RAG
Local knowledge injection is handled via the Context Module. Artifacts (PDF/TXT) are parsed and tokenized into 400-word segments.
Similarity is ranked via Cosine Weighting against the query vector, enabling the model to retrieve specific excerpts without external indexing services.
SECURITY_NOTICE
All RAG indexes are held exclusively in volatile RAM. No persistent search database is created, ensuring a zero-trace operation for sensitive document analysis.
Agent Sensory
The system possesses Autonomous Discovery capabilities. When the neural core
identifies an information void, it initiates a [SEARCH:] protocol to scan global
networks.
Command API
Technical interfaces for direct node control:
Neural Mounting
POST/v1/load_model
Mounts a
GGUF neural archive to the inference core.
Discovery Flow
POST/v1/chat/completions
Initiates a neural
reasoning stream with agentic flags enabled.