Mission Control

The December Capstone: Total System Integration.

Project Synopsis

Neural-Nexus is the culmination of Operation Reconstruction. It is not a tutorial or a prototype; it is a full-fledged Intelligent Simulation.

We are merging the procedural worlds of Project 03, the autonomous agents of Project 09, the vision systems of Project 06, and the dialogue engines of Project 07 into a single executable. This is the proof of your transformation from student to researcher.

Why This Matters

Most engineers build isolated demos. Few can integrate disparate AI systems (Vision + NLP + RL) into a cohesive, performant product. This project serves as the "Magnum Opus" for your Master's application.

Tech Stack

Unity (Core) ML-Agents (Brains) LLM / NLP (Speech) MEAN Stack (Data)

Project Checkpoints

  • Phase 1: Total World Integration (Merge Assets)
  • Phase 2: The Unified Intelligence (Hybrid AI)
  • Phase 3: Telemetry & Full-Stack Bridge (Live Data)
  • Phase 4: The 360-Day Victory Lap (Showcase)

Field Notes & Learnings

Key engineering concepts for Complex Systems Integration.

1. The Hybrid Brain Architecture

Concept: Pure RL is too chaotic for plot; Pure FSM is too rigid.

Solution: Implement a Meta-Controller.
• Use FSM for high-level goals (e.g., "Idle", "Engage Player").
• Use ML-Agents for low-level execution (e.g., "Navigate uneven terrain").
• Use LLM for social interaction (e.g., "Negotiate truce").

2. Thread Management

Concept: Python NLP takes 2 seconds. OpenCV takes 16ms. Unity runs at 60fps.

Solution: Asynchronous Coroutines. Never block the main Unity thread. Run the socket listeners in background threads and use a thread-safe Queue to pass data back to the main game loop.

3. Live Telemetry

Concept: A portfolio piece is static. A live dashboard is dynamic proof.

Solution: Connect the game to the Portfolio 2.0 Backend. Every time a user plays your downloadable build, send anonymous stats (K/D ratio, playtime) to MongoDB and display them on your website in real-time.

4. Security

Protecting your infrastructure:

  • API Keys: Embed a restricted "Game Client" JWT in the build to allow it to POST telemetry data without exposing admin privileges.
  • Rate Limiting: Ensure the game doesn't accidentally DDoS your own server if a bug causes a loop.

Implementation

Step-by-step Execution Plan.

Phase 1: Integration (Week 1)

  • Arena: Generate final map using Proj 03 code.
  • Agents: Import Proj 09 agents with Proj 06 vision.
  • Polish: Bake final NavMesh and Lightmaps.

Phase 2: Intelligence (Week 2)

  • Controller: Script `BrainSwitcher.cs` (FSM <-> RL).
  • Vision: Activate OpenCV "LookAt" tracking.
  • Dialogue: Connect `ChatManager` to Python NLP server.

Phase 3: Telemetry (Week 3)

  • API: Endpoint `POST /api/telemetry` on Portfolio.
  • Frontend: Leaderboard Widget on `aditya-dev.com`.
  • Logs: Auto-report crashes to Discord/Slack webhook.

Phase 4: Victory Lap (Week 4)

  • Trailer: Cinematic 4K video of all systems interacting.
  • Release: `v1.0` Tag on GitHub. Clean Readme.

Dev Logs

Engineering notes & daily updates.

Entry 000 Planning

Date: Feb 3, 2026

Project 11 queued for December. The final integration of a year's worth of engineering.