Features
- Three Learning Modes: Basic factual answers, step-by-step reasoning, and deep research investigations
- Voice-first interaction for hands-free learning with natural language processing
- Real-time 3D visualizations of concepts using Three.js & WebXR
- Curated educational YouTube video integration from trusted sources
- Multi-modal feedback combining text, speech (via ElevenLabs), and spatial panels
- VR-optional design for immersive experiences without requiring a headset
- Accessibility-focused interface for mobility- and vision-impaired users
Prerequisites
- Node.js 18 LTS or newer
- Modern web browser (Chrome, Edge, or Firefox recommended)
- Microphone for voice interaction
- Optional: VR headset for immersive mode (WebXR compatible)
- Perplexity API key, ElevenLabs API key, and YouTube API key
Installation
Usage
- Launch the app in your browser
- Say “Hey Monday” to activate the assistant
- Ask a question in one of three modes:
- Basic Mode – “What is photosynthesis?”
- Reasoning Mode – “Think about how blockchain works.”
- Deep Research Mode – “Research into the history of quantum mechanics.”
- View answers as floating text panels, voice responses, and interactive 3D models
Code Explanation
- Frontend: TypeScript with Three.js for 3D visualizations and WebXR for VR support
- Backend: Node.js with Socket.IO for real-time voice command processing
- AI Integration: Perplexity Sonar API for intelligent responses with reasoning extraction
- Voice Processing: ElevenLabs for speech synthesis and natural language understanding
- Content Curation: YouTube API integration with smart keyword extraction for educational videos
- Accessibility: Voice-first design with spatial audio and haptic feedback support