Spaces:
Running on CPU Upgrade
Running on CPU Upgrade
๐ง Aura-Agent: Neural Guardian โ Seizure Prediction via Teacher-Student Knowledge Distillation
#13
by Babajaan - opened
๐ง Aura-Agent: Neural Guardian
A proactive, mobile-first seizure prediction system with a 30-minute pre-ictal warning, built end-to-end with PyTorch and smolagents.
๐ Live Demo: Babajaan/Aura-Agent-Neural-Guardian
What It Does
Aura-Agent predicts epileptic seizures 30 minutes before onset from raw 19-channel EEG signals and deploys to Android via an ultra-lightweight student model.
Architecture
| Component | Model | Params |
|---|---|---|
| Teacher | Transformer EEG Classifier (4-layer, 4-head, GELU, attention pooling) | 661K |
| Student | RGF-Net โ Ring-Buffer GRU + FiLM biometric conditioning | 37.5K (< 100K โ ) |
| Agent | smolagents CodeAgent with 4 custom tools |
LLM-powered |
Key Technical Highlights
1. RGF-Net Student Model (< 100K params, ~36.7 KB quantized)
- Ring-Buffer GRU enables O(1) constant-memory streaming inference on-device
- FiLM conditioning (Perez et al., 2018) modulates GRU features using patient biometrics:
(1+ฮณ)โh + ฮฒwhere ฮณ,ฮฒ are generated from[age, sex, HR, HRV, seizure_history, medication_adherence] - EEGNet-inspired depthwise-separable convolutions for temporal+spatial feature extraction
- Supports ONNX export and INT8 dynamic quantization for Android Edge-AI
2. Three-Component Knowledge Distillation Loss
- Soft-label KD (Hinton, 2015): Temperature-scaled KL-divergence (T=4.0)
- Hard-label CE: Class-weighted cross-entropy for pre-ictal/inter-ictal imbalance
- Feature distillation (TimeKD, 2025): SmoothL1 alignment of projected student embeddings to teacher embedding space โ critical for cross-architecture (TransformerโGRU) distillation
3. smolagents CodeAgent for Autonomous Emergency Response
RGFNetSeizureRiskTool(Tool subclass with lazysetup()) โ runs model inferencesend_sos_alert,get_patient_gps,dispatch_first_aid(@tool decorators)- Agent follows a strict medical protocol: Assess Risk โ if โฅ 0.75 โ GPS โ SOS โ First-Aid
- CodeAgent chosen over ToolCallingAgent specifically for
if/elseconditional branching
Grounded in Literature
The architecture and training recipe draw from:
- STAN (arxiv 2511.01275) โ Spatio-temporal attention for seizure forecasting
- FiLM (arxiv 1709.07871) โ Feature-wise Linear Modulation
- Hinton KD (arxiv 1503.02531) โ Knowledge Distillation
- TimeKD (arxiv 2505.02138) โ Feature distillation for time-series
- Edge DL (arxiv 2012.00307) โ Quantized models for neural implants
Interactive Demo Features
The Gradio Space includes 6 tabs:
- ๐๏ธ Architecture โ Inspect both models with full parameter breakdowns
- ๐ฏ Train & Distill โ Run the KD pipeline live with tunable hyperparameters
- ๐ฎ Live Inference โ Test RGF-Net with synthetic EEG and adjustable patient biometrics
- ๐ค Agent Workflow โ Visualize the smolagents CodeAgent architecture
- ๐ KD Loss โ Mathematical walkthrough of the distillation loss
- ๐ Source Code โ Project structure and references
Built with PyTorch, smolagents, and Gradio. Feedback welcome!