
Core components:
- Micro:bit + Nezha V2 Controller – Executes the robot’s core control functions and interfaces with sensors and actuators.
- Companion Compute Unit – Provides advanced data processing capabilities and supports AI-based decision-making.
- Cloud-based LLM Service – Delivers intelligent natural language processing for emergency communication.
- Semantic Kernel Framework – Orchestrates AI workflows and integrates LLM capabilities into the control architecture.

Sensor technology:
- AI Lens – Detects individuals via facial recognition.
- Temperature Sensor – Measures body temperature for accurate diagnostics.
- Ultrasonic Sensor – Detects obstacles to ensure safe navigation.
- Display – Shows interactive chat with the Semantic Kernel.
- Light Sensor – Detects ambient light levels.
- Infrared Sensor – Detects motion.
- LED – Provides visual signaling to attract attention.


