# Camera Controller Module - Story 4.2 Implementation This document describes the implementation of the Camera Controller module that captures video frames and publishes them to the event bus for real-time processing. ## Features Implemented ✅ **Event-Driven Architecture**: Camera controller runs as an independent Tokio task and communicates via the central EventBus ✅ **Configurable Frame Rate**: Supports configurable FPS (default: 30 FPS) ✅ **Multiple Input Sources**: - Physical camera devices (requires OpenCV installation) - Video files (requires OpenCV installation) - **Simulated camera** (works without OpenCV - currently active) ✅ **High-Precision Timestamps**: Each frame includes precise UTC timestamps ✅ **Frame Metadata**: Each FrameCapturedEvent includes: - Unique frame ID (auto-incrementing) - High-precision timestamp - Frame dimensions (width/height) - Compressed frame data (JPEG format) ## Current Implementation: Simulated Camera The current implementation uses a **simulated camera controller** that generates synthetic video frames. This allows the entire event-driven system to be tested without requiring OpenCV installation. ### Running the Simulated Camera ```bash # Run the edge client with simulated camera cargo run -- run ``` ### Sample Output ``` 🎯 Initializing Event-Driven Meteor Edge Client... 📊 Application Statistics: Event Bus Capacity: 1000 Initial Subscribers: 0 🚀 Starting Meteor Edge Client Application... ✅ Received SystemStartedEvent! 🎥 Starting simulated camera controller... Source: Device(0) Target FPS: 30 Resolution: 640x480 📸 Received FrameCapturedEvent #1 Timestamp: 2025-07-30 17:39:35.969333 UTC Resolution: 640x480 Data size: 115206 bytes ``` ## Configuration ### App Configuration File The camera can be configured via a TOML configuration file. The system looks for: 1. `/etc/meteor-client/app-config.toml` (system-wide) 2. `~/.config/meteor-client/app-config.toml` (user-specific) 3. `meteor-app-config.toml` (local directory) ### Sample Configuration ```toml [camera] source = "device" # "device" for camera, or path to video file device_id = 0 # Camera device ID (for source = "device") fps = 30.0 # Target frame rate width = 640 # Frame width (optional) height = 480 # Frame height (optional) ``` ### Configuration Examples **Physical Camera:** ```toml [camera] source = "device" device_id = 0 fps = 30.0 width = 1280 height = 720 ``` **Video File:** ```toml [camera] source = "/path/to/video.mp4" fps = 25.0 ``` ## Enabling Physical Camera Support (OpenCV) To enable real camera and video file support, you need to install OpenCV and uncomment the dependency. ### Installing OpenCV **macOS:** ```bash brew install opencv ``` **Ubuntu/Debian:** ```bash sudo apt update sudo apt install libopencv-dev clang libclang-dev ``` **Enable OpenCV in Code:** 1. Edit `Cargo.toml`, uncomment the opencv dependency: ```toml opencv = { version = "0.88", default-features = false } ``` 2. Update `src/camera.rs` to use the OpenCV implementation instead of the simulated version ## Event Structure ### FrameCapturedEvent ```rust pub struct FrameCapturedEvent { pub frame_id: u64, // Unique frame identifier (1-based) pub timestamp: DateTime, // High-precision capture timestamp pub width: u32, // Frame width in pixels pub height: u32, // Frame height in pixels pub frame_data: Vec, // JPEG-encoded frame data } ``` ### Event Publishing The camera controller publishes events to the central EventBus: ```rust event_bus.publish_frame_captured(frame_event)?; ``` ### Event Subscription Other modules can subscribe to frame events: ```rust let mut receiver = event_bus.subscribe(); while let Ok(event) = receiver.recv().await { match event { SystemEvent::FrameCaptured(frame_event) => { println!("Frame #{}: {}x{}, {} bytes", frame_event.frame_id, frame_event.width, frame_event.height, frame_event.frame_data.len() ); } _ => {} } } ``` ## Performance Considerations - **Frame Rate Control**: Precise timing control maintains consistent FPS - **Memory Efficient**: Frames are JPEG-compressed to reduce memory usage - **Async Processing**: Camera runs in independent Tokio task, non-blocking - **Configurable Buffer**: EventBus capacity can be tuned based on processing speed ## Testing ### Unit Tests ```bash cargo test camera ``` ### Integration Test ```bash cargo run -- run ``` ## Architecture Integration The Camera Controller fits into the larger event-driven architecture: ``` ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ │ │ │ │ │ │ │ Camera │───▶│ EventBus │───▶│ Detection │ │ Controller │ │ │ │ Module │ │ │ │ │ │ (Future) │ └─────────────────┘ └─────────────────┘ └─────────────────┘ │ ▼ ┌─────────────────┐ │ │ │ Storage │ │ Module │ │ (Future) │ └─────────────────┘ ``` ## Next Steps With the Camera Controller implemented, the system is ready for: 1. **Detection Module**: Process frames for motion/object detection 2. **Storage Module**: Save frames and metadata to persistent storage 3. **Analytics Module**: Generate insights from captured data 4. **Network Module**: Stream/upload data to cloud services ## Troubleshooting **"Failed to publish frame captured event"**: This is normal when no subscribers are active. The camera will continue generating frames. **OpenCV Build Errors**: Ensure OpenCV development libraries are installed on your system before enabling the opencv dependency. **High CPU Usage**: Reduce FPS in configuration if system resources are limited.