Course Outline

Foundations of TinyML for Robotics

  • Key capabilities and constraints of TinyML
  • Role of edge AI in autonomous systems
  • Hardware considerations for mobile robots and drones

Embedded Hardware and Sensor Interfaces

  • Microcontrollers and embedded boards for robotics
  • Integrating cameras, IMUs, and proximity sensors
  • Energy and compute budgeting

Data Engineering for Robotic Perception

  • Collecting and labeling data for robotics tasks
  • Signal and image preprocessing techniques
  • Feature extraction strategies for constrained devices

Model Development and Optimization

  • Selecting architectures for perception, detection, and classification
  • Training pipelines for embedded ML
  • Model compression, quantization, and latency optimization

On-Device Perception and Control

  • Running inference on microcontrollers
  • Fusing TinyML outputs with control algorithms
  • Real-time safety and responsiveness

Autonomous Navigation Enhancements

  • Lightweight vision-based navigation
  • Obstacle detection and avoidance
  • Environmental awareness under resource constraints

Testing and Validation of TinyML-Driven Robots

  • Simulation tools and field testing approaches
  • Performance metrics for embedded autonomy
  • Debugging and iterative improvement

Integration into Robotics Platforms

  • Deploying TinyML within ROS-based pipelines
  • Interfacing ML models with motor controllers
  • Maintaining reliability across hardware variations

Summary and Next Steps

Requirements

  • An understanding of robotics system architectures
  • Experience with embedded development
  • Familiarity with machine learning concepts

Audience

  • Robotics engineers
  • AI researchers
  • Embedded developers
 21 Hours

Number of participants


Price Per Participant (Exc. Tax)

Testimonials (2)

Provisional Courses

Related Categories