Hand-Held Camera Cube Reference Design Enables AI at the Edge

Date
07/21/2021

 PDF
MAXREFDES178# camera cube executes low latency AI vision and hearing inferences on a coin cell power budget with reduced cost and size

Maxim Integrated Products, Inc. unveiled the MAXREFDES178# camera cube reference design, which demonstrates how Artificial intelligence (AI) applications previously limited to machines with large power and cost budgets can be embedded in space-constrained, battery-powered edge devices. The MAXREFDES178# enables ultra-low-power internet of things (IoT) devices to implement hearing and vision and showcases the MAX78000 low-power microcontroller with neural network accelerator for audio and video inferences. The system also contains the MAX32666 ultra-low power Bluetooth microcontroller and two MAX9867 audio CODECs. The entire system is delivered in an ultra-compact form factor to show how AI applications such as facial identification and keyword recognition can be embedded in low-power, cost-sensitive applications such as wearables and IoT devices. 

AI applications require intensive computations, usually performed in the cloud or in expensive, power-hungry processors that can only fit in applications with big power budgets such as self-driving cars. But the MAXREFDES178# camera cube demonstrates how AI can live on a low-power budget, enabling applications that are time- and safety-critical to operate on even the smallest of batteries. The MAX78000’s AI accelerator slashes the power of AI inferences up to 1,000x for vision and hearing applications, as compared to other embedded solutions. The AI inferences running on the MAXREFDES178# also show dramatic latency improvements, running more than 100x faster than on an embedded microcontroller.

The compact form factor of the camera cube at 1.6in x 1.7in x 1.5in (41mm x 44mm x 39mm) shows that AI can be implemented in wearables and other space-constrained IoT applications. The MAX78000 solution itself is up to 50 percent smaller than the next-smallest GPU-based processor and does not require other components like memories or complex power supplies to implement cost-effective AI inferences.

Key Advantages

  • Ultra-Low Power: Reduces power budget for AI imaging applications up to 1,000x
  • Lower Cost and Size: Cuts the cost of AI with integrated components, reduced board space and compact size
  • Low Latency: Reduces latency of AI inferences by over 100x compared to other embedded solutions

Learn more at Maxim's site.

RELATED