Summary
Overview
This invention targets edge AI systems that must minimize energy per inference while still satisfying latency constraints.
Abstract
Technical Abstract
A controller running on an ARM-based system predicts latency and energy for candidate execution policies and selects the best one under current request and hardware conditions. Policy dimensions include SRAM residency, no-waste DMA prefetch, precompiled graph-variant selection, and performance-state choices such as DVFS, with runtime telemetry continuously improving the predictive models.
Search Context
SEO Keywords
edge AI patent, ARM inference scheduling patent, energy aware inference patent, DMA policy patent, latency constrained inference patent
Related Patents
More Patents in accelerator architecture, compiler control, and cluster systems
These filings sit nearby in the portfolio and strengthen internal linking across related patent topics.