An EEMBC Benchmark
« ULPMark Home · About · Run Rules · View Scores · Submit a Score
This benchmark will be released in Q2'2021; this document will be changing until release.


EEMBC's ULPMark-ML is a machine learning benchmark for measuring the energy cost associated with embedded neural-net inference. Codeveloped with MLCommons, this benchmark is the result of a year of collaboration from dozens of leading-edge member companies that comprise both organizations, making it the most accurate survey of embedded machine learning to date.


The benchmark consists of four neural nets for common tasks: image classification, anomaloy detection, visual wake words, and keyword spotting. It collects three key metrics:

  • Performance - Inferences per second
  • Energy - Joules per inference
  • Accuracy - Top-1% and AUC

A benchmark runner GUI facilitates bringup of the hardware components and automation of data collection. For performance and energy, five input files are send to the device and the median score is measured. This is to help account for data-dependencies on execution. For accuracy, up to 1000 inputs may be sent to the device in order to verify optimizations have not degraded the model at the sake of speed or low-power. Each neural net has a minimum accuracy requirement.

Table 1. Description of neural nets used in the benchmark.
Image ClassificationA ResNet-8 model trained on CiFAR-10 that performance inference on 10 classes
Visual Wake WordBased on TFLiteMicro person detection, it determines if an image matches one of two classes: person, or no-person. This is useful for devices that perform an action if a person is detected in-frame.
Keyword SpottingThis neural-net is based on Arm's KWS model. It attempts to discern 12 different verbal commands.
Anomaly DetectionAn autoencoder based on the ADMOS contest that determines if an audio waveform contains an anomalous noise, used for predicting if machinery is malfunctioning.

Instructions for downloading, porting and running can be found at the EEMBC GitHub page under this link.

How does it work?

ULPMark-ML uses an advanced framework which, in conjunction with the specific ULPMark firmware, allows sophisticated behavioral benchmarking. The framework electrically isolates the device under test (DUT) by providing power through an energy monitor (EMON), and using a proxy gateway to manage I/O (the IO Manager). This combination of hardware, firmware, and a Host UI runner program allows for intelligent, scripted flow control that makes the benchmark possible.

Smart Devices and Battery Life

How much energy does it take to make a device smart?

If you're a researcher or an industry analyst, this is the question of the day. If you're an IoT embedded designer, this question keeps you up at night.

The term "smart device" has been a moving target for decades: it appeared in the 1990's and described palmtop computing devices with handwriting recognition; a decade later it came to mean mobile phones that could connect to the internet; and now it applies to devices that use neural nets to understand speech or identify objects in a video without the need to connect to the cloud.

Neural nets are solving complex problems all over the the computer science landscape, and for the most part, the biggest ones get the most attention. You've probaly heard of Google's GPT3 which is a billion-parameter behemoth capable of uncanny human conversation, or AlphaGo, that taught itself to become a chess grandmaster in a mere four hours. But when will these capabilities start to leave the cloud and inhabit the battery-powered IoT gadgets that are populating every crevace of daily life? That answer depends on the energy cost.

Energy cost is the amount of energy required to perform an inference. An inference is the fundemantal operation of neural net. An inference is a guess. A guess that an word is "yes" or "no", or that a picture is of a car or a bird, or a vibration in a pipe is a merely flow change or an imending rupture. Different neural nets running on different edge nodes have different energy costs. and the higher the energy cost, the higher the drain on the battery.

If you're an IoT engineer, you may be asking yourself:

  • Can I add object detection to this wireless security camera so that shadows don't set off the alarm, but not have to charge its batteries every other day? or,
  • What's the smallest core I need to enable keyword spotting in this smart headset? or,
  • How can I figure out which inference accelerator fits my design's power budget?

Since the number one challenge to smartening up IoT devices with neural nets is battery life, engineers need to optimize for enegy cost. This means right-sizing their deisgn by picking the appropriate hardware for a particular neural net. ULPMark-ML is an instrumental part in realizing these tradeoffs.

Copyright © EEMBC

Note: This website only works on browsers that support ES6, such as Edge, Chrome, Firefox, Safari; IE11 and earlier are not supported.