Embedded AI - Intelligence at the Deep Edge
“Intelligence at the Deep Edge” is a podcast exploring the fascinating intersection of embedded systems and artificial intelligence. Dive into the world of cutting-edge technology as we discuss how AI is revolutionizing edge devices, enabling smarter sensors, efficient machine learning models, and real-time decision-making at the edge.
Discover more on Embedded AI (https://medium.com/embedded-ai) — our companion publication where we detail the ideas, projects, and breakthroughs featured on the podcast.
Help support the podcast - https://www.buzzsprout.com/2429696/support
Embedded AI - Intelligence at the Deep Edge
A chip that controls a balancing propeller on seven microwatts
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Every battery-powered device you own has a quiet energy hog in it that nobody talks about. It is not the processor, it is not the radio, and it is not the screen. It is the analog-to-digital converter, the small piece of circuitry that translates the messy real world into the clean ones and zeros a computer can think about. For thirty years it has been the line item that decides how long your hearing aid, your pacemaker, or your soil sensor lasts on a battery.
In March 2026, a team at the University of Michigan published a result that quietly removes that converter from the picture for a specific class of problems. Their bismuth selenide memristor runs a closed-loop control task at about seven microwatts, roughly a millionth of what a household LED bulb pulls. The chip does not run code in any conventional sense. The physics does the arithmetic, and the answer drives the motor directly.
In this episode, we walk through what the device actually is, why removing the converter changes the energy budget by orders of magnitude, and which products land first when microwatt-class intelligence becomes buildable. We talk about hearing aids, implants, environmental sensors, and the small drones that have been waiting for this kind of result for a decade. We also talk about what this chip cannot do, because the press releases tend to skip that part. It will not run a language model. It will not recognise your face. It will run the reflexes underneath all of that, and the case for why those reflexes matter more than the cortex gets credit for is the through-line of the episode.
If you are interested in learning more then please subscribe to the podcast or head over to https://medium.com/@reefwing, where there is lots more content on AI, IoT, robotics, drones, and development. To support us in bringing you this material, you can buy me a coffee or just provide feedback. We love feedback!