The company claims the Ergo2 is up to four times faster than Perceive’s first-generation Ergo chip, and can handle larger models such as NLP.
Edge AI is coming into its own, with a variety of chips launched that deliver low cost, low power, and high performance. While training AI models gets the most attention in the media, inference processing will end up getting the most revenue, especially on edge. Global Market Insights, a reputable market intelligence firm, predicts that the edge AI market will exceed $5 billion in 2023, at a compound annual growth rate of 20% over the next decade. While we feel $5 billion is too high, we feel the 20% growth forecast is too low.
In any case, the market attracts many competitors, including Perceive, which spun off from Xperi Corporation in 2018 to focus on this opportunity, and now already has its second product ready for market.
What did Perceive announce?
The company isn’t replacing its Ergo product, but is adding higher performance and a more capable chip for high-end applications. As the table below shows, the new device provides a significant boost in image rating, consuming less than 20 megawatts. That’s a milliwatt, or 1/1000 of a watt. We don’t know of any competitor that can claim that, and still deliver around 1,000 inferences per second.
The new Ergo 2 offers four to five times the image rating performance of its predecessor … [+]
There are several companies ready or shipping edge AI chips, including SiMa.ai, Hailo Technologies, AlphaICs, Recogni, EdgeCortix, Flex Logix, Roviero, BrainChip, Syntiant, Untether AI, Expedera, Deep AI, Andes, and Plumerai, as well as Intel, AMD (Xilinx) and, of course, NVIDIA. Some, like NVIDIA and SiMa.ai, are going the SoC route, with the chip offering a more complete solution including Arm or RISC-V CPU cores and I/O.
Both Perceive products demonstrate excellent energy efficiency, but we note that its competitors do not … [+]
In contrast, Perceive (and others like Hailo) have focused on customers looking for a SoC-connected AI accelerator for a specific application. Interestingly, the Ergo chip does not require external DRAM, although it does support connection to NOR Flash to accommodate the weights of larger models. This can be a cost advantage for applications such as speech-to-text conversion, audio applications, and video processing tasks such as ultra-high resolution video and mode detection. Compared to existing products such as the 2-4 watt Hailo-8 accelerator, the Ergo aims for lower power (tens of milliwatts versus 2-4 watts for Hailo-8) albeit at lower performance.
Perceive is designed to be plugged into a device’s application processor (SoC).
conclusions
As we’ve always said, penetration of the high-end market is much easier than penetration in the data center because different applications have vastly different requirements. While an image processor for self-driving vehicles requires higher performance at higher power levels, an embedded smart camera or text processor requires less power and costs less. And many automakers prefer SoCs like NVIDIA Drive over their own SoC design; Tesla is an exception to the rule.
Thus, there’s plenty of room for specific low-power processors like the Perceive, and the company is smart to quickly extend its first foray with a faster processor with sub-watt power. The software will be key to their success, as it will enable larger models to run efficiently on the new Ergo2.
We like to note that there are many competitors entering this market, so check back soon to keep up to date!
Disclosures: This article expresses the author’s opinion, and should not be taken as advice to buy from or invest in the mentioned companies. My company Cambrian-AI Research is fortunate to have so many semiconductor companies as our customers, including NVIDIA, Intel, IBM, Qualcomm, Esperanto, Graphcore, SImA, ai, Synopsys, Cerebras Systems, Tenstorrent, and Ventana Microsystems. We do not have investment positions in any of the companies mentioned in this article. For more information, please visit our website at https://cambrian-AI.com.