Skip to content
News & Analysis

Synaptics’ Wedge in Edge AI is 'Astra'

Synaptics wants to incorporate AI in a host of easy, ubiquitous, and everyday IoT devices.
Synaptics AI Platform
Synaptics AI Platform (Source: Synaptics)

Share This Post:

By Junko Yoshida

What’s at stake:
Synaptics is joining the edge AI club with a platform called “Astra” – a glimpse of which the company teased out this week. Synaptics is banking on Astra to win a crowded, fragmented, and highly competitive edge AI segment. Can it pull it off?

Synaptics developed the Astra platform to enable system designers to make AI as easy, ubiquitous, and quotidian as possible in a host of products, including some not typically viewed as needing AI.

Astra consists of software (Linux, Android and RTOS), AI frameworks (AI models and model creation tools) and development kits (for compute and connectivity modules). All will be applied to Synaptics’ embedded processors, starting from high performance MPUs and MCUs to connectivity MCUs.

Given that practically every player in the MCU/IoT chip space is already touting AI + IoT product offerings, the edge AI market won’t be an easy ring in which to pick a fight, much less win.

Vikram Gupta

Vikram Gupta, senior vice president and general manager of IoT Processors, disagrees.

First, Synaptics’ strength lies in numerous sensing technologies – audio, video, voice, touch, wireless – all critical for AI to detect, sense and pick up data. Second, Synaptics provides a broad range of compute capabilities including high-performance MPUs – an area in which traditional MCU suppliers’ grip is tenuous. Third, Synaptics possesses a robust wireless/connectivity technology portfolio inherited from Broadcom via acquisition.

Synaptics has a credible shot at cornering what one might call the “Edge AI 2.0” market, where AI is no longer an afterthought in IoT products.

Altogether, Gupta believes Synaptics has a credible shot at cornering what one might call the “Edge AI 2.0” market, where AI is no longer an afterthought in IoT products.

AI is disrupting the entire IoT market, observed Gupta. Ten or even 5 years ago, AI might have been considered as a nouvelle feature in connected devices, or a cool thing to have maybe in the future. “Today,” he said, “No product manager would come in and tell us that I am developing a product that has nothing to do with AI.”

How the market is “thinking about this whole AI thing” alters how customers select processors, explained Gupta. “Once people get latched onto some processing element, they want to stick with them because of all the software investment that comes with them.”

Starts with high-performance MPUs
As competitors – especially MCU vendors – tend to bolt AI accelerators to their popular MCUs to enable AI on the edge, Synaptics believes it can separate itself by starting its own AI/IoT journey from its high-performance MPUs.

Synaptics’ highest-end MPU, for example, is already loaded with enough compute power to “take AI model and optimize it and make it run on our MPU,” claimed Gupta, enabling the AI native compute. Moreover, those higher-end MPUs have Neural Processing Units (NPU) built into their silicon. “In some cases, when we don’t have an NPU, we have a GPU,” he added.

Inside look of Synaptics' high-performance MPU
Inside look of Synaptics’ high-performance MPU (Source: Synaptics)

While maximizing tera operations per second (TOPS) on the MPU is important, Gupta stressed, “We’re not just going for raw speed. We are also looking at inferences per second as a more measurable metric.” In the end, what counts is “the entire system and how fast you’re making decisions,” he noted.

With that in mind, Synaptics is heavily investing in software tooling. The goal is by giving customers “the right kind of tooling,” they can take models, which can be optimized and targeted for the AI engines they have,” said Gupta.

We are on a 12nm process node
Synaptics’ differentiating factor is that its MPUs are on 12 nm node, which Gupta described is in a “sweet spot, from a silicon, process and IP perspective.”

Because AI pushes the need for more compute and more memory, Gupta declared, “Being in the 12nm sweet spot, we are better off than our competitors for the next few years from an IoT perspective and from a silicon standpoint.”

In summary, Synaptics cites three big trends changing the market trajectory for AI/IoT in its favor.

First, AI is infiltrating every product. “AI has been very disruptive to the IoT market,” said Gupta. Synaptics is keen on serving not only tier one developers of AI products, but also “tier two’s,” according to Gupta.

Second, IoT system designers need a good set of AI tools. At Synaptics, “We’ve got tooling geared towards making AI experience frictionless and easier. We are trying to say to our customers, ‘Come to us, if you want to solve an AI problem in your products.’”

Third, geopolitics. Gupta said he was surprised by some customers who say that “we want to go to U.S.-based suppliers.” Additionally, customers are looking for an AI/IoT solution that’s future proof.  “They want to have the right amount of processing and the right amount of memory, even though, in many cases, they don’t know what they might need it for… But the AI space is evolving so fast that they are hedging to actually go for more powerful processing elements.”

Bottom line:
Synaptics plans to release its MPUs, MCUs and connectivity MCUs in phases, all supported by the Astra platform. With the exact rollout schedule and specifications of products still under wraps, its hard to say how competitive Synaptics will be in a crowded edge-AI market. But if Astra is properly executed and applied to all product lines as promised, Synaptics’ chances to lead the high-end MPU-based AI/IoT segment get better.

Junko Yoshida is the editor in chief of The Ojo-Yoshida Report. She can be reached at

Copyright permission/reprint service of a full Ojo-Yoshida Report story is available for promotional use on your website, marketing materials and social media promotions. Please send us an email at for details.

Share This Post: