Are DSPs Dead ?
Are DSPs Dead ?
Former Texas Instruments Sr. Fellow Gene Frantz and former TI Fellow Alan Gatherer wrote a 2017 IEEE article about the "death and rebirth" of DSP as a discipline, explaining that now signal processing provides indispensable building blocks in widely popular and lucrative areas such as data science and machine learning. The article implies that DSP will now be taught in university engineering programs as its linear systems and electromagnetics predecessors: necessary background, but no longer a popular career path. If anyone has reliable insight, it's Gene, known as "the father of DSP" (those of us who started in signal processing in the 1980s knew that long before Google).
But DSP as a discipline and DSP chips as a technology are not the same thing. What Gene and Alan don't cover is what happened to TI. Three years later, I think we can answer questions about DSPs: why TI is not an industry leader in the AI and 5G CPU and SoC markets and why TI's top-end DSPs, which were actually advanced multicore CPUs, are not at their height of popularity.
In 2014 Texas Instruments had a multi-billion dollar opportunity to transition into HPC and AI, the latter in real-time inference and training. These technologies rely heavily on calculation intensive devices, as does signal processing. By that time, TI's most advanced devices were multicore CPUs, each core about equivalent to an Intel Xeon E5-2660 for calculation intensive processing. They had available 3rd party hardware and software, including PCIe cards with 64 c66x cores, dual GbE ports, Linux drivers and concurrent users (even VM support), neural net based c66x speech and image recognition software -- all fully operational. These combined hardware + software solutions had significant competitive advantages vs. GPUs, including advanced SIMD, mixed precision, lower PCIe latency, concurrent threads (e.g. multiuser), direct I/O (actually on the card), extremely high per-thread throughput for matrix and convolution operations, very robust Linux cmd line tools, and far lower SWaP. At that point in time, TI was neck-and-neck with Nvidia and FPGA vendors in the eyes of calculation intensive customers. The future market for AI processing was up for grabs.
The decision required TI executives to fully engage and promote their solutions in the server market, as without server cards and an HPC + AI roadmap they -- and their third parties -- had no business case. What led TI to this crossroads ? As with other business decision inflection points that in retrospect seem inexplicable, you have to factor in corporate history and culture. TI always had an aversion to servers, starting with PCs in the 1990s. They focused on embedded products; their mantra was "we're a component company, we don't make systems". Around 2010 that philosophy needed an upgrade. The world's embedded product engineers were inexorably moving towards development with servers and deciding later which embedded SoCs and cores to port their products. This gave developers both flexibility and some degree of "future proofing" as their codebase stayed on servers, maintained, debugged, and rigorously tested in private or public clouds. This was not only a technology shift, but also generational, as millennial engineers were increasingly trained in software engineering. Some embedded chip vendors recognized this shift and were willing to hold their customers' hands as they walked the path from servers to embedded SoCs. Fast forward to today, both AI and 5G applications demand, without exception, "edge native" and open source support to make development and data flow seamless and transparent with cloud native applications. As one example, major carriers are now deploying "edge servers" to support massive wireless data with lower latency. AT&T, Google, et. al. will deploy 100s of thousands such servers to handle the combination of 5G and AI. That means that even TI's former market stronghold for advanced DSPs -- wireless basestations (surprise ! does that sound like 5G ?) -- still remains a potential market for them, but only if they offer server solutions. A "DSP inside" philosophy; i.e. transparent to the enclosing form-factor or platform, would have served TI well. It would also have helped to label their devices something like coCPUs to emphasize both their processing potency and coexistence with general purpose x86 devices inside servers.
Unfortunately it didn't happen. Now the world is server first and TI still treats servers like Voldemort -- "the box that must not be named" -- choosing instead to continue promoting eval boards, JTAG emulators, and Windows IDEs. It's a stubborn, woefully outdated strategy. TI's inability to transition has not been lost on their formerly loyal, dedicated customer base. Starting in 2016 I watched firsthand as customers reluctantly abandoned their TI multicore CPU projects one by one, as they lost faith TI would ever make the jump to server-centric solutions. Looking back now, that was the end of the line for thriving commercial DSPs and a third-party ecosystem.
Do DSPs Have a Comeback Path ?
In 2016 TI halted development and future roadmap for its multicore CPU product line and purged most of its world leading DSP knowledgeable employees, many of whom landed at big tech outfits working on AI and 5G. For the last 3 years, I've been amazed at how many former TI engineers and managers from Dallas and Houston I meet at AI related conferences and tradeshows in the Bay Area. They are doing fine, but TI faces endless and intractable challenges due to the breakdown in globalization and a Washington leadership asking why the US is no longer leading in hardware and semiconductors, instead ceding the advantage to China.
TI does continue to offer effective DSP based SoC solutions in automotive electronics and other niche areas, but unless they rediscover executive boldness, competitive hunger, and a vision to embrace servers, it's hard to see a comeback path. Embracing servers would also mean embracing high performance PCIe cards, DPDK, VM and container support, and a wide range of open source groups. In addition TI would need to commit to careful software engineering (i.e. not offshoring it), and extensive R&D in AI and 5G. The list is long and difficult, and it would take driven and relentless leadership -- like Amazon or Tesla -- to make it happen. But whoever figures out server compatible processing chips for AI will earn those billions.
- Comments
- Write a Comment Select to add a comment
When it comes to DSP, you can have a "deluxe" chip that does it all, or you can go the "lite" path. For example, Analog Devices now offer an RF ADC with programmable divider, allowing I/Q data and basic downconveter in a single chip.
Surely TI could do similar? Adding basic CIC / FIR / IIR elements to their data converters ? The more functionality you pack into interface chips, the lower the workload on the processing core. Lowering data transfer rates also eases PCB design constraints, leading to more robust systems. And no matter how strong the server market is, there should always be a market for low power, reliable data interfaces.
Alan-
They could, but lite RF DSPs are a small market. TI is increasingly an analog company and they continue to lead Analog Devices in analog market metrics by a large margin. If AD has a hot product I'm sure TI would match them.
Now if we're talking top 10 overall semiconductor, that's where TI had their opportunity of a generation. They were well prepared for it, as DSP is adjacent to AI in a calculation intensive sense. In recent years TI has slipped from #4 to #8. They could drop out of the top 10 if they don't get with the program on AI and 5G. For example Nvidia is now #10 and will soon overtake TI due to their many years of investment in AI. Also TI is now competing with semiconductor wannabe's in China, due to decoupling and the general unwinding of globalization. We can be sure guys in China are building super dense multicore CPUs, they even call them DSPs, like this one:
https://www.nextplatform.com/2019/05/02/china-fles...
-Jeff
I would like to thank you Jeff for this long awaited article. It answered a sad question that haunted me for a long time. "Where did the DSP chips go?"
However, I would like to hear you thoughts about a potential "backdoor" for the return of DSPs that is the IoT market. It is expected to thrive as good as the AI & 5G markets. In fact, in my opinion at least, IoT will provide the massive data, which is carried by 5G networks to be processed by AI algorithms.
Furthermore, what about the incarnation of DSPs in processors like ARM and its likes as an IP instead of a hard processor. Do you think this will contribute to the return of DSP?
Sami-
As far as I can tell TI, Analog Devices, and others continue to make strong DSP effort in the IoT, or "far edge", market. So what you see there is what you can continue to expect: various types of ARM based SoCs, with DSP cores included depending on the application, and in some cases DSPs without ARM cores, if the combination of low power and high calculation warrants (such a device is what Alan asked about). To answer your question, that's probably the future for DSP chips, and there will continue to be a slow decline.
But you're right, AI and 5G applications should be a natural driver of DSP chip technology, and the IoT market would appear to be a natural transition from the embedded systems of yesterday.
As I pointed out in the article, AI and 5G application developers demand a well-defined path from servers to IoT devices. For advanced DSP solutions that TI still promotes (such as ADAS, which combines ARM and c66x cores) TI seems to think they can tell AI developers "once you get it working on x86 servers, then you can use our closed source 'translation tool' to port your app to our SoCs". That isn't going to fly -- I've actually seen Google and AWS guys laugh when that comes up at customer meetings and conferences. Telling them to wait for 6 months while they develop their complex, data intensive app on servers then go break out the JTAG emulator just doesn't cut it. TI needs to give developers true server solutions, and build up their credibility inside open source groups. Every c66x multicore and SoC solution they offer should have a PCIe card that developers can drop into Linux x86 servers and integrate into their development and test efforts right from the start. Without this approach, over time they will lose out in the IoT market to Nvidia and other outfits who continue to reduce power consumption, increase calculation cores (e.g. TensorRT cores), and build their presence in servers and open source. TI's advantages in power consumption, small package size, and calculation intensive CPU architecture can only last so long.
-Jeff
Jeff -
Thank you for the Great article!
I sincerely think you nailed the issues with the TI leadership and culture. If I am not mistaken, they had a chance to dominate the game console market or, at least, become a major player then during the 1980's against Atari, Commodore, Coleco, Intellivision, etc. But the CEO at that time made a fatal financial decision by moving the plant elsewhere which cost TI millions unnecessarily - This set them back big time!
From my personal experience, I enjoyed working with embedded TI DSPs, specially C55xx series. As much as I want to work with the same embedded tool chain, most of my projects require coding DSPs in software in combination with FPGA or FPRF optimizations. Basically, as you mentioned, all the collected I-Q data or DSP related data is continually recorded on multi-core servers with fast indexing and retrieval.
I wished TI leadership listens to people like you who have the insight in these matters.
The SDR Monkey
SDR Monkey-
TI is an exceptionally well run, solid outfit. Any product they put out exceeds the data sheet specs by 25%. We used to run Advantech 64-core c66x PCIe cards at 1.6 GHz when they were rated for 1.2 GHz. No issues -- that's amazing.
Normally I'd give them maximum credit for being cautious and sticking to a conservative game plan, but since 2015 everyone can see AI coming down the tracks. Google acquired DeepMind in 2014 for 500M. If that's not a harbinger for calculation intensive need, I don't know what is, and calculation intensive processing is a major reason for TI's existence. There is no excuse for TI execs to have missed that train.
You are right in the sense they could dominate any major semiconductor product area they put their mind to. Maybe TI could not foresee Trump and "de-globalization" but those are now major issues also. I hope somebody in Wash DC gives them a kick and says "you know what, you want to get off that Huawei entity list, you need to regain your leadership position in today's key technologies, including AI and 5G".
They are a historically great American company. They need to back up their reputation and take some initiative.
-Jeff
Good summary, Jeff. Sadly, I think TI lost their way a decade before 2014.
I agree with your comment about TI and analog. This is where the higher margins are so it made commercial sense for them.
Back to DSP - Unlike Gene and Alan I haven't seen the DSP market go away, at all. It just moved and, unfortunately, TI didn't.
Wireless was TI's strength, but it is now just one of many markets for DSP devices and probably not even the largest.
These days, the vast majority of DSP applications can be implemented quite happily using any number of ARM based devices that include one of the many ARM DSP extensions.
Cheers, John
John-
I agree TI has focused on high margins, efficiency, and stock price -- and they are outstanding at this, just like other top American multinationals. But my concern is they're not doing what they are capable of; they are squandering their potential, both for themselves and for the US. As I mentioned in another comment, maybe TI could not foresee "de-globalization" and AI + 5G competition with China, but now we really need TI, and where are they ?
IMHO, they don't deserve to get an entity list waiver to sell to Chinese companies unless they are competing in areas crucial to our future, and high margin analog chips are not that.
In any case it's good to hear your view the DSP application market remains strong. I assume you're referring to embedded systems and IoT, not servers (e.g. automotive, RF, control systems, etc). Is that mostly "low core count", for example one ARM core + 1-4 DSP cores ? Do you have some numbers or links ? I'm always looking for compelling use cases for multicore-intensive DSP in addition to AI, 5G and proprietary mobile phone SoCs.
-Jeff
To post reply to a comment, click on the 'reply' button attached to each comment. To post a new comment (not a reply to a comment) check out the 'Write a Comment' tab at the top of the comments.
Please login (on the right) if you already have an account on this platform.
Otherwise, please use this form to register (free) an join one of the largest online community for Electrical/Embedded/DSP/FPGA/ML engineers: