By “AI chips”, I mean specialized computer chips that are designed to accelerate artificial intelligence or machine learning applications. The reason why such AI chips exist is that machine learning involves certain types of computations that are rather slow on standard CPUs. AI chips are optimized toward these computations and therefore perform them a lot faster.
Major chip makers such as AMD, ARM, Intel, NVIDIA, Qualcomm, and Xilinx are working on such AI chips. So are Amazon, Baidu, and Google, for example. Then there are startups like Graphcore, Groq, NeuroBlade, or Syntiant. You can read more about AI chip companies here or here, for instance.
In fact, you can read so much about AI chips that it almost seems like conventional wisdom by now: IF one does machine learning THEN one needs AI chips.
Perhaps not (or not for every scenario). Here is part of a weekly update email that I received from Mergeflow a few weeks ago:
To give you some context: Mergeflow lets you search across VC investments, R&D, news, patents, and other sources. In addition, you can subscribe to weekly updates on your topics. In my example here, the topic is “deep learning”. The screenshot above shows deep-learning-related VC funding rounds that Mergeflow identified during that particular week.
Of particular interest to me was Neural Magic. Neural Magic is an MIT spinoff, investors include Comcast Ventures, NEA, and Andreessen Horowitz, and their claim is “GPU-class performance on commodity CPUs. No special hardware required.”
Without going into details here (contact Neural Magic for details, they will explain their approach to you), yes, this is plausible.
What does this mean?
If you are an investor…
I’m not an investor, so I am certainly not trying to give you any advice here. But if I were an investor and somebody were to approach me about funding an AI chip company, I’d probably ask them what they think they can do that Neural Magic cannot do. In other words, my “comparables basket” would now not only include other AI chip companies but also Neural Magic.
If you are a product manager…
If I were a product manager at a big computing services company (Amazon AWS, Google Cloud, Microsoft Azure, or a more ‘vertical’ company such as BigML or Cloudera), I’d probably think a lot about GPUs and other special purpose hardware, and how these may help my customers. Now, Neural Magic might just put a whole other option on the table for me and my customers.
If you are a software engineer…
I’m also not really a software engineer but a cognitive scientist who sometimes writes software. But if I were a software engineer, I would get in touch with Neural Magic and ask about a demo, to evaluate if I can run my models on their solution (hint: Neural Magic can deal with ML models represented in ONNX).
…my personal takeaway from this: if I encounter conventional wisdom, some version of “this is just the way it’s done”, or “this will never work”, I try to be alert and on the lookout for surprises. Because it may just be that there is a “Sputnik moment” lurking somewhere.