Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If I was Intel, I would be going straight for the TPU market. GPU have a bunch of legacy from the G=Graphics legacy. The real money maker is not likely to be gamers (although it has been healthy enough market). The future of those vector processing monsters is going to be ML (and maybe crypto). This is the difference between attempting to leapfrog compared to trying to catch up.


> The future of those vector processing monsters is going to be ML (and maybe crypto)

That's a heavy bet on ML and crypto(-currency? -graphy?). Has ML, so far, really made any industry-changing inroads in any industry? I'm not entirely discounting the value of ML or crypto, just questioning the premature hype train that exists in tech circles (especially HN).


>That's a heavy bet on ML and crypto

Well, yes that is the point. My theory is that the gaming market for GPUs is well understood. I don't think there are any lurking surprises on the number of new gamers buying high-end PCs (or mobile devices with hefty graphics capabilities) in the foreseeable future.

However, if one or more of the multitude of new start-ups entering the ML and crypto (-currency) space end up being the next Amazon/Google/Facebook then that would be both unforeseeable and unbelievably transformative. Maybe it won't happen (that is the risk) but my intuition suggests something is going to come out of that work.

I mean, it didn't work out for Sony when they threw a bunch of SPUs in the PS3. They went back to a traditional design for their next two consoles. So not every risk pans out!


> Has ML, so far, really made any industry-changing inroads in any industry?

Does the tech industry not count, or are you only considering industries that are typically slower moving?


The tech industry claims it applies machine learning all over the place, but I doubt it actually moves the needle much.


Is that opinion based on anything firmer than a pessimistic outlook?

Lots of people jump on any trend, it doesn't mean the hype is unjustified or that nobody is "moving the needle" with that trend. Recommendations (e-commerce, advertising, music, etc) have been pretty revolutionized by it.


As a personal anecdote, the quality of the recommendations I receive across the board has been roughly inversely proportional to the level hype around ML in the tech press and academia.

C.f. Youtube's, Amazon and Netflix (products that bet BIG on recommendations) being incapable of recommending compelling material.


> Recommendations (e-commerce, advertising, music, etc) have been pretty revolutionized by it.

I'd like to learn more about how they've been revolutionized by it. Any sources you could share?


ML contributes to a significant fraction of revenue at three of the world's largest companies (Amazon, Google, Facebook - largely through recommendations and ad ranking). It also drives numerous features that other tech companies build into their products to stay competitive (think FaceID on iPhone). Hard to argue that it doesn't move the needle...


A ton. Look at the nearest device around you, chances are it runs Siri, Alexa, Cortana, or Google voice assistant. This will only grow.

Same with machine vision. It's going to be everywhere — not just self-driving trucks (which, unlike cars, are going yo be big soon), but also security devices, warehouse automation, etc.

All this is normally run on vector / tensor processors, both in huge datacenters and on local beefy devices (where a stock built-in GPU alongside ARM cores is not cutting it).

This is a growing market with.a lot of potential.

Licensing CUDA could be quite a hassle, though. OpenCL is open but less widely used.


> Has ML, so far, really made any industry-changing inroads in any industry?

It is (IIRC) a pretty fundamental part of self driving tech. I honestly think this is what drives a lot of Nvidia's valuation.


nvidia's largest revenue driver, gaming, made 1.4B dollars last year (up 56% YoY). nvidia's second largest, "data center" (AI) made 968M (up 43% YoY). Other revenue was 661M. Up to you if nvidia's second largest revenue center, of nearly a billion/year is "industry changing"


> crypto(-currency? -graphy?)

TPUs are massively parallel Float16 engines - not really applicable to anything outside of ML.


> The future of those vector processing monsters is going to be ML (and maybe crypto).

Hopefully some of those cryptocurrencies (until they get proof-of-stake fully worked out) move to memory-hard proof-of-work using Curve25519, Ring Learning With Errors (New Hope), and ChaCha20-Poly1309, so cryptocurrency accelerators can pull double-duty as quantum-resistant TLS accelerators.

I'm not necessarily meaning dedicated instructions, but things like vectorized add, xor, and shift/rotate instructions, at least 53 bit x 53 bit -> 106 bit integer multiplies (more likely 64 x 64 -> 128), and other somewhat generic operations that tend to be useful in modern cryptography.


This is what they tried to do with Nervana and are trying again to do with Habana.


The one thing I don't get is, there are a lot of machines out there that would gain a lot from specialized search hardware (think about Prolog acceleration engines, but lower level). For a start, every database server (SQL or NoSQL) would benefit.

It is also hardware that is similar to ML acceleration, it needs better integer and boolean (algebra, not branching) support, and has a stronger focus on memory access (that ML acceleration also needs, but gains less from). So how comes nobody even speak about this?


I don't understand how database servers would benefit. You would have to add the search hardware directly to DRAM for any meaningful gains.


You would need large memory bandwidth and a good set of cache pre-population heuristics (putting it directly on the memory is a way to get the bandwidth).

ML would benefit from both too, as would highly complex graphics and physics simulation. The cache pre-population is probably at odds with low latency graphics.


They have. Their first purchase (Nervana) hasn’t worked out for them so they are now working through their purchase of the more conventional Habana.


Intel did go for the tpu market. It was called the nervana chip, and they cancelled it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: