In any computationally-intense discipline, there’s usually several complementary dynamics driving it forward. Generalised computing power itself moves fast and it’s one of the best examples of an exponential field. Beyond that, the creation of specialist computing devices will drive massive power & productivity gains.
At google IO, they announced the second generation ‘Tensor Processing Units’ (TPUs). These dedicated chips massively accelerate machine learning and compete directly with Nvidia’s offerings, which are now also introducing dedicated silicon to drive machine learning.
This exact process happened in Bitcoin mining, which moved from general purpose CPUs, to graphics processors, then into chips that are 100% built from the ground up to mine bitcoin. The results were explosive.
The message here is that machine learning capabilities are going to out-pace even the frantic speed of Moore’s law. I can’t wait to see what happens …