For those of you who have attended any of my conference sessions for about the past 6 months, I’ve been including a segment where I switch to a math demo, and just handwrite out some equations on the screen to try and make what I believe are some really important points. And, of course, who doesn’t like to do math?
Well, OK, fine – so rather a large number of people don’t like to do math, which is why I felt it necessary to try to make those points.
But I think it’s important to start to get our heads around this, as we enter into an era of probabilistic computing.
Perfection has always been expensive. As technology platforms begin to move more quickly, can we afford to keep pursuing perfection? Does it even make mathematical (or business) sense to try? We see that, in server data centers, going with commodity servers that can and do fail all of the time leads to the right economics. There is research that the same benefits could be had by applying probability rather than perfection to chip design. And, yes, tolerating and addressing failure is the most sensible way to address application compatibility and platform agility.
Here’s the article: