The next great leap of computing
Changes this profound have not been seen since microcomputers supplanted minicomputer mainframes.
In my last post, I spoke much of John Backus’ research that he promulgated through his Turing Award lecture about the von Neumann architecture. Today, I’m going to centre something I spoke of there that I believe is of fundamental importance:
This framing allows me to break out of existing economic paradigms that normally define and limit programming as we have come to know it in business. I can avoid wasting time (and therefore money) in ways that no one else can.
By more strongly separating data from code in how computing is modelled (and therefore implemented), we can gain anywhere from one to multiple orders of magnitude in energy efficiency of our computations.
This is not a discovery that I am the first to make. REX Computing, for instance, makes it the backbone of their business model:
Moving 64 bits from memory takes over 40x more energy than the actual double precision floating point operation being performed with that data.
REX Computing is rethinking the traditional hardware managed cache hierarchy, and in removing unnecessary complexity, is able to significantly reduce power consumption and total area.
Double-precision floating point operations are among the most energy intensive computations one can do upon data. How much cheaper is integer multiplication? Or addition? Or bitwise operations? We’re potentially looking at hundredths or thousandths of the cost compared to how much energy it took to move that data into a place where it could be operated on.
This is a general consequence of the Harvard architecture and it’s archaic separation of data and code busses. This is what John Backus was working on before he retired. He questioned if we could break free of the von Neumann style, and I’m here to tell you it is an inevitability.
There are fundamental forces at work here that underpin the very nature of computing as we know it. These forces dictate the parameters of how we use energy and what sorts of things we can make the computer do. John Backus called these the “two worlds”, one comprised of statements and the other comprised of expressions. I call the former mechanicalism, and the latter functionalism.
I like to think of mechanicalism as analogous to the punch card machines and assembly lines that coloured the 20th century. Their main existence and utility is not directly intertwined with the things that they make, yet they are magnificent for being highly efficient and are as perfect as their casts and moulds.
Functionalism can be thought of as more of an art or existential study. It is more important how the canvas ends up, or what expressions you make, than it is how much paint or ink you spent to get there. This is why functionalism has been the cornerstone of experimental research in computer science for the past 40 years.
However, it has increasingly proven exhausted, as new developments in the field prove to be quite barren, and riddled with idiosyncrasies and fluff that doesn’t actually make the work of programming any better. This is the right time for such a dichotomy to come into view.
People are going to have to care about mechanicalism in computing in the near future, if for no other reason than it is vastly more efficient than the default, von Neumann styled functionalist computing we live with today. The ceiling is generally so much higher with this stuff, whereas the alternative to scale these days is increasingly a financial problem, not a technical one. Look towards this, or to AWS. Those are your choices.
As ventures like REX and even Parallella have shown, programming with this architecture is not easy. The glut of the software industry will be ripped raw by the need to move to this development style for the energy efficiency gains it presents.
Even as these machines can be brought to support languages people know like Python and JavaScript, they are fundamentally incompatible with much of the UNIX design that everyone today takes for granted. Or at least, if you try to put something Unix-like on these machines, you’re not getting anything for your trouble…
Thanks for reading this post. I run this Substack where I aim to have a third of my content available for free, and as of this writing I am working sixty hours each week doing security contracting to pay my debts and get ahead. While I work on funding the products that will use the magnificent theses described here, a subscription to this would really help me out a lot. Every bit counts these days, no pun intended.