The Power of Computation
July 4th 2022
In our day it is very clear that computation is a powerful force. Computation has become involved in every aspect of our lives. Yet, we recognize very little of the full power of computation. It is a vast, infinite, power, that can only be harnessed when it is understood.
Right now this power is harnessed by massive teams of experts and professionals. Some of these teams are corporations, some are teams of academics, and still others are volunteers. These teams tell us things like:
"It takes our specially vetted methods to create new technologies. It takes our massive organization to handle all of the necessary work. It takes our giant pool of resources to ensure high standards of quality, accessibility, and security. It takes our monopoly to make systems of technology that work together."
The TRUTH is that these corporations and organizations have harnessed very little of the power of computation.
They have scratched a little bit off of an infinite power and with just that little bit they have made a new world. But if we truly had access to the power of computation, what we have right now would appear to be nothing at all.
I believe that a time will come when a single person equipped with a deep understanding of the power of computation will be able to do everything that once took massive corporations, organizations, and teams. The only barrier to this future is understanding.
Vast Infinite Power
First I will explain precisely what I mean when I say that the power of computation is "infinite".
Consider how many potential structures of code and data there are. For instance, how many ways can memory management be handled in the C language? Answer: the number of options is practically infinite.
Most of these potential systems are useless, but within the infinity of options there are hidden gems.
In just a few hundred lines of code one can express an Arena, an extraodinary improvement over malloc and free. Embracing this new paradigm throughout a codebase tremendously simplifies every other part. To a user who works to deeply understand it the Arena is a revolution.
How many more ways are there to put together a few hundred lines of code to create a revolution? How many more discoveries are there to make by merging exactly the right two problems? Or tweaking the relationship between a few concepts? We cannot know because the space to explore is practically infinite.
Size vs Understanding
The path we are on is not the path of understanding, but the path of size. On the path of size, we attempt to harness the power of computation by being large. It does not work very well.
On the surface it seems that large teams do deliver the tools of computation. Computation is such a powerful force, that even a little bit of is worth a lot.
But as we go further down this path, the decline of computation grows more apparent. Modern software is plagued with increasingly more random failures, giant load times, confusing interfaces, incompatibilities, inconsistencies, and limitations. Using software created this way regularly means surrending privacy, security, and choice. At their best, large teams are a chaotic scramble to keep an essential system functioning. At their worst they become hostile to users, and innovators.
The logic of the path of size suggests that the solution to these problems is more contributors and more resources. The path of size is rooted in a belief that computation is weak, and that it is only by being large that large things can be accomplished. Those who follow the path of size use size, instead of understanding, to wield the power of computation. When this approach inevitably creates more problems, they face it by growing even larger.
The TRUTH is that computation DOES MORE by BEING LESS. This idea is expressed by veteran programmers who lament complexity and urge peers to "keep it simple". This is the path of understanding. Those who follow the path of understanding face hard problems by deepening their understanding, rather than by growing their code, teams, or resources.
Earlier I said:
"A time will come when a single person equipped with a deep understanding of the power of computation will be able to do everything that once took massive corporations, organizations, and teams. The only barrier to this future is understanding."
Obviously I do not know the future. What I really mean is that such a future is a worthwhile vision for us to pursue.
To make this discussion more grounded I will switch from discussing a "future person" to discussing a "hypothetical person". That way I can describe how I believe things could be without making claims along the way.
Our hypothetical person has a few advantages over today's massive teams.
First: they do not have to generate many millions of lines of code to succeed. The software they create has all of the useful effects of modern software, but wastes no computation on useless and detrimental effects. This is the power of their deep understanding, they are able to achieve what we do with much less.
Second: our hypothetical person lives in a computational environment shaped by deep understanding. Many of their peers and predecessors embrace understanding, and create tools accordingly. The tools they use do much more than the tools we have today.
Third: they avoid the consequences of Conway's Law brought on by relying on size. They do not have to handle conflicting styles, opinions, and priorities. They are not bound to operate within the constraints of legacy systems. Just like their software, their development process does more by being less.
This is the vision I propose we pursue. In this world individuals would be empowered to integrate computation into their lives in the ways they choose. Computation would become a powerful tool for self determination. Iteration and understanding would compound and unlock whole new possibilities for computation. We just need a way to make this hypothetical future a reality.
The Way Forward
Almost every detail about this hypothetical future is already true of us now. We have much more now than we did when computers were new. Our understanding is deeper. Our tools are more advanced. We already benefit from deeper understanding and better tools. The only difference between ourselves and our hypothetical/future person, is more understanding.
Therefore the process of iteration needs to be further strengthened. New ambitious projects need to pop up more and more frequently. We must not be taken in by the illusion that tools are progress. No tool needs to last forever. We need to embrace a full project life cycle: birth, life, and death. We need to embrace understanding as the way to make lasting progress, and let the tools we create be iterations in the process.
We are seeing many projects from individuals and lean teams that directly refute the necessity of size to handle hard problems.
A web browser for Serenity OS.
The ideas of the old era are being refuted, and a new era is beginning.
The old era ideas were rooted in slow iteration, upholding legacy, devaluing understanding, and solving problems with resources.
"Don't reinvent the wheel", "Don't try to rewrite everything", "Do you really think you can do better?", "Leave that to the experts", "It's too complex for any one person to handle", "It's not going to make a difference anyway"
The new era ideas promote iteration, innovation, and deeper understanding.
"Question dogma and tradition", "Leave no stone unturned", "Nothing is so good that it cannot be improved", "Don't leave it to the experts, become the expert", "Why is it so complex to begin with?", "Deepening understanding always makes a difference"
If you are reading this and finding that it resonates, I hope you will find a place for yourself in this new era! For my own part, I will seek out new opportunities to deepen my understanding in projects that attract me, and pass on whatever I can through this website and through the connected projects.
- Mr. 4th