Levels Of Analysis in Bits and Atoms
In hacking, a computer pretends to be made out of things like ‘buffers’ and ‘lists’ and ‘objects’ with rich meaningful semantics, but really, it’s just made out of bits which mean nothing and only accidentally can be interpreted as things like ‘web browsers’ or ‘passwords’, and if you move some bits around and rewrite these other bits in a particular order and read one string of bits in a different way, now you have bypassed the password.
Gwern, On Seeing Through and Unseeing: the Hacker Mindset
Levels of analysis: atoms
When we go to college (at least at the time of writing
, in the United States), we take a bunch of general education classes and then after one or two years, we pick a major. In the sciences, this is often divided by levels of analysis.At the lowest level, we have math. Then there is physics, which is heavily math-reliant, to tell us about big stuff (galaxies) and small stuff (subatomic particles). If we focus on the latter, from the subatomic particles, we move to atoms and molecules, which gives us chemistry. From atoms and molecules, we move to their combinations in replicators, which gives us biology. Within biology, when it comes to studying the brain, we move to the study of the mind, which brings us psychology. From a single mind, we move to groups of minds, and that gives us sociology. There is a classic xkcd comic that explains this here, calling it "levels of purity" rather than levels of analysis, but the concept is similar enough.
Given all of this, there is another major that has exploded in popularity since the 2010s: computer science. In my corner of the world (biology), there are lots of biologists who are trying to "learn how to code" in order to re-tool for the modern work force. It is to the point where I've seen "familiarity with Python" as a job requirement for a wet-lab job.
The reality here is that while there are some biologists who are indeed enthused by the digital world and fall in love with the computational side of things (I am one of them, who made the transition in the middle 2010s), there are plenty of biologists who see coding as a means to an end. They want to learn just enough to get the job done and no more, because they would rather focus on reading papers and designing experiments. I get it. So this is a tough problem that many are facing.
But lately I have been thinking that perhaps this push to learn how to code, and measuring biologists by their ability to program, is a category error that goes right back to the levels of analysis that I was just talking about. That a biologist learning the ins and outs of programming languages to re-tool is similar to a biologist learning the ins and outs of physics to re-tool. It therefore might not be what they are naturally drawn to.
Levels of analysis: bits
To go into this more, let's talk about where computer science fits in the levels of analysis that we were just talking about. My gut reaction is to place it somewhere between math and physics. But I think that the recent developments in AI are giving us a bit more clarity here, and I think this might help us re-frame the problem.
I propose that there may be an analogous "levels of analysis" hierarchy to the "bits" side of things to the one that I laid out in terms of "atoms." First, I'll start with an explicit mapping, which I will explain in the subsequent paragraphs.
The mapping is as follows:
- math -> discrete math, theories of computation
- physics -> digital electronics, computer architecture
- chemistry -> programming languages, algorithms
- biology -> AI/ML, adaptive software systems
- psychology -> Agentic AI studies
- sociology -> Multi-agent, swarm, and bot net studies
Computing also starts with math, but it especially focuses on discrete math. Lingering at the math level, we get information theory, which is grounded in the algebra of 0's and 1's, also known as Boolean algebra. Information theory gives us the very fundamental theories of computation, that include things like Turing machines and the halting problem.
Then we move up a level to digital electronics and computer architecture. Things like logic gates. This might be analogous to physics.
This in turn leads us to the study of programming languages that can implement sequences of computation, and the algorithms that they implement, which might be similar to that of the rules and molecular motifs of chemistry.
A level up from that, we have AI/ML and adaptive software systems at large (e.g. genetic algorithms). A lot of my work on non-linear dimensionality reduction tools fit in here. This might be thought of as the biology of the "bits" world.
And perhaps a level up from that, we have the psychology of agentic and generalist (LLM) systems. Anyone who has messed around with changing the prompt to see what output a LLM will give you has done this.
And perhaps a level up from that, we have the sociology of multi-agent systems, swarms, and bot nets. An example here is the stock market flash crash of 2010, where some emergent pattern in the automated trading bots being used on Wall Street led to an almost 9% drop in the Dow Jones Industrial Average in a few minutes. A more often talked about example is the use of bots on social media to influence the masses.
The right speciality for the person
What does this mean, from a practical standpoint? A lot of the above has yet to be really fleshed out (at the time of writing here). On the AI side, I am particularly drawn to some of the "interpretability" studies that Anthropic is doing, here.
). But I think what you can do, assuming this is inevitable, is to think about where you fit in the stack. For example, I am a biologist by training, so although I love to nerd out at the lower levels, I find myself doing biology-like experiments (e.g. knock-out, knock-in) on relevant ML algorithms in my field (an exampleBut perhaps a mechanistic chemist would be drawn to the analysis of algorithms. Perhaps a sociologist would be drawn to the study of bot nets on social media. For the psychologists, there is plenty of work to be done in terms of how the more complex agentic systems behave across many different conditions.
What this would suggest is rather than thinking in terms of needing to learn how to code, to think of where in the stack your interests fit. If you're naturally a psychologist, then the act of learning how to code and understanding recursion and Dijkstra's algorithm might be totally uninteresting to you, and understanding prompt-output behavior in increasingly complex AI systems might be fascinating to you. So then focus there, and measure yourself by how good you are there, rather than whether you can write a C++ program that solves the Towers of Hanoi.
And if we go back to how we learn stuff in school, it might be that there is some minimum viable familiarity with "bits" that everyone will need. This could very well include a course on writing simple computer programs in Python without the help of ChatGPT (the same way we learn how to do arithmetic on paper and memorize times tables despite calculators). But that might be "general ed" the same way I took math, physics, and chemistry as a biology major. But then it might be that your major is something more specific, from algorithms to bot nets.
So you can ask yourself: in the levels of analysis on the "bits" side, what lights you up? What are you naturally drawn to? If you don't know yet, then what is the "bits" equivalent of where you are at on the "atoms" side?
The right person for the job
And for the sake of solving problems in general, this framing might help us in terms of bringing together technical working groups to address major issues (e.g. climate change, cancer) as they evolve. If the big problem is the way bot nets are affecting the cultural zeitgeist on social media, and you need some bits-level experts, then perhaps you need to look beyond someone's ability to code. Yes, you'll need an algorithms expert, but you might also need a bot net expert too, and that might be a completely different way of thinking, the same way sociology is a completely different way of thinking than chemistry.
Furthermore, this has applications in recruiting, something my company is doing a lot of right now. Familiarity with R or Python is way too vague of a requirement. Same with arbitrary "checklist" words, like AI/ML, React, Flask, and so forth. Or for that matter, familiarity with LLMs. I think we need to get a lot more granular here. So you do code-based stuff. Do you develop analysis pipelines? Or do you tinker under the hood of analysis pipelines? Do you speed up algorithms? Do you manage large code bases? Do you write packages and libraries or do you do one-off scripts?
But then it gets interesting when you add the AI layer. Because perhaps in the future, we'll get questions like: do you use LLMs or do you tinker under the hood of LLMs? Do you operate at the LLM level, or the "communities of LLMs working together" layer? Have you done ablation studies in agentic systems? Do you do interpretability work in general? Do you have any experience with bot net management?
Conclusions
To wrap this all up, I think as the field of AI gets more fleshed out, it is going to change the way we look at the "bits" side of things, such that we will think in terms of distinct levels of analysis as specialities different enough to have different names, rather than being sub-specialties of computer science.
Imagine what it would be like if "science" was a single major, and there were sub-specialities you could choose once you're in, like physics or psychology. That's what computer science is like right now. And I think once it divides into distinct fields the same way science did, then it might be easier to wrap our heads around where we fit into it, how we solve problems within it, and how we teach it to the next generation.