You’re smarter than you think
One of the more interesting measures in nature is entropy.
Entropy is defined the same in statistical physics and information theory. That’s one of the great discoveries of Claude Shannon. As you may or may not know, entropy is a measure of disorder. It can be disorder in positions and momenta of molecules, or disorder (noise) in signals on a transmission wire. It’s one of those little wonders that the same mathematical object governs both the behavior of steam engines and the behavior of information in computational machines and telegraph wires. It’s actually not much of a wonder once you understand the math, but stepping back and looking at in “big picture” makes it pretty amazing and glorious. The first time I figured it out, I am pretty sure I frothed at the mouth. Heat is a measure of disorderliness in information? Hot dang! I can’t convey the spectacularness of it all, as it requires an undergrad degree in physics to really grok it, but I will give it the old college try anyhow.
People who get degrees in computer science like to think of computers as “Turing machines.” I suspect this is because it is easier to prove theorems about Turing machines than Von Neumann machines. That’s fine by me, though I will note in passing that I have never read a proof (or anything like a proof) that VN machines like you write code on are computationally equivalent to Turing machines. Someone smart please point me to the proof if it exists. We will assume for the moment that they are equivalent; it isn’t important if you don’t believe it. I, personally, like to think of universal computers as shift maps. A shift map takes a string of symbols and shifts them around. Or, if you want to think in terms of Mr. George Boole, I think of a computer as a bunch of NOR (or NAND, it doesn’t really matter) gates. NOR gates flip bits. It’s not really important that you believe you can build a computer of any given complexity out of NANDs and NORs; the important thing is you believe that by taking two input bits, operating on them, and getting one output bit you can build a universal computational machine. It’s quite true; I wouldn’t lie to you. Trust me; I’m a scientist.
Important fact to get from this picture; two inputs, one output
It is crucial to note here that, these are not reversible operations. One of the bits goes away. You can’t take a one bit output and generate the two bits of input, as there are several permutations and combinations of them. This is sort of the crux of the whole argument; what happens when the bit goes away? If you believe in the statistical definition of entropy, you believe that the change in entropy is Boltzmann’s constant times the natural log of the density of states. The density of states of a NAND or NOR gate has decreased by 1/2, making this -k*log(2). So every time you do a calculation, a bit dies, and disorder increases by k*log(2). Most of you who are still reading have glossy eyes by now. Who gives a pair of foetid dingo’s kidneys, right? Well, when you know what the entropy increase is, you can know what the dissipated heat is. k is 1.38*10^-23joules/degree kelvin. If you are dissipating the heat to room temperature (about 300 degrees kelvin), when each bit dies, it dissipates 3E-21 joules of heat. Stop and think about this for a second. For every bit operation done in a computer, I know the minimum physically possible amount of heat dissipated. Assuming a 64 bit calculation is done in 64 bits (an underestimate for almost any kind of calculation), a 64 bit op is 1.84 *10^-19 joules.
What good is this, you might ask? Well, consider: the average human’s resting heat dissipation is something like 2000 kilocalories per day. Making a rough approximation, assume the brain dissipates 1/10 of this; 200 kilocals per day. That works out (you do the math) to 9.5 joules per second. This puts an upper limit on how much the brain can calculate, assuming it is an irreversible computer: 5 *10^19 64 bit ops per second.
Considering all the noise various people make over theories of consciousness and artificial intelligence, this seems to me a pretty important number to keep track of. Assuming a 3Ghz pentium is actually doing 3 billion calculations per second (it can’t), it is about 17 billion times less powerful than the upper bound on a human brain. Even if brains are computationally only 1/1000 of their theoretical efficiency, computers need to get 17 million times quicker to beat a brain. There are actually indications that brains are very close to their theoretical efficiencies; transmissions of information in nervous systems indicate things happen at very close to theoretical efficiency. Too bad we don’t know how calculation takes place in your noggin.
I leave it as an exercise to the reader to calculate when Moore’s law should give us a computer as powerful as a human brain (though Moore’s law appears to have failed in recent years). I leave it as another exercise to determine if this happens before the feature size on a piece of silicon approaches that of an atom (in which case, it can no longer really be a feature at room temperature).
Penrose thinks brains are super-Turing quantum gravitic computers. Most people think he is full of baloney, but, you never know. I am pretty sure quantum computers get around the Landauer limit. People like Christoper Moore, Michel Cosnard and Olivier Bournez have also shown that analog computers are potentially vastly more powerful than digital ones, though they still don’t get around the thermodynamic limit.
“Antikythera mechanism; an early analog computer”
Incidentally, if you google on Landauer entropy (that log(2) thing), you will find many people who don’t know what they are talking about who think they have refuted this bone-simple calculation. All they have done is (re) discovered reversible computing (aka Toffoli and Fredkin) without admitting or realizing it. Such people also must believe in perpetual motion machines, as it is the exact same logical error.
Reversible computing is, theoretically, infinitely powerful. It is also an inadvertent restatement of what “chaos” means -not to mention, a restatement of what heat means. In reversible computing, you need to keep around all the bits which are destroyed in irreversible computing. That’s a lot of semirandom bits. What are they really, but the heat that your computer dissipates when it does a calculation?