## You’re smarter than you think

One of the more interesting measures in nature is entropy.

Entropy is defined the same in statistical physics and information theory. That’s one of the great discoveries of Claude Shannon. As you may or may not know, entropy is a measure of disorder. It can be disorder in positions and momenta of molecules, or disorder (noise) in signals on a transmission wire. It’s one of those little wonders that the same mathematical object governs both the behavior of steam engines and the behavior of information in computational machines and telegraph wires. It’s actually not much of a wonder once you understand the math, but stepping back and looking at in “big picture” makes it pretty amazing and glorious. The first time I figured it out, I am pretty sure I frothed at the mouth. Heat is a measure of disorderliness in information? Hot dang! I can’t convey the spectacularness of it all, as it requires an undergrad degree in physics to really grok it, but I will give it the old college try anyhow.

People who get degrees in computer science like to think of computers as “Turing machines.” I suspect this is because it is easier to prove theorems about Turing machines than Von Neumann machines. That’s fine by me, though I will note in passing that I have never read a proof (or anything like a proof) that VN machines like you write code on are computationally equivalent to Turing machines. Someone smart please point me to the proof if it exists. We will assume for the moment that they are equivalent; it isn’t important if you don’t believe it. I, personally, like to think of universal computers as shift maps. A shift map takes a string of symbols and shifts them around. Or, if you want to think in terms of Mr. George Boole, I think of a computer as a bunch of NOR (or NAND, it doesn’t really matter) gates. NOR gates flip bits. It’s not really important that you believe you can build a computer of any given complexity out of NANDs and NORs; the important thing is you believe that by taking two input bits, operating on them, and getting one output bit you can build a universal computational machine. It’s quite true; I wouldn’t lie to you. Trust me; I’m a scientist.

*Important fact to get from this picture; two inputs, one output *

It is crucial to note here that, these are not reversible operations. One of the bits goes away. You can’t take a one bit output and generate the two bits of input, as there are several permutations and combinations of them. This is sort of the crux of the whole argument; what happens when the bit goes away? If you believe in the statistical definition of entropy, you believe that the change in entropy is Boltzmann’s constant times the natural log of the density of states. The density of states of a NAND or NOR gate has decreased by 1/2, making this -k*log(2). So every time you do a calculation, a bit dies, and disorder increases by k*log(2). Most of you who are still reading have glossy eyes by now. Who gives a pair of foetid dingo’s kidneys, right? Well, when you know what the entropy increase is, you can know what the dissipated heat is. k is 1.38*10^-23joules/degree kelvin. If you are dissipating the heat to room temperature (about 300 degrees kelvin), when each bit dies, it dissipates 3E-21 joules of heat. Stop and think about this for a second. For every bit operation done in a computer, I know the minimum physically possible amount of heat dissipated. Assuming a 64 bit calculation is done in 64 bits (an underestimate for almost any kind of calculation), a 64 bit op is 1.84 *10^-19 joules.

What good is this, you might ask? Well, consider: the average human’s resting heat dissipation is something like 2000 kilocalories per day. Making a rough approximation, assume the brain dissipates 1/10 of this; 200 kilocals per day. That works out (you do the math) to 9.5 joules per second. This puts an upper limit on how much the brain can calculate, assuming it is an irreversible computer: 5 *10^19 64 bit ops per second.

Considering all the noise various people make over theories of consciousness and artificial intelligence, this seems to me a pretty important number to keep track of. Assuming a 3Ghz pentium is actually doing 3 billion calculations per second (it can’t), it is about 17 billion times less powerful than the upper bound on a human brain. Even if brains are computationally only 1/1000 of their theoretical efficiency, computers need to get 17 million times quicker to beat a brain. There are actually indications that brains are very close to their theoretical efficiencies; transmissions of information in nervous systems indicate things happen at very close to theoretical efficiency. Too bad we don’t know how calculation takes place in your noggin.

I leave it as an exercise to the reader to calculate when Moore’s law should give us a computer as powerful as a human brain (though Moore’s law appears to have failed in recent years). I leave it as another exercise to determine if this happens before the feature size on a piece of silicon approaches that of an atom (in which case, it can no longer really be a feature at room temperature).

Penrose thinks brains are super-Turing quantum gravitic computers. Most people think he is full of baloney, but, you never know. I am pretty sure quantum computers get around the Landauer limit. People like Christoper Moore, Michel Cosnard and Olivier Bournez have also shown that analog computers are potentially vastly more powerful than digital ones, though they still don’t get around the thermodynamic limit.

“Antikythera mechanism; an early analog computer”

Incidentally, if you google on Landauer entropy (that log(2) thing), you will find many people who don’t know what they are talking about who think they have refuted this bone-simple calculation. All they have done is (re) discovered reversible computing (aka Toffoli and Fredkin) without admitting or realizing it. Such people also must believe in perpetual motion machines, as it is the exact same logical error.

Reversible computing is, theoretically, infinitely powerful. It is also an inadvertent restatement of what “chaos” means -not to mention, a restatement of what heat means. In reversible computing, you need to keep around all the bits which are destroyed in irreversible computing. That’s a lot of semirandom bits. What are they really, but the heat that your computer dissipates when it does a calculation?

Brian Pottersaid, on May 18, 2010 at 1:07 pmI’m pretty far above my intellectual pay-grade here, but doesn’t the computational equivalence of Von-Neumann and Turing machines fall right out of the Church-Turing thesis? Granted it’s no proof, but is there any reason to suspect Church-Turing might be incorrect?

Scott Locklinsaid, on May 18, 2010 at 6:01 pmHey Brian: yeah, that’s what I’ve heard before from various computer scientists, but lots and lots of people say Church Turing says nothing of the sort.

http://plato.stanford.edu/entries/church-turing/#Bloopers

I confess I never read the original paper. Kind of outside my field.

Brycesaid, on June 19, 2010 at 6:43 pmA turing machine is so much simpler than the commodity PC hardware that we all use. It’s almost trivial to write a working program that simulates a limited form of turing machine that has a fixed-length tape. So PC hardware is not really equivalent to a turing machine in computational power because it only has a finite amount of storage, while the turing machine has an infinite amount of storage. But of course no one has ever built a turing machine that has infinite tape either.

Here’s my favorite definition for entropy: entropy is hidden information

The best example of this definition is in a black hole where the information inside really is hidden. But the holographic principle says that a non-spinning neutrally-charged black hole’s surface area is proportional to the amount of hidden information. This places an upper limit on information density in the universe. Also, the information content of any region of space can be considered to be encoded on its surface. A black hole just has a surface with maximum possible information content.

William O. B'Livionsaid, on May 18, 2010 at 1:59 pm“””

If you are dissipating the heat to room temperature (about 300 degrees kelvin), when each bit dies, it dissipates 3E-21 joules of heat.

“””

So every click of your mouse contributes, in a small way, to the heat death of the universe.

“””

Too bad we don’t know how calculation takes place in your noggin.

“””

We could model it, but we don’t have access to 34 billion computers.

Scott Locklinsaid, on May 18, 2010 at 6:17 pmThe thing is, we don’t even really know if we

canmodel it, even if we know all about brains. People have been proposing this sort of thing, literally, since the 1960s. In principle you should be able to model a brain which runs 34 billion times slower than a real one, like, nowish. Or some small part of the brain which is interesting. Nobody seems to have done this yet.For example, every couple of years, something like this comes up:

http://www.livescience.com/culture/artificial-cat-brain-intelligence-100430.html

My prediction is nothing will come of it. Of course, my prediction is informed by stuff like this:

http://www.dailytech.com/Leading+Neuroscientists+Blasts+DARPA+CatBrain+Project+Calling+it+a+Scam/article16912.htm

John Mountsaid, on May 18, 2010 at 11:32 pmHopcrot and Ullman say the simulation of a RAM machine by a Turing Machine (the interesting part of showing Turing Machine RAM equivalence) is the topic of Cook and Rechhow “Time bound random access machines” J. Computer and System Sciences 7,4 pp. 354-374.

And Fyenman in his “Lectures on Computer Science” expanded the entropy/energy notion of computation to reversable gates and machines which gives the same k*log(2) step cost, but also counts N square of blank tape as N*k*log(2) bonus (because you can disorder the tape). Basically it looks like if you had blank tape (which would cost to make) you can compute for free (if you are willing to wait arbitrarily long for a ratchet process that is kind of like a legit version of Maxwell’s Daemon to do the work). If you want the result sooner you are going pay additional energy linearly proportional to speed (the work you do biasing the system to move forward in states). The funny thing (as you point out) in these situations (waiting for a Turing machine to accept or waiting for a molecule to bind to a receptor) entropy and enthalpy work very similarly.

Scott Locklinsaid, on May 19, 2010 at 3:02 amThanks John; I’ll have to look into that paper one of these days when I’m in a library.

It was the Feynman lectures which originally caused me to think of this about 6 years ago. For some reason I hadn’t read the chapter on reversible computing in his book; the bookmark was right before that place. Had read some in the literature about it though. His explanations make more intuitive sense than the arguments I have read about adiabatic reversible computing, but I guess I shouldn’t be surprised.

John Mountsaid, on May 29, 2010 at 3:17 pmRe-read it, I confounded a few things. The Feynman referred work (Bennett) gives a lower-bound on work per step on non-reversable computation, and allows for free computation using reversible machines if you are willing to essentially wait forever.

John Flanagansaid, on June 9, 2010 at 8:29 amThe second part sounds a bit infinite-monkeys-infinite-typewriters to me. Not saying it’s wrong, it just sounds sort of equivalent.

Patricksaid, on May 19, 2010 at 1:48 am“For every bit operation done in a computer, I know the minimum physically possible amount of heat dissipated.”

Did you just link heat and computation? Holy shit! /mental worlds collide/

Scott Locklinsaid, on May 19, 2010 at 1:51 amWeird, ain’t it?

maggettesaid, on May 19, 2010 at 12:51 pmGreat Stuff. Aigain an interesting read.

Regarding the artificial intelligence scam:

have look here

http://www.semanticsystem.com/en/company.html

Vilhelm Ssaid, on June 22, 2010 at 1:53 pm“In reversible computing, you need to keep around all the bits which are destroyed in irreversible computing. That’s a lot of semirandom bits. What are they really, but the heat that your computer dissipates when it does a calculation?”

Feynman’s lectures on computing also has a reply to this. He ask, what’s really the difference between saying that you could build a reversible computer if you can just keep track of the junk bits it produces, and saying that you could build a reversible car engine if you could just keep track of the heat it produces?

Well, the difference is that with the computer you do have a way to keep track of them! You build two copies of the computer, and run one of them (wait for it) in reverse on the output! This will unscramble all the random-looking bits. You need to nonreversibly make a copy of the answer you are interested in, but all the intermediate calculations can be undone without assuming a heat sink.

Oh my godsaid, on June 22, 2010 at 3:46 pmyou’ve been doing shrooms lately, eh?

A.I. in the Ditch | A View from the Rightsaid, on May 11, 2014 at 10:02 pm[…] “You’re Smarter Than You Think” […]