Locklin on science

Quantum computing as a field is obvious bullshit

Posted in non-standard computer architectures, physics by Scott Locklin on January 15, 2019

I remember spotting the quantum computing trend when I was  a larval physics nerdling. I figured maybe I could get in on the chuckwagon if my dissertation project didn’t work out in a big way (it didn’t). I managed to get myself invited to a Gordon conference, and have giant leather bound notebooks filled with theoretical scribblings containing material for 2-3 papers in them. I wasn’t real confident in my results, and I couldn’t figure out a way to turn them into something practical involving matter, so I happily matriculated to better things in the world of business.

When I say Quantum Computing is a bullshit field, I don’t mean everything in the field is bullshit, though to first order, this appears to be approximately true. I don’t have a mathematical proof that Quantum Computing isn’t at least theoretically possible.  I also do not have a mathematical proof that we can make the artificial bacteria of K. Eric Drexler’s nanotech fantasies. Yet, I know both fields are bullshit. Both fields involve forming new kinds of matter that we haven’t the slightest idea how to construct. Neither field has a sane ‘first step’ to make their large claims true.

Drexler and the “nanotechnologists” who followed him, they assume because we  know about the Schroedinger equation we can make artificial forms of life out of arbitrary forms of matter. This is nonsense; nobody understands enough about matter in detail or life in particular to do this. There are also reasonable thermodynamic, chemical and physical arguments against this sort of thing. I have opined on this at length, and at this point, I am so obviously correct on the nanotech front, there is nobody left to argue with me. A generation of people who probably would have made first rate chemists or materials scientists wasted their early, creative careers following this over hyped and completely worthless woo. Billions of dollars squandered down a rat hole of rubbish and wishful thinking. Legal wankers wrote legal reviews of regulatory regimes to protect us from this nonexistent technology. We even had congressional hearings on this nonsense topic back in 2003 and again in 2005 (and probably some other times I forgot about). Russians built a nanotech park to cash in on the nanopocalyptic trillion dollar nanotech economy which was supposed to happen by now.

Similarly, “quantum computing” enthusiasts expect you to overlook the fact that they haven’t a clue as to how to build and manipulate quantum coherent forms of matter necessary to achieve quantum computation.  A quantum computer capable of truly factoring the number 21 is missing in action. In fact, the factoring of the number 15 into 3 and 5 is a bit of a parlour trick, as they design the experiment while knowing the answer, thus leaving out the gates required if we didn’t know how to factor 15. The actual number of gates needed to factor a n-bit number is 72 * n^3; so for 15, it’s 4 bits, 4608 gates; not happening any time soon.

It’s been almost 25 years since Peter Shor had his big idea, and we are no closer to factoring large numbers than we were … 15 years ago when we were also able to kinda sorta vaguely factor the number 15 using NMR ‘quantum computers.’

I had this conversation talking with a pal at … a nice restaurant near one of America’s great centers of learning. Our waiter was amazed and shared with us the fact that he had done a Ph.D. thesis on the subject of quantum computing. My pal was convinced by this that my skepticism is justified; in fact he accused me of arranging this. I didn’t, but am motivated to write to prevent future Ivy League Ph.D. level talent having to make a living by bringing a couple of finance nerds their steaks.

In 2010, I laid out an argument against quantum computing as a field based on the fact that no observable progress has taken place. That argument still stands. No observable progress has taken place. However, 8 years is a very long time. Ph.D. dissertations have been achieved, and many of these people have gone on to careers … some of which involve bringing people like me delicious steaks. Hundreds of quantum computing charlatans achieved tenure in that period of time. According to google scholar a half million papers have been written on the subject since then.

QC-screenshot

There are now three major .com firms funding quantum computing efforts; IBM, Google and Microsoft. There is at least one YC/Andreesen backed startup I know of. Of course there is also dwave, who has somehow managed to exist since 1999; almost 20 years, without actually delivering something usefully quantum or computing. How many millions have been flushed down the toilet by these turds? How many millions which could have been used building, say, ordinary analog or stochastic computers which do useful things? None of these have delivered a useful quantum computer which has even  one usefully error corrected qubit. I suppose I shed not too many tears for the money spent on these efforts; in my ideal world, several companies on that list would be broken up or forced to fund Bell Labs moonshot efforts anyway, and most venture capitalists are frauds who deserve to be parted with their money. I do feel sad for the number of young people taken in by this quackery. You’re better off reading ancient Greek than studying a ‘technical’ subject that eventually involves bringing a public school kid like me a steak. Hell, you are better off training to become an exorcist or a feng shui practitioner than getting a Ph.D. in ‘quantum computing.’

I am an empiricist and a phenomenologist. I consider the lack of one error corrected qubit in the history of the human race to be adequate evidence that this is not a serious enough field to justify using the word ‘field.’ Most of it is frankly, a scam. Plenty of time to collect tenure and accolades before people realize this isn’t normative science or much of anything reasonable.

As I said last year

All you need do is look at history: people had working (digital) computers before Von Neumann and other theorists ever noticed them. We literally have thousands of “engineers” and “scientists” writing software and doing “research” on a machine that nobody knows how to build. People dedicate their careers to a subject which doesn’t exist in the corporeal world. There isn’t a word for this type of intellectual flatulence other than the overloaded term “fraud,” but there should be.

Computer scientists” have gotten involved in this chuckwagon. They have added approximately nothing to our knowledge of the subject, and as far as I can tell, their educational backgrounds preclude them ever doing so. “Computer scientists” haven’t had proper didactics in learning quantum mechanics, and virtually none of them have ever done anything as practical as fiddled with an op-amp, built an AM radio or noticed how noise works in the corporeal world.

Such towering sperg-lords actually think that the only problems with quantum computing are engineering problems. When I read things like this, I can hear them muttering mere engineering problems.  Let’s say, for the sake of argument this were true. The SR-71 was technically a mere engineering problem after the Bernoulli effect was explicated in 1738. Would it be reasonable to have a hundred or a thousand people writing flight plans for the SR-71  as a profession in 1760? No.

A reasonable thing for a 1760s scientist to do is invent materials making a heavier than air craft possible. Maybe fool around with kites and steam engines. And even then … there needed to be several important breakthroughs in metallurgy (titanium wasn’t discovered until 1791), mining, a functioning petrochemical industry, formalized and practical thermodynamics, a unified field theory of electromagnetism, chemistry, optics, manufacturing and arguably quantum mechanics, information theory, operations research and a whole bunch of other stuff which was unimaginable in the 1760s. In fact, of course the SR-71 itself was completely unimaginable back then. That’s the point.

 

its just engineering!

its just engineering!

Physicists used to be serious and bloody minded people who understood reality by doing experiments. Somehow this sort of bloody minded seriousness has faded out into a tower of wanking theorists who only occasionally have anything to do with actual matter. I trace the disease to the rise of the “meritocracy” out of cow colleges in the 1960s. The post WW-2 neoliberal idea was that geniuses like Einstein could be mass produced out of peasants using agricultural schools. The reality is, the peasants are still peasants, and the total number of Einsteins in the world, or even merely serious thinkers about physics is probably something like a fixed number. It’s really easy, though, to create a bunch of crackpot narcissists who have the egos of Einstein without the exceptional work output. All you need to do there is teach them how to do some impressive looking mathematical Cargo Cult science, and keep their “results” away from any practical men doing experiments.

The manufacture of a large caste of such boobs has made any real progress in physics impossible without killing off a few generations of them. The vast, looming, important questions of physics; the kinds that a once in a lifetime physicist might answer -those haven’t budged since the early 60s. John Horgan wrote a book observing that science (physics in particular) has pretty much ended any observable forward progress since the time of cow collitches. He also noticed that instead of making progress down fruitful lanes or improving detailed knowledge of important areas, most develop enthusiasms for the latest non-experimental wank fest; complexity theory, network theory, noodle theory. He thinks it’s because it’s too difficult to make further progress. I think it’s because the craft is now overrun with corrupt welfare queens who are play-acting cargo cultists.

Physicists worthy of the name are freebooters; Vikings of the Mind, intellectual adventurers who torture nature into giving up its secrets and risk their reputation in the real world. Modern physicists are … careerist ding dongs who grub out a meagre living sucking on the government teat, working their social networks, giving their friends reach arounds and doing PR to make themselves look like they’re working on something important. It is terrible and sad what happened to the king of sciences. While there are honest and productive physicists, the mainstream of it is lost, possibly forever to a caste of grifters and apple polishing dingbats.

But when a subject which claims to be a technology, which lacks even the rudiments of experiment which may one day make it into a technology, you can know with absolute certainty that this ‘technology’ is total nonsense. Quantum computing is less physical than the engineering of interstellar spacecraft; we at least have plausible physical mechanisms to achieve interstellar space flight.

We’re reaching peak quantum computing hyperbole. According to a dimwit at the Atlantic, quantum computing will end free will. According to another one at Forbes, “the quantum computing apocalypse is immanent.” Rachel Gutman and Schlomo Dolev know about as much about quantum computing as I do about 12th century Talmudic studies, which is to say, absolutely nothing. They, however, think they know smart people who tell them that this is important: they’ve achieved the perfect human informational centipede. This is unquestionably the right time to go short.

Even the national academy of sciences has taken note that there might be a problem here. They put together 13 actual quantum computing experts who poured cold water on all the hype. They wrote a 200 page review article on the topic, pointing out that even with the most optimistic projections, RSA is safe for another couple of decades, and that there are huge gaps on our knowledge of how to build anything usefully quantum computing. And of course, they also pointed out if QC doesn’t start solving some problems which are interesting to … somebody, the funding is very likely to dry up. Ha, ha; yes, I’ll have some pepper on that steak.


 

There are several reasonable arguments against any quantum computing of the interesting kind (aka can demonstrate supremacy on a useful problem) ever having a physical embodiment.

One of the better arguments is akin to that against P=NP. No, not the argument that “if there was such a proof someone would have come up with it by now” -but that one is also in full effect. In principle, classical analog computers can solve NP-hard problems in P time. You can google around on the “downhill principle” or look at the work on Analog super-Turing architectures by people like Hava Siegelmann. It’s old stuff, and most sane people realize this isn’t really physical, because matter isn’t infinitely continuous. If you can encode a real/continuous number into the physical world somehow, P=NP using a protractor or soap-bubble. For whatever reasons, most complexity theorists understand this, and know that protractor P=NP isn’t physical.  Somehow quantum computing gets a pass, I guess because they’ve never attempted to measure anything in the physical world beyond the complexity of using a protractor.

In order to build a quantum computer, you need to control each qubit, which is a continuous value, not a binary value, in its initial state and subsequent states precisely enough to run the calculation backwards. When people do their calculations ‘proving’ the efficiency of quantum computers, this is treated as an engineering detail. There are strong assertions by numerous people that quantum error correction (which, I will remind everyone, hasn’t been usefully implemented in actual matter by anyone -that’s the only kind of proof that matters here) basically pushes the analog requirement for perfection to the initialization step, or subsumes it in some other place where it can’t exist. Let’s assume for the moment that this isn’t the case.

Putting this a different way, for an N-qubit computer, you need to control, transform, and read out 2^N complex (as in complex numbers) amplitudes of N-qubit quantum computers to a very high degree of precision. Even considering an analog computer with N oscillators which must be precisely initialized, precisely controlled, transformed and individually read out, to the point where you could reverse the computation by running the oscillators through the computation backwards; this is an extremely challenging task. The quantum version is exponentially more difficult.

Making it even more concrete; if we encode the polarization state of a photon as a qubit, how do we perfectly align the polarizers between two qubits? How do we align them for N qubits? How do we align the polarization direction with the gates? This isn’t some theoretical gobbledeygook; when it comes time to build something in physical reality, physical alignments matter, a lot. Ask me how I know. You can go amuse yourself and try to build a simple quantum computer with a couple of hard coded gates using beamsplitters and polarization states of photos. It’s known to be perfectly possible and even has a rather sad wikipedia page. I can make quantum polarization-state entangled photons all day; any fool with a laser and a KDP crystal can do this, yet somehow nobody bothers sticking some beamsplitters on a breadboard and making a quantum computer. How come? Well, one guy recently did it: got two whole qubits. You can go read about this *cough* promising new idea here, or if you are someone who doesn’t understand matter here.

FWIIW in early days of this idea, it was noticed that the growth in the number of components needed was exponential in the number of qubits. Well, this shouldn’t be a surprise: the growth in the number of states in a quantum computer is also exponential in the number of qubits. That’s both the ‘interesting thing’ and ‘the problem.’ The ‘interesting thing’ because an exponential number of states, if possible to trivially manipulate, allows for a large speedup in calculations. ‘The problem’ because manipulating an exponential number of states is not something anyone really knows how to do.

The problem doesn’t go away if you use spins of electrons or nuclei; which direction is spin up? Will all the physical spins be perfectly aligned in the “up” direction? Will the measurement devices agree on spin-up? Do all the gates agree on spin-up? In the world of matter, of course they won’t; you will have a projection. That projection is in effect, correlated noise, and correlated noise destroys quantum computation in an irrecoverable way. Even the quantum error correction people understand this, though for some reason people don’t worry about it too much. If they are honest in their lack of worry, this is because they’ve never fooled around with things like beamsplitters. Hey, making it have uncorrelated noise; that’s just an engineering problem right? Sort of like making artificial life out of silicon, controlled nuclear fusion power or Bussard ramjets is “just an engineering problem.”

engineering problem; easier than quantum computers

 

Of course at some point someone will mention quantum error correction which allows us to not have to precisely measure and transform everything. The most optimistic estimate of the required precision is something like 10^-5 for quantum error corrected computers per qubit/gate operation. This is a fairly high degree of precision. Going back to my polarization angle example; this implies all the polarizers, optical elements and gates in a complex system are aligned to 0.036 degrees. I mean, I know how to align a couple of beamsplitters and polarizers to 628 microradians, but I’m not sure I can align a few hundred thousand of them AND pockels cells and mirrors to 628 microradians of each other. Now imagine something with a realistic number of qubits for factoring large numbers; maybe 10,000 qubits, and a CPU worth of gates, say 10^10 or so of gates (an underestimate of the number needed for cracking RSA, which, mind you, is the only reason we’re having this conversation). I suppose it is possible, but I encourage any budding quantum wank^H^H^H  algorithmist out there to have a go at aligning 3-4 optical elements to within this precision. There is no time limit, unless you die first, in which case “time’s up!”

This is just the most obvious engineering limitation for making sure we don’t have obviously correlated noise propagating through our quantum computer. We must also be able to prepare the initial states to within this sort of precision. Then we need to be able to measure the final states to within this sort of precision. And we have to be able to do arbitrary unitary transformations on all the qubits.

Just to interrupt you with some basic facts: the number of states we’re talking about here for a 4000 qubit computer is ~ 2^4000 states! That’s 10^1200 or so continuous variables we have to manipulate to at least one part in ten thousand. The number of protons in the universe is about 10^80. This is why a quantum computer is so powerful; you’re theoretically encoding an exponential number of states into the thing. Can anyone actually do this using a physical object? Citations needed; as far as I can tell, nothing like this has ever been done in the history of the human race. Again, interstellar space flight seems like a more achievable goal. Even Drexler’s nanotech fantasies have some precedent in the form of actually existing life forms. Yet none of these are coming any time soon either.

There are reasons to believe that quantum error correction, too isn’t even theoretically possible (examples here and here and here -this one is particularly damning). In addition to the argument above that the theorists are subsuming some actual continuous number into what is inherently a noisy and non-continuous machine made out of matter, the existence of a quantum error corrected system would mean you can make arbitrarily precise quantum measurements; effectively giving you back your exponentially precise continuous number. If you can do exponentially precise continuous numbers in a non exponential number of calculations or measurements, you can probably solve very interesting problems on a relatively simple analog computer. Let’s say, a classical one like a Toffoli gate billiard ball computer. Get to work; we know how to make a billiard ball computer work with crabs. This isn’t an example chosen at random. This is the kind of argument allegedly serious people submit for quantum computation involving matter. Hey man, not using crabs is just an engineering problem muh Church Turing warble murble.

Smurfs will come back to me with the press releases of Google and IBM touting their latest 20 bit stacks of whatever. I am not impressed, and I don’t even consider most of these to be quantum computing in the sense that people worry about quantum supremacy and new quantum-proof public key or Zero Knowledge Proof algorithms (which more or less already exist). These cod quantum computing machines are not expanding our knowledge of anything, nor are they building towards anything for a bold new quantum supreme future; they’re not scalable, and many of them are not obviously doing anything quantum or computing.

This entire subject does nothing but  eat up lives and waste careers. If I were in charge of science funding, the entire world budget for this nonsense would be below that we allocate for the development of Bussard ramjets, which are also not known to be impossible, and are a lot more cool looking.

 

 

As Dyakonov put it in his 2012 paper;

“A somewhat similar story can be traced back to the 13th century when Nasreddin Hodja made a proposal to teach his donkey to read and obtained a 10-year grant from the local Sultan. For his first report he put breadcrumbs between the pages of a big book, and demonstrated the donkey turning the pages with his hoofs. This was a promising first step in the right direction. Nasreddin was a wise but simple man, so when asked by friends how he hopes to accomplish his goal, he answered: “My dear fellows, before ten years are up, either I will die or the Sultan will die. Or else, the donkey will die.”

Had he the modern degree of sophistication, he could say, first, that there is no theorem forbidding donkeys to read. And, since this does not contradict any known fundamental principles, the failure to achieve this goal would reveal new laws of Nature. So, it is a win-win strategy: either the donkey learns to read, or new laws will be discovered.”

Further reading on the topic:

Dyakonov’s recent IEEE popsci article on the subject (his papers are the best review articles of why all this is silly):

https://spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing

IEEE precis on the NAS report:

https://spectrum.ieee.org/tech-talk/computing/hardware/the-us-national-academies-reports-on-the-prospects-for-quantum-computing (summary: not good)

Amusing blog from 11 years ago noting the utter lack of progress in this subject:

http://emergentchaos.com/archives/2008/03/quantum-progress.html

“To factor a 4096-bit number, you need 72*40963 or 4,947,802,324,992 quantum gates. Lets just round that up to an even 5 trillion. Five trillion is a big number. ”

Aaronson’s articles of faith (I personally found them literal laffin’ out loud funny, though I am sure he is in perfect earnest):

https://www.scottaaronson.com/blog/?p=124

 

41 Responses

Subscribe to comments with RSS.

  1. pucenoise said, on January 15, 2019 at 2:42 am

    Yeah this shitwagon is full bore. Normally I would be working on industry related transport modelling for devices but $1.2 billion of funding punted me into making transport models for their optical gizmos.

    At least non-equilibrium quantum statistical mechanics is universal; one (un)surprising discovery we’ve made so far is that not only do the quantum information people have their heads up their asses, but so do the quantum optics people, so I’m not entirely wasting my time.

    • Scott Locklin said, on January 15, 2019 at 2:51 am

      Back in my day, guys like Serge Haroche were first in line to declare all of this to be the sheerest, most obvious bullshit for basically being cavities with Q-factor infinity. Then I guess he won the nobel prize, someone slapped “quantum computing” on his achievements (which were nothing of the sort) and the chuckwagon got that much more amped up.

      Anyway, milk it for what it is worth.

      • pucenoise said, on January 15, 2019 at 3:53 pm

        Yes indeed. What is most irksome is how experts in adjacent disciplines often have the answers, but are not asked. It seems many engineers and applied physicists like Dyakanov see through QC the way a lightsaber cuts butter, but they are infrequently consulted.

        In other news, this guy seems to be your doppleganger W.R.T AI (although I vaguely remember you excortiating AI a while back too):

        https://blog.piekniewski.info/2018/05/28/ai-winter-is-well-on-its-way/

        That post went viral and created an enormous fuss, so I’ll be sure to spread this post on QC like herpes at a rave to make sure it gets the same treatment.

        • Scott Locklin said, on January 15, 2019 at 4:07 pm

          Yes, I passed that one around myself, and I appreciate you spreading my disease.

          It’s true; if you work in data science or AI or whatever they call it today, it’s abundantly obvious we’ve passed our wile-e-coyote over the cliff moment. OpenAI institute was the last straw; looked too much like the “center for responsible nanotechnology.” I hadn’t heard of Piekniewski before reading his blog, but he’s absolutely right about the fundamental problems with AI. So many things we can’t do. I have no idea if his program is any good, but at least he’s trying. The rest of the field are just camp followers.
          https://blog.piekniewski.info/about/

  2. Oleh Danyliv said, on January 15, 2019 at 10:01 am

    Nicely put. Very good read. I like your thoughts about Physics and Physicists. I totally share your opinion about current state of science, although charlatans existed also in times of Newton and Einstein. They were promising everlasting life and gold from led but similarly they were paid by the governments (kings). Those guys were artists and in many cases deserve admiration, current state of Physics is non-adventurous plugging data into computer hoping to publish new paper.

    I must disagree with your argument: “In 2010, I laid out an argument against quantum computing as a field based on the fact that no observable progress has taken place. That argument still stands.” Similarly we try to overcome problem of getting energy from fusion reaction. When did we start? Fifty years ago? Huge amount of money and effort and the problem is still there. But it is solvable. I hope we will get a fully working fusion power plant in our lifetime.

    • maggette said, on January 15, 2019 at 12:01 pm

      IMHO Oleh has a point here. I think that is a kind of survivorship bias. We remember the heros that ex post were contributing major ideas.

      Ovbiously we don’t remember all the charlatans and crackpots…even if they had the newspaper front covers back then.

    • Scott Locklin said, on January 15, 2019 at 4:16 pm

      There is a really simple reason why we may never get this: the sun has the approximate energy density of a dung hill (seriously; go look it up). Sure we can make hydrogen bombs with significantly higher energy densities. That doesn’t mean we can put an arbitrary density of energy which is both self sustaining fusion and convenient energy scale to human beings. I’ve debunked various press releases over time which seem to imply “fusion right around the corner” -but which actually conceal failures of their program.

      Mind you, I’d love to see it, and I got a little excited by Tri-Alpha’s efforts. IMO a-neutronic is the only fusion worth anything for power generation; otherwise just use fission which is cheap and much easier to achieve. The “no nuclear waste” thing is a canard when you’re generating a buttload of neutrons.

  3. Gaw said, on January 15, 2019 at 3:04 pm

    “There isn’t a word for this type of intellectual flatulence other than the overloaded term “fraud,” but there should be.” Boondoggle?

    • Scott Locklin said, on January 15, 2019 at 6:45 pm

      Good word, but it seems more serious and … sinful… than a boondoggle.

  4. roberthenryfischat said, on January 15, 2019 at 3:04 pm

    Reblogged this on robert's space.

  5. bertbert said, on January 15, 2019 at 3:45 pm

    I have related feeling about all the hype surrounding both “automation” and “genetic engineering.” There’s an endless stream of thought pieces about how soon we’ll all be unemployed and there will be a genetically engineered immortal over class. This is so divorced from the reality in either field. Genetics stalled out without much to show for itself well over ten years ago (zero proven treatments), and robots are ages and ages away from making a decent sandwich or reliably driving a car. Everybody thought deep learning would be the trick, but it’s not turning out to be that simple and nobody knows what to do next.

    • pucenoise said, on January 15, 2019 at 3:55 pm

      Not my field, but I always had a hazy feeling that DNN’s looked something awful like a brute force series expansion with slightly better than blindly tuned coefficients, and these guys argue it’s even worse than that:

      https://matloff.wordpress.com/2018/06/20/neural-networks-are-essentially-polynomial-regression/

    • Scott Locklin said, on January 15, 2019 at 6:38 pm

      People talk about automation removing jobs all the time. I always challenge people with this: “name one job which has been automated away by AI.” Nobody ever has a good answer.

      You’d think something involving simple calculations like accounting would be automated by now with all the predictions of singularities and so on, but while there are labor saving tools available, I need an accountant just as much as I did when I started paying taxes.

  6. Stanislav Datskovskiy said, on January 15, 2019 at 6:00 pm

    The “QC” racket is not merely a “string theory”-style grant-eater feeding trough; it is also a key component of NSA’s two-decade FUD campaign against the use of actually-strong crypto by the public (RSA, Cramer-Shoup, and variants) and in favour of questionable replacements (“elliptic curve” algos.)

    Hence the seemingly-bottomless pool of freshly-printed greenbacks made available to just about any charlatan who can pronounce “qubit”.

    • Scott Locklin said, on January 15, 2019 at 6:44 pm

      If there were real QCs, elliptic curves also submit to QC attacks; it’s basically the same thing as factoring.

      I don’t think it’s the spooks, who can’t even defeat a hawker of steaks in asymmetric warfare. I think it’s just a social problem that allows modern “scientists” to get away with gold brick non-productive work which looks like magical future. Almost all of science is turning into this. It happened to the humanities in the US decades ago. You can get a Ph.D. in english without knowing anything about the history of the language, or even having read very much; you just write essays about your feelings about people’s gonads and relate it to some famous novel.

      • Stanislav Datskovskiy said, on January 15, 2019 at 7:27 pm

        As you described — QCism is fundamentally powered by the post-war “Confucian” rot of NATO Reich academia, verily.

        But it is also an (apparently quite effective) disinformation sleight of hand trick: successfully scared quite a few gullible people/organizations out of the use of effective (4096+-bit RSA) public key crypto, and into the use of e.g. 256-bit ECDSA, via fraudulent proofs of “equivalent strength”, “QC resistance”, etc.

  7. Christopher said, on January 16, 2019 at 5:26 am

    How do you feel about “renewable energy” industry as it stands today (i.e. solar, wind and maybe some other tech depending whose talking)? Because I feel like a lot of the same problems apply there as well.

    True, solar and wind have leg up on quantum computing and nanomanchines (son!) in actually existing in real physical space in the current year, but the problem is some people (mostly leftists) want to have solar-wind replace ALL fossil fuels and nuclear ASAP. I think it’s just not possible, first because the power density of those energy sources are low (so it would take time to deploy and that they will have litter the countryside in ways even fossil fuels don’t do). Also, they inconveniently deliver power at the wrong times, meaning you will have to dump the excess and import any deficits … which usually means from other power sources, given the weakness of battery technology. I believe you wrote post on that topic a couple of years ago.

    They probably have their place in the “energy mix”, like hydroelectricity or nuclear (which also has it own hype-proponents, who ignore the high costs and the reality that to run those plants safely, you need competent nuclear power companies and government regulators, which probably come right after the nanomachines, the quantum computers and the perpetual motion machines). But I doubt that on the minds of many of the solar-wind boosters, want to use the Fight Against Climate Change as a replacement for the World-Wide Socialist Revolution that never happened (but should have totally happened).

    • Scott Locklin said, on January 17, 2019 at 2:18 am

      They certainly ruined the skyline of Germany with windmills.

      One of the best things I have ever seen is Jonathan Meades “Remember the Future” -you might like it.

      • ᴹᵃˣ Friedrich Hartmann (@mxfh) said, on January 21, 2019 at 6:29 pm

        While I dig Jonathan Meades (https://vimeo.com/109482700#t=1580s), the reversible impact of windmills on the German landscape is a joke compared to the memories of the smog smell of my childhood, and grandparents home village being plowed under in post-GDR days of the late 90s. https://youtu.be/yzpGUEROpCs?t=1570

        • Scott Locklin said, on January 21, 2019 at 6:41 pm

          I’m sure it was awful. I’m guessing Germany could have done without some of the windmills if they had kept their nukes open. As I understand things, they burn more coal now for lack of the things, and they still have to contend with the French ones nearby if they’re worried about them exploding.

  8. John Baker said, on January 16, 2019 at 5:19 pm

    “Billions of dollars squandered down a rat hole of rubbish and wishful thinking.”

    Sounds like a great tag line for a lot of government contracting.

    This is your best post yet. I am willing to entertain any crazy idea but ultimately theories have to pay off — in actual monetary forms — to persist and prosper. Eventually the masses will observe that no interesting programs are running on so called quantum computers. I would love to be proved wrong here but as a hard ass skeptic I have to see “quantum supremacy” with my own eyes before granting it reality.

    • Scott Locklin said, on January 16, 2019 at 7:20 pm

      To me the real tragedy is people wasting their lives on “woo” garbage like this. My waiter really had a Ph.D. from MIT, and was making a living bringing clowns like me a steak. There aren’t many worse life outcomes than this which come of studying a complex mathematical topic for a half decade or whatever it takes. He could have gotten a degree in acupuncture or faith healing!

  9. Aravind Srinivas said, on January 17, 2019 at 6:19 am

    I noticed you dismissed complexity and network theory in the post. Why do you think those fields are bunk/have nothing new to offer? People like those over at the Santa Fe institute and Yaneer Bar-Yam at NECSI had some interesting things to say.

    • Scott Locklin said, on January 17, 2019 at 4:59 pm

      “Interesting things to say” versus “actual insights which increase our understanding of nature” basically. You can wank all the live long day about complexity and networky rubbish and not have a single explanatory or predictive output, and these subjects have basically done just this. Such nonsense is great for the academic who comes up with it; you become your mini Freud with infinite unfalsifiable wanking, which will be taken up by lots of other wankers, bumping up your ref count for what amounts to a few pages of creative writing and some grad student generated plots.
      Network theory is particularly bad as its been caught in outright fraud as I recall. Horgan’s book has chapters on this dating from the 90s when he wrote the book. They’re worth reading for examples of “woo.” FWIIW: taken in by some of the woo myself.

  10. Anonymous said, on January 17, 2019 at 12:09 pm

    Never change:) I almost lost hope to see another post but here we are. Here’s a couple of similar blogs, don’t know if you know them:

    https://backreaction.blogspot.com – a physicist bashing string theorists and such
    httsp://tritonstation.wordpress.com – an astrophysicist bashing dark matter fans

    The physicist wrote a book and got some feedback from non-physisits:
    https://backreaction.blogspot.com/2019/01/letter-from-reader-we-get-trapped-doing.html – a physician complaining about sepsis research
    https://backreaction.blogspot.com/2019/01/letter-from-reader-whats-so-bad-about.html – a classical music critic complaining about musicology

    As for me, I’m a software developer/cs researcher and I noticed a while ago that a major trend in CS these days is to take a popular but idiotic technology like JS/Unix/C/whatever and try to make it better and call _that_ an important contribution. I mean, sure, a better C compiler makes the turd that is the Linux kernel also better but the quality ceiling is pretty low there.

    • Scott Locklin said, on January 17, 2019 at 5:30 pm

      This one involved a lot of checking and rechecking; I think blog posts are quadratic in length complexity. I also wrote some blockchain economics post which took away a lot of my writerly juices a few months ago (here if you care). Thanks for the links; will check them out.

      CS … I guess I have seen some idiotic technology contributions out of academia. I mean, it’s better than what physics is doing these days. At least they’re writing things that others may be able to use.

  11. 0xFF_ (@0xFF_) said, on January 21, 2019 at 5:11 pm

    “I don’t have a mathematical proof that Quantum Computing isn’t at least theoretically possible.”

    It’s like you never heard of valence and conduction band quantization in semiconductors. #quantumMechanics

    https://en.wikipedia.org/wiki/Electronic_band_structure

    • Scott Locklin said, on January 21, 2019 at 5:21 pm

      Yay, I am typing on a quantum computer!

  12. victor yodaiken said, on January 21, 2019 at 10:04 pm

    >True, solar and wind have leg up on quantum computing and nanomanchines (son!) in actually existing in real physical space in the current year

    In other words, they work in the real world, their costs are rapidly going down, they are cheap to operate, and they outcompete nuclear/coal in the market despite massive public subsidies for the second. Therefore, they must be fraudulent, because “liberals” like them. Fascinating.

    • Scott Locklin said, on January 21, 2019 at 10:57 pm

      I’m pretty sure solar and wind also have massive public subsidies.

      FWIIW I don’t dislike either option, and windmills have been reasonably cost efficient for a while now, but trying to run an industrialized society at present standards of living based on them doesn’t pass the sniff test. Nukes or hydrocarbons will be needed; you got to pick one of them. Germany picked hydrocarbons.

      • victor yodaiken said, on January 21, 2019 at 11:33 pm

        Sniff test can’t compete with real engineering and market pricing. In practice, firms are shuttering coal and nukes and turning to wind/solar and the huge power source of efficiencies. If you can replace 100w incandescent with 12w led you are effectively generating 88w. Obviously a mix will be needed for some time, but there is enormous room for improvement using these technologies derided as ephemeral by people who don’t get the power distribution system.

        • Scott Locklin said, on January 21, 2019 at 11:44 pm

          I guess I have a hard time imagining aluminum plants running on battery power from stored solar/wind sources. Maybe I just lack imagination.

          https://scottlocklin.wordpress.com/2010/01/09/u-s-energy-independence-hard-numbers/

          I certainly didn’t think of fracking back when I wrote this.

            • Scott Locklin said, on January 22, 2019 at 2:34 am

              Germany is about as dedicated to this sort of thing as is possible, and is easily the most likely country to have the engineering chops to pull it off. Yet somehow they only have 15% renewables in their power generation stack. Seems like spinning up a few nuke plants is more prudent than burning a shitload of lignite. Maybe noticing this makes me a bad person.

              I mean, energy from the sun ain’t real dense there. They also have clouds.

              • Toddy Cat said, on February 1, 2019 at 8:08 pm

                Maybe the Germans will make this work, but this isn’t the first blind ally they have barrelled down, yapping happily as they go. For a highly intelligent people, the Germans have certainly fallen for some doozys in the last few hundred years…

  13. Dave said, on January 21, 2019 at 10:47 pm

    Could you elaborate on what you mean by the following passage:

    “In fact, the factoring of the number 15 into 3 and 5 is a bit of a parlour trick, as they design the experiment while knowing the answer, thus leaving out the gates required if we didn’t know how to factor 15. The actual number of gates needed to factor a n-bit number is 72 * n^3; so for 15, it’s 4 bits, 4608 gates; not happening any time soon.”

    In particular, how did they leave out gates? and how did that depend on knowing how to factor 15? Thanks!

    • Scott Locklin said, on January 21, 2019 at 10:54 pm

      Well, if you know what the answer is, you don’t need to perform 72*n^3 operations on the bits. You can perform the operations that say “3” and “5.” That’s what they all do. Sure, they do some quantum-ish thing in between and the answer comes out right: it’s not the real calculation.

      Since we’re only interested in factoring big numbers where we don’t know the answer in advance, this seems rather … deceptive.

      • victor yodaiken said, on January 21, 2019 at 11:03 pm

        In the earlier AI Triumph, in the 90s, there was a stunning success where an AI system at Yale was shown to be able to translate a paragraph of Chinese into English and back. Later it was discovered that the system was limited to that specific paragraph.

      • Dave said, on January 21, 2019 at 11:44 pm

        Are you referring to the original NMR paper linked below?

        https://arxiv.org/pdf/quant-ph/0112176.pdf

        In Figure 1a on page 15 (in the arXiv version above), they show the Phase Estimation circuit with the “modular exponential oracle” as prescribed by Shor’s factoring algorithm.

        In Figure 2a they show their compilation for a = 7 (see Figure 1 caption and main text on page 3) and how they “omit” gates, but this is standard compiling and independent of knowing the solution to the problem. E.g., any worthwhile compiler discards two successive CNOT gates or a CNOT gate controlled on a zero qubit.

        Is this what you’re referring to?

        • Scott Locklin said, on January 21, 2019 at 11:56 pm

          Did they do 4000 gate operations on their states? Nope.

          “The quantum circuit of Fig. 1 was realized with a sequence of∼300 (a= 7) spin-selectiveradio-frequency (RF) pulses separated by time intervals offree evolution under the Hamiltonian(Fig. 3). The pulse sequence is designed such that the resulting transformations of the spin statescorrespond to the computational steps in the algorithm.”

  14. zephirawt said, on January 25, 2019 at 7:57 pm

    See also The Case Against Quantum Computing, Gil Kalai’s argument Against Quantum Computers
    and Quantum strategies fail to improve capacity of quantum optical channels

    It comes as no big surprise, because the quantum computers are potentially fast but very noisy and they operate with low precision (low number of qbits). Whereas the classical computers are slower (at least in principle) – but their reliability and reproducibility is much higher. The computation power of quantum computers is given by product of processing speed and precision and it remains limited by uncertainty principle in the same way like this one of classical ones. At the moment when computational power of classical computers already hits its "physical limits", then the application of quantum computers cannot bring any improvement. The same applies to the bandwidth of quantum links.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: