Locklin on science

Quantum computing as a field is obvious bullshit

Posted in non-standard computer architectures, physics by Scott Locklin on January 15, 2019

I remember spotting the quantum computing trend when I was  a larval physics nerdling. I figured maybe I could get in on the chuckwagon if my dissertation project didn’t work out in a big way (it didn’t). I managed to get myself invited to a Gordon conference, and have giant leather bound notebooks filled with theoretical scribblings containing material for 2-3 papers in them. I wasn’t real confident in my results, and I couldn’t figure out a way to turn them into something practical involving matter, so I happily matriculated to better things in the world of business.

When I say Quantum Computing is a bullshit field, I don’t mean everything in the field is bullshit, though to first order, this appears to be approximately true. I don’t have a mathematical proof that Quantum Computing isn’t at least theoretically possible.  I also do not have a mathematical proof that we can or can’t make the artificial bacteria of K. Eric Drexler’s nanotech fantasies. Yet, I know both fields are bullshit. Both fields involve forming new kinds of matter that we haven’t the slightest idea how to construct. Neither field has a sane ‘first step’ to make their large claims true.

Drexler and the “nanotechnologists” who followed him, they assume because we  know about the Schroedinger equation we can make artificial forms of life out of arbitrary forms of matter. This is nonsense; nobody understands enough about matter in detail or life in particular to do this. There are also reasonable thermodynamic, chemical and physical arguments against this sort of thing. I have opined on this at length, and at this point, I am so obviously correct on the nanotech front, there is nobody left to argue with me. A generation of people who probably would have made first rate chemists or materials scientists wasted their early, creative careers following this over hyped and completely worthless woo. Billions of dollars squandered down a rat hole of rubbish and wishful thinking. Legal wankers wrote legal reviews of regulatory regimes to protect us from this nonexistent technology. We even had congressional hearings on this nonsense topic back in 2003 and again in 2005 (and probably some other times I forgot about). Russians built a nanotech park to cash in on the nanopocalyptic trillion dollar nanotech economy which was supposed to happen by now.

Similarly, “quantum computing” enthusiasts expect you to overlook the fact that they haven’t a clue as to how to build and manipulate quantum coherent forms of matter necessary to achieve quantum computation.  A quantum computer capable of truly factoring the number 21 is missing in action. In fact, the factoring of the number 15 into 3 and 5 is a bit of a parlour trick, as they design the experiment while knowing the answer, thus leaving out the gates required if we didn’t know how to factor 15. The actual number of gates needed to factor a n-bit number is 72 * n^3; so for 15, it’s 4 bits, 4608 gates; not happening any time soon.

It’s been almost 25 years since Peter Shor had his big idea, and we are no closer to factoring large numbers than we were … 15 years ago when we were also able to kinda sorta vaguely factor the number 15 using NMR ‘quantum computers.’

I had this conversation talking with a pal at … a nice restaurant near one of America’s great centers of learning. Our waiter was amazed and shared with us the fact that he had done a Ph.D. thesis on the subject of quantum computing. My pal was convinced by this that my skepticism is justified; in fact he accused me of arranging this. I didn’t, but am motivated to write to prevent future Ivy League Ph.D. level talent having to make a living by bringing a couple of finance nerds their steaks.

In 2010, I laid out an argument against quantum computing as a field based on the fact that no observable progress has taken place. That argument still stands. No observable progress has taken place. However, 8 years is a very long time. Ph.D. dissertations have been achieved, and many of these people have gone on to careers … some of which involve bringing people like me delicious steaks. Hundreds of quantum computing charlatans achieved tenure in that period of time. According to google scholar a half million papers have been written on the subject since then.

QC-screenshot

There are now three major .com firms funding quantum computing efforts; IBM, Google and Microsoft. There is at least one YC/Andreesen backed startup I know of. Of course there is also dwave, who has somehow managed to exist since 1999; almost 20 years, without actually delivering something usefully quantum or computing. How many millions have been flushed down the toilet by these turds? How many millions which could have been used building, say, ordinary analog or stochastic computers which do useful things? None of these have delivered a useful quantum computer which has even  one usefully error corrected qubit. I suppose I shed not too many tears for the money spent on these efforts; in my ideal world, several companies on that list would be broken up or forced to fund Bell Labs moonshot efforts anyway, and most venture capitalists are frauds who deserve to be parted with their money. I do feel sad for the number of young people taken in by this quackery. You’re better off reading ancient Greek than studying a ‘technical’ subject that eventually involves bringing a public school kid like me a steak. Hell, you are better off training to become an exorcist or a feng shui practitioner than getting a Ph.D. in ‘quantum computing.’

I am an empiricist and a phenomenologist. I consider the lack of one error corrected qubit in the history of the human race to be adequate evidence that this is not a serious enough field to justify using the word ‘field.’ Most of it is frankly, a scam. Plenty of time to collect tenure and accolades before people realize this isn’t normative science or much of anything reasonable.

As I said last year

All you need do is look at history: people had working (digital) computers before Von Neumann and other theorists ever noticed them. We literally have thousands of “engineers” and “scientists” writing software and doing “research” on a machine that nobody knows how to build. People dedicate their careers to a subject which doesn’t exist in the corporeal world. There isn’t a word for this type of intellectual flatulence other than the overloaded term “fraud,” but there should be.

Computer scientists” have gotten involved in this chuckwagon. They have added approximately nothing to our knowledge of the subject, and as far as I can tell, their educational backgrounds preclude them ever doing so. “Computer scientists” haven’t had proper didactics in learning quantum mechanics, and virtually none of them have ever done anything as practical as fiddled with an op-amp, built an AM radio or noticed how noise works in the corporeal world.

Such towering sperg-lords actually think that the only problems with quantum computing are engineering problems. When I read things like this, I can hear them muttering mere engineering problems.  Let’s say, for the sake of argument this were true. The SR-71 was technically a mere engineering problem after the Bernoulli effect was explicated in 1738. Would it be reasonable to have a hundred or a thousand people writing flight plans for the SR-71  as a profession in 1760? No.

A reasonable thing for a 1760s scientist to do is invent materials making a heavier than air craft possible. Maybe fool around with kites and steam engines. And even then … there needed to be several important breakthroughs in metallurgy (titanium wasn’t discovered until 1791), mining, a functioning petrochemical industry, formalized and practical thermodynamics, a unified field theory of electromagnetism, chemistry, optics, manufacturing and arguably quantum mechanics, information theory, operations research and a whole bunch of other stuff which was unimaginable in the 1760s. In fact, of course the SR-71 itself was completely unimaginable back then. That’s the point.

 

its just engineering!

its just engineering!

Physicists used to be serious and bloody minded people who understood reality by doing experiments. Somehow this sort of bloody minded seriousness has faded out into a tower of wanking theorists who only occasionally have anything to do with actual matter. I trace the disease to the rise of the “meritocracy” out of cow colleges in the 1960s. The post WW-2 neoliberal idea was that geniuses like Einstein could be mass produced out of peasants using agricultural schools. The reality is, the peasants are still peasants, and the total number of Einsteins in the world, or even merely serious thinkers about physics is probably something like a fixed number. It’s really easy, though, to create a bunch of crackpot narcissists who have the egos of Einstein without the exceptional work output. All you need to do there is teach them how to do some impressive looking mathematical Cargo Cult science, and keep their “results” away from any practical men doing experiments.

The manufacture of a large caste of such boobs has made any real progress in physics impossible without killing off a few generations of them. The vast, looming, important questions of physics; the kinds that a once in a lifetime physicist might answer -those haven’t budged since the early 60s. John Horgan wrote a book observing that science (physics in particular) has pretty much ended any observable forward progress since the time of cow collitches. He also noticed that instead of making progress down fruitful lanes or improving detailed knowledge of important areas, most develop enthusiasms for the latest non-experimental wank fest; complexity theory, network theory, noodle theory. He thinks it’s because it’s too difficult to make further progress. I think it’s because the craft is now overrun with corrupt welfare queens who are play-acting cargo cultists.

Physicists worthy of the name are freebooters; Vikings of the Mind, intellectual adventurers who torture nature into giving up its secrets and risk their reputation in the real world. Modern physicists are … careerist ding dongs who grub out a meagre living sucking on the government teat, working their social networks, giving their friends reach arounds and doing PR to make themselves look like they’re working on something important. It is terrible and sad what happened to the king of sciences. While there are honest and productive physicists, the mainstream of it is lost, possibly forever to a caste of grifters and apple polishing dingbats.

But when a subject which claims to be a technology, which lacks even the rudiments of experiment which may one day make it into a technology, you can know with absolute certainty that this ‘technology’ is total nonsense. Quantum computing is less physical than the engineering of interstellar spacecraft; we at least have plausible physical mechanisms to achieve interstellar space flight.

We’re reaching peak quantum computing hyperbole. According to a dimwit at the Atlantic, quantum computing will end free will. According to another one at Forbes, “the quantum computing apocalypse is immanent.” Rachel Gutman and Schlomo Dolev know about as much about quantum computing as I do about 12th century Talmudic studies, which is to say, absolutely nothing. They, however, think they know smart people who tell them that this is important: they’ve achieved the perfect human informational centipede. This is unquestionably the right time to go short.

Even the national academy of sciences has taken note that there might be a problem here. They put together 13 actual quantum computing experts who poured cold water on all the hype. They wrote a 200 page review article on the topic, pointing out that even with the most optimistic projections, RSA is safe for another couple of decades, and that there are huge gaps on our knowledge of how to build anything usefully quantum computing. And of course, they also pointed out if QC doesn’t start solving some problems which are interesting to … somebody, the funding is very likely to dry up. Ha, ha; yes, I’ll have some pepper on that steak.


 

There are several reasonable arguments against any quantum computing of the interesting kind (aka can demonstrate supremacy on a useful problem) ever having a physical embodiment.

One of the better arguments is akin to that against P=NP. No, not the argument that “if there was such a proof someone would have come up with it by now” -but that one is also in full effect. In principle, classical analog computers can solve NP-hard problems in P time. You can google around on the “downhill principle” or look at the work on Analog super-Turing architectures by people like Hava Siegelmann. It’s old stuff, and most sane people realize this isn’t really physical, because matter isn’t infinitely continuous. If you can encode a real/continuous number into the physical world somehow, P=NP using a protractor or soap-bubble. For whatever reasons, most complexity theorists understand this, and know that protractor P=NP isn’t physical.  Somehow quantum computing gets a pass, I guess because they’ve never attempted to measure anything in the physical world beyond the complexity of using a protractor.

In order to build a quantum computer, you need to control each qubit, which is a continuous value, not a binary value, in its initial state and subsequent states precisely enough to run the calculation backwards. When people do their calculations ‘proving’ the efficiency of quantum computers, this is treated as an engineering detail. There are strong assertions by numerous people that quantum error correction (which, I will remind everyone, hasn’t been usefully implemented in actual matter by anyone -that’s the only kind of proof that matters here) basically pushes the analog requirement for perfection to the initialization step, or subsumes it in some other place where it can’t exist. Let’s assume for the moment that this isn’t the case.

Putting this a different way, for an N-qubit computer, you need to control, transform, and read out 2^N complex (as in complex numbers) amplitudes of N-qubit quantum computers to a very high degree of precision. Even considering an analog computer with N oscillators which must be precisely initialized, precisely controlled, transformed and individually read out, to the point where you could reverse the computation by running the oscillators through the computation backwards; this is an extremely challenging task. The quantum version is exponentially more difficult.

Making it even more concrete; if we encode the polarization state of a photon as a qubit, how do we perfectly align the polarizers between two qubits? How do we align them for N qubits? How do we align the polarization direction with the gates? This isn’t some theoretical gobbledeygook; when it comes time to build something in physical reality, physical alignments matter, a lot. Ask me how I know. You can go amuse yourself and try to build a simple quantum computer with a couple of hard coded gates using beamsplitters and polarization states of photos. It’s known to be perfectly possible and even has a rather sad wikipedia page. I can make quantum polarization-state entangled photons all day; any fool with a laser and a KDP crystal can do this, yet somehow nobody bothers sticking some beamsplitters on a breadboard and making a quantum computer. How come? Well, one guy recently did it: got two whole qubits. You can go read about this *cough* promising new idea here, or if you are someone who doesn’t understand matter here.

FWIIW in early days of this idea, it was noticed that the growth in the number of components needed was exponential in the number of qubits. Well, this shouldn’t be a surprise: the growth in the number of states in a quantum computer is also exponential in the number of qubits. That’s both the ‘interesting thing’ and ‘the problem.’ The ‘interesting thing’ because an exponential number of states, if possible to trivially manipulate, allows for a large speedup in calculations. ‘The problem’ because manipulating an exponential number of states is not something anyone really knows how to do.

The problem doesn’t go away if you use spins of electrons or nuclei; which direction is spin up? Will all the physical spins be perfectly aligned in the “up” direction? Will the measurement devices agree on spin-up? Do all the gates agree on spin-up? In the world of matter, of course they won’t; you will have a projection. That projection is in effect, correlated noise, and correlated noise destroys quantum computation in an irrecoverable way. Even the quantum error correction people understand this, though for some reason people don’t worry about it too much. If they are honest in their lack of worry, this is because they’ve never fooled around with things like beamsplitters. Hey, making it have uncorrelated noise; that’s just an engineering problem right? Sort of like making artificial life out of silicon, controlled nuclear fusion power or Bussard ramjets is “just an engineering problem.”

engineering problem; easier than quantum computers

 

Of course at some point someone will mention quantum error correction which allows us to not have to precisely measure and transform everything. The most optimistic estimate of the required precision is something like 10^-5 for quantum error corrected computers per qubit/gate operation. This is a fairly high degree of precision. Going back to my polarization angle example; this implies all the polarizers, optical elements and gates in a complex system are aligned to 0.036 degrees. I mean, I know how to align a couple of beamsplitters and polarizers to 628 microradians, but I’m not sure I can align a few hundred thousand of them AND pockels cells and mirrors to 628 microradians of each other. Now imagine something with a realistic number of qubits for factoring large numbers; maybe 10,000 qubits, and a CPU worth of gates, say 10^10 or so of gates (an underestimate of the number needed for cracking RSA, which, mind you, is the only reason we’re having this conversation). I suppose it is possible, but I encourage any budding quantum wank^H^H^H  algorithmist out there to have a go at aligning 3-4 optical elements to within this precision. There is no time limit, unless you die first, in which case “time’s up!”

This is just the most obvious engineering limitation for making sure we don’t have obviously correlated noise propagating through our quantum computer. We must also be able to prepare the initial states to within this sort of precision. Then we need to be able to measure the final states to within this sort of precision. And we have to be able to do arbitrary unitary transformations on all the qubits.

Just to interrupt you with some basic facts: the number of states we’re talking about here for a 4000 qubit computer is ~ 2^4000 states! That’s 10^1200 or so continuous variables we have to manipulate to at least one part in ten thousand. The number of protons in the universe is about 10^80. This is why a quantum computer is so powerful; you’re theoretically encoding an exponential number of states into the thing. Can anyone actually do this using a physical object? Citations needed; as far as I can tell, nothing like this has ever been done in the history of the human race. Again, interstellar space flight seems like a more achievable goal. Even Drexler’s nanotech fantasies have some precedent in the form of actually existing life forms. Yet none of these are coming any time soon either.

There are reasons to believe that quantum error correction, too isn’t even theoretically possible (examples here and here and here -this one is particularly damning). In addition to the argument above that the theorists are subsuming some actual continuous number into what is inherently a noisy and non-continuous machine made out of matter, the existence of a quantum error corrected system would mean you can make arbitrarily precise quantum measurements; effectively giving you back your exponentially precise continuous number. If you can do exponentially precise continuous numbers in a non exponential number of calculations or measurements, you can probably solve very interesting problems on a relatively simple analog computer. Let’s say, a classical one like a Toffoli gate billiard ball computer. Get to work; we know how to make a billiard ball computer work with crabs. This isn’t an example chosen at random. This is the kind of argument allegedly serious people submit for quantum computation involving matter. Hey man, not using crabs is just an engineering problem muh Church Turing warble murble.

Smurfs will come back to me with the press releases of Google and IBM touting their latest 20 bit stacks of whatever. I am not impressed, and I don’t even consider most of these to be quantum computing in the sense that people worry about quantum supremacy and new quantum-proof public key or Zero Knowledge Proof algorithms (which more or less already exist). These cod quantum computing machines are not expanding our knowledge of anything, nor are they building towards anything for a bold new quantum supreme future; they’re not scalable, and many of them are not obviously doing anything quantum or computing.

This entire subject does nothing but  eat up lives and waste careers. If I were in charge of science funding, the entire world budget for this nonsense would be below that we allocate for the development of Bussard ramjets, which are also not known to be impossible, and are a lot more cool looking.

 

 

As Dyakonov put it in his 2012 paper;

“A somewhat similar story can be traced back to the 13th century when Nasreddin Hodja made a proposal to teach his donkey to read and obtained a 10-year grant from the local Sultan. For his first report he put breadcrumbs between the pages of a big book, and demonstrated the donkey turning the pages with his hoofs. This was a promising first step in the right direction. Nasreddin was a wise but simple man, so when asked by friends how he hopes to accomplish his goal, he answered: “My dear fellows, before ten years are up, either I will die or the Sultan will die. Or else, the donkey will die.”

Had he the modern degree of sophistication, he could say, first, that there is no theorem forbidding donkeys to read. And, since this does not contradict any known fundamental principles, the failure to achieve this goal would reveal new laws of Nature. So, it is a win-win strategy: either the donkey learns to read, or new laws will be discovered.”

Further reading on the topic:

Dyakonov’s recent IEEE popsci article on the subject (his papers are the best review articles of why all this is silly):

https://spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing

IEEE precis on the NAS report:

https://spectrum.ieee.org/tech-talk/computing/hardware/the-us-national-academies-reports-on-the-prospects-for-quantum-computing (summary: not good)

Amusing blog from 11 years ago noting the utter lack of progress in this subject:

http://emergentchaos.com/archives/2008/03/quantum-progress.html

“To factor a 4096-bit number, you need 72*40963 or 4,947,802,324,992 quantum gates. Lets just round that up to an even 5 trillion. Five trillion is a big number. ”

Aaronson’s articles of faith (I personally found them literal laffin’ out loud funny, though I am sure he is in perfect earnest):

https://www.scottaaronson.com/blog/?p=124

 

107 Responses

Subscribe to comments with RSS.

  1. pucenoise said, on January 15, 2019 at 2:42 am

    Yeah this shitwagon is full bore. Normally I would be working on industry related transport modelling for devices but $1.2 billion of funding punted me into making transport models for their optical gizmos.

    At least non-equilibrium quantum statistical mechanics is universal; one (un)surprising discovery we’ve made so far is that not only do the quantum information people have their heads up their asses, but so do the quantum optics people, so I’m not entirely wasting my time.

    • Scott Locklin said, on January 15, 2019 at 2:51 am

      Back in my day, guys like Serge Haroche were first in line to declare all of this to be the sheerest, most obvious bullshit for basically being cavities with Q-factor infinity. Then I guess he won the nobel prize, someone slapped “quantum computing” on his achievements (which were nothing of the sort) and the chuckwagon got that much more amped up.

      Anyway, milk it for what it is worth.

      • pucenoise said, on January 15, 2019 at 3:53 pm

        Yes indeed. What is most irksome is how experts in adjacent disciplines often have the answers, but are not asked. It seems many engineers and applied physicists like Dyakanov see through QC the way a lightsaber cuts butter, but they are infrequently consulted.

        In other news, this guy seems to be your doppleganger W.R.T AI (although I vaguely remember you excortiating AI a while back too):

        https://blog.piekniewski.info/2018/05/28/ai-winter-is-well-on-its-way/

        That post went viral and created an enormous fuss, so I’ll be sure to spread this post on QC like herpes at a rave to make sure it gets the same treatment.

        • Scott Locklin said, on January 15, 2019 at 4:07 pm

          Yes, I passed that one around myself, and I appreciate you spreading my disease.

          It’s true; if you work in data science or AI or whatever they call it today, it’s abundantly obvious we’ve passed our wile-e-coyote over the cliff moment. OpenAI institute was the last straw; looked too much like the “center for responsible nanotechnology.” I hadn’t heard of Piekniewski before reading his blog, but he’s absolutely right about the fundamental problems with AI. So many things we can’t do. I have no idea if his program is any good, but at least he’s trying. The rest of the field are just camp followers.
          https://blog.piekniewski.info/about/

  2. Oleh Danyliv said, on January 15, 2019 at 10:01 am

    Nicely put. Very good read. I like your thoughts about Physics and Physicists. I totally share your opinion about current state of science, although charlatans existed also in times of Newton and Einstein. They were promising everlasting life and gold from led but similarly they were paid by the governments (kings). Those guys were artists and in many cases deserve admiration, current state of Physics is non-adventurous plugging data into computer hoping to publish new paper.

    I must disagree with your argument: “In 2010, I laid out an argument against quantum computing as a field based on the fact that no observable progress has taken place. That argument still stands.” Similarly we try to overcome problem of getting energy from fusion reaction. When did we start? Fifty years ago? Huge amount of money and effort and the problem is still there. But it is solvable. I hope we will get a fully working fusion power plant in our lifetime.

    • maggette said, on January 15, 2019 at 12:01 pm

      IMHO Oleh has a point here. I think that is a kind of survivorship bias. We remember the heros that ex post were contributing major ideas.

      Ovbiously we don’t remember all the charlatans and crackpots…even if they had the newspaper front covers back then.

    • Scott Locklin said, on January 15, 2019 at 4:16 pm

      There is a really simple reason why we may never get this: the sun has the approximate energy density of a dung hill (seriously; go look it up). Sure we can make hydrogen bombs with significantly higher energy densities. That doesn’t mean we can put an arbitrary density of energy which is both self sustaining fusion and convenient energy scale to human beings. I’ve debunked various press releases over time which seem to imply “fusion right around the corner” -but which actually conceal failures of their program.

      Mind you, I’d love to see it, and I got a little excited by Tri-Alpha’s efforts. IMO a-neutronic is the only fusion worth anything for power generation; otherwise just use fission which is cheap and much easier to achieve. The “no nuclear waste” thing is a canard when you’re generating a buttload of neutrons.

    • Anastasia said, on October 13, 2020 at 6:36 pm

      As someone who studied Ancient Greek, and works with shamans and exorcists and used to be a software engineer, I approve of this message. I LOVE your writing. Thanks for the great read, and thanks for confirming my thesis about quantum computing. You should consider writing a book. You are hilarious and informative.

    • Bobby said, on February 9, 2022 at 3:34 pm

      There is nothing to indicate that economically viable fusion power is solvable. Nothing. There’s research, which is all that ITER is. But saying ‘it’s solvable’ means (to a true scientist/engineer with integrity) that all the parts are known, AND we know how to practically achieve/implement all the parts. (Which is simply not the case). We do not know how to implement many of the steps necessary to implement economically viable fusion power, and ITER will not change that. A physicist has written an article similar to this one, but it’s drowned out due to the ability to show a neat glowing plasma, and because so much has been thrown at a big experiment (ITER).

  3. Gaw said, on January 15, 2019 at 3:04 pm

    “There isn’t a word for this type of intellectual flatulence other than the overloaded term “fraud,” but there should be.” Boondoggle?

    • Scott Locklin said, on January 15, 2019 at 6:45 pm

      Good word, but it seems more serious and … sinful… than a boondoggle.

  4. roberthenryfischat said, on January 15, 2019 at 3:04 pm

    Reblogged this on robert's space.

  5. bertbert said, on January 15, 2019 at 3:45 pm

    I have related feeling about all the hype surrounding both “automation” and “genetic engineering.” There’s an endless stream of thought pieces about how soon we’ll all be unemployed and there will be a genetically engineered immortal over class. This is so divorced from the reality in either field. Genetics stalled out without much to show for itself well over ten years ago (zero proven treatments), and robots are ages and ages away from making a decent sandwich or reliably driving a car. Everybody thought deep learning would be the trick, but it’s not turning out to be that simple and nobody knows what to do next.

    • pucenoise said, on January 15, 2019 at 3:55 pm

      Not my field, but I always had a hazy feeling that DNN’s looked something awful like a brute force series expansion with slightly better than blindly tuned coefficients, and these guys argue it’s even worse than that:

      https://matloff.wordpress.com/2018/06/20/neural-networks-are-essentially-polynomial-regression/

    • Scott Locklin said, on January 15, 2019 at 6:38 pm

      People talk about automation removing jobs all the time. I always challenge people with this: “name one job which has been automated away by AI.” Nobody ever has a good answer.

      You’d think something involving simple calculations like accounting would be automated by now with all the predictions of singularities and so on, but while there are labor saving tools available, I need an accountant just as much as I did when I started paying taxes.

    • RS said, on February 17, 2021 at 1:24 am

      Yeesh this hits hard. Got a phd in quantum optics that used quantum computing as the buzzword bingo to get funded…in reality this means 4 years trying to align optical components to do a few measurements (and do them badly, I am a retard).
      Any quantum computing architecture that doesn’t use pre existing tech to enable being scaled to the same order of magnitude of bits as classical computers is dead on arrival, and adiabatic computation is, imho, bullshit (I never understood how you calculate arbitrary and experimentally implimentable hamiltonians to make this feasible, but again I am a retard so hey ho).

      • Scott Locklin said, on February 17, 2021 at 9:19 am

        Hey man, we’re all retards. Admitting it is probably the first step towards non retardation. The other path apparently leads to tenure in quantum computing.

    • Bobby said, on February 9, 2022 at 3:36 pm

      You’re right in genetic engineering, but wrong on automation, unfortunately. Its happening right before your eyes and it’s accelerating. Already, most aspects of energy and large scale food production are automated. What will follow is full automation of transport and most services. Eventually automation of almost all medicine will be here. And more things will follow

  6. Stanislav Datskovskiy said, on January 15, 2019 at 6:00 pm

    The “QC” racket is not merely a “string theory”-style grant-eater feeding trough; it is also a key component of NSA’s two-decade FUD campaign against the use of actually-strong crypto by the public (RSA, Cramer-Shoup, and variants) and in favour of questionable replacements (“elliptic curve” algos.)

    Hence the seemingly-bottomless pool of freshly-printed greenbacks made available to just about any charlatan who can pronounce “qubit”.

    • Scott Locklin said, on January 15, 2019 at 6:44 pm

      If there were real QCs, elliptic curves also submit to QC attacks; it’s basically the same thing as factoring.

      I don’t think it’s the spooks, who can’t even defeat a hawker of steaks in asymmetric warfare. I think it’s just a social problem that allows modern “scientists” to get away with gold brick non-productive work which looks like magical future. Almost all of science is turning into this. It happened to the humanities in the US decades ago. You can get a Ph.D. in english without knowing anything about the history of the language, or even having read very much; you just write essays about your feelings about people’s gonads and relate it to some famous novel.

      • Stanislav Datskovskiy said, on January 15, 2019 at 7:27 pm

        As you described — QCism is fundamentally powered by the post-war “Confucian” rot of NATO Reich academia, verily.

        But it is also an (apparently quite effective) disinformation sleight of hand trick: successfully scared quite a few gullible people/organizations out of the use of effective (4096+-bit RSA) public key crypto, and into the use of e.g. 256-bit ECDSA, via fraudulent proofs of “equivalent strength”, “QC resistance”, etc.

  7. Christopher said, on January 16, 2019 at 5:26 am

    How do you feel about “renewable energy” industry as it stands today (i.e. solar, wind and maybe some other tech depending whose talking)? Because I feel like a lot of the same problems apply there as well.

    True, solar and wind have leg up on quantum computing and nanomanchines (son!) in actually existing in real physical space in the current year, but the problem is some people (mostly leftists) want to have solar-wind replace ALL fossil fuels and nuclear ASAP. I think it’s just not possible, first because the power density of those energy sources are low (so it would take time to deploy and that they will have litter the countryside in ways even fossil fuels don’t do). Also, they inconveniently deliver power at the wrong times, meaning you will have to dump the excess and import any deficits … which usually means from other power sources, given the weakness of battery technology. I believe you wrote post on that topic a couple of years ago.

    They probably have their place in the “energy mix”, like hydroelectricity or nuclear (which also has it own hype-proponents, who ignore the high costs and the reality that to run those plants safely, you need competent nuclear power companies and government regulators, which probably come right after the nanomachines, the quantum computers and the perpetual motion machines). But I doubt that on the minds of many of the solar-wind boosters, want to use the Fight Against Climate Change as a replacement for the World-Wide Socialist Revolution that never happened (but should have totally happened).

    • Scott Locklin said, on January 17, 2019 at 2:18 am

      They certainly ruined the skyline of Germany with windmills.

      One of the best things I have ever seen is Jonathan Meades “Remember the Future” -you might like it.

      • ᴹᵃˣ Friedrich Hartmann (@mxfh) said, on January 21, 2019 at 6:29 pm

        While I dig Jonathan Meades (https://vimeo.com/109482700#t=1580s), the reversible impact of windmills on the German landscape is a joke compared to the memories of the smog smell of my childhood, and grandparents home village being plowed under in post-GDR days of the late 90s. https://youtu.be/yzpGUEROpCs?t=1570

        • Scott Locklin said, on January 21, 2019 at 6:41 pm

          I’m sure it was awful. I’m guessing Germany could have done without some of the windmills if they had kept their nukes open. As I understand things, they burn more coal now for lack of the things, and they still have to contend with the French ones nearby if they’re worried about them exploding.

  8. John Baker said, on January 16, 2019 at 5:19 pm

    “Billions of dollars squandered down a rat hole of rubbish and wishful thinking.”

    Sounds like a great tag line for a lot of government contracting.

    This is your best post yet. I am willing to entertain any crazy idea but ultimately theories have to pay off — in actual monetary forms — to persist and prosper. Eventually the masses will observe that no interesting programs are running on so called quantum computers. I would love to be proved wrong here but as a hard ass skeptic I have to see “quantum supremacy” with my own eyes before granting it reality.

    • Scott Locklin said, on January 16, 2019 at 7:20 pm

      To me the real tragedy is people wasting their lives on “woo” garbage like this. My waiter really had a Ph.D. from MIT, and was making a living bringing clowns like me a steak. There aren’t many worse life outcomes than this which come of studying a complex mathematical topic for a half decade or whatever it takes. He could have gotten a degree in acupuncture or faith healing!

  9. Aravind Srinivas said, on January 17, 2019 at 6:19 am

    I noticed you dismissed complexity and network theory in the post. Why do you think those fields are bunk/have nothing new to offer? People like those over at the Santa Fe institute and Yaneer Bar-Yam at NECSI had some interesting things to say.

    • Scott Locklin said, on January 17, 2019 at 4:59 pm

      “Interesting things to say” versus “actual insights which increase our understanding of nature” basically. You can wank all the live long day about complexity and networky rubbish and not have a single explanatory or predictive output, and these subjects have basically done just this. Such nonsense is great for the academic who comes up with it; you become your mini Freud with infinite unfalsifiable wanking, which will be taken up by lots of other wankers, bumping up your ref count for what amounts to a few pages of creative writing and some grad student generated plots.
      Network theory is particularly bad as its been caught in outright fraud as I recall. Horgan’s book has chapters on this dating from the 90s when he wrote the book. They’re worth reading for examples of “woo.” FWIIW: taken in by some of the woo myself.

      • deepquant said, on October 16, 2021 at 2:17 pm

        i wish there’s an upvote button. This reply is pure gem.

  10. Anonymous said, on January 17, 2019 at 12:09 pm

    Never change:) I almost lost hope to see another post but here we are. Here’s a couple of similar blogs, don’t know if you know them:

    https://backreaction.blogspot.com – a physicist bashing string theorists and such
    httsp://tritonstation.wordpress.com – an astrophysicist bashing dark matter fans

    The physicist wrote a book and got some feedback from non-physisits:
    https://backreaction.blogspot.com/2019/01/letter-from-reader-we-get-trapped-doing.html – a physician complaining about sepsis research
    https://backreaction.blogspot.com/2019/01/letter-from-reader-whats-so-bad-about.html – a classical music critic complaining about musicology

    As for me, I’m a software developer/cs researcher and I noticed a while ago that a major trend in CS these days is to take a popular but idiotic technology like JS/Unix/C/whatever and try to make it better and call _that_ an important contribution. I mean, sure, a better C compiler makes the turd that is the Linux kernel also better but the quality ceiling is pretty low there.

    • Scott Locklin said, on January 17, 2019 at 5:30 pm

      This one involved a lot of checking and rechecking; I think blog posts are quadratic in length complexity. I also wrote some blockchain economics post which took away a lot of my writerly juices a few months ago (here if you care). Thanks for the links; will check them out.

      CS … I guess I have seen some idiotic technology contributions out of academia. I mean, it’s better than what physics is doing these days. At least they’re writing things that others may be able to use.

    • Isaac said, on September 20, 2020 at 11:58 am

      You might also be interested in (“The DIM Hypothesis”)[https://www.amazon.com/DIM-Hypothesis-Lights-West-Going-ebook/dp/B0090UMLLC/], by Leonard Peikoff. His thesis is essentially that cultures are defined by their predominant “mode of thinking”, which can either be “integrated”, “misintegrated”, “disintegrated” or some mixture. String theory or the kind of abstract musicology described in the blog post are examples of Platonist “misintegration” — overly abstract, rationalistic systems of thought deduced from **a priori** principles without reference to observations of nature.

      Leonard Peikoff was a student of Ayn Rand’s, so he tends to be dismissed simply by virtue of his association with her (I guess her “brand” has become too tarnished amongst modern academia for anyone associated with her to be taken seriously), but his book is very interesting and dissects some of the deeper philosophical issues behind the warped thinking styles we observe in various fields today. I wrote a review of it (here)[https://thoughtandactionblog.com/2016/05/29/review-of-peikoffs-the-dim-hypothesis/].

  11. 0xFF_ (@0xFF_) said, on January 21, 2019 at 5:11 pm

    “I don’t have a mathematical proof that Quantum Computing isn’t at least theoretically possible.”

    It’s like you never heard of valence and conduction band quantization in semiconductors. #quantumMechanics

    https://en.wikipedia.org/wiki/Electronic_band_structure

    • Scott Locklin said, on January 21, 2019 at 5:21 pm

      Yay, I am typing on a quantum computer!

  12. victor yodaiken said, on January 21, 2019 at 10:04 pm

    >True, solar and wind have leg up on quantum computing and nanomanchines (son!) in actually existing in real physical space in the current year

    In other words, they work in the real world, their costs are rapidly going down, they are cheap to operate, and they outcompete nuclear/coal in the market despite massive public subsidies for the second. Therefore, they must be fraudulent, because “liberals” like them. Fascinating.

    • Scott Locklin said, on January 21, 2019 at 10:57 pm

      I’m pretty sure solar and wind also have massive public subsidies.

      FWIIW I don’t dislike either option, and windmills have been reasonably cost efficient for a while now, but trying to run an industrialized society at present standards of living based on them doesn’t pass the sniff test. Nukes or hydrocarbons will be needed; you got to pick one of them. Germany picked hydrocarbons.

      • victor yodaiken said, on January 21, 2019 at 11:33 pm

        Sniff test can’t compete with real engineering and market pricing. In practice, firms are shuttering coal and nukes and turning to wind/solar and the huge power source of efficiencies. If you can replace 100w incandescent with 12w led you are effectively generating 88w. Obviously a mix will be needed for some time, but there is enormous room for improvement using these technologies derided as ephemeral by people who don’t get the power distribution system.

        • Scott Locklin said, on January 21, 2019 at 11:44 pm

          I guess I have a hard time imagining aluminum plants running on battery power from stored solar/wind sources. Maybe I just lack imagination.

          https://scottlocklin.wordpress.com/2010/01/09/u-s-energy-independence-hard-numbers/

          I certainly didn’t think of fracking back when I wrote this.

            • Scott Locklin said, on January 22, 2019 at 2:34 am

              Germany is about as dedicated to this sort of thing as is possible, and is easily the most likely country to have the engineering chops to pull it off. Yet somehow they only have 15% renewables in their power generation stack. Seems like spinning up a few nuke plants is more prudent than burning a shitload of lignite. Maybe noticing this makes me a bad person.

              I mean, energy from the sun ain’t real dense there. They also have clouds.

              • Toddy Cat said, on February 1, 2019 at 8:08 pm

                Maybe the Germans will make this work, but this isn’t the first blind ally they have barrelled down, yapping happily as they go. For a highly intelligent people, the Germans have certainly fallen for some doozys in the last few hundred years…

      • Paul said, on August 12, 2019 at 6:55 am

        As regards solar, the economics are really good with no govt subsidies, now that you can get solar panels at under 50c/ watt from unsubsidized companies that are making a profit.
        50c/ watt means $5000 for 10kw and with 4hrs/ day sun on average that’s 40kwhrs/day or 1.2Mwhts/month or 14.4MWhrs per year , worth $1440 at 10c/kwhr which is an annual return of 28.8% on your investment. In practice it’s half that because you need snother $5k of shit around the solar panels, but it’s still an economic no-brainer.

  13. Dave said, on January 21, 2019 at 10:47 pm

    Could you elaborate on what you mean by the following passage:

    “In fact, the factoring of the number 15 into 3 and 5 is a bit of a parlour trick, as they design the experiment while knowing the answer, thus leaving out the gates required if we didn’t know how to factor 15. The actual number of gates needed to factor a n-bit number is 72 * n^3; so for 15, it’s 4 bits, 4608 gates; not happening any time soon.”

    In particular, how did they leave out gates? and how did that depend on knowing how to factor 15? Thanks!

    • Scott Locklin said, on January 21, 2019 at 10:54 pm

      Well, if you know what the answer is, you don’t need to perform 72*n^3 operations on the bits. You can perform the operations that say “3” and “5.” That’s what they all do. Sure, they do some quantum-ish thing in between and the answer comes out right: it’s not the real calculation.

      Since we’re only interested in factoring big numbers where we don’t know the answer in advance, this seems rather … deceptive.

      • victor yodaiken said, on January 21, 2019 at 11:03 pm

        In the earlier AI Triumph, in the 90s, there was a stunning success where an AI system at Yale was shown to be able to translate a paragraph of Chinese into English and back. Later it was discovered that the system was limited to that specific paragraph.

      • Dave said, on January 21, 2019 at 11:44 pm

        Are you referring to the original NMR paper linked below?

        Click to access 0112176.pdf

        In Figure 1a on page 15 (in the arXiv version above), they show the Phase Estimation circuit with the “modular exponential oracle” as prescribed by Shor’s factoring algorithm.

        In Figure 2a they show their compilation for a = 7 (see Figure 1 caption and main text on page 3) and how they “omit” gates, but this is standard compiling and independent of knowing the solution to the problem. E.g., any worthwhile compiler discards two successive CNOT gates or a CNOT gate controlled on a zero qubit.

        Is this what you’re referring to?

        • Scott Locklin said, on January 21, 2019 at 11:56 pm

          Did they do 4000 gate operations on their states? Nope.

          “The quantum circuit of Fig. 1 was realized with a sequence of∼300 (a= 7) spin-selectiveradio-frequency (RF) pulses separated by time intervals offree evolution under the Hamiltonian(Fig. 3). The pulse sequence is designed such that the resulting transformations of the spin statescorrespond to the computational steps in the algorithm.”

          • Kyle said, on April 16, 2022 at 9:09 am

            Hi Scott, sorry if this is a very basic question, but I don’t quite understand how you managed to get 4000+ gates. Shors algorithm claims to have a complexity of logN, so why does the equation you use to obtain the number of gates have a cubic term?

  14. zephirawt said, on January 25, 2019 at 7:57 pm

    See also The Case Against Quantum Computing, Gil Kalai’s argument Against Quantum Computers
    and Quantum strategies fail to improve capacity of quantum optical channels

    It comes as no big surprise, because the quantum computers are potentially fast but very noisy and they operate with low precision (low number of qbits). Whereas the classical computers are slower (at least in principle) – but their reliability and reproducibility is much higher. The computation power of quantum computers is given by product of processing speed and precision and it remains limited by uncertainty principle in the same way like this one of classical ones. At the moment when computational power of classical computers already hits its "physical limits", then the application of quantum computers cannot bring any improvement. The same applies to the bandwidth of quantum links.

  15. SlapJackson said, on March 14, 2019 at 2:05 pm

    Love your blog, Scott! What’s your take on this latest “breakthrough?”

    https://www.independent.co.uk/life-style/gadgets-and-tech/news/time-reverse-quantum-computer-science-study-moscow-a8820516.html

  16. mitchellporter said, on May 30, 2019 at 12:56 am

    “In principle, classical analog computers can solve NP-hard problems in P time.”

    Hopefully you understand that this is a red herring with respect to the complexity-theoretic question of P vs NP? which (if we are to characterize it in terms of physical paradigms of computation) is strictly about algorithms for digital computers, and whether there is a “digital” polynomial-time algorithm that can solve NP-hard problems. P time on a classical analog computer, on a digital computer, and on a quantum computer are different complexity classes (I guess they would be denoted P_R, P, and BQP, respectively).

    If your point is just to use analog hypercomputation as an example of a computing paradigm whose theory can be studied but which is impossible for physical reasons, that’s OK, but I wasn’t sure.

    • Scott Locklin said, on May 30, 2019 at 9:05 am

      Hey Mitch! The over all point is they’re all examples of theorists encoding infinite bit strings in ways which are meaningless when actual matter, and, like op amps and things are involved. Models aren’t reality.

      People regularly make the “P=NP on my protractor” error: everyone from physicists with soap bubble “computers” to U-Mass professors of unphysical neural net architectures. I just said this in a private thread with someone, but at this point the last 50 years of theorists have such an absurdly bad track record, one could frog march the lot of them to the firing squad, and it might actually improve the rate of scientific and technological progress. My lone possible counter example is the quantum Hall bunch, and none of them believe in magic computers either.

  17. DALE JAKE CORNER said, on June 6, 2019 at 1:16 am

    More over priced whirligigs. However unlike whirligigs the quantum computing, “ai”, and now cyber security insurance scams don’t ever meet their own standard of functionality. Everything I have seen on “ai” and “quantum computing”, which mind you does not exist and is not even what they are doing is called. Is like this whirligig, if it did not actually work. Or if the bell was replaced with a pre recorded ding noise repeated by a blue tooth speaker that played regardless if any wind was blowing.


    At least the above does what it proposes to do and what its designer and maker says it does. Unlike “ai” and “quantum computing”.

    I wonder if this is a consequence of stolen ideas, that were set to never work, by a man who knew he was being screwed.

  18. HankScorpio said, on June 25, 2019 at 9:04 am

    Any colour on Google’s recent announcement re their 72 cubit processor? I gather it is still bullshit?

    • Scott Locklin said, on June 25, 2019 at 9:19 pm

      All quantum computing, autonomous vehicle and “AI” announcements are bullshit until thoroughly proven otherwise.

      • Paul said, on August 12, 2019 at 6:39 am

        Definitely agree

  19. Tony said, on July 10, 2019 at 7:59 pm

    Nobel Laureate Robert Laughlin Dismisses Quantum Computing:

    Although he should make exception to quantifying multi-partite entanglement which cannot be dismissed as wrong or useless.

    Keep this blog post alive!

    • Tony said, on September 27, 2019 at 4:18 am

      If they can’t figure it out (which they won’t) then there is no risk to a cashless society, one based on digital currencies.

    • Tony said, on December 8, 2019 at 4:09 pm

      Just so viewers are aware as of December 2019, if you type in the title of the video, “Nobel Laureate Robert Laughlin Dismisses Quantum Computing” in the search bar for both YouTube and Google, you will notice that is has been shadow banned.

      • Matthew Cory said, on September 28, 2022 at 10:24 pm

        Yeah, I posted that video a while back. From what I have been able to understand, nobody has explained how basic thermodynamic limits will ever be transcended. Is it not true that the speed of an actual reversible computer scales linearly with heat dissipation? In other words, the speed of a computational step scales linearly with applied force and energy dissipation? From the Heisenberg E-T bound and the bound on total entropy over a complete computation (Planck-Boltzmann law), a QC is lower bounded in runtime by (h*S^2)/(k*T), where S is the number of steps, h is Planck’s constant, k is Boltzmann’s constant and T is the ambient temperature. The runtime bounds on Grover’s and Schor’s algorithms don’t look too impressive under this basic analysis. MT and ML bounds dramatically overestimate the speed of quantum evolution. I have never met anybody who could dispel this argument.

  20. Aram Harrow said, on August 1, 2019 at 9:04 am

    This objection is pretty incoherent. You imply the difficulty grows exponentially, and then admit that there is a constant threshold but don’t like the constants. You say there has been zero experimental progress but then the response to the recent experimental progress by IBM and others (which actually is steady progress over decades, see Schoelkopf’s review article for example) is that you are “not impressed”. You bring up a source of correlated noise as though no one has ever thought about it before. Legitimate worries about quantum computing exist but I couldn’t find any in this post.

    • Scott Locklin said, on August 1, 2019 at 10:56 am

      You seem to be objecting to things I didn’t say because you don’t like the conclusion. OK.

      The only difference between now and the late 90s is there are more people at the conferences. Otherwise, nothing useful has happened.

      • tgm1024 said, on September 10, 2019 at 3:40 pm

        I completely agree about your “nothing useful” statement. But there are two issues competing concepts here:

        1. Quantum computation supporters, who are /honestly/ trained in QEM, seem to have forgotten what it is they *themselves* have said.

        2. There is a belief system (which you’ll never unseat) that is emotionally carved in tungsten tablets. It goes like this:

        K = what we know
        DK = what we don’t know
        DKDK = what we don’t know what we don’t know

        And here’s where the above variables are just weirdly applied:

        The fact that DKDK is both forgotten /while simultaneously/ relied upon in the scientific community as a basis for reasoning “everything is eventually possible” is a conflict that I don’t think you’re going to be able to mitigate.

        So, “I hope QC is possible” and “I already know it can’t be” are in superposition.

        (Author runs for cover)

        • Scott Locklin said, on September 10, 2019 at 6:13 pm

          We’re using classical probability here, contra Penrose speculations; I think there is a small probability I am wrong and QC is right around the corner. It’s an extremely small probability.
          Just based on the preposterous lack of progress in other bullshit hyped fields; I think quantum computing is actually less likely to come before working Bussard Ramjets.

          • tgm1024 said, on September 10, 2019 at 7:51 pm

            As far as I can tell, Penrose runs into a logical ditch, because he doesn’t understand a fundamental principal regarding determinism. At least that’s as much as I can squeeze from his multiple writings. Here’s what he can’t quite grokk:

            A lower level abstraction can be 100% deterministic.
            A higher level abstraction (of that lower abstraction) need not be if the rules are confined to that abstraction level.

            1. We don’t need quantum anything to emulate the active neural connectivity of the mind. We just need unGodly fast (perhaps impossibly so) systems to have them behave 1:1 with a brain.

            2. Despite the first glance, the mind, however, is still “free” to choose a random number…no need for quantum anything. Why? Take this conversation from two PH.D’s, both named from Mad Comics in the 50’s.

            Potrzebie: Can I choose a random number if my brain is deterministic?
            Furshlugginer: Sure.
            Potrzebie: ????? How? You’ve contradicted yourself.
            Furshlugginer: At the abstract layer of my mind, I can choose, say, 5.
            Potrzebie: But that’s *still* based upon a deterministic process you idiot!!! The brain patterns themselves have no such thing as random numbers.
            Furshlugginer: Can you get angry?
            Potrzebie: Sure.
            Furshlugginer: No you can’t. The brain patterns themselves have no such thing as “angry”.
            Potrzebie: ……………………..(stammers)……..

      • Tony said, on September 22, 2019 at 5:52 am

        He’s a prof at MIT. Of course he’s going to object when you attack his field.

        • Scott Locklin said, on September 22, 2019 at 4:53 pm

          Maybe he should explain why one of his students is a waiter who delivers steaks for a living.

          • Tony said, on September 22, 2019 at 5:16 pm

            I’ve met some graduate students in this field. Some are now professors at other universities which I should point out that in Canada, unlike most U.S. elite institutions, scrapped the PhD written comprehensive candidacy exam. They will defend this decision in favour of focusing on research but this is nothing more than excuse for laziness. But the competency level in physics ought to be a concern especially when they teach physics, not mathematics.

            Recall the following statement, “It’s really easy, though, to create a bunch of crackpot narcissists who have the egos of Einstein without the exceptional work output.”

          • Tony said, on September 22, 2019 at 5:41 pm

            Because when you have been indoctrinated or strongly identify with a belief system, ie. libertarian economics, or neoliberal economics, etc., it’s very unlikely you will give your students the benefit of the doubt especially when you’re enjoying tenure as a professor. You won’t have to worry about the consequences until after you retire anyway.

            • Sally said, on February 23, 2023 at 2:06 pm

              What does “identifying with” “libertarian economics” have to do with not giving students the benefit of a doubt as a tenured professor? What are you talking about?

              • Tony said, on February 23, 2023 at 5:03 pm

                Identify with a belief system.

                I used libertarian economics as an example.

                If you’re a a graduate student and you write in your thesis that Keynesian fiscal policy is subject to the political parameters that lessens its effectiveness as a stabilization tool, and later in life impose Keynesian policy in your country as Prime Minister which worked — although critics would argue it would have been done better, only then to have your supervisor come out to defend that student citing “blaming the opposition parties” for forcing that policy is defenseless.

                Nevertheless, both schools of thought are fantasies.

                Get lost.

  21. Paul said, on August 12, 2019 at 6:36 am

    Excellent warning against believing stuff that nobody can explainbto you. AI is the same kind of scam. As for quantum computing, I see no use for a logic element that is in the o state and the 2 state simultaneously. You can’t do skwat with that!

  22. tgm1024 said, on September 10, 2019 at 4:02 pm

    3 Analogies come to mind:

    1. Anyone remember the irresponsible press and money that poured into cold fusion research?
    2. (Much harder) Anyone recall a PH.D here and there in the late 70’s insisting that they created a perpetual motion machine?
    3. Angle trisectors folks actually still exist. (Here’s a great article of how to handle them: https://web.mst.edu/~lmhall/WhatToDoWhenTrisectorComes.pdf)

  23. Tony said, on September 22, 2019 at 5:03 pm

    https://www.ft.com/content/b9bb4e54-dbc1-11e9-8f9b-77216ebe1f17

    https://fortune.com/2019/09/20/google-claims-quantum-supremacy/

    https://gizmodo.com/google-says-its-achieved-quantum-supremacy-a-world-fir-1838299829

    “We don’t have many details as to what calculation the computer performed, nor can we independently verify the Financial Times report. But previous proposals essentially involve the quantum computer racing a classical computer simulating a random quantum circuit. The achievement would not be a surprise—we’ve long known that Google has been testing a 72-qubit device called Bristlecone with which it hoped to achieve quantum supremacy. Financial Times reports that the supremacy experiment was instead performed with a 53-qubit processor codenamed Sycamore.”

    https://news.ycombinator.com/item?id=21029598

    • Scott Locklin said, on September 22, 2019 at 6:04 pm

      I think this likely sums it up nicely:

      And yes I consider someone named “isis agora lovecruft” to be a more reliable narrator on this topic than any of the sources you listed.

      • Tony said, on September 22, 2019 at 11:19 pm

        It’s disappointing to see intellectuals in other fields such as politics or economics debate more publicly about issues unlike in physics where there are vast echo chambers, physics education research is another.

        • George W said, on November 22, 2019 at 4:25 pm

          Thanks for this post. I used to be an aspiring PhD physicist, but changed majors to accounting after seeing the poor job prospects in physics. Quantum computing looked promising by principle, but it smelled wrong. The ‘quantum’ media hype especially reeked of bullshit. Did some research into it, and I eventually stumbled into this post which confirmed my suspicions. QC is cool, but so is an interstellar spaceship—there are serious engineering limitations with these ‘cool’ things that make them almost impossible to implement in real life. With accounting, or maybe a degree in statistics/Computer science, I won’t be constantly begging for more funding from the gov.

          I’m not IVY league talent by any means, but then again, maybe I’m at least a little more talented than these ‘intellectuals’ basing their careers on a field that has no useful applications. Either way, it’s good to know I won’t be waiting tables in the future.

          • Scott Locklin said, on November 22, 2019 at 8:16 pm

            For what it’s worth I have actually considered changing careers to CPA. It’s a nice moaty profession: the tax codes are absurdly complex, most people don’t want to this sort of work, preferring to feed their egos rather than their bank accounts.

          • Tony said, on November 23, 2019 at 2:00 am

            One does not go into physics for the monetary reward anyway. If you’re seeking a career with useful applications, you should have considered engineering? Most graduate students I know that did not end up in academia went into software development, R&D, and finance, etc.

            The OP was focused on a single instance of a graduate in quantum computing who ended up being a waiter. But that’s besides the point.

            • George W said, on December 10, 2019 at 4:37 pm

              I ended up switching to computer science w/ minor in stats instead. Didn’t realize that CPA requires both 150 credit hours and 2000 hours of work experience. CS + stats is more enjoyable than tax codes and it’s more lucrative,,,

              But why do people go into physics if not to teach? Nothing prevents someone from buying used textbooks and self learning physics to a graduate level for the hell of it if that was their goal.

              From what I can see, physics research is ‘bottle-necked’ by money, engineering, and chemistry/material science. Not the quantity of people available with a PhD. Incremental advancements in engineering and chemistry/material science allow us to conduct new experiments in physics which yields new data that can be analyzed to incrementally revise theories. This is oversimplified, and maybe flat-out wrong, but it’s essentially the reason I won’t go into a physics profession. From my view, there’s nothing I could add to this process.

              I used to think that people were still doing ground breaking research with chalkboards and garages. But today I realize that research is incremental and occurs through coalitions of people with specialized tasks and millions of dollars of equipment. Quantum computing was something I thought would ‘breakthrough’ in the garage of a bright physicist.

              It may be better to save 20m as a software engineer (70k/yr, 8%, 40 years) from investments and carefully donate some of it to the research cause of choice…

              Like bio-computing, quantum computing’s older cousin, behaah!

  24. […] it was one of the biggest stories of 2019 and will probably continue to be discussed (but see this post saying it all might come to nothing). Something almost as mind-blowing is happening with in-memory […]

  25. arash said, on May 21, 2020 at 7:39 pm

    There’s a something misread about the quantum computation’ usefulness; logistics. the quest for optimisation is what is driving the markets towards this.and what is keeping Dwave afloat -in your opinion mysteriously the current algos (Montecarlo spin-offs) are at their limit and human intuition is at play in the top managerial decision making positions. RSA breaking is not the goal for quantum computing at the moment but optimisation is and optimisation has already changed the semiconductor framework, e.g Fujitsu’ DAU

  26. Yann Kervennic said, on June 25, 2020 at 3:22 pm

    Great reading. I have been post doc for a while in a connex subjet, single molecule electronics in “famous” labs also working on quantum computing (and graphene, the new wonder material that elon musk will undoubtedly be using to travel to mars), namely Leo Kouwenhoven lab in Delft and Per Delsing’s in Sweden.
    In the end i’ve end up being a warehouseman lifting heavy loads at 6 o clock in the morning for the” bare subssitanse minimum (and that’s a miracle because I have been leaving for years with no jobs and no money).

    All this is complete crap on the job market and all the recruters i have met assume that you must be complete wacko to waste your youth on such subjects. I guess they are right.

    But I ‘d say this is not particular to research. This industrial society, due to its development is on the verge of collapse and idiots are selected by this mass society to lead the way to extinction; I see the same with agriculture now that i am rushing to change tracks. People are racing to kill pests and creating always more pests. Idiots with huge loans and big machines get always bank and state money, whereas those who can actually grow with much less capital using clever organic techniques are left with little.

    But you have to keep faith: nature will finally tidy the mess. Covid is just the beginning and money won’t flow forever.

    • Scott Locklin said, on June 25, 2020 at 6:48 pm

      I know a lot of guys who got taken in by the shiny stuff around touted nonsense like nanotech and QC. Some ended up as software engineers, which at least pays well, even if it turns you into a Houllebecq in the long run.

      We share a taste in music, and I hope you drop in here once in a while to let me know if you manage to get a farm working. Medieval Frenchmen on the farm did pretty well as I understand things. I’m interested in this sort of thing for my own long term future plans.

    • egsy said, on March 25, 2022 at 3:06 pm

      > People are racing to kill pests and creating always more pests. Idiots with huge loans and big machines get always bank and state money, whereas those who can actually grow with much less capital using clever organic techniques are left with little.

      You also get labs promising to solve climate change by breeding crops to sequester more carbon or whatever, sucking public funds for free R&D and greenwashing for Bayer/Syngenta, pissing out GWAS after GWAS, throwing phds and computers into the bottomless high-throughput phenotyping pit. All performing utmost concern, oh we must do this noble work to feed/save the world, by the way who wants to fly out to Minnesota next month to go drinki- er discuss emerging technologies for precision agriculture.

      We would be better off dropping stacks of cash out of airliners at random. At least that might occasionally get money into the hands of a farmer who can put it to decent use.

  27. Gaw said, on June 27, 2020 at 10:24 am

    https://singularityhub.com/2020/06/22/a-new-startup-intends-to-build-the-worlds-first-large-scale-quantum-computer/

    A New Startup Intends to Build the World’s First Large-Scale Quantum Computer

  28. pleasehelp said, on November 3, 2020 at 7:56 pm

    What do I do for a career that pay good?
    I come from low income family, and had interest in quantum computing for awhile, and now I read article and I don’t know anymore. I don’t want to be work casino jobs for many years like my parents are

  29. Suhaib said, on December 5, 2020 at 7:37 am

    Hi Scott – This was published a couple of days ago:

    https://www.nature.com/articles/d41586-020-03434-7

    Reminded me of this post of yours so I thought to get your opinion on this.

    • Scott Locklin said, on December 5, 2020 at 1:29 pm

      Great triumph; they should have no problem doing something useful with their machine that isn’t a physical implementation of boson sampling; say, taking the square root of a large number. Not holding my breath though!

  30. gmachine1729 said, on December 7, 2020 at 4:12 pm

    http://disq.us/p/2do42o3

    Seeing the comment linked above on the blog of high energy and black hole theorist Steve Hsu (who researches genes and intelligence, an application of machine learning essentially on the side) reminded me of this blog post of yours. And yes, though I’m not actually qualified to judge, I do believe you. I did learn quantum mechanics theory on my own after undergrad, and as an undergrad, I interacted with some quantum computing theory people. Honestly, before one touches stuff like quantum computing, one should probably learn quantum mechanics as a physicist physicist would. In retrospect, it seemed like many of those quantum computing people were too lacking in real world physics knowledge/context, which is associated with doing “science” in a very out of touch or too idealistic way. Speaking of which, Steve Hsu’s wanting to do genomic prediction for intelligence is actually way more realistic than quantum computers. Yet, a quantum computing theorist reacted towards this with “sounds tough”.

    As for that Disqus commented, it pointed to some recent quantum computing “advancement” reported on Nature that I’m skeptical of. I much dislike the emphasis on bullshit marketing of science nowadays.

    https://www.nature.com/articles/d41586-020-03434-7

    Also in an above comment, you mentioned how similar happened to the humanities decades ago, with many PhDs in English knowing nothing about history of the language and not having read much. I think with a subject like physics, which unlike mathematics is not absolute truth and requires much implicit context of more ambiguous nature, knowing the historical development is essential. Otherwise, it’s very hard for a lot of physics to actually make sense, deeply. You will get nowhere serious in learning quantum computing if you don’t already know for instance the basics of scattering theory, which is more phenomena based. There is also that the quantum mechanics textbooks are way too formal and mathematical. I remember as an undergrad, I could understand the math in it, but I couldn’t understand how it actually corresponded to reality. That problem was only solved when I did a ton of high quality reading on the historical development of physics, which enabled me to understand where the bottlenecks were in the different stages of development of the field.

    I also struggled to understand special relativity as an undergrad. I was so dumb then. But after more intellectual maturity of a more organic nature (by this I mean brain maturity, which on average ends at age 25) along with more background knowledge, it became easy, and I for the most part independently re-derived Lorentz transform and E=mc^2 myself. There’s a lot of bullshit in how they teach that stuff that it’s better not to ever expose yourself to if you want to understand it. I realized in the process of learning the background E&M and its historical development that Einstein didn’t actually 100% come up with it himself. Lorentz, Larmor, Fitzgerald had also figured out the Lorentz factor and length contraction based on ad hoc hypotheses, which Einstein was aware of. The genius of Einstein was to come up with a systematic, first principles explanation of it.

    Something else is that my friend who knows theoretical physics extremely well doesn’t like Feynman Lectures. He considers the tone too informal and redundant. I saw the part in Feynman Lectures on sound waves, and I felt the same. On the other hand, Landau Lifshitz is a much more serious and formal exposition of physics, substantial and concise at the same time. Maybe the author of this blog, Scott Locklin, feels the same. I’d be curious to know.

  31. Carlos Rivera said, on January 4, 2021 at 11:36 pm

    I’m researching quantum computing on my own as a hobby, in conjunction with classical computing (and building actual circuits on a breadboard) and admittedly I have not yet finished reading this article, but I could not wait to say that this article is incredibly insightful and well-written!

    You write about the mistakes made by people with PhDs in quantum computing make. This makes it convenient to learn from other people’s mistakes! I must say, however, that even though an SR-71 was inconceivable in the 1760s, it was not impossible in principle. The problem with PhDs is that their contemplations jump beyond what is technologically possible. Instead of learning more about how to construct a quantum circuit, by first learning how to construct a contemporary circuit from scratch, from making the initial specifications to the actual implementation of digital components into integrated circuitry, to using a VHDL to specify the behavior of the circuit, many of these PhDs appear to be outputting nothing but ideas. Has anyone decided what kind of programming language can be used to make quantum computer software?

    • Scott Locklin said, on January 5, 2021 at 1:37 pm

      There are dozens of QC programming languages. Perhaps hundreds by now; when I was originally studying this, back in the 90s when it looked like it might become something real, people were writing them. Seems premature! Fortran didn’t exist until some time after ordinary computers!

  32. Gaw said, on February 17, 2021 at 9:25 am

    Was just reading this when notification of your reply hit my inbox. Totally off topic but feel sure you’ll enjoy it:
    https://www.thetimes.co.uk/article/brigadier-jack-thomas-obituary-39gpdc9tf

  33. Brian said, on January 6, 2022 at 11:29 pm

    Quite a long read, looks like you covered all bases – I really enjoyed it!! The anecdote from the 13th century is very fitting – wow – It’s truly an amazing state of affairs. Then I wonder, are the Quantum Mechanics as clever as Nasreddin, or do they actually believe their own hype? Is it a self indoctrination / self delusion, or are some Quantum players as consciously aware of the nonsense as Nasreddin? I get the feeling it’s a it’s a Stonehearst Asylum situation, as the lunatics have managed to navigate themselves into the favourable position of being in charge of the asylum.. or maybe,.. it’s actually just me… 🤔

    Anyways, I’ll leave this one with you (currently under review with Elsevier), as I assume you’ll be interested in another article in support of your argument:

    https://arxiv.org/abs/1601.02569

  34. […] this point you’re probably thinking about quantum computing but my advice is to forget it. It’s been just around the corner for at least 25 […]

  35. PeterCottonPhd (@PhdCotton) said, on September 16, 2022 at 6:17 pm

    I admit I winced at “sperg-lord” (definitely offensive) but I understand it is part of the flourish of an immensely entertaining read.

    • Scott Locklin said, on September 16, 2022 at 8:26 pm

      Fake subjects are more offensive than saying “sperg lord.”

      BTW I assume you’ve seen the Perdition Burning and Flames book.

  36. Dodge said, on September 30, 2022 at 7:35 pm

    At least some physicists are catching on… https://archive.ph/PBHhe

    What do you think of this article?

  37. jeromepow said, on December 30, 2022 at 7:37 pm

    https://quantumfrontiers.com/2017/07/14/the-world-of-hackers-and-secrets/

    I think all you have to read is the above article. It shows the quality of academics in this field.

    • Scott Locklin said, on December 31, 2022 at 4:59 pm

      That article, the responses by the author and the entire caltech associated website with dopey girl-boss type dipshits yammering nonsense and poasting their vacation photos are completely bonkers.

    • Tony said, on May 4, 2023 at 2:46 pm

      The quackery never ends.

      DR.SHIVA: WHY MODERNA-IBM GENERATIVE AI & QUANTUM COMPUTING WILL STILL CREATE UNSAFE VACCINES

      https://www.bitchute.com/video/MperStUSIRCu/

  38. pinidaes said, on April 18, 2024 at 9:02 pm

    Thought you maybe would like this https://www.tomshardware.com/tech-industry/quantum-computing/commodore-64-outperforms-ibms-quantum-systems-1-mhz-computer-said-to-be-faster-more-efficient-and-decently-accurate

    The paper here http://www.sigbovik.org/2024/proceedings.pdf

    “Quantum computers are just a bunch of atoms in a trench
    coat pretending to be a computer, except the trench coat is an
    electromagnetic field, and we’re the ones doing the pretending”


Leave a comment