Locklin on science

Myths of technological progress

Posted in Progress by Scott Locklin on September 1, 2009

I’ve been doing some writing for Taki’s magazine. Most of this writing will be irrelevant or annoying to people who are interested in Science, Technology and Finance. Taki is an old school conservative who is mostly interested in social and political issues. The article I wrote for him does have social and political implications, which is presumably why he was kind enough to publish it.

I think this particular article is also quite germane to this blog, as it’s about the disturbing slow down in the rate of technological progress we’ve experienced in my lifetime. I realize this is going to be controversial, and it may seem insane since you’re reading about it on my personal little network broadcast system, but it is an important subject worthy of your consideration. Technological progress isn’t what it used to be. Oh sure, the direction is remains generally positive (though not always), but the rate of new innovations and human power over nature seems to have slowed considerably. I don’t think I have any simple answers as to why this is so, but I want people to think and talk about it. As far as I know, there are no voices anywhere in the blogosphere or any other popular media which are saying this, excepting perhaps Charles Murray. Someone has to start the conversation, and that someone apparently has to be me. Sure, you could dismiss this idea as depressive rantings caused by economic apocalypse, but I actually have had this idea in mind since 2003 or so, when I was procrastinating writing up my Ph.D. thesis in the LBL library. I spent a week or so reading the yearly journal, “Advances in computers” -a wonderful experience that was probably more valuable than writing my thesis up. It was also humbling in that there have been very few recent advances in computers which aren’t simply advances in lithography.

If you’re a scientist or technologist, obviously this should be an interesting subject for you. If you’re a financial type: this should be even more interesting for you, as economic progress is ultimately fueled by technology and improvements in human power. I consider it the most important subject of our time. Why aren’t we doing better?

I’m impressed and humbled by the caliber of people who have read and commented on my blog, so I hope some of you take the time to read and think about this short piece, and maybe leave me some useful comments for or against my thesis. Because I directed the article at a popular audience, I skipped over a lot of technical details. Obviously computers have made possible innovations like Supply Chain and Operations Research type things, which are a form of progress, but not a revolutionary one: things aren’t really much different. Keeping less product on shelves isn’t really progress to my mind. Inventing, say, double entry book keeping: that made stuff incredibly different from how things worked before. Inventing trains and trucks to move product around: that’s real progress too. Blogs are … well, they’re a sort of democratization of technology: very positive, but not exactly progress. Some wise-acre I spoke to had the nerve to use Twitter or other social networking websites as an example of “progress.” If you are tempted to do so, I would assert that you don’t know what the word “progress” means. Anyhow, here is the article:




Edit add: a friend subsequently told me about this guy, who has some similar thoughts on the subject, though he is talking about the “singularity” specifically. Particularly useful is a
PDF timeline charting progress in his and his grandmother’s lifetimes.


Edit Add again:
This article (Thanks John!) examines some reasons why people can make very rapid progress, and why we aren’t so much right now. It also advocates for the use of rapid prototyping tools in research problems: something I’ve dedicated a good fraction of the last two years of my life to:

Edit Add^2:
A friend recommended “Towards the Year 2018” by the Foreign policy association. I reviewed it on Amazon, and reproduce my comments below:

Hilarious view of our decline in technological progress

I recently wrote a magazine article on how the last 50 years of progress haven’t been particularly spectacular. A friend who has actually been around for the last 50 years and involved in the development of new technologies in that period of time recommended I read this book for a view into how people were thinking in 1968. I guess it’s easy to laugh at predictions of the future, and there is a whole lot of hindsight bias in this sort of thing, but this book is too funny to pass up a good natured chuckle at the whole thing.

This book gets an astounding amount of stuff right: they knew that communications technology would improve a lot more than it had. They knew that cheap international flights would change immigration and nationality forever. They knew that people would become more “open about their feelings” -though they had no idea that this would be largely a bad thing. They knew that nations might attack each other without identifying themselves -though they didn’t quite grasp the concept of non-state actors doing the same thing. They knew the United States (which was probably at around self sufficiency at that point) would be out of oil by 2018. They knew microelectronics would improve tremendously. They knew nuclear proliferation would be an important international issue in the future. They also seemed to realize that Fusion and Solar power required huge technical breakthroughs to become practical sources of energy. Finally, they contradicted the widespread idea that overpopulation would cause mass starvation at some point. They were correct: this still hasn’t happened.

Here are some bold predictions which did not come true. One of the authors postulated amazing breakthroughs in physics that never occurred: energy storage mechanisms making possible “disintegrator guns,” anti-gravity technology, they thought robots might fight bloodless wars. I don’t know why this guy thought this kind of insanity might happen (and he did hedge by saying he saw no way these things might happen, but he seemed to think they would anyway). Presumably too much television. Others postulated hypersonic air travel. The picture phone was a fun one; while it was certainly possible by the date they estimated, I guess they underestimated human nature. The chapter on weather and climate control is hilarious. They did worry darkly that adding too much CO2 to the atmosphere might have some effect -but they seemed more interested in actually engineering climate and weather in those days. Now a days, such talk seems like total madness. They also worried about a lot of other climate issues which it seems all get rolled into “global warming” now a days -that sort of speculation gives one pause. Have we eliminated these things? Is carbon dioxide more important than dust bowls and ice ages? I don’t even know how to know this, but it bothers me that they asked such questions in 1968, and everything dealing with climate now a days is deeply politicized. I guess they were right about the idea of weather becoming political, if not the ultimate way it happened. Self repairing machines? Um, no; we don’t have this yet. Nor are we likely to any time soon. The population estimates were ultimately very high. As for widespread exploration and exploitation of under seas resources: this never happened either. We pretty much abandoned the deep oceans in the 1950s, with the abandonment of Bathyscaphe technology. Human beings haven’t been back to the ultimate deeps since then.

I guess it’s wrong of me to lump all the predictions together, as they were made by different sets of experts per chapter, but since they’re all in the same book, I leave it up to the reader to sort the sheep from the goats. This book is really a remarkable document of how huge the technological changes were in the period from 1918 to 1968; they merely assumed the rate of change would remain unchanged. Well, as it happened, progress slowed down rather a lot. “

46 Responses

Subscribe to comments with RSS.

  1. PJ said, on September 1, 2009 at 2:48 am

    The biggest advance I see coming is the 3D-printer, which promises to turn bits into atoms in a much more efficient and useful fashion than before.

    Also, it could maybe be argued that the GPL is somewhat revolutionary due to its social effects – though admittedly that may be due to its timing. Previously there was no use for such a thing because atoms mattered more than bits. Now that they’re on more of an even par, the GPL is an effective mechanism to help accumulate public work and stop the loss of intellectual property into the depths of failed corporations.

    • Scott Locklin said, on September 1, 2009 at 6:10 am

      Thanks for your reply. I think it’s a common conceit of our kind that “bits are more important than atoms.” I don’t think bits are important at all. Software will at some point be looked at like any other kind of idea: public domain, more or less. In Cardano’s day, they kept solutions to cubic equations secret. Anyway, I do not think the GPL is technology; it’s a legal innovation; one which some may argue is harmful (I prefer other kinds of open source license).
      Solid printing has gotten a lot of press. I mistrust things that get a lot of press without actually changing anything. Nobody talked about the internet until it was already here. I used both the internet and solid printing long before most people did: the internet was obviously going to change a lot of stuff. Solid printing won’t change things any more than CNC machines did.

  2. Brian C Potter said, on September 1, 2009 at 4:55 pm

    A few more recent developmets:

    MRI (not to mention PET scan, CT scan)
    Li-ion batteries
    Genetic engineering (along with gene sequencing, PCR)
    Industrial Robots

    This isn’t considering things that were introduced before 1959 but didn’t have widespread effects until afterward (like the interstate highway system, or the shipping container) And it isn’t considering the advances in computing and communication, which really have made enormous changes (I think your article is fairly uncharitable in that regard).

    • Scott Locklin said, on September 1, 2009 at 7:53 pm

      Hey Brian; thanks for weighing in. I just got a CT scan for appendicitis. While CT is a wonder of signal processing and technology, the main effect on my treatment was the doctor didn’t have to stick his finger up my ass. I’m very grateful for this of course, but it’s not a life-changing thing.

      I agree with you that shipping containers were an important invention, and of course there were many other labor saving things which made shipping product a lot cheaper than in the past, but it doesn’t really change things. What I see as an earth shattering invention is something like the jet airplane or the telephone, or, say, antibiotics. Those changed things such that life before and after is unrecognizable. The laser is a neat thing (really invented in the 50s in microwave form -C.H. Townes says so, so it ain’t just me being argumentative), but it has not changed life beyond recognition. I mean, when I was a kid in the 70s, all the visions of the future featured lasers everywhere, but it didn’t work out that way, now did it?

      It may seem like I’m being dismissive of the last 50 years of technological development, but really, I’m just thinking about the previous 50 years, which were literally earth shattering. What invention between 1959 and 2009 is as insanely important as the atom bomb? Container ships don’t quite stack up.

      • Brian C Potter said, on September 1, 2009 at 10:47 pm

        A few concessions, a few rebuttals.

        Re: MRI – I think you’re brushing this one off a bit easily. The ability to look inside a living brain, along with the development of cognitive psychology (another post 50’s creation) has taught us more about how the brain actually works in the past 40 years than we learned in probably the previous 400.

        Re: Lasers – They actually are everywhere, almost a billion are sold annually: http://en.wikipedia.org/wiki/Laser#Uses , though none of those uses by themselves is probably earth shattering.

        Re: Atom Bomb – You’re right about this, but it might be an outlier – it’s not often the laws of physics allow such an enormous jump in the energies we have access to. The only thing even comparable might be the invention of fire.

        In general, I’d say you’re right in that our ability to manipulate the physical world in the past 50 years hasn’t matched the previous 50. I can think of a few reasons for this:

        1) All the low hanging fruit is gone, only incremental improvements remain. (At least until we gain mastery/understanding of a lower level of physics)

        2) We won’t know how much things have really changed until history makes a determination – it’s hard to see change when you’re in the middle of it.

        2) The previous 50 years (1909-1959) were an outlier because of two world wars, which caused enormous amounts of research dollars to be used with the goal of being able to control the physical world (ie: blow stuff up) on a large scale.

        3)The advances of the information age have obviated the need for a lot of such manipulation. We don’t need new, superfast aircraft when we can have a fleet of predator drones (prop-powered, no less!) constantly hovering over the battlefield. We don’t need new, bigger bombs when we can guide small ones with perfect precision. We don’t need expensive space programs when we have advanced probes and high-resolution telescopes.

        These all seem plausible – the truth likely lies in some combination of all four.

        • Scott Locklin said, on September 2, 2009 at 1:41 am

          Medical imaging is certainly useful in trying to do science, but … color me unimpressed with neuroscience. Mapping the visual system is all very well: try modeling how a sea slug’s 4-5 neuron brain works. Just try it! Anyway, this is a discussion of technology, not science.

          I don’t think atom bombs are outliers: jet engines, antibiotics, controlled fission, radar, the tank … these are all comparable inventions in their impact on humanity which happened in the same 50 year time period. I also do not think 1909-1959 was an outlier. Consider 1859-1909: an even more insanely innovative period, as was the 50 years before that: even more so. I’m not real well acquainted with technologies in 1759-1809, so I can’t comment on that era, but any 50 year period of the previous 150 years have been a lot more impressive in inventiveness and technological innovation than the last 50 years.

          I don’t agree with “information age” ideas making things unnecessary. That’s defeatist. I also don’t think things are appreciably different: we had remote control airplanes (they’re not really pilotless) a long time before the information age. Certainly we had them in common military use in 1958 -the F-106 is a great example of one. if we use more of them now, it’s because they’re cheaper, or it took us this long to figure out what they’re good for.

          I admit that history hasn’t shown what things are important: but it’s worth noticing that nothing important has happened lately. I also admit that things such as biotech could one day be very important in ways comparable to antibiotics or the atom bomb. But, remember, this technology or group of technologies has been with us a good 25-30 years already. Biotech was a huge speculative bubble in the 1980s. People keep telling me it will be important. It certainly provides jobs for friends of mine, so it’s doing something useful enough, but … is it really important? Is the type of thinking we do on such technologies actually fruitful? The fact that I can ask such questions means it’s not clear at all. In 50 years, we might find that a very mundane innovation, say, the peptide synthesizer came up with a lot more interesting drugs and helpful things than all of biotech combined. And that 50 years hence, perhaps not much will be different in medicine or whatever else biotech is good for. That sort of thing rather bothers me, which is why I’m making noise about it.

  3. DefunktOne said, on September 1, 2009 at 7:50 pm


    What if we reach a “singularity” of sorts where humans and computers become one or computers surpass humans? Theory has it that this group would seek to constantly expand their capabilities much like Moore’s Law with microprocessors, leaving the rest of us mortals in the dust.

    Progress would then take on forms unimagined by humans. Could the current lull be the quiet period before the storm?

    “In 1965, I. J. Good first wrote of an “intelligence explosion”, suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences. The first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to an exponential and quite sudden growth in intelligence.”

    • Scott Locklin said, on September 1, 2009 at 8:12 pm

      Well, I sort of alluded to that as “secular religion for nerds.” Derbyshire described it as The Rapture for Nerds. (def check his article; lots of helpful links to an IEEE spectrum conference on the subject) Computers pretty much suck. I think it’s silly to expect them to even get as smart as a fruit fly any time soon. We could have some sort of AI revolution, but there is nothing brewing at all which indicates this is going to happen. AI which works can be accurately described as signal processing. For example, Boosting is a popular technique at present. It’s based on an idea which was implemented in missile hardware in the 50s (Stochastic computing).

      • DefunktOne said, on September 1, 2009 at 9:44 pm

        The only reason computers suck is because of the ways people have imagined to put them together. Blame that on the imaginators.

        Rarely does anyone create something new. Rather, they figure out a new way to put the same pieces together making in significantly more useful or elegant. Music and art are great examples.

        Anyway, maybe the low hanging fruit has already been picked, thus, the lull in progress. If we would stop wasting billions on bailouts of crummy auto companies and insolvent financial companies, maybe we could fund a load of new ideas to get us out of our funk! 😉

        • Scott Locklin said, on September 2, 2009 at 1:46 am

          Well, I am blaming the imaginators! I can imaginate an era in which progress is much more, well, progressive. I remember when I was a boy, I imagined the year 2001 would see me in a hovercraft, watching people fly around jupiter, and talking to my sentient computer. Instead it’s the same as when I was a boy, except I got a TV looking thing in my house which allows me to write letters to people who live far away.

          Maybe it’s true that “low hanging fruit” is picked. Maybe we’re just stupider and more cowardly than our forefathers. But, doesn’t it bother you that things don’t get appreciably better? I mean, great: blogs, online movies; I’m glad for such improvements. I just expected more innovations: I mean, we’ve got more Ph.D.’s and technologists alive now than in all of human history before -you’d think they’d come up with a few things comparable to atom bombs and antibiotics.

  4. Warren said, on September 2, 2009 at 4:47 am

    So I think we need to take a step up the technology development chain and look further upstream, and further long term. Any technology that’s eventually commercialized likely has a long and interconnected lineage of development, frequently originating in an R&D lab somewhere. We’re still reaping the benefits of the R&D dollars invested in the 50’s and 60’s, for crying out loud. The obsession with quarterly results has led to a culture where the time horizon for research payback is now measured in months rather than years or decades. Exhibit 1 – Bell Labs:


    The news isn’t all bad, of course. There are still a few ambitious, multi-decade research projects that hold great promise for unlocking generational advances in science in technology. The ITER fusion project and the Large Hadron Collider at CERN are good examples. I’d feel a lot more comfortable about our prospects for continued technological breakthroughs if this sort of ambition was more widespread.

    • Scott Locklin said, on September 2, 2009 at 7:44 pm

      Hey Warren. Yes, the loss of Bell Labs was a great loss for America. However, “Big Science” might actually be one of the causes of the problem. While some of the great innovations of the 40s came from Lincoln Labs and places like that, those were much, much smaller than the present national lab system. For example, the “Skunk Works” of Kelly Johnson was a fairly small group, and they did tremendous things for aerospace.

      Certainly science is bigger now than it ever was in the past, when humanity made its really huge strides forward. I myself worked for the original national lab, LBL, for almost 9 years. This is a very emotional topic for me, but I can attest to the fact that you become a cog in a vast unpleasant machine. Want to publish a paper outside your program tasks? Unless you are a political creature, you will be ruthlessly crushed. And, honestly, I’d say I had a fairly pleasant career there compared to most people. As such, I do not think things like ITER or the LHC are going to get us far. While this is another topic, I question the value of unification physics in our age (the reason we built the LHC). E + M got us the modern world, but EM + W … I daresay nothing will ever come of that. A huge fraction of physicists are working on string theory and LHC boondoggles. Why? Compared to something like mesoscopic physics or collective effects, these are tremendously boring areas of study; ones which lend themselves to bureaucratic sclerosis. The entire field of string theory seems to be 10,000 people dotting Ed Witten’s i’s. That’s absurd. It should be 100 people, or 10 people. That 10,000 people should be thinking about other kinds of problems.

      I don’t know why NASA in the early 1960s was so innovative, and why NASA from 1967 to the present has been a complete boondoggle. Examining the sociology of NASA in that era might be informative.

      Also: one of the horrors of national labs -there were these old guys who were there in the glory days. Usually they were uneducated men; working class machinist types of guys who cut their teeth in the Navy. They were capable of building just about anything; these guys were really incredible artists. Not only are these guys dying off without passing on their knowledge to new generations, nobody cares! When I was there, there was this guy Noel Kellog. I could ask Noel to build me a cryochamber with electrodes and X-ray windows in it for keeping a torr of helium in a UHV system, and he’d go away and build the freaking thing from a piece of aluminum, he’d come to me a few days later, and it would work. There is no guy like that now. Now I have to go through a CAD designer, who will send it to a CAM shop, and the thing will come back with something missing, like no wires to hook up to the electrodes. Instead of employing one guy for a few days, the budget will get chopped up to a half dozen people, it will take months instead of days, and I won’t get what I want anyway. Sure, they can build things now that they couldn’t have before, but there is something missing in the food chain: the guy who can screw around in the machine shop and build something that works from a pencil sketch. There are guys who want to do that, who are capable of it, but that sort of career path isn’t available to them any more.

  5. Brent said, on September 8, 2009 at 6:03 pm

    It will be difficult to finish reading a post that makes a claim that no one, let alone the author, can make with any authority. Even if the definition of “progress” or just “technical progress” was objective and precise, which it isn’t, nor could it be, the scope of the assertion is beyond absurd. Hubris.

    • Scott Locklin said, on September 8, 2009 at 6:14 pm

      The scope of the assertion isn’t beyond absurd: people write books on the subject asserting the opposite all the time. If you disagree, you might try pointing out something which has improved in the last 50 years which is a bigger leap forward than, say, the atom bomb or antibiotics. I maintain that your inability to do so is good evidence that the last 50 years have been a massive technological disappointment.

  6. Gaw said, on September 9, 2009 at 3:47 am

    Thank you. Your blog is full of interesting ideas. This latest thought-provoking post
    also provoked a modest post on my part:


  7. Melvyn said, on September 26, 2009 at 4:44 am

    After reading this article, I realised you really didn’t read and/or understand Nassim Taleb’s Black Swan book and just wanted to be be a twerpy attention-seeking pseudo-quant. Practically nobody in the academic universe believes in the CAPM model these days. The belief that returns are distributed normally is far more clownish than the belief that risk cannot and should not be quantified. Try doing some real research before you make a fool of yourself again.

    • Scott Locklin said, on September 26, 2009 at 5:25 am

      Thanks for your substanceless comment, ding-dong. Had you actually managed to say something pertinent or intelligent, I might actually be a little embarrassed. Cheesing off a turd flinging monkey: that’s just a job well done.

      • Melvyn said, on September 26, 2009 at 5:33 am

        Yep just as thought. Nothing to say about the normality assumptions. Some existentially-challenged dork looking to take jealousy-laden potshots at something he knows very little about. Next!

        • Scott Locklin said, on September 26, 2009 at 5:58 am

          Nothing to say about normality assumptions? Did you say something clever about normality assumptions? If you did, which you didn’t, why would you do so in a thread on the history of technological development? Why don’t you go look at the nice video, monkey boy?

  8. John F. McGowan said, on September 29, 2009 at 12:58 am

    There has been poor progress in power and propulsion technology since about 1970. This can be measured by metrics such as the top and average speed of transportation vehicles, cost of transportation, and so forth. Fairly objectively.

    There was substantial and fairly steady progress in power and propulsion technology from about 1775 when the separate condenser steam engine was invented until about 1970. This involved not only incremental improvement of various technologies but a series of technological leaps including the development of the separate condenser steam engine, high pressure steam engines, internal combustion engines, jets, liquid propellant rocket engines, and nuclear power. The minimal progress in power and propulsion technologies is very important because this is one of the main drivers, if not the main driver, of the standard of living.

    One probable reason for the slowdown is that there is a huge amount of trial and error in the development of new power and propulsion systems. Huge devices like tokamaks, modern nuclear reactors, full scale rockets, and so forth are enormously expensive to modify or build from scratch. So it is very difficult to replicate the thousands of trials that historical inventors like James Watt conducted on often tiny prototypes to research and develop new technologies. It simply costs too much and it is extremely difficult to secure authorization or funding for the radical changes that are usually needed to overcome the fundamental problems that cause prolonged plateau’s in performance. Indeed, fusion/tokamak research seems to have responded to each setback by building an even larger, more expensive and costly tokamak.

    I have written more on this in my Space Review article “Cheap access to space: lessons from past breakthroughs”




  9. Scott Locklin said, on September 29, 2009 at 9:18 pm

    I may have actually read your essay in preparing my own article. I agree that cheap trials are extremely important in innovation. I think some of the other essays above talk about this also. Steam technology made it easy to try out new ideas; just hack up a bunch of brass on the lathe and see what happens. Same with high level programming languages, versus writing all your code in opcodes.
    This dovetails nicely with my assertion that “big science” is a part of the problem. When I worked at the Advanced Light Source, I was in a room with 200 smart people and countless bureaucrats, all dedicated to the “program task.” To actually do something creative, you had to claw your way to the political top of that 200 people, to say nothing of the user community. For myself, most of the interesting stuff I figured out in my time there was a result of my screwing around on things that had nothing to do with the program task. Of course, the political juice needed to pay for all those people had its own agenda, which wasn’t necessarily discovering new things. There was no rapid prototyping there. Everything involved at least a half dozen people one way or another, and most of the projects there were developed on a timescale of years. Not many people are willing to take a career risk on a multi-year time scale.

  10. Arthur said, on September 29, 2009 at 11:06 pm

    May I offer a few half-baked, but possible reasons that scientific progress may have slowed

    1. We experienced a rapid advancement of technology when we first learned to harness the power of naturally-occurring materials like coal, oil, conducting metals, and semi-conducting metals. We have picked the “lower-hanging fruit” and now advances will come more slowly, and will rely on new materials of our own creation. Because this time around, we have to invent both the machines AND the materials, progress will be slower.

    2. The next step in advanced machines and gadgets would require so much energy, that they would pose unacceptable risks to society. For example, nuclear power in automobiles, or liquid nitrogen coolant in your cell phone.

    3. A scientist must first master the accumulated knowledge of his discipline before advancing beyond this, and innovating. The size of that accumulated knowledge is now immense. A scientist is likely 35 years old before he has gained this proficiency. However, I have read that a human brain is its most nimble and malleable at the ages of 20-25 years old. Many scientists from 100 years past made their greatest contributions at such an early age. Perhaps that is no longer possible.

    • Scott Locklin said, on September 29, 2009 at 11:20 pm

      The problem with 1) is that people always say this. Lord Kelvin said it a long time ago, right before they cooked up quantum mechanics, statistical physics and nuclear physics. I know that isn’t much of an argument, but, for example: why haven’t we been able to harness something simple, like quantum mechanical collective effects? All kinds of potential wonders there, yet things like quantum computing remain preposterously distant possibilities, rather than something I can go buy.

      2) … mmmm, maybe … maybe not. Sure, hypersonic jets would be incredibly dangerous things, as that v^2 term indicates, well, lots of energy. We’re a bit limited to energy densities of hydrocarbons though.

      3) I think this is more a defect in our educational system than anything else. I knew, effectively, most of what I know now in my early 20s. I’m presently marginally better at stuff like programming computers, and I’ve stuffed more facts into my noggin, but I was capable of doing what I do now, then. Honestly, I’ve only come into my own as a creative thinker since I quit the gerbil wheel of industrial/academic life and done some thinking on my own. I think it was Cvitanovic who pointed out that physicists used to have lots of great ideas going for walks: I know I do. Why don’t physicists go for more walks? I think because they’re too busy doing tedious nonsense like editing crappy journals and sitting on impossible committees.

  11. John F. McGowan said, on September 30, 2009 at 12:49 am

    The argument that earlier generations did not have to invent new materials and today we do is certainly not correct. Metals such as copper, iron, steel, and so forth are not naturally occurring materials with rare exceptions, e.g. a few meteorites (and meteorites are mostly an iron nickel alloy of variable properties). In fact, it took earlier generations a long time to develop the mining and metallurgy to produce these materials that we take for granted.

    Steam engines, for example, were known and probably built as far back as ancient Alexandria, but ancient metallurgy was generally quite poor. It took major advances in metallurgy to make the steam engine possible. In fact, the early steam engines of the 18th and 19th century routinely pressed the limits of existing metallurgy; many blew up just like rocket engines today.

    Similarly, gasoline, kerosine, and a number of other fuels are not naturally occurring. It required significant research and development to learn to make these fuels from naturally occurring crude oil. You can’t run your car or other internal combustion engines on crude oil.



    • Scott Locklin said, on September 30, 2009 at 1:27 am

      Hey John: thanks for the spirited disagreement! I really appreciate it. OK, I agree that material science is slow developing …. at times. But consider my SR-71 example. An awful lot of titanium metallurgy was done in a couple of years at Lockheed. Why were they so damn productive? Why do material scientists so egregiously fail to deliver interesting things today? We’ve been reading about the miracle of carbon fiber composites for decades now, yet we still don’t make cars or airplanes out of such things (the Dreamliner maybe … if it ever flies). What’s up with that? There’s also stuff like maraging steels which is probably useful everywhere, but because nobody orders it by the ton (except people trying to build atom bombs), it remains expensive and exotic.

      I suspect the ancients were both more clever and less clever than they are commonly given credit for. For example: nobody has any real idea (to my knowledge) where things like clockwork mechanisms came from. Things like the Antikythera mechanism indicates that even very advanced clockwork mechanisms (which require at least decent bronze metallurgy) to the point of analog computing are very damn old! Then, you look at Heiron’s steam deelie boppers, and it’s fairly obvious they were missing the basic concepts required to make a Newcomen engine. For what it is worth, hobby steam engine boilers blow up all the time even today. There are fairly easily met conditions which can cause even the greatest of modern super materials to explode spectacularly. I guess that plays back to your point about energy densities.

      Let’s say we exclude stuff like transportation, and things which can potentially explode in general and concentrate on individual fields. Obviously telecom has done very well. Computer power per cubic inch has gone up dramatically. Silicon material science is fantastically better than in the past, as is material science pertaining to magnetic materials. All this is great: provides us with a living, and makes this conversation possible. My problem is, I have a bigger imagination than, “gee, wouldn’t it be nice if we could make really little magnetic domains on a thin film, so I can store all my old girlie magazines in a few microns of stuff.” Other people have big imaginations too. Really strong stuff would be good. Really good conductors at room temperature would be awesome. Clever new ways of turning heat or other forms of exploitable energy into electricity would be swell. I see no progress in these important areas. If there is a way to make progress in these areas, it seems like someone should break it down into doable pieces, and bloody well do it. If there isn’t, maybe people should explain this, so people stop wasting their time on impossible stuff (much of “renewable energy” alas, falls into this category: people wasting time on the impossible). Anyhow, maybe I’m wrong and something exciting is just around the corner: that would be great.

      • eminence_gris said, on December 1, 2009 at 11:00 pm

        An interesting addendum to the metallurgy issue is that steam engines need gaskets. Steam engine metallurgy got “good enough” for that job quite rapidly, but this moved the bottleneck elsewhere, such as the gasket problem. Leather was about the best material available for gaskets at that time, and it wasn’t really very good. But basically that’s all they were going to get (and they did quite well with it!) until materials science advanced a LOT further.

        Materials science as a whole is a big part of our current technology. We basically know how to make stuff now with whatever properties that are physically possible given the available combinations of atoms we have at our disposal. However, there are lots of properties which might be DESIRED, but which we have no way of coaxing actual bits of matter into behaving that way.

        Fusion, for instance, would be a lot easier if we had stuff which could handle the energy density required. But we don’t, nor do we really have any good ideas of anything to even try. Materials science is a mostly known field, with mostly known limits. There might be breakthroughs lurking in the corners, but the current state of the art might be best expressed by that classic scene from Apollo 13:

        “We’ve got to find a way to make THIS… fit into the hole for THIS… using nothing but THAT.”

        • Scott Locklin said, on December 1, 2009 at 11:19 pm

          Yeah, I kind of look at ideas like “nano” as a “wouldn’t it be nice” form of Material science bullshitting.

          I recently bought a little unimat lathe/mill to fiddle with, and maybe make some little steam engines and clocks and stuff. I used to make a lot of stuff when I was doing physics; I figure the lathe will keep that part of my brain alive and functioning. Having a lathe, and having to buy and work with real materials, really, really makes you think about the development of material science.

          Making cheap steel was an amazing innovation that really changed everything. You can look at most of the late years of the industrial revolution as being “cheap steel.” Which is why, when presidents wanted to tamp down inflation, they’d do price fixing of steel, and browbeat labor unions into not clamoring for a raise.

          Other amazing things about materials: an incredible amount of “low” technology (like most of your engines) still uses cast iron; stuff which was used by prehistoric celts and other primitive people. I find that amazing, and indicative of how trapped we are by the periodic table of elements. Also, aluminum is a horrible structural material, yet its lightness and cheapness makes it “high tech” compared to steel or iron.

          For myself, I think it’s very useful to go make stuff to think about what technology is. So many people (Kurzweil!) have merely studied some small field, and think they can extrapolate their field to others, like manufacturing. All those people talking the coming revolution from “solid printers” -jayzus, STFU until you’ve gone into a machine shop and made something at least as complex as a metal scribe, or tapped a goddamned thread!

  12. John F. McGowan said, on September 30, 2009 at 5:39 pm

    I personally think there has been a slowdown in many fields. I think this is true even in computers and electronics, where for example progress in artificial intelligence and pattern recognition has been quite limited. I attribute the slowdown to a combination of several overlapping factors. These include:

    1. In some fields, such as power and propulsion, the per trial cost has become extremely large. This may not be inevitable. I question the wisdom of pursuing concepts that involve billion dollar prototypes. It seems to take hundreds, thousands, even tens of thousands of trials to research and develop a major new technology. Hence a concept with a per trial cost of a billion dollars is extremely unlikely to bear fruit. Other concepts with lower per trial costs should be seriously considered.

    Specific examples where the size and cost of the prototypes has become very high are orbit-capable rockets, other space vehicles, tokamaks, inertial confinement fusion, nuclear reactors, etc.

    2. There seems to be excessive reliance on computers and computer simulations. Many fields where progress has slowed are actually early and heavy adopters of computer simulations, CAD tools, and so forth. This is evident in both aviation and the automobile industry. The aerospace and automobile industries are among the major customers for MATLAB (for example) and many other computer tools. The slowdown in aviation and rocketry seems to coincide with the adoption of computer methods. The results with slide rules, curiously, were better.

    I think there is a large amount of conceptual analysis in major inventions and discoveries, usually expressed in words and pictures. Computers and computer programs are not capable of conceptual thought. They don’t have “intuition”. They are not capable of “philosophy”. The heavy reliance on computer methods tends to crowd out the conceptual thinking that has historically been used to make major inventions and discoveries (e.g. physicists taking long walks).

    3. The modern system of heavily and centrally funded “Big Science” research has great difficulty abandoning ideas that are not working out. Many major inventions and discoveries involved at least one blind alley that led the inventor or discoverer astray for years. It seems to take many trials and usually five or more years for an inventor or discoverer to decide that an idea is probably flawed and devise a superior alternative. It is hard to do this even as a single person or small team in a workshop or lab. To some degree, this is prudent behavior. One should not abandon a “good idea” without strong evidence. However, modern research programs exhibit extreme difficulty in abandoning or even questioning ideas that don’t seem to be working out. Often the idea is explicitly or implicitly part of the funding pitch to the government and public. Hence, questioning it can be risky. And modern research programs are usually very strongly committed to a single idea or “paradigm” (an overused term); there is very limited diversity of ideas.



    • Scott Locklin said, on October 13, 2009 at 9:59 pm

      You know, your last post could have the makings of a good manifesto on this sort of thing.

  13. […] This is the post, which also links to an article that Dr. Locklin wrote for a magazine I have heretofore not heard of (but then again, I haven’t heard of almost everything, mathematically speaking (I have not taken a real analysis or measure theory course yet but I do grasp this concept from reading about it on wikipedia), considering how much content there is out there on the vast interwebs). […]

  14. Bruce Charlton said, on June 25, 2010 at 10:00 pm

    I saw this link on Mangan’s – where you commented on my piece. Nice article and some excellent points here.

    I see we were both very impressed by Charles Murray’s Human Accomplishment – it made a huge impact on me.

    Progress has indeed slowed, and overall I think has been in reverse for some time (certainly in medicine).

    But the reason why this is not obvious is because of other ‘media’ areas where ‘progress’ has been very rapid: public relations, marketing, advertizing, design, hype, spin and dishonesty.

    • Scott Locklin said, on June 25, 2010 at 10:11 pm

      Medicine is something I know little about; I had sort of assumed from a distance that things had gotten better in some areas. But people don’t seem to live much longer or better.

      I agree to a certain extent that media is responsible for our blindness to the decline. I think it’s also one of those things which is baked into our self conception as a civilization: progress is just assumed to be true. My more liberal pals think the democratization of technology is a not inconsiderable form of progress. I guess if I lived in the third world, I’d agree, but I don’t, and it doesn’t excuse the fact that peak civilization is not moving forward.

      I actually read Murray’s book after I wrote the article; a friend recommended it. It’s amazing to me that nobody ever wrote such a thing before. You might enjoy reading Kishore Mahbubani’s essays.

  15. Eric said, on August 9, 2010 at 4:23 pm

    One of the biggest reasons space travel hasn’t really taken off is the government-industrial complex. Jerry Pournelle has often written about the death of the X programs; small, tightly-focused, relatively cheap programs to explore ONE problem in detail. The last X program was the DC-X, and that was done under the sponsorship of a single person, who died, and the program was merged into the technological monstrosity known as X-33, which required the simultaneous solution of 5 distinct problems, all of which had to succeed.

    Alan Kay has spoken extensively on the lack of innovation in computer science as well.

    • Scott Locklin said, on August 10, 2010 at 3:08 am

      I can’t argue much; having worked for a government lab, such places follow their own logic, having little to do with achievement.
      NASA is arguably worse; it was envisioned by Johnson as a sort of baksheesh “southern strategy” to reindustrialize the (very) post Civil War south. I don’t know how that’s working out, but I suppose the south did take some heavy industry from the north.

      • CF said, on August 25, 2010 at 6:30 am

        Hi Scott,

        Why is the lack of this sort of technological progress so lamentable? I’m not sure that the type of progress you’re looking for is even desirable.

        1.) Space. We don’t really need to go to space for much. Satellites get up there just fine with current tech. Wait a century or two, and then living on a giant brown desert planet might be an appealing alternative to Earth. We (thought we) needed to go to space b/c the Russians were beating us there in the mid-century.

        2.) Energy. Oil isn’t expensive enough yet, and the alternatives aren’t likely to be an improvement. We’ll break even at some point with something, but I don’t expect much until there’s an imperative. For now current battery tech suffices for our needs, as well.

        3.) Climate engineering. See #2, we don’t need it yet, and we (politically, and maybe otherwise) can’t stop the Great CO2 Spew.

        4.) Computation. I would love to see AI improvements, but also I wonder whether we need it right now. There are some positive signs that more innovation here may come soon, e.g. in automation. Several recent and very successful attempts to have computers drive cars without intervention. I predict we will make real progress in AI only when it is cheap and easy to make machines with effectors and decent navigation. Right now it’s difficult to even say what AI should do, but when robots are moving around and doing stuff it might be more obvious.

        Some of the “technologies” being developed right now probably don’t look like technology to you. Disclosure: I’m a psychologist / cognitive neuroscientist, which probably instantly discredits me in your eyes, but hear me out. Imagine the following innovations:

        a.) Tech that helps people achieve happiness [without medicating the shit out of them]
        b.) Tech that helps people make better decisions. For example, make healthier food choices, choose a more appropriate spouse, or not commit a crime.
        c.) Tech that helps people learn faster, more, and at a deeper level.

        Psychology doesn’t have a lot to show currently for the above (although there is a lot of untapped potential in extant results), but I think we’re getting there. A huge assist is the development of the Internet and portable internet-connected devices. We’re now at least measuring happiness, for instance, and we can do it while people are not in the lab thanks to these devices.

        An aside on rapid prototyping — I am very passionate about it, and in psychology it is extremely possible to have rapid progress. I have conceived of experiments, programmed them, and collected pilot data from undergraduates in the scope of less than one week.

        Another aside — in Psychology / Cognitive Neuroscience labs generally consist of one P.I. + a handful of grad students / postdocs. There is no point to the giant institutional labs that you rightfully deride as stifling innovation. Psychologists also have relatively thin boundaries within the discipline. I jump regularly between studying visual perception and decision-making. No person in power has done anything but encourage my erratic, ADHD impulses.

        The point is that what looks to you like incremental progress — the Internet and cheap computers — is actually opening up new worlds in disciplines that can’t hope to be as advanced as your physics within our lifetimes, but which are extremely free and open by comparison. You’re quite right that we can’t model the simple CNS of the very simplest slugs or flies, but there are “technologies” such as the quantification and observation of happiness that have incredible potential to change our lives, even though they don’t look the same as the invention of an engine or a transistor. To my view, that’s the source of the next most likely burst of innovation, and progress in this domain is invisible to outsiders because we are still figuring out how to do our science correctly or at least productively. Even though, like you, I tend towards skepticism that borders on cynicism, I see a lot of progress around me, and thankfully we are not quite as encumbered by institutionalism as physics, for instance.

        There is a tremendous amount of bullshit spewing from my field (ever heard of “neuromarketing”?), but it is a very attractive field for a true innovator to be right now, for the above-cited reasons.

        Thanks for this fantastically interesting blog; you’ve caused me to stay up entirely too late!

        • Scott Locklin said, on August 25, 2010 at 8:43 am

          I hold a very dim view of psychology as a field. For example: your a) and b) were things that people used to naturally possess via tradition and religion. Most unhappy people are fat, stupid and unhappy because they’ve been sold a lifestyle which is incredibly stupid and unhappy making. Those things very obviously worked better before psychologists and egalitarian social engineers started noodling with society. The way I look at it, you lot broke Western Civilization, and now you’re claiming you have some band-aids to make it look better. No thanks. I’ll stick to my working class remedies of “grin and bear it” and “don’t be a dummy.” They’re cheaper and work better.

          I don’t particularly believe you with c). Consider this: my mom went to an ordinary catholic high school. When she graduated, she knew Latin, French and Anglo-Saxon, as well as integral calculus. Apparently, them nuns who taught her knew something which people have now forgotten about teaching methodology. For what it is worth, she didn’t do much homework either, and her school was quite ordinary. And don’t try to tell me that learning these things are somehow irrelevant: they’re not. I feel like someone robbed me of my birthright, as should anyone else who speaks english but never read the Book of Exeter. I’d tell you to go ask the nuns how they did that, but your lot bought them vibrators and taught them to be lesbians, so there aren’t any left to talk to.

          If you don’t understand what space flight is for, well, I can’t help you see what glory and the human spirit is. Maybe the Futurist Manifesto can help you. If not, red meat and testosterone shots. Your other examples … I mean, you don’t seem to like the idea of technological progress very much. I do. I think it’s bizarre and insane that I live in a time touted as “advanced” which isn’t making any observable advances. I think it’s even more bizarre and insane that I’m surrounded by people who don’t notice this.

          Anyhow, thanks for the praise. I’d like it if you were right and I am wrong, and we’re about to step into some great new age of human advancement, even if it’s only some nebulous kind of psychological advancement.

          • CF said, on August 25, 2010 at 4:38 pm

            You seem to have lumped into “my lot” a lot of people that I have nothing in common with, but OK. I have mixed feelings about convincing nuns to be vibrator-wielding lesbians, for instance, and I don’t think it would ever even occur to me to fight such a battle. We’re not all super-PC egalitarian tree-huggers who think everyone has the same potential and should be getting off all the time, you know. Even so, I don’t think “my lot” is responsible for making people fat and stupid and so forth. Actually, aren’t those things due more to the rapid technological advancement in other fields? For example, the green revolution, big agriculture, cheap cars, cheap fuel, interstates, drug wars and civil rights struggles that pushed people to the suburbs, etc. A lot of that doesn’t really fall into my domain, but I guess if “social sciences” is one big amorphous blob to you, I can understand why you might feel that we’re all part of the problem, but fields like urban planning have been slow to recognize the real consequences of these advances and accommodate for them, but there’s a lot more to that than just some evil psychologists hatching plans about how to make people happy.

            I agree with you that the type of education you describe is awesome, and I actually think “my lot” will help you recover that in time, but we’ll just have to wait and see. The pivotal thing is the ability to measure learning rates and improve them, then generating the tutoring content in those domains. I think a lot of good education is basic memorization, and that’s so easy to improve over what we currently have that it’s laughable.

            I think you misunderstood me regarding space flight and so on. Personally, I think it would be fantastic to have the sorts of advancements you describe. It makes my spirit soar, because I am a giant nerd. However, I think the advancements prior to the last 50-year period were due to need (real and perceived). They have sort of temporarily accomplished their goal, because we are relatively war-free compared to any period in recorded human history; we are not yet at our population cap with current agriculture tech; communication has gotten about as good as it gets; and we are not yet running truly low on resources (yet, but we may be close). I think once the needs start pressing in on us, advancements of those sorts will soar again.

            By the way, the italian futurists were batshit crazy weirdos whose only accomplishment seems to have been producing utterly terrible music. 🙂

            • Scott Locklin said, on August 25, 2010 at 5:49 pm

              Urban planners: only modern people could consider it a great breakthrough to create town squares and public space on human scales … you know, like every society has done for the last 6000 years. I don’t blame technology for the fact that people are fat: if it were true, French and Italian and Japanese people would be as fat as Americans. I blame lack of self control, which is something sold to humanity by social engineering types.

              OK, ok, its not your fault that society is dissolving into barbarism, but it is ideas like yours which have caused this, so I’m not a big fan. Tradition is there for a reason; how come we don’t see more psychologists and sociologists figuring out why people who follow tradition are happier, smarter and live longer than people who don’t? Seems like a dirt simple subject for research to me. I figure nobody has done it, because those fields are the enemies of tradition. Only recently have people guardedly begun to notice that, hey, in fact, people from normal intact traditional families do better than people who are from, um, “alternative” family structures. Even now, even thought it’s obvious, saying that makes one a thought criminal.

              Indeed, them Futurists were crazy weirdos, but they did invent electronic music, and had some cool paintings and other art. Had they a government behind them, like von Braun did, they might have accomplished something more impressive.

              • CF said, on August 25, 2010 at 9:15 pm

                Hm, what do you mean about “lack of self control…sold to humanity by social engineering types”? I’m confused about who sold this, in what form, and why, but I don’t think it has anything to do with me and “my lot.”

                I’m not talking about social-engineering from the top-down as much as I’m talking about bottom-up tools that someone can choose or not choose to use. I’m not interested in forcing peoples’ hands, they can decide for themselves if they’ll be happier or better using these tools or not.

                There is tons of interest in my field around religion and tradition, and their relationship to happiness and social structure. Even though I don’t personally study it, I know of certain findings, e.g., that religious people tend to be more generous than non-religious. Obviously, if there were a political aim behind that finding, it would be pro-religion. So I don’t know why you imagine that no one is studying that sort of thing, because everyone I know seems deeply interested in it. I can’t speak to sociology, but I don’t get the sense at all that the general bent of people in my field is to act as “an enemy to tradition.”

                The primary drive, rather, is to uncover truths about the world, just as in any other science.

                • Scott Locklin said, on August 25, 2010 at 9:37 pm

                  Self control was a part of religion and traditional culture for, like, forever… until around 1970. Even David Brooks noticed this in a recent editorial.
                  It’s pretty obvious who is to blame for the lack of these things in society: social engineers and marketers, springing from destructive social “scientist” groups like the Frankfurt school and Freudianism. There was some BBC documentary on the subject some years back, “Century of the Self,” which, somewhat surprisingly, had an almost Paleoconservative point of view. Anyhow: there are less fat people in Canada also. Social decay isn’t quite as advanced there.
                  I don’t doubt that you, as an individual, are a decent person who is interested in scientific truth. The thing is, the paradigm your science is based on is something I consider destructive. There’s not much you can do to convince me otherwise, other than to give me back my space rockets and Anglo-Saxon teaching nuns.

                  • CF said, on August 25, 2010 at 9:57 pm

                    Dude, you’re talking about a lifetime ago, another world. The field has moved on; no respected psychologist spends time worrying over the Freudian/Frankfurt or any of the rest of the other psychoanalytic BS. Instead, the experimental tradition that was presaged by people like Wundt won out in the middle of the 20th century. Only idiot philosophers and literary types have anything to do with Freud etc. any more.

                    So, psychology as it exists now is so young. Fields like physics, chemistry have centuries of work behind them. We’re just finding our legs…

                    But I guess I’ll stop trying to talk you out of what I believe are seriously mistaken notions…I’d be delighted to give you your rocket ships and nuns if I could. I personally want nothing more than to become a cyborg and live for thousands of years, so I feel your pain of unfulfilled technological ambitions….slash…weird nun fantasies…

                    • Scott Locklin said, on August 25, 2010 at 10:06 pm

                      Yes, yes, I know: I’m worried about what fun stuff is being developed now. Some people worry about self replicating nanobots from physicists; my cross to bear is worrying about what fun insanity is going to come out of the psychological/psychiatric schools next. No respected psychologist worries about lots of things I worry about.
                      Anyway, I blog about this sort of cultural decline stuff elsewhere. This blog is on technological stasis, and possible eventual decline.

  16. Mikhail Glushenkov said, on August 26, 2010 at 8:52 pm

    > If you’re far from civilization, you will have no dial tone. If you were far
    > from civilization in 1959, you will have no pay phone.

    That’s what satellite phones are for.

  17. Technological decay « Jim’s Blog said, on March 19, 2012 at 1:36 am

    […] A number of posts have appeared by a number of people reporting slowing in technology, or actual decline in the level of technology: See Locklin for a summary and review. […]

  18. […] Locklin – Myths of Technological Progress […]

  19. Andreas Yankopolus said, on October 22, 2018 at 1:08 am

    Link to the IEEE article on archive.org: https://web.archive.org/web/20090905081154/http://www.spectrum.ieee.org:80/robotics/robotics-software/singular-simplicity/0

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: