Locklin on science

Computers reduce efficiency: Case Studies of the Solow Paradox

Posted in Progress by Scott Locklin on November 21, 2023

I’ve harped on this in my sidewinder and slide rule blergs, as well as older ones: very often, using a computard is a recipe for failure, and old fashioned techniques are more efficient and useful. The sidewinder guys at China Lake actually called this out in their accounts of the place going to seed: computers and carpets ruined the place’s productivity and most of all innovation. I’ve mentioned my own anecdotes of CAD failures, and the general failings of computers in a design cycle. This is an early realization of mine; even before the internet existed as a distraction. In my first serious programming projects I had a lot of good experiences literally writing out the Fortran on a giant brown paper trashbag cut up and stapled into a scroll-like object, compared to people who would laboriously use WATCOM or Emacs or whatever people were using in those days as an IDE, looking through the toilet paper tube of 640×480 90s era monitors. I attributed  this to simply being able to look at the whole thing at a glance, but for all I know, writing it with a pencil and engaging my hand’s proprioceptors, or not looking at a hypnotically flickering screen were the magic ingredients. I’m beginning to think the latter things are more important than people realize.

RAND corporation did a study of the failings of CAD in design of British nuclear submarines. Before computard/CAD tools, people would use the old timey techniques of drafting in 2-d on a piece of paper, and building scale models to see how pieces fit together. Literally laboriously using weird T-square tools and a pencil and building plaster models was faster than using advanced computard technology. Again, this is something I’ve actually experienced in building experimental physics gizmos. You can spend months on a design in Solidworks and make something impossible to fabricate which doesn’t line up with the rest of the system: I’ve seen it happen. Dude with a tape measure can fix it in a few hours if it’s a one-off; somehow these problems don’t come up with models and slide-rule and paper design. This was admitted in Parliament in their investigations of the cost overruns on their Astute class submarines. It boggles my mind that people still don’t realize this is a real problem. We get mindless repetitions that “software is eating everything” like some kind of mantra despite evidence to the contrary. Instead of studying the problem, it’s simply dismissed. Nobody trains in non-CAD drafting any more so we can’t exactly go back to that.

Now the RAND study did sort of elide over the core problem by stating that American expertise (who had been using CAD and run into many of the problems before) at the electric boat company helped unfuck the British program. They did not ask the basic question of whether or not CAD was mostly harmful; it might be so that its use reduces productivity overall and people might be better off only using it strategically. We’ll never actually know, because unless the Russians are still using old timey drafting methods, we don’t have a comparison class which isn’t time-censored (the Chinese would never think of this: using paper would be seen as losing face).

Another study is one on CAD design back in 1989. He uses the example of printed circuit design; something that has long since been given over to CAD. Back in those days a lot of the designs had to be refactored by hand. He also notes the danger that future generations of designers might have atrophied skills which won’t enable him to do this. He notes that CAD didn’t eliminate the job of the draftsman or increase his output; he just does it on a computer now.

For another example, Richard H Franke studied an early adopter of now widespread computer technologies: the financial services industry. This is wholly remarkable because if any field would show an increase in productivity due to computer adoption it would be financial services, but he pretty definitively proved, up to 1987 anyway, productivity of financial services went down due to the introduction of computers. Not by a little bit either: by a lot:

Note traditional non-computard plot he made: probably got published 2 years earlier because of this

Note in the same paper he found that introduction of CAM in manufacturing was also associated with a similar productivity decline. You can sort of imagine why: computer equipment was expensive and people had to learn how to use it. But there are probably larger scale effects. I have small machine tools in my house; none of them are CAM tools. If I want a part, in most cases it’s dirt simple to get out the calipers, visualize it in my mind and make the damn thing. At most I need to fool around with a ruler and piece of graph paper. I can’t make everything this way, and there are a number of doodads I’d have a hard time with, where a $50,000 CAM mill that fills my entire machine shop would be able to do it. The CAM thing would do the corner cases, but I’d spend months learning how to use the thing, spend tons of loot keeping it running (it’s much more complicated and prone to failure), and I’d spend all my time ministering to this monstrosity, learning to use whatever CAD tools are out there, and forgetting how to make precise cuts on my manual mill and lathe. The same story was probably true in FinServ. Their routine tasks were made more complicated by ritual obeisances to the computer gods.

Somewhat to my surprise there are enough examples of this that economists have actually come up with a name for it. It’s called the Solow Paradox. Robert Solow is a 99 year old MIT emeritus professor of economics who quipped in 1987 that “You can see the computer age everywhere but in the productivity statistics.” I loathe economists as a pack of witch doctors with linear regression models, but the effect is large enough even they noticed. Everyone was relieved when GUIs and LANs came out in the mid 90s and these technologies did seem to be associated with an increase in productivity in some sectors of the economy. This measurable increase basically stopped when people started wiring their computers up to the internet. It’s not like MS Word does anything different now that it didn’t do in 1995. It just requires more resources to run.

For brick and mortar retail one can understand how productivity increased in the 90s. It’s a hell of a lot easier using bar codes and a database back end to manage your inventory than whatever ad hoc filing cabinet systems people were using before. With inventory control you can optimize your supply chain and get further efficiencies. Buying it all from China also helped the firms involved in doing this (didn’t do any good for the country’s manufacturing capabilities of course, but that’s out of scope for economists). This process was happening in the 80s, but computers were still running things like DOS with VAX and AS/400 backends; all of which required ministrations of a large caste of IT professionals. Hooking everything up to a LAN with GUI front ends helped lower the IT head count, so the IT guys could go off and invent new businesses involving wasting time on the internet.  Later you got some productivity growth from selling stuff online instead of in shops (which are a large cost center). BTW this is my interpretation for why the Solow paradox came back in the 00’s: the most obvious interpretation.

You can see here the “paradox” that computers aren’t helpful is back

earlier figures in productivity growth not shown in current year BLS website

Just to tangentially remind people: this is the promise of computing and wistful fantasies like “AI” -you want to increase the productivity of a worker to the point where you can make do with fewer workers, outputting the same amount of product for cheaper. If you have a technology that doubles a worker’s productivity, but requires another worker to minister to the technology, you haven’t increased your company’s productivity: you’ve decreased it, because you have the same output per worker and an incurred cost for the technology. If you have a technology which marginally increases a worker’s productivity but still requires another worker to minister to the technology, you have made productivity significantly lower.  It is entirely possible that you might lower a worker’s productivity with a technology, as we saw with the British attempt to cut submarine design costs using CAD instead of using t-squares, pieces of graph paper and styrofoam models.

The economists, naturally, are lowering their own productivity by arguing about this; some of them claim it’s not real and that the productivity increases show up at some nebulous later date in an unspecified way that apparently can’t be measured. Some of them simply flip over the tables and insist the computers are good and we should find a new way to measure productivity that involves fucking around on a computer. This despite the abundant evidence productivity is slowing down or declining despite our pervasive computard technology. They fiddle around with linear regression models of varying degrees of sophistication. They argue on the internet. What they don’t do is look for situations where the data show differences in an attempt to understand well enough to provide guidance or solutions. This is a microcosm of everything else: rather than solving a problem, they’re looking busy by furiously typing on their computard. Economists in the pre-computard era were more than capable of this sort of thing: Burton Klein made a good stab at looking at productivity improvements using pencil and graph paper.

One of the things that introducing a new technology does do: it redistributes resources and what people do on a daily basis. I don’t know if Avon Ladies are still a thing; but now there are Instagram whores shilling things. Database vendors and DBAs get money instead of filing cabinet manufacturers and filing clerks. Instead of filling out paper forms, people fill out computer forms. For another example, computard made decimalization a thing, and opened up market making to people all over the world, instead of a couple of knuckle dragging former football players in Chicongo and NYC who went to the same couple of high schools. Any individual who buys stonks now (and a lot more people do) will get a better price. Lots more guys like me get paid. Still, the total productivity has gone down. Is it better to pay a couple of dumb incumbents more money or more Ph.D. types less money and spend the difference on computers and stratum 0 NTP servers instead of coke and hookers?

Robotic automation may remove jobs from blue collar workers and assign more jobs to white collar workers and the pyramid-scheme institutions which certify white collar workers. It would be hilarious if we automated all the manufacturing jobs with robotics and it lowered productivity: that actually seems to be the trend. The long term trend of this is that lower IQ people have nothing remunerative to do, and higher IQ people in these jobs don’t reproduce, because they educated themselves out their fertility windows. That’s another issue nobody in economics wants to think about, but an Amish farmer would probably notice.

Mind you I think robotics is something I think is worth investing R&D dollars in. All these “AI” goons fooling around with LLMs or larpy autonomous vehicle nonsense should be working on workaday stuff like depth estimation, affordance discovery and scene understanding, or other open problems in robotics. It’s the mindless application of current year information technologies in areas they are not suited for or not helpful at all I find disagreeable. We add computers to things not because it makes things better, but as a sort of religious ritual to propitiate technological gods. The gods are not pleased with our sacrifices. We do them anyway, like the cargo cult guy with coconut earphones trying a different variety of coconut in hopes of getting a different answer.

The persistent presence of the Solow “paradox” ought to give pause over how we develop and innovate new technologies. If I visit a company claiming to innovate things, is there a computer on everyone’s desk? Does there need to be a computer there? What are people doing at their computers? Is it mission oriented or are they just fucking around with a computer? I suspect banning computers in R&D facilities excepting where absolutely necessary would pay dividends. Banish them to special compute rooms, and limit employee time there.  Someone should try it; there’s nothing to lose: all R&D is a gamble, and at least you won’t waste time fiddling with computers.

76 Responses

Subscribe to comments with RSS.

  1. Ken Grimes said, on November 21, 2023 at 4:25 pm

    Interesting article, I wasn’t aware of the Solow paradox.

    I’m still skeptic though on the negative effect of computers/CAD based on my work in semiconductor design. It’s now impossible to design a digital circuit without CAD software to place and route the million transistors involved. Analog circuit design without CAD would take ages due to finding bugs only after the silicon comes. Of course some people overdue it with the SPICE simulations, some effects can really only be found in silicon, but overall I don’t think that nowadays one can design a competitive circuit just with pen and pencil.

    One of the papers you quoted says in the abstract “new technologies do have a positive impact on the productivity of the sectors of adoption. The propagation of this effect to the whole regional economy, however, is mitigated by sectoral employment reallocation effects towards less productive sectors.” So maybe CAD does help in for instance semiconductor design, it’s the second order effects in the economy that lead to the Solow paradox

    • asciilifeform said, on November 21, 2023 at 5:05 pm

      > It’s now impossible to design a digital circuit without CAD software to place and route the million transistors

      This is not actually true: e.g. the 1801BM2 ( see https://archive.is/pEMWM ) was AFAIK hand-drawn. And worked.

      There’s also the unquestioned assumption here, that one actually needs “millions of transistors”, 3nM, etc. Reactors, weapons systems, avionics, etc. (and in particular, in the ex-Soviet world) are still running on PDP compatibles — baked with hand-drawn photo masks.

    • Scott Locklin said, on November 21, 2023 at 5:30 pm

      In the early days of CAD PCB design it was claimed it emphatically didn’t improve things, but people did it anyway. I believe it is more productive now (I’ve used them back around 2002; seemed fine), and like many computard things, enables us to do stunts we couldn’t do before. I guess like your overdone SPICE simulations (something I also learned in school and never actually needed to use) you have to ask whether or not you should actually be doing these things. People naturally sit at a desk with a computer on it and do computer things; whether or not they’re useful things is a different question.

      I bet you could still make something like a RISK-V chip using people and t-squares if you had to, and the head count wouldn’t be that different from people with laptops. Within a factor of 10. Might even be lower; who knows.

      • asciilifeform said, on November 22, 2023 at 10:59 pm

        > people and t-squares… …the head count wouldn’t be that different from people with laptops

        If you can actually get hold of the ~dozen people who remember how to do it, who are not yet in the ground.

        If the process has to be rediscovered from first principles (a la Asimov’s “Feeling of Power”) expect it to take… longer.

        • Robert said, on November 23, 2023 at 8:54 am

          would pay good money for an online class that teaches pre-computard fundamentals for everything from slide ruler to semiconductor design for autist gen-alpha

          • Scott Locklin said, on November 23, 2023 at 9:49 am

            You don’t have to; it’s all on youtube.

            Hire a post-doc to answer questions if you get stuck; there you go, basically free collitch.

            • Robert said, on November 23, 2023 at 9:55 pm

              Immensely helpful. Thank you Scott.

            • JMcG said, on November 24, 2023 at 4:19 pm

              There was a BBC show back thirty years ago called The Secret Life of Machines. The creator, Tim Hunkin, has put all the episodes on YT. Additionally, he has created a whole new set of programs describing his process of fabrication.
              They are very useful for someone wishing to get his hands dirty.
              He also has an essay on his website; Technology is What Makes Us Human, that touches on many of the same ideas as do you.

      • CS said, on November 26, 2023 at 4:53 pm

        I’m in semiconductor design as well, and while I agree with Ken in principle, reducing computer usage would still greatly increase productivity in the industry. The amount of MS Office your average engineer engages with these days is unfathomable. People write their specs in word, use Visio to do their diagrams, and present cliff notes in PPT.

        I train all my junior engineers to do all their digital design work in a graph paper notebook first. Shut off the screen, put your head down, and work. Write theory of operation to yourself, draw some block diagrams, relevant FSMs, waveforms, etc. Get it all working on the paper. When it all makes sense there, then you can take the couple of hours to convert it all to MS Office for the managers. But don’t start off by doodling in Visio!

  2. ian said, on November 21, 2023 at 5:16 pm

    This is a good one!

    I keep going back to Richard Hamming when he said “The purpose of computing is insight, not numbers.” As in, using it to help your head model, not as a black box to be used mindlessly. Mind you, this was in the 60s. What would he think now?

  3. toastedposts said, on November 21, 2023 at 5:17 pm

    Nobody trains in non-CAD drafting any more so we can’t exactly go back to that.

    I did see a machinist on youtube who won’t pay for computer drafting tools (mostly because anymore they’re fanatically 3d monstrosities like solidworks or CATIA). Instead he’ll fire up “autocad” (by which he means, go to a sheet of paper on a drawing table and pencil sketch things in with protractors t-squares and triangles.)

    I’ve done it a little myself. A family member gave me a bunch of his old books (dogeared, covered in grease) on the geometric dimensioning and tolerancing language that drafters used to use (but isn’t much in evidence on the drawings he gets anymore.)

    The UI of modern CAD tools sucks in my opinion. There are ways to *do* the things I want to do, but they’re buried in 10 menus and not at all obvious. CATIA, when I worked for a company that used it, lagged horribly and wanted to phone the mothership to ping some centralized part repository. Lag makes software tools psychologically painful to use.

    A family member depends on a very old piece of 2d drafting software for his CAM work. It is designed around drawing contours in 2d, which is what he needs it to do. I dread the day the computer that can run it dies, and have tried keeping sufficiently old machines on standby. It depends on a security module that has to fit into a particular type of printer port that has to live on a particular motherboard bus for the software to load. It depends on running Windows XP or earlier.

    I’ve tried my hand at cracking the software, but I’m not that good with the tools to do that, and as far as I can tell, whoever wrote the security module was some kind of evil Russian genius. (It does something very weird where it loads the part of the program that does the security call by overwriting blocks of its own program memory.) File that under time wasted flailing at computers.

    • toastedposts said, on November 21, 2023 at 5:35 pm

      What might be going on is that the program is completing itself with something that lives on the security module, which I’m not going to touch because I don’t want to break it.

      Shortest path to solution here might be writing my own 2d CAD/CAM software. A project that’s been on the backburner for a while now.

      • asciilifeform said, on November 22, 2023 at 11:04 pm

        If you post the warez in question + a few sessions’ worth of signal captures from the parallel port “key”, chances are quite good that someone will hand you a working pill.

        Possibly even a photo / description of the key will suffice — many of these were in fact standard kits, used by various “golden toilet” vendors over the years, and were cracked long ago.

  4. Cleetus said, on November 21, 2023 at 5:24 pm

    I agree completely Scott. I was a drafter for many years. The K+E drafting machines are way faster than CAD. Those are the bent arm drafting machines that Keuffel & Esser made.

  5. Igor Bukanov said, on November 21, 2023 at 5:59 pm

    At the beginning of eighties Soviet KGB spent a lot of efforts to acquire CAD software and computers from Western countries. A relative of buddy of mine around 1985 in Moscow once brought him to a closed/restricted research facility and shown him a room full of dudes working with CAD on Soviet clones of IBM 370. People working on less restrictive things were drafting on paper. So one can assume that Soviet military clearly saw need for CAD.

    On the other hand maybe this contributed to downfall of USSR. The country has started to rely on GRU and KGB to bring blueprints of Western computers and control systems. But then it turned out that blueprints were not enough. A story claims that Soviet manufactures were absolutely fail to clone Intel 286 CPU even after not only blueprints but pieces of machinery were provided to them. One really needs people working with things from the ground zero to reproduce them. And the problem with software is that the connection with the basic is totally lost.

    • Scott Locklin said, on November 21, 2023 at 11:34 pm

      Imagine if the computard revolution was a head fake to get the Soviet Union to destroy itself with inefficiency, and they forgot to tell people that back in the US.

      • g.b. said, on November 22, 2023 at 9:10 am

        Supposedly the USSR remnants went back to mechanical typewriters after the Snowden revelations.

      • Igor Bukanov said, on November 22, 2023 at 12:57 pm

        I wish that can be just a joke. There was really strange development in seventies in USSR when pretty much all development of own computer systems were abandoned in favour of just copying IBM. Maybe CIA indeed managed to sell to Soviet generals the idea of importance of CAD in particular and just blindly copying anything from West in general.

        • Scott Locklin said, on November 22, 2023 at 1:28 pm

          Certainly having the fastest supercomputer has always been a great power flex. It was never clear what tangible good they did anybody. I know early computers were important for certain developments, but it doesn’t mean having a faster one gives proportionally more useful output.

          CAD sounds great in general: I have been paging through an older book (1990 Routlige encyclopedia on the history of technology) which talks about the impending FULLY AUTOMATED factory as if that were an inevitability. Draw picture of car, car comes out factory automatically. It’s almost like current year technological projections are entirely formed in someone’s marketing department. Marketing is commercial propaganda, intended to deceive. It is very obvious to me in current year marketing when it’s something I know about: but what if that’s always been a rare skill?

          • asciilifeform said, on November 23, 2023 at 2:53 am

            Who did the author believe would buy the goods made at the FULLY AUTOMATED (lol) factory, and with what moneys?

            The great American experiment of “we’ll all eat by paying each other to wipe one another’s arses” isn’t going well, last I knew.

          • Igor Bukanov said, on November 23, 2023 at 8:25 am

            So it was just marketing-sales at IBM that were good at doing their job in seventies. They sold mainframes to banks and CAD to manufacturers and Soviet generals. Then bosses did not want to admit that it was a mistake and the rest is history.

            • Scott Locklin said, on November 23, 2023 at 9:59 am

              I mean, we should consider the possibility. I’ve certainly seen lots of million dollar analytics and machine learning projects which made organizations less efficient (because it needed a small team of phds to make it work and added little but psychological comfort to the C team).

              • Altitude Zero said, on November 28, 2023 at 9:50 pm

                CAD as doomsday machine/Berserker robot – the enemy it was designed to destroy is long vanquished, as is the society that built it, but it lives on, creating planetary mayhem…

  6. Sam Arak said, on November 21, 2023 at 6:48 pm

    As for the economists giving it a name, kudos. In business, we just call it bullshit. Maybe Occam’s razor if you want to get technical: “the simplest solution is often the most efficient” or even “the simplest tool is often the best”. Technologies complicate things. Ivy league MBA dipshits complicate things. Combining the two results in a clusterfuck.

    IMO, Windows, iOS, and Android are all nearly unusable from stock. Bloated up and saturated with nonsense notifications, startup programs and games. They all coerce you to upload everything to their cloudware and prompt you to buy more storage. It’s not terribly surprising that CAD has taken on bloatware-qualities. Even without the bloat, computers are intrinsically worse tools for the mind.

    For simple projects, I know firsthand that MS Paint is superior to civil engineering CAD. Scale up pixels to feet and draw away, no chance of crashing, never have to learn any new tools or systems, fires up immediately, no distractions, etc. It is still obvious that technology has improved productivity and efficiency in some sectors of the economy. Logistics and online shopping come to mind, but realistically Amazon is poised to become shitware.

    Wikipedia seems to ignore this on their Solow page, but if you accounted for cultural decline and motivation loss in the west, I think productivity gains would be much greater. Imagine an America that was NOT fat, drug addicted, media-obsessed and atheist (we can only dream)

  7. Александр said, on November 21, 2023 at 6:48 pm

    From my POW, active use of computers cause two effects:
    1. With computers, we are able to design much more complex systems when manually – and can’t actually understand them completely.
    2. Most of computer power is spend on completely useless and even harmful things.
    ———————
    May be, the real reason for such bad results from computers is idea of selling and copyrighting software without any responsibility/liability.

    If producer of CAD software was liable for failed design of plane or car – caused by bugs in CAD code – such complex software itself will not be build.

    We will still have computers, but with much smaller, less buggy and much simple software.

    • polar vortex said, on November 21, 2023 at 9:12 pm

      There are electric toothbrushes with an Alexa (Amazon.com) feature. Now there’s a fine example of useless complexity.

  8. Edmund said, on November 21, 2023 at 9:57 pm

    I don’t think the Solow Paradox is much of a mystery. Google has a price/book ratio of 6.3. Maybe it’s a steep price to pay but we don’t call this a paradox; we simply recognize that the market values their intangible assets much more highly than the accountants do. Similarly, measured productivity growth relies on a set of accounting standards that were designed for an industrial economy and perform less well in an economy driven at the margin by the services sector.

    GDP only captures the value-add of economic activity mediated by monetary exchange, unless otherwise imputed. Ordering Doordash is more GDP intensive than going to McDonalds, which is more GDP intensive than buying ground beef and making hamburgers. Similarly, if Jack pays Jill to watch her child while Jill pays Jack to mow Jill’s lawn, there’s some true productivity gain due to specialization. However, it is probably not nearly as large as the boost to GDP caused by shifting labor from the domestic to the market economy. To the financial services case, paying a hedge fund 2+20 is more “value-add” in the national accounts than paying Vanguard 10bps. It makes sense if you believe managers are paid proportionally to their productivity, but that’s empirically questionable.

    I derive immense utility from the internet. For free, I can read your blog, countless scientific papers, and many books. I can also communicate with interesting people from all around the world. There is no reason to expect any of this to show up directly in measured productivity, unless people go forth and use that knowledge to produce things they can sell for more money than they would have otherwise.

    Few people did more than Solow to get economic theory on the wrong track IMO. A topic for a lengthier post, but just contrast the world of his eponymous growth model with that of Adam Smith’s pin factory. The founding father of economics would find it unrecognizable.

    • Scott Locklin said, on November 21, 2023 at 11:33 pm

      Solow was definitely a tosser, being a colleague of ultra-tosser Samuelson, but he wasn’t the only one to notice this; it’s easily observable on the micro and macro levels. It’s observable if you’ve ever done anything computer optional, like draw a sketch. You should still be able to measure productivity increases with technological investment. It doesn’t show up anywhere -it should if computers always or mostly increased efficiencies!

      • Chris said, on November 23, 2023 at 9:04 pm

        Your recurring disdain of Samuelson is amusing, but what exactly is your beef with him? You’ll have to excuse my ignorance, the entirety of my exposure to econ consists of three undergrad classes (thank god)…I thought he promoted free trade or something, but googling only shows he’s skeptical of unrestrained free trade (and he was an opponent of free-market Friedman)….I’d love to know why to hate him! Thanks

        • Scott Locklin said, on November 24, 2023 at 1:07 pm

          Samuelson: too stupid to understand optimizing geometric returns (a long beef with the Kelly Criterion culminated in a paper written with one syllable words which is gloriously retarded), yet thinks he can derive a sort of thermodynamics of economic systems: https://en.wikipedia.org/wiki/Foundations_of_Economic_Analysis -one of the most cement headed genuflections at useless abstraction I’ve ever seen, and a baleful influence on generations of economists. His other book is less retarded, but still is retarded. He was wrong about Ricardo. He was wrong about the Soviet Union (he thought they would inevitably crush the US economically because they used ideas in his book, basically: said it every year until Soviet Union fell). He was wrong about the stock market predicting recessions. He was wrong about monetary policy. He was wrong about Keynsianism. He was trivially wrong about “revealed preference.” He was wrong about the Stolper-Samuelson (trade liberalization would float all boats; it drives economic inequality). The Balassa–Samuelson “effect” is fiction which doesn’t fit the data (a dumb just-so story about international trade and exchange rates). There is very little he ever said or did which wasn’t complete bullshit. His students include such luminaries as Merton (who almost blew up the world), Modigliani (responsible for the ridiculous levels of debt US firms run at), Krugman, Stiglitz (helped trash European economies in 2008). His reputation was made by the thermodynamics book and generations of economists have been corrupted by it, turning it from a reasonable thing into a bullshit subject with more differential equations for impressing the rubes. I can’t think of a single novel thing he said or did which was anywhere in the ballpark of correctness or truth. If he had been run over by a bus while in grad school the world would have been better off.

          https://www.independent.org/publications/tir/article.asp?id=820

          • Chris said, on November 24, 2023 at 10:25 pm

            Thank you, that’s very informative and hilarious (not as hilarious as his Nobel)…If I may, what do you mean by Merton “almost blowing up the world”?

            • Another Idiomatic J Expression said, on November 25, 2023 at 12:37 am

              Apologies for jumping in. On a study break. Used to work at a three-letter Swiss Bank.

              1998 Long Term Capital Management (“LTCM”), was founded by Merton et al. on the linked premises that (1) spreads across similar, yet not identical, issues of sovereign debt would converge to zero over time, and (2) this convergent sequence of spreads would be at best monotone or at worst cauchy.

              To effect such an arbitrage (long the cheap, short the dear), enormous amounts of leverage (“gearing” for the European readers) was borrowed from large banks against LTCM’s tiny equity base to juice returns to LTCM’s investors.

              Perversely, the very introduction of so much capital into the sovereign convergence trade (again: long cheap, short dear) did push the spreads closer to zero, making the founders look like (a) geniuses, and (b) justifying even greater leverage to maintain constant equity returns.

              Technical note: narrower spreads generate lower absolute returns (in $’s) per dollar of capital at risk, Thus, you need to risk more capital to generate the same absolute amount of money.

              Which is what LTCM did.

              The intelligent behavior at that point would have been to reduce the size of trade and return some, even most, of LTCM’s equity to investors. Instead, LTCM went in the other direction.

              When, in fullness of time, both premise (1) and (2) proved to be false and the markets for unwinding their sovereign bond trades became predictably illiquid, Federal Reserve stepped in with a giant (for the time) bailout to protect LTCM’s equity holders and creditors (at the expense of everyone else).

              There were arguments about what would have happened in global credit markets if the Fed had not intervened when it did. Outcome wasn’t observable so I don’t have an opinion.

              What was done (the bailout) and the start of the sequence of bailouts was enough of a catastrophe. “Heads we win, tails everyone get bailed out and keeps bonuses”, corrupted and continue to corrupt any useful function of capital markets and was a strong factor in the 2008 sequel.

              One man’s opinion. I have others.

              • JMcG said, on November 26, 2023 at 2:28 am

                Thank you.

  9. Rickey said, on November 22, 2023 at 2:13 am

    Besides manufacturing and design, another area where computers killed productivity is in administration and communications. Instead of writing a cogent paragraph or an executive summary describing a situation, it now has to be in several PowerPoint slides with bullet points, graphs and lots of purdy pictures. What should take ten minutes now takes two hours. If you have to present something to the military, you had better have the correct slide template, proper font, color scheme, column alignment, etc. The term PowerPoint Ranger is firmly established in the military lexicon.

    Verbally telling someone about a meeting or event or sending them an email is not good enough. The onus is on you to send it to their calendar software in appointment format.

    Don’t get me started about persons who “Reply to All” in emails to have a conversation or shotgun emails to an oversized distribution list because they are too lazy to determine who actually needs it. I spend way too much time filtering through crap emails that absolutely do not apply to me, but my department has a policy where everyone has to be carbon copied on emails for “situational awareness”. Computers have enabled morons and bored persons to generate clutter and noise to new levels.

    • g.b. said, on November 22, 2023 at 9:17 am

      I’m amazed you can still use email. Between the time-wasting humans you describe, there’s also the abuse of email by any and all websites and other software. My theory is that that’s one of the reasons humans took to Slack so readily, and now things send notifications to Slack.

      Only email defense I know of is to make a filter that takes all emails that don’t have you explicitly in the To: field and sends them to a “Maybe later” folder.

    • Scott Locklin said, on November 22, 2023 at 9:47 am

      I used to give my presentations using transparencies and magic markers, because writing and sketching shit is a lot faster than powerpoint equation editors and IDL haikus. Towards the end of that part of my career one of the manager-bureaucrats used to get buttmad at me; that using Powerpoint was “more respectful.” I’m pretty sure old timey overhead projectors are no longer allowed in my old institution and now everyone spends 20h composing powerpoint instead of 20m making transparencies.

  10. pindash91 said, on November 22, 2023 at 3:01 am

    must mention the first law of system-antics , a fun read even if not quite right

    anergy is conserved

    that’s the explanation for the Solow Paradox

    Anergy – Any state or condition of the Universe, or any portion of it, that requires the expenditure of human effort or ingenuity to bring it into line with human desires, needs, or pleasures.

  11. Chris said, on November 22, 2023 at 3:25 am

    If you believe in “shadow inflation”, as elucidated by John Williams, then the Solow Paradox is much more severe. Essentially, since the 80s, the government has changed the measure of inflation to greatly underestimate the true value, by using a moving basket of goods instead of a fixed one. This means the productivity growth is actually much lower than the official statistics, since real GDP indexed to whatever year is lower. So it’s not that we have a reduced productivity growth since computers, it’s that we have a pure productivity decline

  12. SalammboSalami said, on November 22, 2023 at 5:26 am

    Do you think a 3D printer is a good investment for someone who wants to get involved in manufacturing things but lives in a college dorm or is a mini-lathe more worthwhile? Looking at the Prusa i3 MK3S+

    • Scott Locklin said, on November 22, 2023 at 9:33 am

      You’ll give yourself lung cancer with the 3d printer. The mini lathe will just create sharp metal chips on the floor. Why don’t you join a makerspace or wait until you can afford a house with a garage?

      • SalammboSalami said, on November 22, 2023 at 3:27 pm

        >You’ll give yourself lung cancer with the 3d printer
        You can purchase one with an enclosure or just build the enclosure yourself. Besides, I have access to an attic that I was planning to print in.
        >Why don’t you join a makerspace
        I already did, but I want access to my own printer without any of the time and other limitations that come with it
        >wait until you can afford a house with a garage
        This is far away in the future

      • toastedposts said, on November 22, 2023 at 7:02 pm

        I’ve used 3d printers before to make things like weird grommets, circuitboard standoffs, front panel plastic, knurled knobs for threads and things. Test tube stands.

        I’ve been experimenting with RC aircraft frame stuff printed out of plastic.

        They’re versatile for certain applications, but plastic sucks as a material for other things. You won’t get precision out of plastic parts. You can tap holes in plastic to hold screws (even fairly tiny screws). Might need a drill press to clean up holes and get them to proper sizes.

        I use the Ender3v2 – it’s inexpensive, and does what any of them will do.

        Actual machine tools will cost more. Mini lathes look like they’re ~$500ish or more. You’ll also need tools and indicators and things. You’ll need various kinds of lubricants and tapping fluid. But the advantage there is you can work in metal, and you can make actual parts that take loads out of metal.

        You need PPE for machining too – goggles or glasses always. But also wear a respirator when grinding.

        I used a 3d printer in my apartment during grad school, but haven’t thought about how I would keep metal shavings contained from a mini-lathe, running it set up on a kitchen table or something. Build a cloth tent around it maybe?

        • Scott Locklin said, on November 22, 2023 at 9:04 pm

          I had a unimat-3 in my efficiency apartment in Berkeley. It’s actually fine for steel if you sweep up and then double sweep up with a magnet. Doesn’t work on brass though, and brass chips are sharp.

          I had thought of getting a solid printer for things like telescope rings or afocal adapters for night vision. Too easy to make that from a piece of nylon tho.

          Anyone who wears a ventillator when grinding 1/4″ steel needs to get his testoterone checked.

          • toastedposts said, on November 22, 2023 at 9:35 pm

            I suppose I’m thinking more in terms of my surface grinder. It produces a lot of dust. A dremel on a toolholder is a different story.

      • toastedposts said, on November 22, 2023 at 7:07 pm

        I hadn’t thought too much about the fumes.

        I did almost gas myself twice a year or two ago doing inadvisable things with organic compounds. Once while hot-wire cutting styrofoam (wear a respirator!) – almost went into anaphylaxis after an unprotected lungful of that.

        Once when attempting to take 3d printer plastic and “cast” it into a mold. Don’t do that. I ruined some cookware and all the grass died where I tried to dispose of the mess.

        A respirator is a nice bit of PPE to have.

  13. Wm Arthurs said, on November 22, 2023 at 8:08 am

    Microsoft Word hides the cost of indecision from the indecisive, tinkering writer. “Hides” in the Emperor’s New Clothes sense, that is.

  14. Scott Locklin said, on November 22, 2023 at 12:45 pm

    Fun example courtesy Eugyppius about traffic light camera operators:
    https://www.eugyppius.com/p/once-more-on-the-managerial-menace

  15. WMBriggs said, on November 22, 2023 at 1:16 pm

    Just as tiny example, I saw some lunatic company that invented some kind of electronic trigger lock for your pistol which must be controlled by your “smart” phone, both of which must be kept charged and updated. And where you have to trust the company will keep your information secure.

    Utterly useless and which exists for the sole reason it can be put on a computer.

    Which will make this excrescence attractive to “regulators” somewhere.

    • asciilifeform said, on November 23, 2023 at 2:56 am

      These IIRC are known as “regulatory startups” — rent-seekers in search of a captive audience — and the given example is unfortunately merely the tip of a vast shit-iceberg. (As I understand, the US “AI industry” presently consists almost entirely of these, sometimes thinly disguised as actual businesses, sometimes not.)

  16. asciilifeform said, on November 22, 2023 at 5:23 pm

    “Job-creating technologies” ( rubbish click-and-drag CAD hell, MS-Win, Java, Ruby, etc. ) see e.g. my ancient piece re subj: http://www.loper-os.org/?p=388 ) proliferate via natural selection.

    The elementary fact is that the people who are hypothetically able to create e.g. better CAD, have no incentive to do so — as they will not be the ones to capture the value. A modern-day worker who proposes a serious process improvement is inescapably “buying his cross.”

    On top of this, “productivity” is not an automatic win in itself. Productivity at what? What would you or I gain if e.g. massively-improved CAD allowed the next iDildo to be designed 20% faster? No one will have a shorter work day or larger house (land isn’t a manufactured good); the most that will happen is that Jobs^H^H^HCook will upgrade to a larger yacht.

    And if you’d like “nightmare fuel”, picture if e.g. the F-35 people were suddenly expected — on account of “actually more productive” tooling — to produce 1000 of them per month, rather than fiddling with a pile of C++ liquishit for imaginary “improvements” to the firmware. Very soon the grocery shelves will look precisely like “stagnation era” Soviet shelves: Mexican tomato pickers will be busy instead refining titanium, hand-polishing the titanium toilets on the planes, etc.

    • Ian said, on November 22, 2023 at 9:33 pm

      Hey, I like your blog. Are you still working with Ada? I’m curious about your experience.

      • asciilifeform said, on November 22, 2023 at 10:44 pm

        Virtually all I’ve been doing since 2021 is commercial work, none of which was in Ada.
        If you’re thinking of the FFA series ( http://www.loper-os.org/?p=1913 ), I had the code for Ch. 22-24 written, but the proofs & discussion to this day not done, and I’ve no idea when (or whether will live long enough) to even attempt to finish these.

        • Ian said, on November 22, 2023 at 11:13 pm

          Yes. I followed these. Do you still recommend it? (I work in embedded controls so I’ve always been interested. Maybe I should go bug you over there instead)

          • asciilifeform said, on November 23, 2023 at 2:47 am

            I most certainly do recommend Ada — but strictly the language subset and style featured in my articles; and concretely excluding the floridly insane pile of OOP bloat and safety overrides which most published Ada programs seem to make generous use of.

            • ian said, on November 23, 2023 at 3:57 am

              Got it. Good to know. Thanks

            • Jujup said, on November 26, 2023 at 10:15 pm

              That second chart of BLS productivity stats is interesting, showing numerically what many of us feel, ie that the ’90s-00’s productivity increases have petered out. Of course, that is using the incredibly flawed GDP stats, which Edmund and Chris already skewered above.

              My explanation is that like all tech revolutions of the past, it comes in two phases: first, you redo existing systems with the new tech, then you create new systems that are native to the new tech.

              The classic example would be how the internet was used to take old-line credit cards from a technology used in larger urban stores to one that any online site or small kiosk could accept. The new system would be crypto-currencies replacing the fiat dollar itself, which are currently still working out the kinks, to put it charitably.

              Across the board, this is the problem I see with the introduction of computing and the internet: way too much emphasis on driving efficiencies in pre-existing systems, way too little time spent creating new systems that really take advantage of the new tech, eg rather than come up with some new online publishing format, we just slap the hoary book online as the “ebook.”

              This is understandable, as that discovery process takes time, but there is still way too little of it going on. You could argue the blog post with comments was an attempt at innovating on that online publishing format, but that has ossified too.

              It is tempting to say people have just gotten dumber and less creative, but history shows such lulls in past tech revolutions too. I expect new systems based on computing and the internet will push productivity up again eventually.

              • Jujup said, on November 26, 2023 at 10:18 pm

                I have no idea why my comment was placed here, when I clearly clicked reply at the bottom of the page. Super shitty WordPress strikes again.

              • Scott Locklin said, on November 27, 2023 at 11:54 am

                > I expect new systems based on computing and the internet will push productivity up again eventually.

                Yeah, except people have been saying that since the 1950s and it apparently never happened.

                • Jujup said, on November 27, 2023 at 6:51 pm

                  For one, your own embedded data shows a rise in the ’90s and ’00s, cresting higher than the postwar average that every dumb boomer is always going on about, as though that competition-free period when Europe and Asia were still digging themselves out of the postwar rubble was in any way repeatable.

                  Second, almost nobody had computers in the ’50s, 30 years before the PC revolution. That would be like bitching that the internal combustion engine was no big whoop at <a href=”https://en.m.wikipedia.org/wiki/Timeline_of_motor_vehicle_brands”>the turn of the century, when there were only a handful of ICE car brands</a>.

                  Anyway, I don’t care about those bogus productivity stats, which Edmund already fisked above, but there has been a clear disconnect between the possibility of this new tech and its reality.

                  The biggest problem I see is the mindset: just as the mostly agrarian population more than a century ago was not ready for the abundance of goods available through mechanization (nor the pollution), most people today are not ready for the information age, or its attendant information pollution today.

                  I see it all the time: we live in a time dominated by theory and ideas, yet most don’t even know how the most basic ideas undergirding our society work, and I include most “experts” in that ignorant class. Well, that ability to understand how our current systems work and how they can be remade with this new tech is crucial if you’re going to build new systems, yet nobody out there seems capable.

                  We’re still waiting for the Henry Ford of the internet, someone who really understands it and rebuilds society from first principles using it. Griping that it hasn’t delivered yet is premature until then.

                  • Scott Locklin said, on November 28, 2023 at 10:38 am

                    The 2001-2007 productivity bounce could be attributed to any number of things and reverted back to half historical levels; if computers actually added anything it shouldn’t have stopped! And you’d see it in other countries, which you don’t.

                    Your faith in “progress” as a result of computard use in defiance of all data is sort of charming in a way. Like people who believe the earth is flat because they think it says so in the bible. Weren’t you predicting Tesla’s robot would be some kind of revolution a year or two ago? How’s that working out for you?

                    • Jujup said, on November 28, 2023 at 6:48 pm

                      The 2001-2007 productivity bounce could be attributed to any number of things

                      I actually agree with this- note my skepticism of such stats above- but I thought you yourself attributed it to “you got some productivity growth from selling stuff online instead of in shops” above. I’d have to see disaggregated numbers to decide if that was mostly just the real estate bubble or if the new tech contributed significantly. I suspect it did, but was not the full bounce.

                      and reverted back to half historical levels; if computers actually added anything it shouldn’t have stopped!

                      Why wouldn’t it? Computers aren’t a magic bucket out of which you can fish out productivity increases forever, all tech has a ceiling of how much it can improve past processes.

                      And you’d see it in other countries, which you don’t.

                      Have you been to other countries? Most are just starting to integrate online retail or ride-hailing now, years after they were getting popular in the US. Even today, online retail is a minority of sales in the US.

                      Your faith in “progress” as a result of computard use in defiance of all data is sort of charming in a way. Like people who believe the earth is flat because they think it says so in the bible.

                      I have never used that silly word and your constant bashing of current tech is boring. We agree that it is currently overhyped, though that’s not because of “wasting time on the internet” but because the tech bureaucrats have used the few genuine advances of this nww tech to sell hiring a bunch of coders who just fill space, because they are all too dumb to come up with new systems, ie try to actually innovate, as Satoshi tried and failed.

                      In other words, we agree on the overall problem, but you blame the tech and I blame the people.

                      I don’t need to have “faith” or read some magic book, because I actually know what the tech is capable of and can figure out what’s coming, advantages that you apparently don’t have.

                      Weren’t you predicting Tesla’s robot would be some kind of revolution a year or two ago? How’s that working out for you?

                      You must be thinking of someone else, as I have never hyped Tesla in my life, instead buttoning the Elon fans who preach about him to me on how he’s probably the biggest con man ever.

                      Btw, this new commenting widget is horrible, how do we go back to the old HTML form?

                    • Scott Locklin said, on November 28, 2023 at 8:07 pm

                      >Btw, this new commenting widget is horrible

                      One thing we agree on at least.

    • Anon said, on January 20, 2024 at 9:47 am

      >A modern-day worker who proposes a serious process improvement is inescapably “buying his cross.”

      Ah, this reminds me of when I was a computard greenhorn and came across Paul Graham’s article here: https://paulgraham.com/ambitious.html

      I read this shortly after it was published, when I came across his website while searching for Lisp materials. I honestly believed in this great steaming heap of bullshit for a while, until I finally realized none of the Shitticon Valley “entrepreneurs” got their fame and wealth from being good at anything except for family connections to old money and corporate propaganda.

      • Scott Locklin said, on January 20, 2024 at 2:42 pm

        I dunno some of the things he mentioned came to pass: streaming video, apple watches that monitor heart health, lots of people use IRC derivatives instead of email (awful choice), and google search is so ridiculously bad you could only continue to use via inertia.

        Graham’s articles, of course, are mostly of benefit to Graham. His Lisp stuff is basically true in that small teams can be very productive, but misleading (big ones probably can’t, and Lisp is sort of like free coffee and snacks for nerds in that it makes work a little more fun).

  17. Wanderghost said, on November 22, 2023 at 8:47 pm

    Any opinions on (IEEE) floating point, ye engineers and simulators? All the ways to fail or output garbage (pref. quietly) seem to me like quite the collection of anti-patterns.

    • toastedposts said, on November 22, 2023 at 9:25 pm

      I dunno – numerical stability is more important. Having an ability to set something to infinity or not-a-number is handy for indicating (for example) when a surface is present or absent in a z-buffer, or when some calculation *has* failed for a particular case.

      The absence of a nan setting won’t make an unstable algorithm stable.

  18. Dawit Habtemariam said, on November 23, 2023 at 4:37 pm

    Do you think computers have reduced the quality of journalism and writing in general?

    • Scott Locklin said, on November 24, 2023 at 12:23 pm

      Google wiped out advertising for newspapers, so, yes.

    • William O. B'Livion said, on November 24, 2023 at 4:36 pm

      I attended a top ranked journalism school in the late 80s, well before the internet was a big thing.

      No, the internet didn’t reduce the quality of journalism. The Gramscian long march did.

      It’s part and parcel of the rest of education.

  19. Filip Finodeyev said, on November 27, 2023 at 8:52 pm

    CAD is a very complex tool so it not only requires more new skills, it requires many of the old drafting skills. After doing it professionally for 20 years, I think building a good CAD model of a complex system like a car takes veteran professionals architecting the model and directing junior CAD specialists/engineers. Unfortunately corporate America doesn’t like CAD jockeys much and wants the engineers to do it. So it’s a tool that is used badly, just like CNC mills are very useful in the hands of an expert organization but much less for a hobbyist.

    Complex tools only increase productivity if used properly, which is complex….

  20. senexada said, on November 27, 2023 at 11:00 pm

    My fave example:

    Boeing 747: 75k paper drawings and plywood models, twice as large as any prev passenger plane, factory was being built at same time as the plane, carried paying passengers <5 years after earliest contemplation, <4 years after first contract signed.

    Boeing 787: CAD era, flies fewer passengers at same speed, does get better mileage. Carried paying passengers 8+ years after earliest conception, 7+ years after first contract signed.

    • Scott Locklin said, on November 28, 2023 at 10:36 am

      Yep I used that example on my original article on the topic for Takimag.

  21. Derrick said, on November 29, 2023 at 5:29 am

    I say this as someone who’s completely ignorant of how any of those works but who is naturally skeptical by nature, but does this paradox relate to the burgeoning “AI Revolution?” Image generation seems impressive though most of it is quite low quality and cranked out in large amounts and the ability of an LLM to at least create something passable as human text is more impressive than I thought it would be. How would these technologies absorb any resulting productivity gains? They certainly seem like they could get rid of a lot of low level customer support and commercial artist jobs.

    • Scott Locklin said, on November 29, 2023 at 1:43 pm

      I have a commercial art pal who uses these tools. His work output seems about the same, except instead of it looking in his unique style it has the stupid lens flare look of AI generated art. Same with some guy I know who claims its the best thing ever because it only took him a half hour to set up his website (Wix takes like 0 minutes).

  22. Igor Bukanov said, on January 16, 2024 at 2:27 pm

    One more of those failed experiments with computers is self-checkout in shops, https://www.bbc.com/worklife/article/20240111-it-hasnt-delivered-the-spectacular-failure-of-self-checkout-technology

    I thought shop owners would not be blind to the cost of supporting the tech. But they were.

    • Scott Locklin said, on January 16, 2024 at 8:30 pm

      Great example; it’s funny to me they continue installing these disasters. I’d always rather wait in line for a human being.


Leave a comment