Locklin on science

Ruins of forgotten empires: APL languages

Posted in Design, J, Lush by Scott Locklin on July 28, 2013

One of the problems with modern computer technology: programmers don’t learn from the great masters. There is such a thing as a Beethoven or Mozart of software design. Modern programmers seem more familiar with Lady Gaga. It’s not just a matter of taste and an appreciation for genius. It’s a matter of forgetting important things.

talk to the hand that made APL

There is a reason I use “old” languages like J or Lush. It’s not a retro affectation; I save that for my suits. These languages are designed better than modern ones. There is some survivor bias here; nobody slings PL/1 or Cobol willingly, but modern language and package designers don’t seem to learn much from the masters. Modern code monkeys don’t even recognize mastery; mastery is measured in dollars or number of users, which is a poor substitute for distinguishing between what is good and what is dumb.  Lady Gaga made more money than Beethoven, but, like, so what?

Comparing, say, Kx systems Q/KDB (80s technology which still sells for upwards of $100k a CPU, and is worth every penny) to Hive or Reddis is an exercise in high comedy. Q does what Hive does. It does what Reddis does. It does both, several other impressive things modern “big data” types haven’t thought of yet, and it does them better, using only a few pages of tight C code, and a few more pages of tight K code. This man’s software is superior to yours APL languages were developed a long time ago, when memory was tiny compared to the modern day, and disks much slower. They use memory wisely. Arrays are the basic data type, and most APL language primitives are designed to deal with arrays. Unlike the situation in many languages, APL arrays are just a tiny header specifying their rank and shape, and a big pool of memory. Figuring out what to do with the array happens when the verb/function reads the first couple of bytes of the header. No mess, no fuss, and no mucking about with pointless loops. Code can be confusing if you don’t drink the APL kool-aide, but the concept of rank makes it very reusable. It also relegates idiotic looping constructs to the wastebin of history. How many more for() loops do you want to write in your lifetime? I, personally, would prefer to never write another one. Apply() is the right way for grown-assed men do things. Bonus: if you can write an apply(), you can often parallelize things. For(), you have to make too many assumptions. Roger Hui, also constructed of awesomeness One of the great tricks of the APL languages: using mmap instead of scanf. Imagine you have some big chunk of data. The dreary way most languages do things, you vacuum the data in with scanf, grab what is useful, and if you’re smart, throw away the useless bits. If you’re dealing with data which is bigger than core, you have to do some complex conga dance, splitting it up into manageable chunks, processing, writing it out somewhere, then vacuuming the result back in again. With mmap, you just point to the data you want. If it’s bigger than memory …. so what? You can get at it as quickly as the file system gets it to you. If it’s an array, you can run regressions on big data without changing any code. That’s how the bigmemory package in R works. Why wasn’t this built into native R from the start? Because programmers don’t learn from the masters. Thanks a lot, Bell Labs! Fred Brooks, Larry Breed, Joey Tuttle, Arthur Whitney, Eugene McDonnell, Paul Berry: none of these men can be held responsible for inflicting the horrors of S+ on the world This also makes timeseries databases simple. Mmap each column to a file; selects and joins are done along pointed indexes. Use a file for each column to save memory when you read the columns; usually you only need one or a couple of them. Most databases force you to read all the columns. When you get your data and close the files, the data image is still there. Fast, simple and with a little bit of socket work, infinitely scalable. Sure, it’s not concurrent, and it’s not an RDBMS (though both can be added relatively simply). So what? Big data problems are almost all inherently columnar and non-concurrent; RDBMS and concurrency should be an afterthought when dealing with data which is actually big, and, frankly, in general. “Advanced” databases such as Amazon’s Redshift (which is pretty good shit for something which came out a few months ago) are only catching onto these 80s era ideas now. Crap like Hive spends half its time reading the damn data in, using some godforsaken text format that is not a mmaped file. Hive wastefully writes intermediate files, and doesn’t use a column approach, forcing giant unnecessary disk reads. Hive also spends its time dealing with multithreaded locking horse shit. APL uses one thread per CPU, which is how sane people do things. Why have multiple threads tripping all over each other when a query is inherently one job? If you’re querying 1, 10 or 100 terabytes, do you really want to load new data into the schema while you’re doing this? No, you don’t. If you have new data streaming in, save it somewhere else, and do that save in its own CPU and process if it is important. Upload to the main store later, when you’re not querying the data. The way Q does it. The APL family also has a near-perfect level of abstraction for data science. Function composition is trivial, and powerful paradigms and function modifications via adverbs are available to make code terse. You can afflict yourself with for loops if that makes you feel better, but the terse code will run faster. APL languages are also interactive and interpreted: mandatory for dealing with data. Because APL languages are designed to fit data problems, and because they were among the first interpreters, there is little overhead to slow them down. As a result, J or Q code is not only interactive: it’s also really damn fast. It seems bizarre that all of this has been forgotten, except for a few old guys, deep pocketed quants, and historical spelunkers such as myself. People painfully recreate the past, and occasionally, agonizingly, come to solutions established 40 years ago. I suppose one of the reasons things might have happened this way is the old masters didn’t leave behind obvious clues, beyond, “here’s my code.” They left behind technical papers and software, but people often don’t understand the whys of the software until they run into similar problems. Some of these guys are still around. You can actually have a conversation with mighty pioneers like Roger Hui, Allen Rose or Rohan J (maybe in the comments) if you are so inclined. They’re nice people, and they’re willing to show you the way. Data science types and programmers wanting to improve their craft and increase the power of their creations should examine the works of these masters. You’re going to learn more from studying a language such as J than you will studying the latest Hadoop design atrocity. I’m not the only one who thinks so; Wes McKinney of Pandas fame is studying J and Q for guidance on his latest invention. If you know J or Q, he might hire you. He’s not the only one. If “big data” lives up to its promise, you’re going to have a real edge knowing about the masters. Start here for more information on the wonders of J. http://conceptualorigami.blogspot.com/2010/12/vector-processing-languages-future-of.html 231 Responses Subscribe to comments with RSS. 1. Rohan Jayasekera said, on July 28, 2013 at 5:22 am Sigh…reading this makes me wish I were still working in one of those languages, instead of in something that I have to put up with. (I also have to put up with hearing Lady Gaga songs occasionally, but at least those are over quickly, while currently popular languages torture me with how long it takes to get things done.) • Scott Locklin said, on July 28, 2013 at 6:57 am I figured you were sipping mint juleps while maidens wave palm fronds at you. I see now that you’re doing some heavy lifting. Want an introduction to Wes? He’s a friendly guy, and working on some cool stuff. I think he could use an APL Rabbi, and I’m not up to the task. Also, if you ever have an urge to work on or give guidance on a J project; let me know. I’m no good at it, but I’m learning. 2. Brian said, on July 28, 2013 at 5:42 am Lady Gaga isn’t a go-to target for what’s wrong with modern music. She’s actually talented. She’s no Mozart but I’d like to see Britney or Kanye do this: http://www.youtube.com/watch?v=oP8SrlbpJ5A&list=FLcP4ag6fCJqnFnfDOPEi1JA&index=3 The can’t because the f’ers can’t read music or play an instrument. Sorry I’m off topic but I don’t know a F’ing thing about coding. Btw, can you do an article on space time warping? The NASA project in the news lately? Is it a publicity stunt? • Scott Locklin said, on July 28, 2013 at 6:46 am I picked Lady Gaga on purpose. She’s classically trained. So’s James Gosling and Guy Steele. That means neither of them has an excuse. • Brian said, on July 28, 2013 at 10:03 am Got ya. Sorry, I took it the wrong way. Yep, Gaga went to Julliard. They don’t let hacks into that place… or at least they didn’t let me in. • Scott Locklin said, on July 28, 2013 at 10:32 am Right. You’ll have to link me on the NASA thing. It certainly sounds like bullshit. On my “to do for this blog” list is Robert Forward’s antigravity machine, once I can track down the original papers. I think on initial inspection it’s not bullshit (if you don’t mind swinging mini black holes around), but my relativity needs a tune up. • Brian said, on July 28, 2013 at 10:49 am A Mexican physicist, Miguel Alcubierre, came up with FTL machine concept in the 90’s I think. They’re talking about it again lately because NASA is tinkering with it. http://www.nytimes.com/2013/07/23/science/faster-than-the-speed-of-light.html?src=me&_r=0 • Scott Locklin said, on July 28, 2013 at 9:36 pm Alcubierre’s idea is essentially the same as one of Robert Forward’s. Forward had this idea in the 60s; he also had several other ideas which are probably worth exploring. I’m pretty sure the relativistic mechanics is legit. I’m saying this, not because I’ve thought through the papers, but because they got published and nobody laughs them to scorn. The problem is, well, you can’t really build such a thing due to large amounts of mass required. Googling around on the article, White appears to have a workaround. He also appears to have a very simple “ground zero” experiment: table top physics you could do for a few tens of thousands of dollars of hardware. If his gizmo sees something, it would be an important breakthough on many levels (measuring General Relativistic effects due to “exotic matter” using a laser would be a big deal); even if you can’t really build a warp drive. Mind you, even in the best case scenario, if everything works right, you’re talking *controlled* mass-energy numbers equivalent to the yield of 1000 6 megaton hydrogen bombs; nobody knows how to do that (nobody even knows how to make an uncontrolled 6000 megaton bomb, thank bobo), but the experiment would measure if this is even possible in principle. It is a high risk, cheap experiment, and it should be done. Kudos to NASA for punting on this one. https://en.wikipedia.org/wiki/White%E2%80%93Juday_warp-field_interferometer • Picky said, on July 31, 2013 at 12:54 pm Actually she was accepted to Julliard but never enrolled. She went on to Catholic school then Tisch School of the Arts • Rohan Jayasekera said, on July 28, 2013 at 3:37 pm In defence of Guy Steele, in the 1970s and 1980s there were a bunch of attempts to combine APL and Lisp, and his was the only one I saw that was really good. Unlike the other creators he went to the core principles (and therefore strengths) of both languages, instead of getting stuck on surface aspects. Also, he took some Common Lisp stuff straight from APL, because APL had already nailed it, and as recently as 2007 he gave a talk at an APL conference (I’ll post a separate comment about that because I think it’s significant on its own). • cstacy said, on July 31, 2013 at 4:30 am Yes, Common Lisp took a lot of the arithmetic rules (e.g. branch cuts) from APL, and also introduced the REDUCE operator to Lisp. Another MIT guy, Dick Waters, had an abstract series expression language embedded in Common Lisp at that time, also. More recently, have a look at the lazy / series orientation in Clojure (which is close relative of both Common Lisp and Scheme, executing on the JVM). • Scott Locklin said, on July 31, 2013 at 5:42 am As a bit of trivia; Lush2’s IDX array system is starting to look an awful lot like APL. I never asked Ralf or Yann if that’s what they were thinking of, but found it amusing it ended up that way. • raito said, on July 31, 2013 at 4:28 pm Guy Steele obviously has no excuse, moving from Lisp to Java. • John Cowan said, on August 3, 2016 at 8:21 pm Steele knows. He wrote a programming language for programmers, that’s Scheme, and it lost. Then he wrote a programming language for wannabe programmers, that’s Java, and it gives employment to millions. Which is the greater achievement? It depends. • Scott Locklin said, on August 3, 2016 at 10:19 pm • Larry said, on July 29, 2013 at 2:00 pm I do realize how the discussion on this blog about Lady Gaga proceeded to acknowledging her superior musical training and talent. I just want to chime in to say that Lady Ada Lovelace really pales in comparison. Though she is holds the honor of being the modern world’s first computer programmer, her contributions were overhyped by her mentor Charles Babbage. Historians cited her to have been mad as a hatter and a passionate gambler who waged herself into deep debt. I do admit that 33 years ago I was enchanted by the idea that DoD chose Ada (Mil-Std-1815) as its standard programming language but in hindsight I see that it is just as well that it fizzled and was displaced by better designed object-oriented languages. • Scott Locklin said, on July 29, 2013 at 11:30 pm I’ve heard that before, but have no opinion on that bit of history. It’s silly to be talking about code for a machine which was not run at the time anyway. Grace Hopper was certainly important, though, well, COBOL. I used to know a woman at LBL who was involved in writing a bunch of important bits of System-V; she was a good person to know when the Solaris machines acted up. 3. bingo said, on July 28, 2013 at 7:11 am > Apply() is the right way for grown-assed men do things. I know that you were probably trying to write lightheartedly. But this sort of casual sexism is one of the things from the history of CS that really does deserve to be forgotten. • Scott Locklin said, on July 28, 2013 at 7:24 am Computer science worked better historically in part because humorless totalitarian nincompoopery hadn’t been invented yet. People were more concerned with solving actual problems than paying attention to idiots who feel a need to police productive people’s language for feminist ideological correctness. You may now go fuck yourself with a carrot scraper in whatever gender-free orifice you have available. Use a for loop while you’re at it. • RR said, on July 28, 2013 at 9:43 am Excellent insult, the “Use a for loop..” was the Icing on the cake – I’m going to have to remember that. Also, thanks for the above articles – it’s good to see the occasional “those who do not remember the past..” article now and then, it reminds me exactly how much of a crap-fest I have to deal with on a day-to-day basis (at work, some times at play). Keep it up! • John Croisant said, on July 28, 2013 at 5:50 pm Scott, I enjoyed your original post, and I was considering sharing it with my peers, but your reply to bingo made me lose all respect for you as an intellectual and as a person. Not only was your comment a major overreaction to bingo’s valid point, but it was also inappropriate and hateful, and revealed you to be, frankly, an utter asshole. Your words and attitude dishonor the brilliant men you claim to admire, and embarrass every decent person involved in computer science and technology. • Scott Locklin said, on July 28, 2013 at 8:43 pm I am the type of asshole who doesn’t want oversensitive swine reading my blog, using my software, or getting anything out of me other than impotent fist-clenchy, foot-stampy rage. So, thanks for not inviting any more. • Daniel said, on July 29, 2013 at 2:17 pm Awesome! You are now permanently bookmarked! “humorless totalitarian nincompoopery” – simply the best comment of the week, and it’s only Monday!!!! • Adam Williamson said, on July 31, 2013 at 4:13 am Word to the wise: Sam Varghese is not any sane person’s idea of ‘good company’. • Scott Locklin said, on July 31, 2013 at 5:44 am I don’t know who that is, other than the author of that news article. Nor do I know who Sarah Sharp is. However, Linus is known to me, and if such hamsters as posted the comment above are also bothering him about his “tone,” I am, in fact, in good company. • John Cowan said, on July 4, 2019 at 10:58 pm Even Linus finally became concerned with Linus’s tone finally. • Scott Locklin said, on July 4, 2019 at 11:36 pm Go fuck yourself. • bob van wagner said, on July 31, 2013 at 1:10 pm Scott, you have just encountered what looks to be children of the “Bubble Generation”, the interchange they (Bingo, Croisant) had with you is an typical example of how they interact with the world. These are the young raised in extreme political correctness, communicating excessively with age peers alone via txting and IM’ing, now facebook and twitter. You violated what they consider appropriate PC-ness, and immediately the horde bubble-communicates and reaches a consensus that you are outside the fence. They drop all further interaction with you. No matter what you have to offer in all other regards. The “Bubble” refers to that fierce group-awareness they operate with, and it accepts no input from those outside of it, except from group approved sources. • Scott Locklin said, on July 31, 2013 at 10:11 pm People like that are why I write for Takimag. I seriously do not want such people to read my blog, use my evil sexist racist software, or interact with me in any way for fear of causing them moral distress. I also want them to stop using the transistor (OMG, Shockley!), highways (Hitler!), fast food (homophobes!), computers (Von Neumann … he was a zionist, right?), the internet (DARPAnet!) and jets (moooore Nazis!). Yet, I have something productive for them to do: I am told the nation needs a tribe of people to pick the potatoes. • Matt said, on July 31, 2013 at 6:28 pm Like Daniel, I now have your blog bookmarked because of your responses here. Thank you for being funny and not standing for bullshit! I’m totally on board with the point of your post as well. I find so many better ideas looking through older languages and frameworks than “modern” crap. (WebSockets make me sick, for example.) Keep up the good work. • Rob said, on July 29, 2013 at 4:12 pm Agree. Everyone is entitled to be a curmudgeon, and even be proud of it. Doesn’t mean everyone has a right to be indulged. Enjoy “your house”. • John C. Randolph said, on July 31, 2013 at 3:24 am Scott’s response to “bingo” was entirely appropriate. And fuck you, too. -jcr • gufo said, on August 10, 2013 at 11:57 am Masterful troll we have here: – complains about a guy being offensive to others; – calls same guy “utter asshole”. And yes, I am being sarcastic about the trolling level. Yes, it is a pity some people choose expressions that hurt the sensibility of some other. But the forceful preservation of SOME of the sensibility is how fascism prefer to start. First rightfully shield gays, atheists, or females, tomorrow protect pedos, rich rapists, promoter of perverted religions, corrupt politicians, which was the real object of all the operation from the start. Think about it, reputation and political correctness are present in corrupt systems because they are additional means to control/manipulate others. • Wisnoskij said, on July 31, 2013 at 3:35 am “Awesome! You are now permanently bookmarked [RSSed]!” Seconded. • Fried PBJ said, on August 9, 2013 at 8:29 pm This is the whiny-assed titty babiest thing I’ve read all month. • bob van wagner said, on August 9, 2013 at 10:29 pm “babiest”? Ageism has no place here in the future. • gngl said, on July 28, 2013 at 10:11 am Where do you see “casual sexism” in that? • max said, on July 28, 2013 at 8:42 pm • anonymous said, on July 31, 2013 at 5:20 am haha what a bitch mangina.. go pussybeg somewhere else.. 4. Chris said, on July 28, 2013 at 8:05 am Your photo is the wrong Ken Iverson – see http://en.wikipedia.org/wiki/Ken_Iverson • Scott Locklin said, on July 28, 2013 at 8:17 am Oh shucks; thank you kindly for pointing that out: fixed. That other Ken Iverson looked more like a golfer than a savant; I should have known. 5. Jacques Le Roux said, on July 28, 2013 at 8:22 am APL changed my life, I had two great intellectual experiences in my life: when I learned to read, and when I learned APL. And I still remember this aha moment in the late 80s when, on an IBM 360 machine, I 1st read from a DB2 database to an APL2 session memory. So I earned my life during 20 years using APL and was still enjoying it, then I had to turn to Java because contracts (I’m a freelancer) became too rare. This said times have changed, and so did machines. I recently read this article on “The LMAX Architecture” from Martin Fowler (http://martinfowler.com/articles/lmax.html), it’s worth reading. There is also the concept of mechanical sympathy which is kinda used here. Thanks for this article! • Scott Locklin said, on July 28, 2013 at 8:33 am There was a neat discussion in the comments on this blog a year or two ago; LMAX came up. It’s kind of neat that they’ve rediscovered this powerful pattern. Either that or they looked at how kdb+tick works. I’m a contractor too. I’d actually consider a day job where I could use something like J all day. Meanwhile, J has occasionally found utility in grinding data down to something R can deal with. And it works great on my personal stuff. • bob van wagner said, on July 29, 2013 at 3:45 pm So true about the comparison of learning to read and learning APL. I had programmed in calculator and plotter machine-like languages before APL, and many since, but none freed the mid to explore at mind speed data like APL. Learning to talk is not as amazing adventure as learning to read, because talking and listening are slow. The speed or reading is unlimited, really, and APL flew beyond the reach of other ways of expressing symbols that produce action in both the real and data-modeled world. Much closer to speed in thought, and abstractly the operator and reduction algebras of APL improve thought. 6. Thanks said, on July 28, 2013 at 9:18 am Genuinelly interested by APL languages: do you know and recommend applications/frameworks written in such language to do heavy 3D rendering? Any other resources on applications or libraries you would like to share? Thanks. 7. Jacques Le Roux said, on July 28, 2013 at 12:13 pm It’s actually Dyalog APL, not Dylog 😉 • Scott Locklin said, on July 28, 2013 at 9:48 pm J is so much easier to spell… • Petro said, on July 31, 2013 at 4:48 am Laziness is supposed to be a virtue, but isn’t that taking it a little far? 8. Stuart Brorson said, on July 28, 2013 at 12:33 pm You didn’t mention it, but one of APL’s most important features (IMO) was that it provided an interactive environment (a “shell”) which allowed you to fiddle around with your data in real-time. Back in the day, no other programming language allowed for interactivity, except Basic. (At least as I recall.) And trying to do real math in Basic would be an exercise in frustration. APL was amazing that it allowed you to experiment with your computation (and your program), try different operations until you got the result you wanted — in real time. Trying to do the same thing in any other language involved the usual edit -> compile -> run -> debug cycle. Also, I don’t remember how it worked, but I do recall using APL to make plots easily, another thing you couldn’t do with any other environment. Nowadays, we are fortunate to have Matlab, R, NumPy, Julia, and so on, all of which provide a convenient shell and graphics capabilities. But APL was the first, as far as I know. • Scott Locklin said, on July 28, 2013 at 8:48 pm Well, I did say “interactive,” which I think mandatory for data science type problems. I didn’t emphasize it, though I know it was an important feature in the old days. FWIIW, J has awesome plots. R quality. Easier to use as well. • Gzorgumplatz said, on August 9, 2013 at 1:49 pm apl\360 had the “3 plotformat” workspace, at least that what it was called on my system. It was able to do (at the time, 1975) very nice printer graphics. I.P.Sharp and probably other timesharing compaines had a facility ([]arbout) which could send the necessary ascii characters to drive a daisywheel printer in graphics mode or flatbed plotter. • maggieleber said, on August 9, 2013 at 3:30 pm At Unicoll, we had a Tektronix 4015 storage tube graphics terminal with an APL keyboard….I still have some of the doodles I did on it, printed out on thermal paper…. 9. Ross Judson said, on July 28, 2013 at 1:53 pm What I like the most about K is the list of verbs and adverbs. If LISP is a distillation of the concepts of functional programming, K’s masterful set of operators and their combinations represents a distillation of mathematical operations. 10. Pete said, on July 28, 2013 at 2:37 pm Last April I stumbled across this fun 56-minute YouTube video, “The Origins of APL – 1974”, that (If I read correctly) was filmed one morning during the 6th APL conference in Anaheim. In the video there are about two minutes of montage before a moderated panel made up of Philip Abrams, Larry Breed, Adin Falkoff, Kenneth Iverson, and Roger Moore digs in. From the description: “[ John R. Clarke]…sneaked off and went to the IBM Sci­en­tif­ic Cen­ter in Philadel­phia to meet with Iverson and Falkoff. He then pitched the idea of cut­ting a tape and us­ing it at the ban­quet. Iver­son was not im­pressed, but when Falkoff ask why it should be done, John replied that he would bet 90% of the peo­ple us­ing FOR­TRAN did not know who John Bac­cus was and how FOR­TRAN was de­vel­oped. Falkoff went to see Iver­son, and it was agreed…tape was shot in re­al time the morn­ing be­fore the ban­quet in the evening.” I hope others enjoy it as I did. • MaggieL said, on July 28, 2013 at 8:36 pm The building I learned APL in was originally the IBM Scientific Center. By the time I got there it housed Unicoll Corporation… • Brian Burns said, on July 29, 2013 at 2:02 am “Once you begin to understand what you’ve been doing, you realize it’s practically all been done before, that in one sense, APL is a simplification and extension of vector algebra, which is something that’s been around in mathematics for a long time.” (7:10) Aye, he did it also. • Don Rueter said, on July 30, 2013 at 2:57 am I had learned APL at the end of the 60’s, pretty much on my own, while a grad student at USC. After hours, I was allowed access to an IBM 1130. I’d bring along a disk cartridge (about 16″ in diameter), a Selectric type ball with the APL symbol set, and a single prepunched card to boot the machine. This knowledge paid off as I searched for a paying job and found Orange Coast College in Costa Mesa, CA. They had an IBM 360 with 20 IBM 2741 Selectric terminals attached, all running only APL. They also had John R. Clark, a forward-thinking professor and APL champion. (He is mentioned but misspelled in the above comment.) The OCC Computing Center is named for John. I presented a paper at that May 1974 conference in Anaheim, sharing my work in speech synthesis using APL. Still have the proceedings sitting on my bookshelf. • Scott Locklin said, on July 30, 2013 at 3:02 am Interesting times to be sure. If you ever did work on Voice Recognition (APL or otherwise); I’d like to subscribe to your newsletter. • Don Rueter said, on July 30, 2013 at 3:58 am Never voice recognition–just synthesis using an RS-232 device (Votrax) connected in parallel with the terminal. (See http://en.wikipedia.org/wiki/Votrax.) I retired from OCC in 2004 after 34 years there, teaching and programming VB.NET applications in the latter years. APL gave me a foot in the door. • bob van wagner said, on July 30, 2013 at 5:20 pm Votrax! A long time since I’ve heard that term. PWM, right? • Devon McCormick said, on August 6, 2013 at 4:50 pm I remember programming the Votrax in APL at OCC. I did the soundtrack for a short student (Jr. High) film: the Walt Kelly (of “Pogo” fame) poem “Hail to the Hippety Push-button World”. • Don said, on August 7, 2013 at 1:01 am We only had one at each campus, as I recall. You may have used my APL library for your work. I remember one time when we were doing a show in San Diego using a remote connection to the 360 or 370 mainframe back in Orange County (acoustic telephone coupler at 300 or 1200 baud). My colleague was already addressing the audience while I was hustling around trying to get the system to work. It required a quick phone call back to the computer operator to reboot the mainframe. The audience never knew. • Pete said, on July 30, 2013 at 3:16 am Don, My apologies to you and John R. Clark for the misspelling. One shouldn’t just copy text from YouTube and paste it in.. Also thanks for posting your comment, I’m still reading about Votrax: http://dl.acm.org/citation.cfm?id=810855&dl=ACM&coll=DL&CFID=236624058&CFTOKEN=11689681 http://en.wikipedia.org/wiki/Votrax • Don Rueter said, on July 30, 2013 at 3:18 am Hadn’t seen that video in years. The intro was shot in our computing center which, by 1974, featured the IBM 370. The two operators (Paul & Mickey) were buddies of mine. Memory Lane, indeed! • Jerel Brandon H. Crosland said, on August 1, 2013 at 12:49 am APL was the third computer language I learned, and I did it at Orange Coast College in about 1973. I was a sophmore at Newport Harbor High, where I taught myself BASIC on a PDP-8, and then discovered OCC offered “public access” (virtually 24×7 – I know because I spent many long nights there!) for free in order to get some funding from somewhere. I went out there and taught myself Fortran on their 360, and then discovered the room with all of the dedicated APL Selectric terminals, and thought I had died and gone to heaven. What an amazing language! I’ve been a programmer most of my life because of that experience, but never did I have the privilege of working for hire in APL. Ah, if only! I envy those who were able to use these languages professionally. I ended up doing a lot of work in various assembler languages, and on up the “generations”. I have felt the curmudgeon for so long, observing vociferously the similarity of what’s “new” to what has gone before. There is so much reinvention of the wheel, only it ends up being square. Then, BIG IMPROVEMENT! They make it triangular and crow that they’ve gotten rid of one bump! But it’s “new” because it has a different name or was the brainchild of some new wiz. Thank you, Don, for reminding me of my time at OCC, and thank you Scott for a great article. • Don Rueter said, on August 1, 2013 at 1:46 am Were you my student? I was teaching APL for quite a while. • Jerel Brandon H. Crosland said, on August 3, 2013 at 2:56 am Hi Don, No, I was very into teaching myself everything at the time. I bought the textbook for the class, which I think was the only book on APL at the time, and taught myself. But while I have no recollection of it, I’m sure we met. I spent a lot of time out there! There was another guy out there…. Ron Clark? That name comes to mind but I haven’t thought of it all since the 70’s so I’m a little fuzzy. I think we, being bratty high school kids, were probably barely tolerated there. If you recall any such kids infecting the APL room, I’m sure I was one of them. • Don Rueter said, on August 3, 2013 at 3:18 am I remember we had a bunch of guys that were too old for toys and too young for girls who spent a lot of time in the terminal room. Actually we learned a lot from the high schoolers. You’re probably thinking of John R. Clark for whom the Computing Center is now named. He was a classic. • Devon McCormick said, on August 6, 2013 at 5:10 pm I was one of the Junior High “squirrels” who infested the computer room on weekends – not many of the college kids were there then. • Don said, on August 7, 2013 at 1:03 am I was lab supervisor in the 1970-71 year and saw a lot of you and your friends. • Don said, on August 7, 2013 at 1:05 am I just downloaded a 30-day trial of APLX to run on my iMac. Brings back a few memories. • Paul Robinson said, on August 11, 2013 at 12:17 am I went to OCC back around 1979, I got to see one of the best people I’ve ever known to use APL who worked as a teacher there, John Clark. He taught the Pascal class there. I knew that no matter how good I got as a programmer in any language I’d never be able to touch what Mr. Clark could do with APL APL is probably one of the most powerful programming languages ever developed, and I think the problem is that it’s so terse that it’s impossible to do maintenance, and whether we like it or not, programs have to change over time to fit new needs and circumstances, and being hard on maintenance is a death-knell for any language, then being hard to understand was the final nail in its coffin. (I’d like to say the standard joke was to write a program in one line in APL that would take a page of code in some other language, except it was no joke, you could do it, but you’d never be able to update it and you’d probably have to rewrite it from scratch to do so.) Which is why as bad as COBOL is, it’s great for maintainability and fairly easy to read which is why it stuck so well. Right now, August 2013, go out and look; there are probably still as many job postings looking for people who know COBOL as C or Java. Despite all the hits on it, it’s arguable more code was written with COBOL than possibly any other language. If APL could have had its power with that kind of readability and maintainability we might still be using it in non-esoteric applications. There’s an existing open source version of APL called NARS 2000, it’s really not too bad. Needs work, though, it lacks access to databases but the implementation is solid; it duplicates the results of operators on data from other APL from other vendors and examples flawlessly. • Don Rueter said, on August 11, 2013 at 4:18 am You are so right about JRC! Too bad that NARS 2000 doesn’t run on my Macs like the non-free APLX does. 11. boriel said, on July 28, 2013 at 4:24 pm This post has deeply resonated with me, not only with APL-like languages but, in general, with the state of the art in Computer Science. Other than hype and ego I often see much “rebranding” and little innovation. Of course, there is innovation, but not as much as they say. • Scott Locklin said, on July 28, 2013 at 9:55 pm The history of machine learning is kind of sad and amusing to contemplate as well. 12. […] APL programovacích jazykov ostáva stále výborný aj po rokoch […] 13. Rohan Jayasekera said, on July 28, 2013 at 4:28 pm While doing a web search just now for Guy Steele’s 1980s paper combining APL and Lisp, I learned that at the APL2007 conference (I haven’t been to an APL conference since the 80s) he gave a talk entitled “What APL Can Teach the World (and vice versa)”. His abstract remarks that “lately it has fallen on hard times, which is a pity, because there are certain important lessons from APL that the rest of the world has still not learned”, and suggests that the APL community could once again become influential by abandoning its unfortunate insularity: “In this talk I make some specific suggestions for how the APL community might use its unique skills to tackle some of tomorrow’s important computing challenges”. Which was a brilliant suggestion but would have been lost on most or all of the audience, now that most APL enthusiasts have given up (myself included), leaving those who are willing to be isolated in a ghetto. Fortunately there is still the one bright light of the Arthur Whitney camp, which has a foot in both APL and non-APL worlds. But even they may not be inclined to act on his suggestion. Back when I was an APL evangelist I learned that the vast majority of people who were already programming in other languages found it too weird to even consider using (those who had never programmed in anything were unaware that it was weird, so had no problems with it). If I ever meet Paul Graham I’m going to tell him that at Viaweb they need not have bothered keeping their use of Lisp a secret, as the prospect of a huge productivity increase is not sufficient to get people to switch to a strange language. When I was at I.P. Sharp Associates (where Ken Iverson eventually worked as language architect) we ended up using APL as Viaweb used Lisp: as a tool to create applications that could beat the competition. For example, we built a global risk control application for banks (I was one of the two authors of the core system) and it took half the market; Deutsche Bank used it for some 15 years. Arthur Whitney’s company Kx Systems continues with that approach, selling not a language but software written in it. • Scott Locklin said, on July 28, 2013 at 9:01 pm I’m pretty sure Paul Graham saved Lisp from the grave. We now have things like Clojure which have achieved some mainstream respectability, and relatives like ML backed by no less than Microsoft. I remember in 2004 or so when I was struck by the Graham thunderbolt, the idea that you could use “all those parenthesis” to do anything was considered absolutely crazy. So, things can change. While APL certainly fell into disrepute, I think Kx and Morgan Stanley kept it more alive than Lisp was in those days (J was “pay for” until fairly recently as well; I think a few prop shops were using it). I should probably focus on making my pile of dough using J, rather than writing about why it and its cousins are awesome. I’d be happy for some company though. I’m pretty sure I could get J to do Kdb style things using zeroMQ and a few other doodads, for example, but it’s going to happen a lot faster if some visionary decides he wants to take a risk and make it happen, or pitch in and help out. • Fred Lakin said, on July 31, 2013 at 11:18 pm Speaking of “Guy Steele’s 1980s paper combining APL and Lisp” Did you ever find it? I remember seeing the article in paper way back when, and asked Steele about it. He grinned a little embarrassed (we were at the Stanford AI lab at the time) but cheerfully talked about it. Besides hacking LISP I was also learning APL at the time, enjoyed it, and also the chat with Guy. • maggieleber said, on July 31, 2013 at 11:34 pm Well, there *is* this: http://groups.csail.mit.edu/mac/users/gjs/6.945/readings/MITApril2009Steele.pdf Clearly APL inspired…and yet at the same time the middle of the slide deck looks like the first few weeks of Odersky’s Coursera “Functional Programming Principles in Scala” https://www.coursera.org/course/progfun • Rohan Jayasekera said, on August 1, 2013 at 6:12 am It’s driving me crazy that I can’t find it. If I recall correctly, I was at the conference where he presented it, at MIT. I think it may have been the Second ACM Conference on Functional Programming and Computer Architecture — but since that was in the dark ages before the Web I haven’t even been able to determine via Google whether that conference was held at MIT, let alone find a list of papers! I’m sure I still have those conference proceedings — but packed in one of many boxes. 14. […] Programming News: Ruins of forgotten empires: APL languages One of the problems with modern computer technology: programmers don’t learn from the great masters. There is such a thing as a Beethoven or Mozart of software design. Modern programmers seem more familiar with Lady Gaga. It’s not just a matter of taste and an appreciation for genius. It’s a matter of forgetting important things. There is a reason I use “old” languages like J or Lush. It’s not a retro affectation; I save that for my suits. These languages are designed better than modern ones. There is some survivor bias here; nobody slings PL/1 or Cobol willingly, but modern language and package designers don’t seem to learn much from the masters. Modern code monkeys don’t even recognize mastery; mastery is measured in dollars or number of users, which is a poor substitute for distinguishing between what is good and what is dumb. Lady Gaga made more money than Beethoven, but, like, so what? Read full story => ScottLocklin […] 15. MaggieL said, on July 28, 2013 at 5:04 pm I remember my APL experiences fondly (and FORTH YOU LOVE IF HONK THEN as well), and the things I learned then stand me in good stead now that I’m working with Scala. • Scott Locklin said, on July 28, 2013 at 10:49 pm Out of curiosity, how does Scala deal with array types? I didn’t feel real good about Clojure’s array types; Colt or JVM native, so I ended up screwing around with JBLAS before despairing of the whole thing. • MaggieL said, on July 28, 2013 at 11:10 pm Erm, yes. There by hangs a tale…or several. It’s a complicated story, perhaps best introduced by words from Dr. Odersky back in 2009: http://docs.scala-lang.org/sips/completed/scala-2-8-arrays.html He confirms the dissonance you felt between wanting to have a good array design and preserving Java compatibility. The solution chosen exploits the richness of the Scala collections library (ever so much nicer designed-in than bolted-on after the fact as it was in Java) and leverages the (admittedly somewhat scary for those of us scarred by exposure to PL/I) powerful implicits capability of Scala. Some more useful talk on the subject here: http://docs.scala-lang.org/overviews/collections/arrays.html A clue to how rich the Scala collections framework is is dropped by the type hierarchy diagram: That’s for the immutable flavor… scala.collections.mutable looks like this: In short, in current Scala, collections are as central to the goings-on as arrays are in APL… but Scala arrays are just one member of a much larger family of types, a family that shares many operations through inheritance. • Scott Locklin said, on July 28, 2013 at 11:41 pm I guess collections are pretty helpful (and an improvement on the Java ones is always welcome), but as a numerics guy …. argh. What if I want to do SVD or calculate eigenvalues on a collection? Probably have to shove it into a new type. I mean, having a natural way to make a collection into a priority queue is nice, but for my kind of problem, I like it better when my arrays of floats are assumed to be a matrix, like in APL. Then again, I kind of wish J had dictionaries; I guess it’s impossible to make me happy without spending$100k a cpu.

• MaggieL said, on July 28, 2013 at 11:54 pm

For that soft of thing you might find there’s already a library close to your needs….

And we’ve come a long way from the days when I when I sat myself down and copied stuff by hand from “The APL Handbook of Techniques” or “The APL Idiom List”. You just clone from GitHub or Google Code and off you go…

• Mark C (@scalaguy) said, on August 11, 2013 at 10:18 pm

I have enjoyed reading this article and thread of comments, so I want to put my two cents worth in that working in Scala DOES have the feel of the old days in APL. And the beautiful part of it (well, one of the many) , is that you can sneak it into a larger Java project very easily!

• MaggieL said, on August 12, 2013 at 10:46 am

Quite agree. If there’s some part of APL that Scala doesn’t have, it’d be fairly straightforward to add it in a library.

Oh…look what just hit news.ycombinator.com: http://ngn.github.io/apl/web/ “APL in Javascriptt”

16. Jane said, on July 28, 2013 at 6:19 pm

I’d go upstream from this. Conway’s Law states that organizations produce systems which are copies of the communication structures of those organizations. So when you find that industry programmers are ignoring the classics, often as not, it means that their organizations are ignoring the organizational masters. Who has time to do things right when your company is just going to screw it up later, anyway? Even if you (had a time machine and) could hire Leonardo to paint for you, if you hired him to do the artwork for a cheap softcover novel in the supermarket checkout line, he’d spend a lot less effort on it than the Mona Lisa.

I’ve been looking for years, and I’ve never found any company who learned from the successes (and mistakes) of the organizational masters. For example, “Peopleware” (1987) presented loads of evidence in favor of quiet workplaces for knowledge workers. Today, I can name only 2 software companies in the world that actually give private offices to programmers. “Mythical Man-Month” (1975) talked about how to organize a software team, and hard limits on team size. This one seems even less followed.

As the Wikipedia entry for the K language points out, the syntax for these is not the easiest to grasp. There are huge payoffs to be had, but there are also significant investments to learn them. When your “ScrumMaster” can (and does) move you to a new project any week by just dragging a box in your company JIRA, there’s no incentive to spend any time learning, or even documenting. The only incentive you’re left with is to make it look superficially simple (Python and Rails are both amazing as demoware) for the next poor guy who has to maintain it.

• Scott Locklin said, on July 28, 2013 at 9:09 pm

I can’t disagree with you much. Though I can mention that Fred Brooks of “Mythical Man Month” fame is an APL pioneer, and his photograph is listed above as one of the great software Ninjas.
J and K aren’t bad if you’ve studied a functional language. If you’re pretty good at C, you’ll rapidly discover the bones of the thing are all clever bits of C pasted together. It is different though. I built a machine learning gizmo in J which would have taken me a few days in R. If I hadn’t got stuck on a bit of syntax, it would have been done in *minutes* in J. Even with falling flat on my face on the syntax, it only took me one day’s work.

17. MaggieL said, on July 28, 2013 at 10:08 pm

I just wanna say: we grown-assed women use apply() too.

In case anybody wasn’t clear on that. 🙂

• Scott Locklin said, on July 28, 2013 at 10:44 pm

Word.

18. bob van wagner said, on July 29, 2013 at 2:00 pm

I learned APL in 1972, it was the first high level language I learned. This was for a class called “Non-numerical problems in computer programming” taught at Rutgers by an IBM fellow on leave. All was from the Iverson book. A compiler didn’t exist yet. I was a freshman, but became the maven of the class which was a 300 level course. Halfway or so of the way through the spring semester a compiler did come out, and I sometimes would get “expression to deep” to parse errors, because I loved the depth of one line. A universe can be created in one line of APL. In that same semester I took a numerical methods course which used Fortran (Watfor, iirc). I could do in two lines of APL what would take 200+ lines of fortran — which was fine, but the numerical methods professor would not accept APL. Fortran was so frustrating just because of the comparison of amazing ease by which huge hordes of data and numbers could be managed succinctly in APL! Iota heaven!

But who used APL? After college I got jobs using Fortran, but no one used APL. Or so I thought until a dozen years later when I moved into a neighborhood where a research scientist at Mobil was using APL to model reactions in cracking towers, he loved it too — I sold him a few of my APL books. But the refinery business was then in a severe downturn so Mobil was no market for my by then very rusty APL talent.

C gives a taste of APL, a mere taste! Forth (PostScript is a variant) gives a sort of parallel tastes of the new type of conceptualization necessary, although I never liked Forth.

I have learned many languages, and mastered more than a few, but NONE comes close to the power and ready ease of APL. APL was truly a way of THINKING about problems. The structures of other languages forestall human conceptualization, the OPERATORS of APL enable it and do so radically powerfully. Yet, as Scott Locklin says, the mind of a APL person has to be able to “drink the APL Kool Aid”, it is a fierce new way of thinking, and awesomely potent. Yet why do so few use it? Is it too scary?

• Scott Locklin said, on July 29, 2013 at 11:27 pm

Certainly the differentness of the language scares the hell out of most people. Most people want their for loops and their explicit temporary variables. Also, most programmers don’t know any linear algebra, making it much less useful to them. Plus, technologists usually make the mistake that “newer is better.” It was true once; it’s not always true now. Finally, one I can relate to: people need jobs, and knowing Java is more marketable than knowing, say, A+ or J. The upside to obscurity is you’re going to work with smarter people if you can cadge a J or Q job.
Always wondered at them Brazilian guys who could hack raw postscript. Also, at the time, I was running Watfor on my HP100LX pocket computer!
Check out J if you want to dip a toe in again. Scary, but awesome. The wide variety of libraries (SQL for example) also makes it handy as a data slicer; beats mucking around in Awk.

• Rudolf Winestock said, on July 30, 2013 at 1:04 am

I studied PostScript a little while ago, on a lark. I didn’t know that raw PostScript had a cachet among Brazilians.

• Scott Locklin said, on July 30, 2013 at 1:19 am

It did among Brazilian nuclear physics grad students in the 90s anyway. I’m assuming in hindsight they all studied Forth or something, but the way they told it, they were so poor in Brazil, they’d just write postscript documents. I just shrugged at the time, but I mean, LaTeX is pretty much free.

• bob van wagner said, on July 30, 2013 at 3:29 pm

Thanks for the advice. I downloaded R a month+ ago but haven’t gotten around to using it — the son of a friend wrote a book on it recently, I was curious. You prefer J to R, eh?

The hardware hacker of old Don Lancaster is a bug Postscript fan, see for example: http://www.youtube.com/watch?v=W6kD_SwKGlY

• Scott Locklin said, on July 30, 2013 at 9:21 pm

R is a horror of a programming language, but it has almost every statistical doodad under the sun written for it, and enough extensions (like bigmemory) to work around its limitations. It also has a very clever FFI and packaging system, which is one of the reasons it is so useful.
J has almost no statistical and machine learning doodads written for it (other than the ones I’ve managed to write myself), but it is an excellent language, and a powerful data manipulation tool.
I use R for my “day job” of finding meaning in data, and J to build the bones of things which can make me money when I sleep (and the occasional pay-for data task).

• bob van wagner said, on July 30, 2013 at 4:11 pm

Book APL, as opposed to the interpretive APL I used back in the 1970’s had more fundamental data structure types than scalars and multidmensional vectors. One I used a lot was trees. I was quite upset that they weren’t implemented.

• Jacques Le Roux said, on July 30, 2013 at 8:33 pm

Interesting, I guess because it was hard to implement then. After all, if you push the reasoning at its paroxysm, APL is nothing more than a macro assembler… With its reentrance (I mean execute function), and also reflection and introspection you can’t implement a complete compiler. If you doubt, before trying ;), read https://en.wikipedia.org/wiki/APL_(programming_language)#Compilers

Apart trees, which other data structure were in the book and missing in interpreters?

• Scott Locklin said, on July 30, 2013 at 9:31 pm

I certainly miss associative arrays in J. You can implement them using namespaces, but it should be a first class type like it is in K.
It would be nice to have a tree structure in J, but instead I import the ones I need via the FFI. I’m probably too dumb to write a good one like libANN or libFLANN anyway. You might be able to make one using boxes (structs, basically) and an array, but it is best to steal from people who know what they are doing:
https://github.com/locklin/j-nearest-neighbor

I think though, that if you’re doing compiler design or whatever else you need trees for, you should use another language; ML or Lisp. APL is about data and math; stuff that fits into arrays.

• Gzorgumplatz said, on August 13, 2013 at 8:30 am

Several APL compilers (APEX, Causeway) are written in APL, I suppose picking a suitable language is a matter of personal taste and mastery. APL is about amplifying your thinking and getting the right answer fast with fewer keystrokes and less tedium.

• bob van wagner said, on July 30, 2013 at 9:54 pm

I want to say linked lists, but that may have been part of how trees were generalized, or not at all — memory is dim. I also wonder if the trees were more generally a graph. I was learning graph theory at the same time in some physics or math or engineering course. So there’s a memory mush. It’s most likely that they were not generalized maps, but I think they were n-branched trees and not just binary. Alas I sold my Iverson APL book years ago, despondent of ever having to use the language again.

I remember that Iverson’s introduction explained that APL was developed as a design tool for the IBM 360 series, a micro-programmed architecture. Years later working with bit slice processors, I saw how my understanding of APL helped understand how to use those well.

Outside of dropping Trees, the biggest lament I had of with interpretive APL versus book was fixing the index base. In my book programs I switched between 1 and 0 index bases quite a bit. There was some reason I preferred the one that was not implemented, so I has some trouble at the end of the semester trying to get all these hand-written programs in book APL to run on the interpreter. The error was subtle, and not flagged, and my code was dense on a line. Write once stuff. I knew it when I wrote it.

• Jacques Le Roux said, on July 31, 2013 at 2:26 pm

The index shift (QuadIO) existed in all APL languages I used (from 1985 to 2005: APL+ on both mainframes and PC, APL2 on mainframes, and Dyalog). And yes, I can understand the appealing of native implemented tree structures. A lesson I learned with Apache OFBIz (my pet project), is that there is nothing better than open graphs in structured data (I mean trees are closed). But it was hard for me to get it: it was too free at 1st glance and open to drawbacks. Finally you can always close when you need it, so better be open 1st.

Just one a last comment about bits of information I collected on APL when I was trying to help people moving from APL+ to Dyalog on PCs in the 90s. If you refer to Nasa, the IBM 360 effort is the the Appolo program of IBM (and maybe of all IT efforts, in term of costs). So they got issues with their OS, and one of the reason APL was implemented was to debug the issues they were running in. Successfully it seems…

I can’t stop withouth referring to Dijkstra. If someone is well aware of what some call “mechanical sympathy” it’s him. Also I believe he is the kind of man you want to listen advices from, don’t you? Then I let you read this article http://www.zdnet.com/blog/murphy/apl-cobol-and-dijkstra/568

Also did you know that Bill Gates (pushed by IBM then, IIRW) wondered about replacing his beloved Basic by APL on 1st delivered PCs. This would have changed a lot of things. I guess again the APL’s symbols blocked the turnaround :/. I guess, that’s why Iverson turned to J, also because he was a scientist not a marketer. I heard that IBM was pushing APL in the early 80s because it was a way for them to sell more costly mainframe memory 😀

• bob van wagner said, on July 31, 2013 at 8:30 pm

Beautiful reply. Did Gates wonder or worry? When Dijkstra’s “Goto’s considered bad” letter came out I had just taken a position in a company where the lead insisted on flow charts! YUCH! He was also a huge fan of goto’s to reduce the code. I came from a rare company before that, in addition to my college exposure to APL, at that rare place I developed on my own object oriented techniques of method tables, call backs and late-bindings, and the whole place was oriented to intense reinvention of the wheel, and parallelism — but no heaven outlasts men’s vices for long.

I would NEVER do a flow chart. Nor do I like many of the madnesses computer science professors, excepting the rare, afflict the world with on a sort of regular basis

There is a saying ascribed to Lao Tse “Slaughter the talented!”. That’s a functional programming statement. It’s not an imperative, it’s a statement of human development.

Why are no artists today up to the level of the Dutch Masters, or musicians to the level of a Bach or Mozart?

There is no home for APL mavens and Perl masters. They are slaughtered by a hordes of hundreds of thousands of Sharepoint and Java trained troops. Hung Gar for everyone! Shaolin arts are not desired, the practitioners like great boulders reduced to hand size river stone by the floods, and set in Zen gardens.

• Gzorgumplatz said, on August 6, 2013 at 9:21 am

Discorporate IT is no help – from what I have seen, Java and .Net thrive for the following reasons:
– Developers should be like vacuum tubes (remember those?) when one burns out, replace it
– Damage control – a single developer can be able to only muck so much up
– Protection – a single developer cannot hold the company hostage
– When the price is right, the entire kettle of vacuum tubes can be shipped off to Bangalore
– More developers look good on the org chart at the right time, and look even better on the balance sheet when outsourced

APL required some individuality and genius, things long discouraged by Discorporate IT.

• Jacques Le Roux said, on August 7, 2013 at 8:12 am

I think it began with COBOL…

• Gzorgumplatz said, on August 9, 2013 at 1:53 pm

Sorry, “valves”.

• Erik Haliewicz said, on April 25, 2015 at 1:20 am

Is there any reason why compiling APL is any more difficult than, for example, Lisp?

• Scott Locklin said, on May 24, 2015 at 9:04 pm

The odd thing about APL is there is generally no reason for compilation. People certainly have written compilers for it.

19. Michael Main said, on July 29, 2013 at 3:40 pm

What a trip down memory lane. You will all find this hard to believe, but APL was the very first language I ever learned. My Grandfather; Martin Heitzmann, used APL when he worked for the FDA. When I was about 11 years old he took me under his wing and taught me the basics of electronics, chemistry, physics, and around the end of middle school, he taught me APL. It was amazing. Even though I wasn’t a good programmer at that age, my unwillingness to give up allowed me to figure it out and write several programs that APL wasn’t even designed for. A couple games included (granted they used ASCII characters). One of the games was the CS classic “Life”, and the other was an RPG style game with a map and everything. From there I moved on to C/C++ and today I use mostly Java and Groovy (as I write a lot of Web Apps). One of these days I need to relearn APL though.

Thanks for the memories!

• Scott Locklin said, on July 29, 2013 at 11:21 pm

APL

You shoot yourself in the foot and then spend all day figuring out how to do it in fewer characters.
You hear a gunshot and there’s a hole in your foot, but you don’t remember enough linear algebra to understand what happened.
@#&^$%&%^ foot LaTeX compy$ more foot_shooting.tex

\documentclass[12pt]{article}
\usepackage{latexgun,latexshoot}
\begin{document}
See how easy it is to shoot yourself in the foot? \\
\gun[leftfoot]{shoot} \\
\pain
\end{document}

compy$latex foot_shooting line 6: undefined control sequence \pain 20. michaeleriksson said, on July 29, 2013 at 4:17 pm It is indeed often the case that the lessons of old are not duly considered when making something new (not restricted to software, let alone language design). PHP springs to mind as an almost incomprehensibly poorly thought through language. However, we must not forget that there often are other reasons than just neglicence, ignorance, or poor research behind these decisions. For instance, modern language are built to satisfy different criteria than the languages of old. Consider the increase of hardware resources, the growth in size and complexity of the average piece of (at least commercial) software, and the need to have code written, understood, and maintained by duller and duller programmers. Another major source of problems is “design by committee”, where simplicity of design is almost doomed to fail, inconsistent features are added according to political swing, etc. Indeed, this is a point were a language can suffer because too many people try to incorporate too many (!) lessons from the past… Or consider that the language it self is slowly becoming less important than the libraries provided with or available to the language. • Scott Locklin said, on July 29, 2013 at 11:17 pm Of course there are different criteria and different pressures now. But, I figure if you’re going to try to solve the problem of “what to do when your data is way bigger than the memory on your computer,” you should look at how the Great Old Ones solved the problem. Or even at how modern people who know what they’re doing (Art Whitney) solved the problem. I figure stuff like Hadoop and friends was done under ridiculous deadlines under extreme pressure. That’s no excuse for everyone who is not under ridiculous deadlines and extreme pressure. 21. Philip Thrift said, on July 29, 2013 at 5:09 pm Haskell (based on combinatory logic) is also worth looking at: http://www.haskell.org/tutorial/arrays.html • Scott Locklin said, on July 29, 2013 at 11:09 pm People keep telling me that. I have noticed an uptick in interest and libraries written in Haskell. Other people say it’s too pure/functional to actually use productively. I’m happy with the collection of languages I have now, at least until I see some interesting problems solved in my domain in Haskell. • Corentin Roux (@corentinroux) said, on September 8, 2013 at 3:19 am Haskell is wonderful to use productively, but like J (and mostly functional languages), has a steep learning curve which filters a lot of people out. It is practically impossible for someone to write Haskell without thinking about it, which some see as a bug rather than a feature. I will write it up properly as a case study some day, but our story is basically that because of a constrained budget, I had to find high power programmers instead of the usual Java farm, and ended up hiring only Haskellers. Prototypes are built in hours, production code has, so far, not needed any maintenance, and the code can run up to 10x faster than the equivalent in pandas in the rare cases where we transferred from one to the other. It’s also very, very readable (unlike tacit J) so you don’t need much documentation so long as the person reading it is relatively experienced. We solve the more complex problems of the business like recommendation engines and so called big data tasks (+1 for Redshift, which we chose mainly because it was cheap and abstracted away the need for spending time on IT tasks). Despite GHC’s reputation for being funny with dependencies, a lot of the libraries are actually quite old because they were done right first time. You should also check out: http://research.microsoft.com/en-us/um/people/simonpj/papers/ndp/haskell-beats-C.pdf The interesting thing is that despite its reputation as an academic language, in our team of 4, one guy does has a CS MSc (thesis was written in OCaml), but the other two do not have a degree, and I’m a (hardware) engineer by training with no CS qualifications whatsoever. The main reason Haskell is not used in industry is that people are not allowed to use it, whereas I am guessing that you could slip Clojure through the corporate QA process by pretending it’s an extension of Java… • Scott Locklin said, on September 8, 2013 at 6:39 pm Somehow I hadn’t realized until your post inspired a bit of googling, that Haskell is of the ML family. I screwed around in OCaML for a while, and thought it a powerful way of doing things. In hindsight, I probably should have stuck with it, but I fell in love with Lush. The self-documenting nature of ML languages is very helpful indeed, as is the type safety; at least for production code. I’m not surprised that it runs faster than Pandas, but I suspect for “screwing around with the data interactively” purposes, Pandas would be better. I don’t think Haskell solves any pressing problems for me, but if I were to look at it, I take it GHC is the way to go? • Corentin Roux (@corentinroux) said, on September 9, 2013 at 4:54 am Yes, if you work on linux (which I assume you do). The best (most idiomatic) intro is: http://learnyouahaskell.com/chapters although I also really like Graham Hutton’s book, the quickest (and you might appreciate it as a Lisper): http://en.wikibooks.org/wiki/Write_Yourself_a_Scheme_in_48_Hours/ and if you want to quickly search for how to do something in the real world: http://book.realworldhaskell.org/read/ and here’s an incentive for you to try it: http://www.haskell.org/haskellwiki/Parallel I think Haskell can be faster to mess around with, if you think like a Haskeller (this might be why so many mathematicians and physicists feel comfortable there), and pandas’ advantage is there for us at least only because you are initially more proficient with the R mindset; also, but this is very much anecdotal and due to the stuff we have built so far, we have found it always worth putting 20% extra effort and building directly in Haskell instead of prototyping first in a messier language. Historically, Haskell is based off Miranda which indeed inherits many ideas from ML, but the process by which it was created is one of the few cases where committees are better than individuals (a well defined engineering task force to refine one very good answer to a hard problem). If you ever do give it a serious shot as you did with J, please do write about it… • Scott Locklin said, on September 9, 2013 at 5:50 am Thanks for the roadmap. I don’t have a presently viable systems programming language that I care for, though I do not generally need such things. I can do most of what I need to do for my personal projects in J and duct tape, but having something like Haskell in reserve might come in handy one day. My mostly happy experiences with lisps made me inclined to screw around in PLT, Chicken or Stalin for such an animal, but I probably haven’t given the ML family a fair shake yet. Hindley-Milner is potent juju. One thing I am intensely curious about: a high level language for fooling about in things like Parallela or GPUs. I don’t often have a need for 1000 CPUs, but when I do, I’d usually rather just wait 1000x longer for the answer, or buy a bigger computer than screw around for weeks in C++. I think something like J or K could do very well at this (Haskell too, but calling an FFI is not interesting), but I don’t know of any projects which have done so. 22. Wayne Froese said, on July 29, 2013 at 10:09 pm This is my first encounter of APL being discussed without mentioning actuaries. 23. Jack Rusher (@jackrusher) said, on July 30, 2013 at 5:46 pm I was taught Numerical Methods in APL at university by a fantastic professor: http://archive.vector.org.uk/art10007880 … it really is a shame that more people aren’t exposed to that way of thinking about computation, especially if they do *any* linear algebra. • bob van wagner said, on July 31, 2013 at 12:57 pm Thinking in APL is very liberating, empowering. You simply cannot think in most other languages. We don’t think “For While”, but we can think “Reduce”, where reduction means application of an operator over all members of the set, to produce a reduced set. Perl and Regex are other APL-tasting languages, but still far removed from APL’s powers. It’s sad that the implementation dropped the tree operators, for then people focus on matrix inversion, and such — which are great, but like taking the engine out of a Benz and dropping it in a forklift. 24. Dave Stallard said, on July 30, 2013 at 7:34 pm I found this blog post very intriguing, sufficiently so that I wished it went into enough detail to persuade me. I’m certainly open to persuasion. MapReduce is so simple in conception, and yet so ugly, awkward, and verbose as implemented in Hadoop I do remember reading about APL long ago, and while respected, it even then was considered as arcane, and mainly belonging to the past, and to a small community. It’s fascinating that it might reborn in this way. 25. truthspew said, on July 31, 2013 at 2:36 am I learned both PL/I and COBOL back in the day and use neither in practice. One of the language I use most believe it or not is split between VB and PHP. Go figure. • maggieleber said, on July 31, 2013 at 11:29 am Was wondering why I was having deja vu about this, even allowing for the Iverson nostalgia (my APL Vade Mecum sadly disintegrated long ago; would that I had another) Then I remebered: me, writing in 2004: http://margaretleber.sys-con.com/node/45512 26. Wilfred Verkley said, on July 31, 2013 at 4:03 am Loved this post, and agree with a lot of it. I think the problem is mostly industry, not people though. Universities are often very good about giving a good history and theoretical background in computer science (mine was). Individually, people may have their own passions to explore different things too. However our industry as a whole likes shifting to new products often though, and in that process, things get left behind, and not many opportunities are ever created to “master” anything, especially for a whole group of people. It’s hardly a unique problem to IT either…it happens in many industries which face change/progress. That said, while we may decry the negative aspects of progress and the gems we leave behind, its easy to forget the horrors of the past (e.g. COBOL) and ignore some of the amazing things being created today…. • Scott Locklin said, on July 31, 2013 at 4:20 am Many, many things are more awesome than they used to be. For example, I can run Unix on my laptop without any more difficulty than winders (supposedly there were better OS in the past, but I never used one). Data compression is routine and very effective. BitSync and BitTorrent make my life better in lots of ways. ZeroMQ is also one of the coolest libraries I’ve ever looked at. But then … we have 8 million serialization protocols, when we already have XDR, or tardgasms like CORBA when we already have perfectly good shit like RPC. All I’m asking is people just look to see if the problem has ever been solved or attempted in the past, and maybe, you know, try to improve on that if it worked OK, rather than thinking nothing important happened before Java and Netscape. 27. Romeo said, on July 31, 2013 at 4:38 am I piss on the APL-like languages. Sure they are damn fast at what they do, as long as the data they need is in memory. Have you looked at the I/O performance of your beloved Q ? It’s god-awful the minute you need to go to disk to retrieve data. Also, the syntax of the language sucks big time. How the hell do you maintain such an ugly code base ?!!! My point is that languages evolve and we don’t necessarily need to spend time with horse shit like Q. I do concede that some of those old techniques can certainly be used. However, if you have to use shit like Q, J, or K to illustrate those techniques, then you’ve already lost 99.9% of the battle. Those languages belong in the stone age. • Scott Locklin said, on July 31, 2013 at 5:38 am I have only noodled with Q, as nobody will give me a fully working copy of the thing. J seems plenty fast to me. What are you comparing to? I also find it relatively easy to maintain: use namespaces, just like you would in any other language. 28. Sarek said, on July 31, 2013 at 6:33 am The biggest problem we all have is there is too much to learn. It’s easier to reinvent the wheel than to learn, build, and test various “good idea” languages or frameworks or even paradigms. In the last 10 years, I’ve read about 10’s of languages that people are adopting, and creating with. They come in like a fad and sometimes disappear without a trace. Sometimes they stick around. There is rarely one great summary that explains why the language was created or even what it does best. You can pick it up by reading (say) 100 blog entries all over the web. But who has time, except those people who want to see something new? So, I stick with C, Java, C#. My platforms are Windows command line, ASP.NET, and Android. I don’t want to spend 6 months learning Ruby on Rails unless you can show me how it will double my salary. PS I used APL back in the day. Great language and loads of fun and I would be happy to get back into it if the money was there. • bob van wagner said, on July 31, 2013 at 1:16 pm Only 10’s? You underestimate our modern version of Babel!. For me that the web site Ravelry was done in Ruby is enough for me to accept it as a valuable language and platform. One man, one great web site. That’s Texas Ranger style coding. 29. MJR said, on July 31, 2013 at 12:39 pm APL was my first programming language so I have a great deal of affection for it, but I just don’t see it being a language the average programmer uses. The joke was always, “it’s a write only language”. It’s just really, really hard to pick up someone else’s APL code and debug it. I have programmed in everything from parallel micro code to SQL and have noticed some of the same problems, but attribute it to a different cause, lack of parallel programming skill. Specifically, lack of data parallel programming skill. SQL is a data parallel language with all of the necessary features to write any program. Yet time and time again people will insist that PL/SQL is necessary. The truth is just the opposite. not only is PL/SQL not necessary, it’s hasn’t even been particularly good. I’ve often converted PL/SQL code, with no obvious flaws, into pure SQL and seen a 5X performance improvement. Yet most coders find it almost impossible to understand what was done with out a great deal of help. I did participate in the first golden age of parallel processing (80s & 90s) so I had quite a bit of time to practice my craft on great old machines like the DAP and the Connection Machine. It is my contention that teaching data parallel programming would help to focus the next generation of language designers. • Scott Locklin said, on July 31, 2013 at 10:21 pm APL would make a dandy tool for programming the various video and vector processors that have come about. I don’t know if parallel actually does have a future; it’s almost impossible to write a compiler that saves the user from himself. But, I, personally, could do neat things with APL on a vector machine. • Gzorgumplatz said, on August 9, 2013 at 2:06 pm Why wait? You can do neat things with APL right now on a single processor machine. The beauty of the language is that your code need not change when the implementation finally supports parallel hardware, be it a vector instruction set, the latest Nvidia graphics board, shiploads of processors, or all of the above. The same should be true for all of the array langauges, J, K, Q, even Batlab. • Scott Locklin said, on August 9, 2013 at 4:32 pm Well, I thought it was pretty obvious I was doing neat things in J. I’d just like to do other such things in J on an a bunch of processors without writing the code to do this. Kona has something parallel now, though it’s not exactly what I had in mind: https://www.youtube.com/watch?v=WBXsCeW9qfc • Gzorgumplatz said, on August 13, 2013 at 1:17 pm True, IP communications isn’t hardwired into the session in any APL I have recently used (presumably also not in J) as it is in K or Kona, where a remote session would be listening and would execute whatever comes out of the socket. But that code is not at all difficult. Sort of reminds me of “shared variables”, where in mainframe APL you could connect to some other processor (DB2, GDDM, etc.), send it commands and it would return a result or do something. I.P.Sharp had S-Tasks, same idea, have a dialogue with a remote system. • michaeleriksson said, on August 1, 2013 at 5:39 pm It is indeed possible to do far more in plain SQL (at least with Oracle) than most people believe. However, your claim seems unrealistic. Are you possibly speaking of SQL*Plus scripts rather than just SQL? (And even for these, I suspect that problems can be found that require PL/SQL.) • jack said, on August 5, 2013 at 1:52 pm Indeed, you’d be utterly amazed by what you can do with the right operators in SQL. You need to go outside the simple query operators and into the data mapping ones usually reserved for ETL jobs to get the full parallel power of the database engine. It is quite possible to get order of magnitude speed improvements over plain subqueries. • Scott Locklin said, on August 5, 2013 at 8:46 pm I’d love to see a presentation on this. 30. khinsen hinsen said, on July 31, 2013 at 1:25 pm Thanks for this article. I am another APL old-timer (I picked it up in high school, on an IBM mainframe, which after TRS-80 basic was a real eye-opener) who more or less left the APL universe because of its isolation from mainstream computing. After using APL.68000 on an Atari ST for many years, it was painful to discover that all my code had become unreadable when I moved to Linux. No way to convert my workspaces to any APL on Linux. J is a lot more portable and works with standard text files, but I still haven’t found a good way to make it read the data I want to work on, which resides in netCDF and HDF5 files. BTW, the latest issue of “Computing in Science and Engineering” has an article on J: http://doi.ieeecomputersociety.org/10.1109/MCSE.2013.64 • Scott Locklin said, on July 31, 2013 at 10:26 pm Well, J’s FFI is pretty easy to use. I recall HDF5 to be a mess to call via FFI, but netCDF wasn’t so bad. Kind of a corner case, as J folks tend to use J to store the data. Maybe I’ll have a look at netCDF again… • khinsen hinsen said, on August 1, 2013 at 9:08 am Of course it’s easiest to use J to store data processed in J, but if you work in a collaboration network where most people use mainstream tools, you don’t have that choice. That’s one reason why J has remained an island by itself. The HDF5 API is indeed very complex. Someone (but who?) should write a high-level C API layer that trades in some performance for simplicity. 31. Maury Markowitz said, on July 31, 2013 at 1:47 pm Meh, same sort of elitist “real men” BS that infects the computing world – it’s only good if its fast and hard to use, and if you say that it’s hard to use, you’re a loser. The goal, IMHO, should be to bring programming to more people, so they can better automate their world. This could improve the performance of the global brain trust. Improving the performance of a matrix operator in a program that will run five times in all history? Not so much. As such, I think HyperCard has a lot more to teach the planet that APL ever did. • Scott Locklin said, on July 31, 2013 at 10:27 pm Democratization of programming is a completely separate problem from using good tools for data problems. Though, FWIIW, Wes’ project is, actually, a democratization of programming, and it uses a lot of the bones of APL under the hood. APL is also used to teach schoolchildren; it’s not hard. 32. shawnhcorey said, on July 31, 2013 at 2:15 pm If you don’t program in Perl, you have no idea what a language is capable of. Unlike other languages, Perl has many paradigms and they can mix them together in a single program. And no, there is no steep learning curve; Perl can be used by any novice. Like chess, it’s easy to learn but difficult to master. And even it’s masters are always learning new things about it. • Bozon said, on July 31, 2013 at 9:40 pm Since, learning curves are measures of amount learned in a given amount of time, with the Y axis being the amount learned and the X axis being the amount of time, a steep learning curve is actually a good thing. 🙂 • Scott Locklin said, on July 31, 2013 at 10:29 pm APL is a bit like Perl for data, except it is designed better. I liked Perl back in the day, but I converted to Python, and, eventually, Sed and Awk for doing Perl-like things. 33. Mark Mullin (@Arete) said, on July 31, 2013 at 6:21 pm I vaguely remember a red paperback that occupied my life for six months as I made a port of APL on the Z8000 – even managed to get a hold of an old tek terminal with APL keying – the one flaw of APL to my mind was it’s conciseness – you could accomplish the same amount in one screen of APL that took C pages and PL/1 volumes to accomplish – however…… when you came back to the app six months later intending only the smallest change, you paid for that earlier convenience with the mammoth effort of trying (often vainly) to recollect how the hell the program worked 🙂 As for LISP, I’ve used it several times a decade to add programmability to custom systems, beats the hell out of TCL/LUA/whatever, as long as you aren’t crowdsourcing other content • Bozon said, on July 31, 2013 at 9:36 pm I completely agree. I still remember my Programming Languages final at Georgia Tech in the mid 80’s. APL was one of the languages on the test. I coded the solution for the given problem in APL. I then went on to finish the rest of the problems in Lisp, Pascal, Fortran, etc. When I was checking my work (the final testing period was only 3 hours long so it couldn’t have been more than an hour), I was stumped by the APL program I looked at my code and had no idea what I had done or how it worked. I started to panic, because I really wanted to ace the test. I finally, had a thought, I covered my original solution with my scratch piece of paper recoded the solution in APL, and compared the programs. I then stepped through each solution, while the last one was fresh in my mind. I was able to then understand the original program to verify and correct it. I literally, couldn’t understand the APL I had written an hour earlier. You have to be very careful with APL. I can laugh about it now, but it taught the importance of readability. • Scott Locklin said, on July 31, 2013 at 10:31 pm I’m a big proponent of writing lots of tiny little elementary particles wherever possible. That sort of idiom works very well for J (which can encode a page of Java in a line of code). Though it also works well for R. • bob van wagner said, on August 1, 2013 at 9:39 pm There’s a philosophy that says all programs should be write-once. It is a more powerful elixir than “write once, run everywhere”. Perhaps most web pages adhere to it. • maggieleber said, on July 31, 2013 at 11:21 pm Your red paperback is likely one edition of Gilman and Rose’s “APL: An Interactive approach”. The first edition was black, second and third had red covers. This one’s seen some hard use; http://www.flickr.com/photos/psd/16307787/ Mine’s in better shape. http://www.librarything.com/work/610030 • Scott Locklin said, on July 31, 2013 at 11:26 pm It’s an all time classic for certain. There are some rumblings in the J community about an update to J, but Allen Rose at least is retired. • Don Rueter said, on August 1, 2013 at 12:35 am I don’t have my copies any more, but check the acknowledgements for the second edition and you may find my name. I read the manuscript for the publisher. • Mark Mullin (@Arete) said, on August 1, 2013 at 12:46 pm It was indeed – wore out the first copy, the second is still in the library, right next to Peter Henderson’s Functional Programming – that one was a horror story – great book, but I was missing the errata pages and fixing up the system at the end was tough 34. Bozon said, on July 31, 2013 at 9:23 pm “Infinitely Scalable” – Infinite is like really big. At some point won’t the earth cave in from the weight of the infinite number of servers, Or won’t we run out of Silicon to make chips, or will we run out of electricity to power an infinite number of servers. It is an interesting thought experiment to determine what we will run out of first. You sound very intelligent, but using “infinitely scalable” isn’t a very accurate or useful term. • bob van wagner said, on August 1, 2013 at 12:41 am Maybe he meant countably scalable. • Bozon said, on August 1, 2013 at 12:56 pm I once tricked my cousin, who has a PHD in mathematics, with the question of whether the sands on the beach are countably infinite or uncountably infinite (I had of course prefaced the question with a bunch of questions about infinity to obscure the question), he answered of course countably infinite, I giggled and said neither, the sands on the beach are finite. He groaned, and said of course. An interesting article, http://en.wikipedia.org/wiki/Graham%27s_number • bob van wagner said, on August 1, 2013 at 1:21 pm Nice expanding mind trap question! Is countably scalable the same as countably infinite? Does countably scalable include all such things like a Graham number, but not as much as a countably infinite set? Is Graham’s number countable? And right now how does that help me write some CSS? • Bozon said, on August 1, 2013 at 9:16 pm Countably Infinite means simply that a one to one mapping can be found between the set and the counting numbers. So the whole numbers are countable because: For all negative numbers: Map them the odd counting numbers to the odd numbers N=-2*n + 1, and the Positive numbers: Map them to the even numbers so N=2n. Since there is no overlap and you always have another number you can always map them. The real numbers are of course not countable because you can always find another number between two numbers that you haven’t mapped. Is Graham’s number countable? It depends on how fast you count. 😉 I don’t even thing you can express it with the atoms in the universe if each atom represented a 0 in the number. That is how big it is. Is countably scalable the same as countably infinite? That is a great question. So, what is a really good way to say that a system scales nicely, in a very linear way? Infinitely Scalable is an awful term, but linearly scalable doesn’t have zing. 35. maggieleber said, on July 31, 2013 at 11:23 pm Nobody’s mentioned the IBM 5100 yet. I never even touched one, but it did inspire lust in me. • bob van wagner said, on August 1, 2013 at 12:39 am Same here. Was that the first PC? September 1975. Almost! The Altair 8800 got out in May, and the IMSAI 8080 in kit form in December. But then they did not have built in keyboard and monitor. So maybe the IBM 5100 was the first true PC.$20,000 1975 US$. More than a BMW7 series scaled to todays$.

• maggieleber said, on August 1, 2013 at 12:47 am

A PC in some very limited sense… the first Intel-based IBM PC was given the model number 5150 as an homage, but it owed more to the Displaywriter for its hardware heritage.

• Gilles said, on August 1, 2013 at 4:16 am

The first programmable personal computer built with a microprocessor was the MCM/70 machine that supported only APL. I saw a demo by the vendor rep in the winter or spring of 1974 at the Faculty of Forestry of Université Laval. It shipped in the fall. MCM was an innovative company founded by Mers Kutt in Kingston, Ontario, Canada.
The IBM 5100 was introduced in late summer 1975.

• bob van wagner said, on August 1, 2013 at 2:21 pm

Thanks for the MCM/70 reference. Another predecessor is Don Lancaster’s TV Typewriter (kit plans only) 1973: http://en.wikipedia.org/wiki/TV_Typewriter The MCM and TVT look a bit alike. All the TV Typewriter did was make text on a TV screen via a simple RF modulation circuit. Still even that was a dreamy thing at the time.

Memory was VERY expensive at the time. Tiny amounts of RAM, very expensive, so expensive that hobbyist magazines shared ways of using bad RAM chips (with missing rows for example) using external discrete logic — two bad chips make one good one. 10 MB hard disks then looked and sounded like washing machines. Some commercial systems still produced and sold hand wired core memory. Punch cards — as program and data storage devices, huge boxes full of them — were the low cost long term storage. Backing up a deck of punch cards was an important role in the Computer Department. Yes, there was tape too, but it was not as reliable as punch cards.

• maggieleber said, on August 1, 2013 at 3:19 pm

Tape not as reliable as punched cards? Well, you could rely on them to jam…

• bob van wagner said, on August 1, 2013 at 3:41 pm

Tape technology has improved. In 1972 tapes also were sensitive to stretching, binding, warping. Generally the data or code lost in a card jam could be at least identified as to records (one, two, not usually more) by operators, and often recovered by the operators — the card was torn, not machine readable, but human readable. Many cards also carried a printout of their content. The data lost on a tape? Well which data was lost?

• maggieleber said, on August 1, 2013 at 3:47 pm

I was a mainframe operator in 1972. Ever see a card saw?

• bob van wagner said, on August 1, 2013 at 4:58 pm

I operated a IBM 370/20? at a big school Physics research department during the summers, and a NCR Century system in a refrigerator factory (accounting and payroll, manufacturing and order tracking), nights, in that time frame. No I never saw a card saw. What is it?

• maggieleber said, on August 1, 2013 at 5:38 pm

A hacksaw-blade-like tool used to clear the papier-mache resulting from the nastiest of jams that could occur at the throats designed to pass exactly one card in a card-handling machine.

If you needed one, you would probably not be recovering much data from the cards involved.

I also recall running a S/360 assembler over the source code for the IBM On-Line Teller application as modified for use at Germantown Savings Bank. It was on about a dozen drawers worth of punched cards, and the assembler was multi-pass. As patches were added to the code over the years, not all of them had sequence numbers added, so a dropped drawer could have had disastrous consequences…a back-up copy of the cards was kept, but of course it was not absolutely current…and making the back-up copy itself was a hazard to the original…

There was no 370/20. There was a 360/20, kind of a little brother to the rest of the 360 line, often used as a remote data terminal to bigger mainframes. There was also the 360/22 that was a stunted version of the 360/30. The OLT program mentioned above ran on a 360/30, and made very good use of all 64k of memory.

• bob van wagner said, on August 1, 2013 at 9:44 pm

Impressive. At the factory we switched over to mag tape only when the old-school IT manager went out on six month leave after a heart attack. He would never allow it otherwise. His cards where his comfort.

• Rohan Jayasekera said, on August 1, 2013 at 5:30 am

I was lucky enough to attend the launch tour of the IBM 5100 (Ken Iverson himself did the APL part of the demo!) and to use one for a week shortly after that. And now I have a 5100 that I bought on eBay some years ago — it still works! An amazing machine for its time. But not as amazing as the MCM/70, which was released in 1974 and has one of the best claims to being the first true personal computer.

36. Konrad Hinsen said, on August 1, 2013 at 10:37 am

Watched this video this morning: http://vimeo.com/71278954 (Bret Victor on “The Future of Programming”). Makes you really wonder what progress we have made in the last 40 years.

37. Visto nel Web – 90 | Ok, panico said, on August 4, 2013 at 6:51 am

[…] Ruins of forgotten empires: APL languages One of the problems with modern computer technology: programmers don’t learn from the great masters. There is such a thing as a Beethoven or Mozart of software design. Modern programmers seem more familiar with Lady Gaga. però, ecco… panico! ::: Locklin on science […]

38. Roger Bigod said, on August 4, 2013 at 10:42 pm

There’s an article http://www.wired.com/wiredscience/2013/07/genetics-of-iq/ in a recent Wired about a huge project on the population genetics of intelligence. Apparently a major language in this field is Perl.

It isn’t clear to me how much creative programming there will be on genomics analysis. The algorithms seem straightforward and all depend on the elementary task of scanning two arrays for matching subsequences. When the price comes down to a few hundred dollars everyone will get sequenced, but only once. And the result will be scanned for informative subsequences at a few sites. It sounds like a lot of brute force processing power. Would APL or J offer any advantages?

• Scott Locklin said, on August 4, 2013 at 10:58 pm

Presently, probably not. Perl has a lot of specific libraries for dealing with this sort of thing. Genes are encoded as text files, and they’re not particularly array like as I understand things. I don’t really have a deep understanding of the analytics behind gene expression though.

• Roger Bigod said, on August 5, 2013 at 4:31 pm

From the fundamental nerdology standpoint, text can be viewed as a array of characters, and a DNA sequence as an array of codes for single nucleotides. At this level there’s only a few types of mutations — deletions, insertions, block rearrangements, single nucleotide substitutions. From the IT standpoint, the programs spend most of their time churning through variations of grep and diff.

The essentials of genetics are straightforward, compared with even 19th Century physics. The hassle is that it’s like a set of languages that never had a standards committee, with random hacks and kludges now frozen in place. E.g. even the genetic code isn’t universal — there are some minor variants in the assignments of codons to amino acids that occur in specific places. The effect of these exceptions and provisos has been described as “baroque complexity”.

• maggieleber said, on August 5, 2013 at 5:31 pm

“The hassle is that it’s like a set of languages that never had a standards committee, with random hacks and kludges now frozen in place.”

• Jacques Le Roux said, on August 5, 2013 at 6:14 pm

I think it was more a comment about DNA sequences than Perl 😉

• maggieleber said, on August 5, 2013 at 6:26 pm

Shhhhh…. you’ll spoil the joke…..

• Roger Bigod said, on August 5, 2013 at 7:34 pm

When nature produces a acceptably functional system with a lot of ugly duct-tape, we call it “evoution”. When people do it, we have Perl (or Common Lisp).

39. Lee Courtney said, on August 7, 2013 at 3:11 pm

1. More historic APL data here: http://www.softwarepreservation.org/projects/apl

2. I also started my career with APL, starting in 1969 on an IBM 360/50 (while in the 7th grade).

3. If one looks at our capabilities with silicon, micro-architectures, and design tools there is an opportunity for a new golden age around array oriented languages.

40. Paul Robinson said, on August 11, 2013 at 1:12 am

RCA got slaughtered trying to compete in the mainframe business against IBM; they lost 1/2 a billion 1970s dollars doing so. They sold most of their assets to Sperry, and who continued IBM’s computer line as the Spectra series. When Sperry became Univac, the Spectra 7 became the 90/60, probably one of the best IBM 360 workalikes. The Operating System, TSOS went two places, Univac kept it and renamed it VS/9; Siemens also got it and eventually it became B2000. Fujitsu sold it today.

I look at what APL was, and I realize what could have been, the same thing. If APL had been more accessible to “ordinary mortals” and “normal people” instead of something so esoteric that it’s almost impossible to use without a graduate degree, it could have done so many things for people that use computers and given them improvements in productivity measured not in percentages, but in orders of magnitude.

41. Gzorgumplatz said, on August 13, 2013 at 8:20 am

There was certainly a Richness / Reach problem with APL. APL was certainly very rich, in a way that Java isn’t, but well out of reach until comparatively recently. IBM wasn’t any help, as APL was used as a catalyst for mainframe sales into the 1990s. Very significant contributions to APL development were made by timesharing companies such as I.P.Sharp, however this was at a time when computer time was scarce and valuable and high profit margins made it interesting to partake in such developments. APL was and still very much is a vendor-dominated language.

The situation is somewhat better today, nearly all the vendors (Dyalog, APLX, IBM) supply free or nearly free evaluation or personal editions. There are a couple free experimental APLs (Nars 2000, PLJsAPL). Certainly J has the right idea. Hopefully it’s not too late.

I started using APL long before I had a high school diploma. What I liked about it then, and what I like about it now, is that it helped reduce clutter in that otherwise undersized and unreliable hard drive between my ears. It was really easy.

42. Roy Sykes said, on September 3, 2013 at 2:58 am

One nice thing about being an APL’er today is the high compensation,
mainly due to two factors: one is the relative paucity of APL programmers,
and the other is the extraordinary productivity of them (equivalent to 2-5
C# programmers). (A third reason is that we can actually read APL code!)

43. Chris Wright said, on November 3, 2013 at 3:27 am

Great article – reminded me of some of Alan Kay’s talks about Doug Engelbart’s stuff and the Mother of all Demos…

Again, as you say, Lisp/Forth/APL/Smalltalk – all designed by Really Clever People, not committee!

44. fogus: The best things and stuff of 2013 said, on December 28, 2013 at 2:49 pm

[…] Ruins of Forgotten Empires — an article about the APL family of programming languages […]

45. matt said, on December 30, 2013 at 9:14 am

nice article, shame your ego can’t take criticism regarding your sexism, you bring shame on us all

46. Andrew scarfacedeb said, on January 9, 2014 at 3:35 pm

I’m 23 and I’ve been a web developer for 3 years, but I’ve never heard of APL before. I guess it proves your point.

But I’m familiar with Dennis Ritchie, Ken Thompson, Alan Kay, Donald Knuth, Frederick Brooks, Douglas Crockford, to name a few.
I vaguely remember who invented transistor and microprocessor.

Usually, I read about origins of computer languages and concepts that I’m aware of. That’s why I know about Alan Kay (Smalltalk) or Douglas Crockford (JSON).
What would be the best topic to choose if I want to read about people that you mentioned? APL and LISP?

• rfc1394 said, on January 13, 2014 at 10:11 am

APL was invented by Kenneth Iverson. I remember that last name because that was Doris Day’s character’s last name in “With Six You Get Eggrolls.” Iverson used APL as the mathematical formula to define the operations of the 360 mainframe computer.

• rfc1394 said, on January 13, 2014 at 10:15 am

Off the top of my head I know Shockley and Barden invented the transistor while at Bell Labs. They and one other person who worked with them got the Nobel prize for it. A quick check on Wikipedia says ” John Bardeen, Walter Brattain, and William Shockley” invented it in 1947 and got the Nobel in 1956, so that’s not too bad a batting average!

• Roger Bigod said, on January 14, 2014 at 5:15 am

If you’re interested in the history of Lisp, the wikipedia entry is a place to start. Also the entries for John McCarthy and Guy Steele.

I don’t think your education is deficient if you don’t know about any of this stuff, but there’s some cultural history connected with Lisp and AI. There was a period of excitement centered on MIT in the 60’s and early 70’s. Then a period of triumphalism, including a couple of Lisp machines with dedicated chips. Rule-based “expert systems” were going to take over the world. The euphoria ended in “AI winter” in the 80’s. Then DARPA decided to make the Lisp community adopt a single language standard, resulting in Common Lisp. Opinions differ on the merits of this product.

The historical comments in Norvig’s text on AI cover a lot of this. The collections of MIT hacker slang are fun reading.

47. Visto nel Web – 114 | Ok, panico said, on January 19, 2014 at 8:13 am

[…] Ruins of forgotten empires: APL languages ::: Locklin on science […]

48. Travis Oliphant (@teoliphant) said, on March 28, 2014 at 12:45 am

The power of Array-oriented computing is at the heart of NumPy which was inspired by J and APL and allows you to mmap and write code exactly as you describe it should be done (Pandas is based on it). Here are some slides that compare NumPy code to APL for the Game of Life: Slide 27 of http://www.slideshare.net/pycontw/largescale-arrayoriented-computing-with-python

• Devon McCormick said, on April 1, 2014 at 3:05 pm

NumPy may be inspired by APL but slide 59, with its nested loops, only reinforces Locklin’s original point; kind of handling arrays as scalars in a loop is a far cry from what APL or J or K does.

The more important point behind APL, that a language is a tool of thought, isn’t even on the radar of 99% of coders: most programmers still see a language solely as an obstacle to be overcome rather than as a useful conceptual tool

• Scott Locklin said, on April 6, 2014 at 1:25 am

Hey Travis; NumPy is good juju, and thank you for your work. Back when Paul Dubois was first writing it (whatever it was called in those days; Numeric I think), I knew it would end up replacing a lot of Matlab type stuff. I rewrote some legacy code in it when I was at LBNL myself. Noodled around with it for a project today.
On the other hand, it’s a little more like my (aborted, though others have done the same thing) attempt to graft lapack onto Clojure than it is like Q, J and the other APLs.

I actually came across Lush shortly after my Python salad days. They also have an interesting array type which is probably worth a look. Yann and Ralf are really smart guys. Now a days, when I have the time, it’s J for me.

I’ve considered using Python instead of R for machine learning projects, but Python’s lousy options for packaging things makes it a less attractive option. R is a rotten language, but for hacking out a data problem where you might have to try a half dozen ideas to get a decent solution, it has no equal.

49. mario alejandro said, on April 24, 2015 at 8:43 pm

Hi, I’m in the process of build a toy language, and have read about K, and think it fit to my idea (I wish to build a relational language, with Arrays+Relations(aka tables) as the main structures, and group-based operations instead of scalar), but the main block to me is the lack of a les cryptic explanation or explication of the interpreter.

I’m looking to a minimal KDB interpreter (like I as on https://codegolf.stackexchange.com/questions/48178/implement-a-minimal-j-kdb-interpreter-reverse-golf), as happend with scheme/lisp where minimal interpreter exist to understand it from the inside.

Also, looking for individuals interested in the idea of have a “KDB/relational-like” language but with more mainstream syntax.

50. Akiel said, on April 27, 2015 at 10:21 am

From my perspective as a young-ish guy (in my 30’s) who recently started learning J: my interest in J arose almost as a visceral reaction to a disturbing trend I observed online: that of “software methodologists” attempting to publicly solve algorithmic problems (ranging from simple to modestly complex in difficulty) using questionable methodologies (read: “test driven development”) that seek to reward ineptitude and encourage waffling about haphazardly in the hope of eventually stumbling into the solution, rather than succinct thinking and methodical problem solving.

There’s an “agile” expert who couldn’t program a Sudoku solver even after several weeks of applying his software development methodology (revealing his disconcertingly slipshod style of thinking in the process). And then there are these thesis-length blog posts written by programmers attempting coding “katas” (exercises) on trivial problems whose solutions could be expressed as a short one-liner in J or APL. (Google for the “diamond kata” as an example.)

After observing this trend for a while, it dawned upon me that a well-coded solution in a language such as J lays bare the true complexity of a problem, similar to how the entropy of a message tells us about its information content/compressibility.

And while I don’t claim to be a more skillful programmer than the category of people I just dissed, I know enough to realise that if I want to become better at it, I should strive to emulate and learn from the true masters of the craft.

51. Richard said, on June 18, 2015 at 2:16 pm

I coded APL for food for about a quarter century. I started in that sweet spot in the ’80’s that let me work progressively on 3×0 through z/OS platforms as well as distributed and desktop Windows boxen. The evolution from line input to FSM (VSAPL) to GDDM (APL2, Sharp APL) to full GUI (APL*Plus II, Dyalog APL) was an increasingly fantastic ride both for the virtues of UI enhancements and new language features.

Thanks for this paean to true genius(es) and your entirely appropriate and entertaining curmudgeonliness towards the pointlessly distractably weak-minded among your readership (they’ll prob’ly end up in management).

52. Vess Irvine said, on January 27, 2016 at 1:22 am

I disagree with everyone here who says APL is hard to read. IMHO it is ridiculously easy to read if you have a running copy of the function in question. Just run it in stop vector (debug) mode where the system stops live pending the execution of every or selected lines of code. Use cut in the debug window to parse the statement into it’s component parts (reading right to left in all cases) and paste that snippet of code in the immediate execution window. You see the intermediate answers instantaneously. The system is suspended in the exact state as if it was running in production.

Most APL systems have a DISPLAY function that gives you a quick visual look at the nested structure of each sub or intermediate variable, vector or n-degree matrix array in a statement. Mixed data type variables (part
numeric, part character, etc.) are easily comprehended with DISPLAY.

Use small data sets or arrays in this testing process.

In this manner it is literally child’s play to get to the heart of any APL function. You may still decide to rewrite the function from scratch if the original author was having a bad day.

Since APL functions are so concise, the whole thing sits within eye-view of one window screen, while you are doing the cutting, pasting, snipping and executing of the parsed statements. The snipped code you are pasting is never longer then one width of a screen monitor line. Often it is just a few characters long.

I believe people stating that APL is hard to read must only have experience with viewing the code on a piece of paper alone. No serious APLers even try to do that when studying legacy code. If you use all the tools that are available, especially the “live code” characteristics of an interpreter, APL is a cake walk in both development and maintenance modes.

The debug mode is also terrific for introducing corrupted data into the execution flow. Just change an internal variable, while the system is suspended, to a known faulty value. Then watch how error trapping is performed live. In this way I can exercise all lines of code in a function, even those error handling lines that are only used once every blue moon or so.

I can walk through a 40 year old APL function with ease since the core language has had no reason to change from the get-go; it was so perfect back then.

Exception to the rule …. APL is used to solve some very difficult mathematical problems. This degree of math complexity adds difficulty to reading the code, but that is not APL’s fault. Hopefully you have the math chops to understand what the program is trying to accomplish.

• Scott Locklin said, on January 27, 2016 at 9:09 pm

Certainly the longer I use J, the more it becomes a readable and powerful notation. The problem becomes getting people to adopt the notation. A nice one line J verb can be readable to me, but most people prefer 10 pages of Java spooge, because that’s what they’re used to.
I’m pretty sure Iverson was looking to build a sort of MathCAD with a notation which is easier to parse, and more oriented to general computer science problems. The problem becomes how to get people to adopt the notation; algol and ascii won. Keyboard overlays become a fairly high barrier to entry.

• Bob Van Wagner said, on January 27, 2016 at 9:22 pm

I thought Iverson was just describing the 360 series and future IBM architectures in a a convenient algebraic way, rather than words, diagrams and flowcharts. Still, that does’t explain where the Trees went.

• Scott Locklin said, on February 2, 2016 at 8:54 pm

Well, that was the original use case, but when APL turned into a language, it was certainly my impression that it turned into a method for expressing mathematical concepts. Mathy people don’t like adopting radically different notations, hence the success of MathCAD among non-programmers who need computer aided math (and lesser success of things like Mathematica, which at least translate the Mathematica expressions into something that looks like ordinary math notation). A revolution at the time, no doubt, and a continued source of inspiration to folks like me.

• Vess Irvine said, on February 3, 2016 at 8:12 am

Bingo. My two math tools are APL for heavy lifting and MathCAD, to remind me about calculus courses taken in college. When I am in the mood, I rework old Schaum’s Outline problems in MathCAD, for the heck of it. The last time I worked those problems was with a pencil, blue book and slide rule, 50+ years ago. The beauty of MathCAD is in the end you have fine documentation of how a problem was solved, which anyone can read and understand.

53. Alhadis said, on March 1, 2016 at 6:24 am

I’ll be 29 this year, and I absolutely love the crisp directness and conciseness of APL. The beauty of APL will not go unnoticed by all future generations, believe me.

54. Paul Robinson said, on March 1, 2016 at 4:02 pm

I tried posting this earlier but it didn’t seem to take so I’lkl try again.

While I have always liked APL and felt it had undiscovered power, its problem is that it never had a large base to support it. I shudder when I hear how the GCC compiler requires 20 F*#@%ing megabytes of memory to compile C++ code! Even on, say, an IBM z/System mainframe. This is obscene. APL, as complicated as it was, could run on minicomputers or run in less than 100K and do productive work. Again, the two problems are the extremely steep learning curve and the complexity of the language. And that is why I fear a major capacity for software development, as Alhadis put it, “the crisp directness and conciseness of APL” will be lost because it’s not accessible to ordinary people or to journeyman programmers who could probably do fantastic things with it, but the accessibility isn’t there and the learning curve is very steep.

Java is a very sexy language because of its use on Android, and even today COBOL pays the bills and processes the transactions at probably 100% of the largest corporations (let’s say all of the Fortune 100) and banks in the country, if not the world, but until we find a way to make APL a viable choice for programmers to use in real-life applications, it’s coing to be one of those minor niche languages.

The answer is to get a decent interpreter with libraries to support scripting tasks better than what is out there and make it easier to select APL for serious applications. Like APL with built-in support for MPI so it could be automatically split onto multiple machines as a task becomes more complex and to the extent it is serizable. Perhaps providing workspace libraries that duplicate tjhe entire PHP library to provide all of its functionality. An APL with better programmer tools and libraries could do a lot to bring it back as a serious choice for application development.

But unless it gains better usability, accessibility and functionality it will remain an almost unknown niche language, and its improvements in productivity will be lost to us.

• Scott Locklin said, on March 1, 2016 at 6:28 pm

Pure APL has the issue with the fonts and keyboards (I should probably make an APL keyboard). Buying actual hardware to write in a language is a tough sell unless you’re already committed. I mean, I am motivated, yet I haven’t put a dent in “classic” APL because of this.

Then there is J, which appears to be pretty close to classic APL as a language, but which is extremely dense for people coming from classical programming languages.

My partner and I are working on Kerf; not a general solution for now (specifically for solving certain data processing problems); we made it more approachable by making it look like more ordinary programming languages.

I reached out to Mike Jenkins of Nial; apparently he’s partnered with John Gibbons to bring back Nial. They apparently have it running on GPUs, which is a considerable achievement, and a useful thing. Nial is written in something like a conversational English language version of APL notation. It may also get built out to work in a Spark-like way.

• Woodley said, on July 9, 2016 at 7:27 pm

The APL “Mainframe” is alive and well… in the form of Cloud Computing. http://APLcloud.com is a step in the right direction. Also, APL is available on a variety of operating systems (from Windows to Unix, Linux, Apple IOS, and even Raspberry PI).

Dyalog APL is a full featured language that has its roots in APL, but has extended the language to interface with the world of “modern computing”. It also offers a Web Server “engine” written in APL that is called MSERVER (open source at GITHUB).

A demo of APL through the Web Server can be seen here: http://tryapl.org/

UNICODE solved half of the problem of the specialized APL Character set. Now, APL CODE can easily be copied, pasted, emailed, stored in databases, and edited using any number of UNICODE tools.

The other half of the APL Character set problem is TYPING IN those specialized APL symbols. For that, you will still need to download and install an APL Character Set. However, this is fully integrated into the OS and acts like an alternative keyboard that you can “toggle” on and off for any language pack.

It is interesting to see that the overall interest in APL Programming is increasing. After a long slow decline, Google Trends shows the interest in APL Programming has been increasing over the past 5 years.

Cheers.

• Scott Locklin said, on July 14, 2016 at 4:30 am

I’ve heard tales of the wonders of Dyalog; in fact, the guy who pointed out J to me in the first place has been using Dyalog since half past forever. Thanks for pointing out their cloud solution; that’s pretty interesting.

55. […] can find an article in Scott Locklin’s Blog about APL and other almost forgotten languages and the potential loss […]

56. Woodley said, on July 9, 2016 at 7:22 pm

The APL “Mainframe” is alive and well… in the form of Cloud Computing. http://APLcloud.com is a step in the right direction. Also, APL is available on a variety of operating systems (from Windows to Unix, Linux, Apple IOS, and even Raspberry PI).

Dyalog APL is a full featured language that has its roots in APL, but has extended the language to interface with the world of “modern computing”. It also offers a Web Server “engine” written in APL that is called MSERVER (open source at GITHUB).

A demo of APL through the Web Server can be seen here: http://tryapl.org/

UNICODE solved half of the problem of the specialized APL Character set. Now, APL CODE can easily be copied, pasted, emailed, stored in databases, and edited using any number of UNICODE tools.

The other half of the APL Character set problem is TYPING IN those specialized APL symbols. For that, you will still need to download and install an APL Character Set. However, this is fully integrated into the OS and acts like an alternative keyboard that you can “toggle” on and off for any language pack.

It is interesting to see that the overall interest in APL Programming is increasing. After a long slow decline, Google Trends shows the interest in APL Programming has been increasing over the past 5 years.

Cheers.

57. […] This journey led me to appreciate the selection and use of computer-based tools for quantitative analysis.  Today, the two leading platforms in “Data Science 101” are Python (a general purpose language with statistical libraries), and the R Project for Statistical Computing (a specialized package for data analysis and visualization).  Both are open source projects, and free to download and use on personal computers.  I tried both.  R is a higher level programming language more similar to the APL programming language that gets work done more quickly.  For statistical work, I recommend R over Python (although APL is a theoretically better implementation). […]

58. aquarianpowerllc said, on May 19, 2018 at 11:50 am

I loved APL and would like to develop a next generation APL compiler with new math and physic symbols and constants to work toward some kind of Universal Field Theory. My LLC may have the money and brain power to do just that. Anyone interested?

• Roy Sykes said, on May 19, 2018 at 2:56 pm

I suggest you look at Bob Smith’s NARS2000 project (www.nars2000.org)
before embarking on your effort. Bob has implemented many features I

• Robert Collins said, on April 8, 2020 at 7:42 pm

Good to see you Roy. I hope all is well.

• Roy Sykes said, on April 9, 2020 at 3:24 am

Howdy Robert. E-mail me at roy@roysykes,com and we’ll reconnect.
I may be the oldest guy writing APL continuously for fun and profit since 1969!

59. George3d6 said, on July 29, 2018 at 7:32 am

It’s like you wrote an article entitled:

Why sports cars today are worst than sports cars in the 80s

And started with the comparison:

“A Kia Rio doesn’t stand up to a Bentley Mulsanne”

Redshift, based on benchmarks I did on multiple occasions against Monete, Druid, MariaDbCs and Clickhouse doesn’t stand up as a column-oriented DBMS. It’s a shit database, plain and simple, it will give you < 15% of the performance of any of the above, at the same hardware cost (keep in mind, it's a hosted service, and I'm talking performance on simple aggregated queries here, some SUMs, COUNTs, DISTINCTs and GROUP BYs).

You want to shit on modern technology ?

Fine, feel free to.

But bring benchmarks and LEARN what modern technology is to begin with.

Coming out of your old troll cave trying to appeal to authority, without having a semblance of understanding of the subject you are discussing, is not helpful. It's annoying to some and misleading to others.

• Scott Locklin said, on July 29, 2018 at 3:14 pm

I actually think clickhouse is OK and have said so. I’m pretty sure you did something wrong with redshift though.
FWIIW you do realize I have written things like this, right?

60. charlesleecourtney said, on March 29, 2019 at 10:31 pm

About every 5 years I stumble across this cornucopia of (computer science) wisdom.