The Library of Imaginary Machines
May 03, 2006 | co.mments
Both JRuby and Jython have experienced stop start development. Why is that?
Maybe language development isn't sexy anymore. For a lot of developers now, I think programming languages just aren't all than interesting in terms of a target abstraction, or for playing around with. My generation (graduating in the late nineties) might be last that actually were interested in programming languages as an end in themselves, and even then platforms like J2EE, CORBA, and the Web were much more interesting to many. Yes we argue a lot today about which language is better, but that's mostly trash talk about The Man and what we use to get our work done. I'm talking about what we play with outside of work for fun - it tends to be frameworks and platforms, not programming languages, which are now a means to an end, something you put up with. Time was I would play around with Antlr, or use RelaxNG to verify other languages, or hack a http subset. For fun. Now I fool around with Rails and Django and Web Protocols, which are different classes of machine to a language machine.
Perhaps the relative importance of the programming language is because developing programming languages is difficult, something we might classify these days under systems programming. It's more difficult than building a framework or a library. There's less tolerance for inconsistency in a language than a framework. Internal illogic will be found out. Yes, something can put something together quickly and it can be useful and fun, but really finishing it out, making it performant, the last 20% - is a big ask.
In software there's always some machine you're dealing with that is made from another machine. There are Turing Machines, Von Neumann Machines, Register Machines, State machines, Automatons, Virtual Machines, Pools, Caches, Timers, Schedulers - even languages are machines. Sun has landgrabbed the term Virtual Machine for Java, but really there's always been virtual machines.
As a consequence using the current machine or language to implement a more relevant machine or language is a powerful technique. Since new machines can take time to figure out and are hard to estimate they make line of business managers nervous - they smack of over-design, over-engineering, and over budget. As Terence Parr put it "Why program by hand in five days what you can spend five years of your life automating?" But when you have one, it's like going to battle with an AK-47 instead of a musket.
The gun analogy is crude, but apt. There is no force multiplier in software development like a new machine. Once you start writing software this way, in terms of better machines, it's very seductive. New machines are silver bullets. Power to get things done is what gets people fired up about Rails as a DSL, or MDA, or Code Generation. I suspect current heady interest in Domain Specific Languages (DSLs) is driven by the "power tool" notion.
When Joel Spolsky talks about Leaky Abstractions, he's talking about the situation where an underlying machine, one which you should ideally pay no attention to, pops through your present machine into your coding reality, Lovecraftian style. The word we use to describe this hiding of machines is "transparency", which is one of those technical terms that is correct, but bone-headed. Most people see the word "transparent" as allowing you to see through to something, not hide it. What we really mean here is "opacity", the ability of one machine to hide the fact that it is implemented in terms of another machine. It's when the underlying machine (or machines) seep through that there *is* transparency.That's when the leaky abstraction occurs and problems begin.
I like the "machine" term better for this kind of discussion - it has less psychological baggage and consequent politicking than "language". For starters you don't have to buy into Sapir-Whorf theory to appreciate the words of people like Steve Yegge or Paul Graham or Sean McGrath on this subject. It also keeps the discussion focused on operational and utilitarian matters. "My machine is better than yours" has a fighting chance of a useful outcome. "My language is better than yours" is a Godwin attractor.
Recently a Rails request ran on the JVM. That's the kind of thing that can drive interest. JRuby becomes the cost of doing business for that VM; an underlying machine to support the Rails machine/DSL. JRuby itself is of limited interest - whereas a clone of Rails on Java turns heads. Hence JRuby gets done.
Support for dynamic languages on the JVM will probably work itself out. Instead it might be concurrent programming that results in migration away from the JVM. "Nonsense", you say - "the Java threading model clearly rocks". Well, yes and no. Some have claimed that Java, with its shared memory concurrency model does threading right. I'd have to agree - for the set of credible commercial languages Java is the winner if you are going to do threads. But it's a stretch to say its shared memory model is the right approach to concurrent programming altogether. Languges like Erlang and Mozart, are arguably better, albeit currently 'impractical for commercial use'. The only Erlang book I can find is from the mid-nineties, is currently £47 - I've been trying not to buy it for a few months.
Multi-core is the default production configuration now for CPUs I trialed a tiny laptop a few weeks ago and the most surprising thing was that it had a dual core. I'm no metal head, but given that they're turning up in machines optimized for road warriors and office apps, and if people are right about the end of Moore's Law (bye-law?), then the next major machine/language argument could be over concurrency models, with shared memory playing the incumbent, just as static typing and OO does today in language design. Productivity and expressive power might be recast in terms of your ability to work with the concurrent architectures being imposed on you.
May 3, 2006 01:21 AM
Post a comment
TrackBack URL for this entry: