Monday, May 12, 2008

Logical Fallacies Considered Harmful: You Should Learn New Languages

Gustavo Duarte recently posted an article about the folly of dabbling in new languages.

The post is an earnest one, and thought-provoking, but I realized this evening that it is a masterpiece of logical fallacy.

Problem #1: Begging the Question

Gustavo argues that "dabbling" in new languages gives superficial results to your career. To dabble is to "work at something in a superficial manner". So, the argument is that a superficial investment in something yields superficial results. Hmmm... this seems like begging the question to me.

Note that I don't mean the popular usage of "that begs the question: yada yada" but rather the classical sense of proving something that is implicit in the premise.

Problem #2: The Strawman Argument

Gustovo condemns the Pragmatic Programmers' advice to "learn a new language every year". But he goes on to set up that advice as though they are saying "Open an editor, screw around with some syntax, and wait for the enlightment to hit you". Then he knocks that Strawman down, with ease.

I just checked the Prag Prog book: it is relatively terse on this point. I may be guilty of my own "Ironman Argument" here but, having seen Dave Thomas speak on several occasions, I doubt if they intended for us to take a full year to merely mess around with some core language syntax. My guess is that they will agree with the examples in the next item.

Problem #3: False Alternative

Gustavo seems to set up two choices: either noodle around aimlessly with a new language, or learn something specific with your current language, such as algorithms, math, or AI.

Wow. We have hit the trifecta now with the fallacy of false alternative.

There are more than two choices. Let's assume that we only know Java. How about these examples:

  • Write an implementation of a Red-Black tree in Scala.
  • Compute Pi to a million decimal places in jRuby. Then check an online source to see if you are right.
  • Develop a simulator for Tic-Tac-Toe that uses genetic algorithms to evolve a winning strategy. Use Groovy.
Sounds fun to me! Personally, I would learn (or dust off) something from all of these examples. And we would learn a new language along the way.

Problem #4: When timeless arguments aren't timeless

This isn't a formal logical fallacy, but Gustavo hints at some classic, timeless elements such as Lisp and the "Gotos Considered Harmful" article.

One might think that his arguments are timeless: nope. He asserts that:

In reality learning a new language is a gritty business in which most of the effort is spent on low-value tasks with poor return on time invested. You need to get to know the libraries, struggle with the environment, find or tune a text editor, look for tools, and so on.

This was true in the past, but with languages on the JVM and the CLR, we are in a golden age. As proof, consider the 3 examples above: all of those languages are a simple drop of a jar file and allow the use of the Java libraries. They run on the JVM. (Ted Neward has given entire keynotes on this idea.)

I will grant that the debugging/IDE facilities won't match Java, but this argument is completely bogus.

Problem #5: Confusing Momentum with Inertia

Again, this isn't a fallacy per se, but I take issue with this:

There’s another pernicious effect to language hopping: it hurts your momentum. Every time you rely on your current languages you get a little better.

Last year, I blogged about using Groovy to interrogate an OODB. My team still uses that tool. Anyone on the team could have written it in Java, but in the span of 5 years, no one had done so.

That's not momentum. That, my friends, is inertia. My little Groovy program is one of the biggest contributions I've made to that team: people ask me for it all the time.

The Upshot

Gustavo ends by recommending that we learn languages that are orthogonal to each other. I agree completely. And, for the record, I agree that merely dabbling is dangerous: I have probably been guilty of it myself.

But I heartily recommend learning a new language. And learning as much as you can about logical fallacies. They are everywhere (in this post?).

12 comments:

Unknown said...

Hi Michael,

Here are my replies, following roughly the numbering of the problems you identify:

#1

While by the dictionary's definition "dabbling" might be implicitly bad, dabbling in programming languages is viewed positively by nearly all programmers. In fact, under point #3 you suggest dabbling in a positive light! If I write an implementation of red/black in Scala, _that is dabbling_. You learn just enough of the language to write X, have a ton of fun doing it, and move on.

Fun? Sure, it's definitely fun as you point out. An efficient way to learn? No.

So even as you say something is implicit in my premise, you contradict it. So no, this is not begging the question.

#2: Again, the "one language a year" thing is more reinforcement for this idea that toying around with different languages is a good way to become a better programmer. I think there are better ways, hence my post.

And if you're learning a new language a year you're:

a) Only dabbling, or
b) Not learning much else, or
c) unemployed

All of which are bad, except perhaps c.

So yes, I think that's bad advice. I also don't "set up" the argument in the way you describe. I am arguing the general case that learning languages casually is an inefficient way to grow as a programmer. I don't think it's fair to say I'm attacking a straw man.

#3: I also give the alternative of learning an orthogonal set of languages and getting really good with them, AND I suggest that there are other more important things to learn instead of toying around with languages (examples in the post: security, business, architecture, etc).

Concerning "false alternative", it is only false if your time is infinite. My whole argument revolves around trade offs. If you learn red black in Scala, you're spending time learning Scala that you could spend in something else. Would that something else be better? Maybe. I think it could often be.

It is clearly an alternative: how to best spend the limited time you have to learn?

So again, not a fallacy.

#4: If you're only learning a language enough to compile and run it in the JVM or the CLR, you're almost surely only dabbling.

Ruby, Python, Lisp, Perl, any of these languages have _their_ way of doing things which if you're even remotely serious about the language you must learn. And libraries go way beyond the bare essentials we get in Java or .NET. Same for toolage: each language comes with its set of tools. And of course high productivity things like the perfect emacs mode or Vim scripts or ReSharper templates and so on don't come automatically either.

So no, the argument is far from totally bogus. The syntax of a language is the tip of the iceberg.

#5: That is a personal choice. I am constantly trying to evolve and become better in the languages I have in my set.

I try to read high quality source code in the languages I know, I improve my tools, I try to improve my environment, etc. I'm constantly getting better at them and the end result is that I can develop _extremely_ fast in the languages in my set. I feel this approach has worked very well for me and I see it as momentum, not inertia.

cheers,
Gustavo

Weavejester said...

A year is plenty of time to learn a language, gustavo, even if you're employed. This is especially true if you're already familiar with several programming languages.

I think you vastly overestimate the time taken to learn a new programming language, and greatly underestimate the amount of overlap between programming languages.

How many programming languages are you familiar with? If you don't learn a programming language every year, how can you say how long it takes? If you don't know at least a dozen languages, how can you say for sure that it doesn't get a lot easier, the more you know?

I'd guess most of your critics have plowed their way through a fair number of languages, and presumably therefore know more about it than you. Am I wrong?

Anonymous said...

Well I would take a middle ground b/w "learn a new language every year" and "learning new languages is useless".

Learning new languages IS useful but learning new languages that are very similar to your current language is not of much value. Also once you learn languages of 4-5 different varities, learning new languages is not useful enough to justify spending the limited amount of learning time we have everyday.

A programmer who "knows" 10 languages but hasnt written anything major in any of them or who doesnt know lets say how compilers or filesystems work is kidding herself.

Unknown said...

@Rahul: that's _exactly_ what I think.

@Weavejester: I have programmed substantially (ie, more than thousands of LOC) in Ruby, C#, JavaScript, SQL, Perl, C, C++, Lisp, Mathematica, Matlab.

I've done a fair amount of programming in a number of other languages, some of them full-blown and others scripting languages like bash script, power shell, awk, etc.

Regarding the time needed to learn a programming language, absolutely, you can get hello world going in no time at all. I actually have an easy time with languages, and can usually jump straight from the spec (or the grammar with some discussion) to programming. What takes _a lot_ of time is proficiency and productivity.

The initial phase is fun, my point is just that I don't know that it really is all that beneficial, compared to other things like reading high quality source code (to improve your programming) or learning from other domains I think are very important for programmers.

But anyway, I've been rehashing this argument for a while now, so I'll let it rest.

cheers,
Gustavo

Weavejester said...

Fair enough, gustavo. I honestly haven't noticed any problems in productivity when swapping to a new language, except with Haskell, and that was a bit of a paradigm shift.

Frankly learning Rails took far more time than learning Ruby, for me. I find the language and standard library takes only a small proportion of the time to learn, in comparison to the time spent on a project.

Michael Easter said...

Thanks for your comments everyone.

@Gustavo

To really hash this out, we'd have to have formal definitions, as we disagree on a fundamental level.

I invite other readers to read all of this and decide for themselves, with their own subjective sense of the various terms.

I will say that in my world, dabbling is learning operator precedence and other syntactic basics. Writing a sophisticated data structure in a given language's mindset (e.g. in the Scala 'style') is non-trivial and _not_ dabbling. True, it is not a full-blown app but it has real value. Is it inefficient? That's subjective. Is it wasteful? (Per your title) No.

With respect to the JVM and libraries, Scala and Groovy only exist on the JVM! So I maintain that it is possible to learn a truly distinct language (from Java) and still use the well-known libraries.

Michael Easter said...

ps. re: title. I have taken some heat for going with the cliche riff off Goto Considered Harmful.

Duly-noted: I tend to overuse that lick. However, this time, it was actually intended as a play on Gustavo's title (which itself echoed the classic article).

Unknown said...

@Michael: agreed.

I have a lot of math background so I relate well to your point about formal definitions, and taking things one step at a time until we either find an assumption we don't share or a logic mistake in either of our points.

This is also hair splitting to some degree. There are many things that are important to our learning as programmers, and surely learning languages is one of them. When I said wasteful, I don't mean as in "watching pr0n wasteful", but as in "there may be a better return elsewhere". I think because learning languages is fun, it's something that sometimes gets overdone by talented programmers in detriment of other things.

But alas, as you say we'd need to step back and be more precise if we wanted to fully hash it out. Who knows, maybe a conference some day...

cheers,
Gustavo

Anonymous said...

Michael Easter said: "I invite other readers to read all of this and decide for themselves, with their own subjective sense of the various terms."

OK. I fear I have to side with Gustavo. Over the years I have tried (in my own time) to learn the following languages: Ada, x86 assembly, C#, Erlang, Java, Ocaml, Python, C++, Haskell, Perl, Ruby, C, Lua, Prolog and Scheme. Only in Java, C and Haskell can I say that I have done things that might perhaps be interesting for others beside myself. The rest varies from reinventing basic data structures and algorithms (in Ada that was sort of a must, since there were almost none in the Ada95 standard library) to practically nothing.

But I also have learned some language-independent tools. From my private learning projects directory, I count Alloy, Bison, R, HOL, LaTeX, and maybe CSS as tools, and even though I've only dabbled there as well, I feel like it has been more worthwhile: Although I can't do very much sophisticated in R, I now know what it is and roughly what it can do - I can return to it if I see an opportunity to use it. To some degree this also applies to "real" languages: If I need to write an app with an embedded scripting language, I know that Lua and Scheme may be worth considering. If I need to write something real-time and critical (hope it never comes to that...) I can tell people that there exists these things called Ada and SPARK.

Although I haven't tried Gustavo's approach so much, I feel like I've indeed wasted a lot of time that could have been better spent elsewhere. From a general learning perspective, I probably should have stuck with C, Java and Haskell.

Michael Easter said...

@Harald.

Really? You mention sticking with C, Java, and Haskell. You didn't get anything from Scheme and (Python or Ruby) that would add them to your orthogonal set ?

I find that hard to believe, frankly.

Michael Easter said...

@rahul

There is one other argument for learning the 4th or 5th language: to have the vocabulary so that one can join the Great Conversation.

By that, I mean that any given language is probably having a debate about where it is going: a Great Conversation. IMO, it is worth picking up enough of the language so that we can track the arguments.

For example, Python and Ruby are quite similar at an abstract level. However the terms and idioms are different. One might argue that it isn't worth learning one if you know the other (the orthogonality argument), but if one wants to participate in the Great Conversation, of say Python 3000, then it is worth it to pick it up.

Now on this point, I do concede the sense of "infinite time". One may pass on this idea due to priorities and finite resources, but it is still valid.

Andreas Krey said...

Gustavo: Fun? Sure, it's definitely fun as you point out.

Well, that's the point! Why are you programming when you're not having fun doing it?

An efficient way to learn? No.

Efficient way to learn what? Going downhill is not efficient when you want to get to higher grounds, but it is needed to get out of the local maximum, and even to see whether there are others.

Honing your hammer isn't helpful when you don't recognize screws.

I think you made it less than perfectly clear that you are arguing from an already-diverse set of known languages.

For example, I use ruby mainly as the new perl (which I never got into for some reasion), and I learned erlang by accident on a job, where I also more or less by accident did some not-quite-trival modifications in existing modules. I wouldn't consider me quite fit for doing design of new erlang software, though.

I also think that libraries are overrated. Somehow I always end up in corners where I mainly have to deal with locally-grown stuff.

And finally, to me the point of learning new languages is to see that there are occasionally new ways of handling the basics of coding: type inference, closures, quasiquotations and generally foreign parsers; things that one can merrily ignore staying in a fixed (small) set of languages.