A recent post has been deleted from this space. It was a sincere, if boring, piece, by a guest writer, with an underlying message on the importance of logging out of all browsers on shared computers, even when at home on holidays.
In its stead, I offer this: my thanks to you, the CodeToJoy nation, for another fun year, and best wishes ahead in 2010. If you are travelling, may you be safe and on time. Keep laughing (often, that's all you can do.)
I've been meaning to write for awhile; this post celebrates analogies and provides a few of my current favourites.
Background
I've been making analogies for a long time. My favourite teachers made them, and as an undergrad, when tutoring others in calculus, they came naturally to me. To this day, I'm always trying to frame problems in some other context, either to non-IT people or domain experts. I'm convinced that there is a biochemical reward for nailing one. E.g. when another techie says "that's exactly right". (The reward is probably a cousin to that of making an excellent pun.)
To be fair, Michael talks about romantic analogies to software engineering, whereas many of my analogies have the granularity of problems in software engineering. But no matter: be it an analogy, metaphor, or allegory, you can count me in. This is especially true for keynote presentations, which should be thoughtful and imaginative. If a tech talk is a pop song, a keynote speech is a symphony. (There I go again, but hey this blog was started on a musical analogy.)
Now, I happily grant that analogies have their place. If you're working at a start-up, frenetically trying to hit some deadline, then no one wants to hear how your design was inspired by a Chopin nocturne. There's a time for straight-talk. I'm with Michael there.
With that background, here are some ideas that have lit my fire lately. They may or may not be pure analogies, but they have sparked my imagination.
Mozart's K 522 - A Musical Joke
Wolfgang wrote a piece, A Musical Joke (K. 522), that satirizes clumsy composition. It intentionally uses common "mistakes" made by inept composers.
Ever since the vending machine shoot-out at the Lambda Lounge, I've wanted to write a spoof version in Java: my own modest K. 522. Its ostensible goal is a functional style of the vending machine spec: no side-effects or mutable state.
However, in my spoof, the poor author goes off the tracks, because all computation is done by throwing checked exceptions. The program will be absolutely gorgeous in its wretchedness!
I've started this project, but unfortunately it quickly became so painful that I couldn't bear it. However, I hope one day to have an entire theme of K. 522 projects over on github.
Mondrian and De Stijl
I'm no connaisseur of fine art, but I've taken a class or two and enjoyed learning about its history. My major discovery (blogged elsewhere) is that much of art occurs in reaction to a prior context. It dawned on me that at a gallery, a new exhibit can often be a major "screw you" to the establishment of the time. What appears to be a simple painting, when viewed in context, can be startlingly rebellious or profound.
As a quick primer on De Stijl: it's a Dutch art movement founded circa 1918, intended as an intellectual response to the chaos of war. As noted in their manifesto (!), the artists sought to find inner harmony within themselves and universal laws of simple geometry.
I have a few thoughts on this. The obvious one is apparently everyone needs a manifesto, from artists in 1918 to software artisans in the 21st-century (exhibits A, B, and C).
The second reaction, from a software standpoint, is REST. Agreed, it's clearly insane to compare WS-*, SOAP, etc with the horrors of World War I, but consider how simple this URL is:
http://esite.com/product/show/1
Call me crazy, but I genuinely sense a kind of abstract connection to the orthogonal lines in Mondrian's work, especially against the busy, noisy chaos of Web Services. Imagine standing outside a giant, enterprise-y WS conference with that URL on a large placard. Reaction to a prior context.
Finally, recall that the REST movement similarly came from an explicit intellectual genesis, famously being a PhD thesis.
Note that, above, I said abstract connection: this is not equating art to software engineering. They just rhyme for me, in some weird way.
Kandel and Aplysia
I'm currently reading an excellent book, In Search Of Memory, by Eric Kandel, a Nobel-Prize winner in neuroscience. Part biography, and part history of his scientific journey, it is wonderful.
In a key chapter of his career, Kandel sought to understand the biological mechanism behind short-term memory. In the early 1960s, the conventional wisdom favoured mammals over invertebrates for research. The thinking was to stay as close to the goal (the human brain) as possible.
Kandel went in the other direction. He wanted a reductionist approach that explained short-term memory in a minimalist setting. He ultimately chose a sea slug called Aplysia Californica. This species matched his instinctual desire to keep things simple: Aplysia has a small number of neural cells, which are quite large (i.e. easier to study). It also has a simple reflex (withdrawing its siphon, and inking) which ultimately proved to respond to forms of learning (e.g. habituation and sensitization).
It was a wise choice. By 1969-1970, Kandel and his colleagues had discovered severalmajor principles of the cell biology of learning and memory. In essence, they built a entire conceptual framework for learning and memory, and were able to verify it in the "laboratory" of the Aplysia. The humble sea slug was a gold mine for his career and for science.
Although completely glorious in its own right, my take on this for us, humble software developers, is this: Kandel not only discovered terrific scientific ideas, but provides an object lesson on how to do research. This is the absolute embodiment of KISS.
For example, imagine that you are struggling with a concurrency concept in a large project. Or the precise mechanism of transaction propogation in Spring. Or the dreaded gridbag layout in Swing. You may be resistant to starting a new tiny project -- your very own Aplysia -- for the sake of isolating the exact issue of concern.
"Who has that kind of time?", you may cry, as I often have. Well, if you want the real answer, make the time. Think of Kandel. Reduce, reduce, reduce: KISS.
The Upshot
Though I can enjoy art and music as pure pleasure, I love finding parallels between disparate subjects. Often, the best way to convey these psychic fingerprints are through analogy. So, sign me up.
As mentioned above, I realize these are abstract connections. I don't fancy myself as a composer, artist, or neuroscientist. I don't equate my career to such enterprises to make it more glamorous. Thankfully, I already find my career to be glamorous, as it grants me elements of both art and science.
Dean Wampler wrote an excellent article about the importance of cultivating your career, and provides some ways to do so. A thoughtful piece with some great ideas.
In classic piggyback style, I thought I might add some suggestions. I'll keep this brief, as I fear an audience mismatch: those who might benefit most from these kinds of blog posts are the people who don't read blogs. This means, sadly, we may well be yammering to ourselves in a cyber echo chamber.
Developing Experience
Dean suggests trying new technologies at work (testing is a great opportunity), or an open-source project.
One other option is a public repository such as GitHub. In my experience, posting a project on a public server really forces you to dot the i's: you want a good build process, unit tests, clean/idiomatic code and so on. The public nature of the effort removes the laziness that can happen on homegrown "Sunday night" projects.
Another option is to build a website. If you want to learn Rails, then really use it. Hosting is cheap, as a career investment.
You might protest: What would I build? All the great ideas are taken! Well, first remember that you aren't trying to get rich, you're trying to get experience. That said, it will take some creativity. All too often we concentrate on creative technical solutions, but do not apply it to our careers. In this instance, there are lots of ideas: consider a single service site, help a volunteer organization, etc.
If you play it right, you might learn something and make a name for yourself in the process. A friend of mine wrote Online Task List: he learned a ton about web development and now has hundreds of real-life users (including me).
Use Technology
Everyone talks about social media, but there are many other technologies such as screencasting and the mighty YouTube.
As an aside, I am stunned that it is 2009 and I see 6-page résumés: each job has the same lengthy details, no matter if it is a senior position in 2008 or an intern level gig in 1999. There is a painful list of technologies that includes things like log4j. I realize résumés are tuned for search engines, but no one cares that you know log4j!
What I'd like to see is someone on YouTube standing at a white-board, taking 5-10 minutes to explain the inverse=true concept in Hibernate. Or your definition of unit tests versus integration tests. What is your favourite data structure? Anything! If you're good, you will shine through.
If I were to receive a minimalist résumé with contact info and a URL to your YouTube vid, I guarantee you'll have my attention. (Here, I'm assuming a thoughtful, tailored cover letter as part of the offering.)
The Upshot
In both solving problems and cultivating careers, don't discount creativity.
All of this may or may not matter to you: "A Longer Post" is an anagram of "Strange Loop". I live in St Louis, MO. Some speakers and the organizer are friends of mine. I work for a sponsor of the recent Strange Loop conference.
This is a continuation of some thoughts on the conference. See many more resources here.
Hamlet gave a splendid talk on the power of manipulating Groovy's AST. As a warm-up to Groovy, he showed a quine: absolutely perfect for this conference. Looking back, I'm surprised they weren't all over the place.
Hamlet's main example was to introduce code into the AST during one of Groovy's compilation stages. I was reminded of a comment on Java Posse where someone said that AOP had to be invented in Java to solve a particular problem, and that the problem simply didn't exist in dynamic languages. This talk exemplified this in spades.
As an aside, Hamlet was a real trooper with the microphone. It was an awkward set-up, but he handled it gracefully. (I tend to get rattled under such conditions, so big props...).
Matt Taylor: jQuery, the Javascript Library of the Future
This talk wins the prize for making me want to a buy a book on the subject. If you haven't seen Javascript for awhile, good news: the libraries are fantastic, with jQuery arguably leading the way.
Matt's presentation and slides (featuring a hall-of-fame Twittch comic) were right on target, but the money maker is the demo. Seriously: check it out now.
It may be simple jQuery in a browser, but it is really a clever layout and a testament to jQuery. Matt showed some snazzy selectors and hinted that you can do more if you know CSS. I maintain that if you don't know CSS, you could use something like this to explore and learn more about it. The demo is a bit like an IDE for the browser. Also, if you are using a giant template system (ahem), then jQuery might be useful to introspect pieces of the HTML fractal with which you must deal.
I'm sold. I hate CSS and a lot of web design but this library looks great.
Michael gave a classic, spot-on talk on 2 major platforms: iPhone and Android. I say 'classic' in the sense that the trade-offs were presented in a balanced and honest manner. This is one talk where I wish there was more time for questions. There were many.
After all, there is a big fork in the road for mobile development, and you can't take both paths. Choosing one is a big decision. I'd especially liked to have heard more about development as a potential side-venture, rather than within an enterprise, and the necessary resources (e.g. accountant, attorney, trademark, etc). Not very techie, but mobile is the new gold-rush.
I respected that Michael didn't proselytize which path to choose: he just laid out the options. I have the high respect for people who can present both sides of a topic without tipping their hand (even if they are passionate in one direction).
Alex Payne: Keynote on 'Minimalism in Software'
I've been somewhat scooped by Michael Galpin (above) on this one: I also give Alex high marks and found his keynote to be really thought-provoking, even if it was disagreement. I may write a critique in later a post. However, unlike Michael G, I do likeartistic/musical analogies. If anything, I wonder if Alex went far enough with his analogy.
More later, after the web has a chance to see Alex's talk. The nano-gist: after an introduction to Minimalism (versus minimalism), Alex listed some methods to achieve it in technology (see the slides or this recap by Weiqi Gao).
Regrets and 2nd Chances
I saw some other talks but want to focus on regrets -- actually, second chances -- as the content will be online. I'm looking forward to the Strange Passion sessions, talks by the senseis Jeff Brown and Ken Sipe, and definitely James Williams' talk on Griffon.
The Upshot
Yes, yes, Alex Miller likes nachos. But I once read that he also likes building things, including events, and bringing people together. Strange Loop really was a dandy, and we are all better for it. Congrats! And thanks...
Full Discloseth:
I work for a sponsor of the recent Strange Loop conference. Some speakers and the organizer are friends of mine.I live in St Louis, MO. "Lagoon Strep" is an anagram of "Strange Loop". All of this may or may not matter to you. Go Steal Porn.
I work for a sponsor of the recent Strange Loop conference. Some speakers and the organizer are friends of mine. I live in St Louis, MO. Finally, I've run "strange loop" through an anagram generator and laughed for hours at the various and sundry output. All of this may or may not matter to you.
As always, all opinions are solely mine, and genuine.
A Random Walk
I'm not a journalist and won't try to report on the conference. Chances are, it would be a futile endeavour. I use the "random walk" title as signal that this post is an Impressionist, personal experience.
The Vibe
I do want to set the scene: the event was held at the Tivoli in University City. "The Tiv" is a 1920s era movie theater with lots of unabashedly glitzy character. It was a marvelous choice and worked out really well. About 300 people attended, cramming the lobby and the facilities: the geek vibe was strong. (There was much more room in the theatres; i.e. during the talks.)
There were also cameras! The talks will be on video, thanks to DZone. I'll post them here along with any links to slides.
One measure of a good movie is how long it stays with you afterwards. Mario's talk passes that same test. In a lyrical style with captivating slides, he combined ideas from Zen philosophy and a 'warrior code' to forge parallels to agile teams in software development.
Among my revelations:
The struggle of meditation is to quiet the inner voice. Pure TDD is similar, as the inner voice always wants to write 'the real code' first. Testing first is meditation. Perhaps that's why it takes focus.
Team culture is more than the sum of its parts. Like a warrior clan, there is a sense of something larger. Good teams have a sense of the 'common good' (aka convention) and the discipline to stay with it.
I once read about an Allied WW2 bombing squadron that suffered terrible losses over Europe. Despite being decimated, the remaining planes returned to Britain in formation. I think of this often when I work, alone, in a team war room on a weekend.
Mario mentioned Corey Haines, who lives a nomadic existence as a programmer. This reminded several of us about Paul Erdős, a rockstar mathematician who lived the same lifestyle.
Mark gave an excellent overview: the pros and cons of lock-based concurrency versus using Software Transactional Memory (STM). I especially liked the open question of whether its time has come: after all, garbage collection took many years to become mainstream. The unvarnished truth is that we don't know, but things certainly seem to be brewing.
Mark examined the details of STM in Clojure, using diagrams to give a sense of the internal representation. It is hard to recreate here, but I left with a better sense of Rich Hickey's position that the time dimension is vital to concurrency (see Hickey's slides here).
I don't know Ruby, but I couldn't pass up a chance to see Charles. He's a class act in the Ruby community and obviously a major force. Not knowing Ruby, I was definitely a stranger in a strange land -- in fact some goons at the door frisked me, finding a Grails book and some Python code in an inside pocket. Not necessary, gang! (also: not true)
Charles mentioned "Java Next" and his criteria for choosing a Java successor. I loved that some popular JVM languages -- claimed by others as Java Next -- did not meet his criteria. He respected said languages but stated, matter-of-fact, that they didn't meet his aesthetic. This is a clear sign that the JVM community is healthy.
He went on to examine two Ruby mutants: Duby and Surinx. There are some compelling slides that compare and contrast these two 'unfortunately named' mutants to Ruby itself. I'm struggling here to capture the essence of this talk, but do check it out: I thought it was fantastic stuff and an object lesson as a presentation, in terms of pace, tone, and code samples.
There is one aspect of Bob's keynote that I found especially noteworthy, and I'm dedicating this section to it. It was an excellent talk, with lots of interesting material, but this really resonated.
A friend of mine once kept a log, for years, about bugs that he found. Over time, he compiled evidence about software development in a given language. Based on this record, he developed a philosophy towards his coding conventions. (More to come in a subsequent post about evidence-based software practices: it ain't gonna be easy.)
I was impressed at the time, and impressed again by Bob, when he argued for ARM blocks. He began with some Java puzzlers, to show the difficulty of correctly using IO and try-catch-finally blocks. All well and good. But then he reported examinations of large codebases, including the JDK itself: there are plenty of instances where the code does not behave in a strictly-correct manner.
More than Java 7 features, this is the big take-away: when presenting a case to an audience (be it a keynote, or your team), do the research and present evidence. Compelling.
Strange Passions
Sadly, I didn't make it to the Strange Passions track, or the party at Blueberry Hill. However, there was a lot of buzz about the track idea (which is fantastic) and the individual talks. Sounds like it was a huge hit. I hope the passion talks are on video.
More to come, re: Day #2.
Full Discloseth:
All of this may or may not matter to you: "go steal porn" is an anagram of "strange loop". I live in St Louis, MO. Some speakers and the organizer are friends of mine. I work for a sponsor of the recent Strange Loop conference.
Like many, I have loved Calvin and Hobbes for a long time. Not that this blog compares, but the spoofs on here are influenced by Calvin's weird, wonderful world where we only bounded by our imaginations.
I recently found a post that contains, my all-time favourite C&H. I probably shouldn't encourage a likely violation of copyright, but I'm weak. I've tried, in vain, to describe this one to people dozens of times. I laugh out loud (especially the visual of frame 5, "I don't want to know about it") every time since I first saw it in the early 1990s.
This comic is such biting satire toward software development that I no longer hang it up at work gigs, lest it is interpreted as some kind of protest.
The genius is that it has no ties to IT: a friend once commented that her mother, a judge, had it laminated and placed prominently on her refrigerator. Truly philosophical, it is timeless and universal.
I love Dilbert too, but if I had to choose: you can keep it. Give me Calvin and Hobbes, please. I only wish that Watterson would come out of retirement and do a few more, whenever his muse strikes.
I've been a fan of Joel Spolsky's for a long time. He has written some excellent, influential stuff. Unfortunately, his latest blog entry, The Duct Tape Programmer, falls short of the mark, and has sparked somedebate throughout the web.
I wasn't going to respond until I saw supportive tweets such as 'thoughtful, provoking essay'. (Damn you, Twitter, now you've drawn me into this.)
Joel's piece isn't a thoughtful, provoking essay. This one is, but Duct Tape Programmer is just a quick rant. Like I said, I'm a fan of Joel, but let's not be sycophants here!
The Gist
I freely admit that I haven't read the book, Coders At Work, but here is Joel's article. Here's my summary of his points:
Keep things as simple as possible, but no simpler.
'Simple as possible' is context dependent.
It's the features, stupid.
The first item is a favourite quote of mine, often attributed to Einstein (though I have not verified that). An example in software: your webapp may not need GWT on a full blown J2EE stack, but you'll probably need a database. The first case is not simple as possible; yet without a DB, it is too simple.
His other point is that the definition of 'simple as possible' depends on circumstances: the context of a start-up company is much different than a mature app at a large enterprise. Naturally, the start-up will have a much more stringent definition of 'simple as possible'.
Finally, Joel admires those that concentrate on features, and who ship code.
The Surprise
I agree with Joel. His thesis isn't particularly original, but as I see it, it is virtually indisputable.
However, I take issue with some of the details. My point is that this just isn't a thoughtful post.
At Issue: What is Duct Tape?
Joel gives lots of examples of complex technologies: multi-threading, COM, and CORBA. Hard stuff, no doubt. He goes on to write:
... any kind of coding technique that’s even slightly complicated is going to doom your project.
However, against the backdrop of extremely complicated technologies, he doesn't define 'slightly complicated'. There's not even an example! From what I can tell there isn't even an example of duct tape!
Is OO slightly complicated? AOP? Functional programming? Transactions? Languages without garbage collection? with garbage collection? The notion of 'simple' is much more nuanced than Joel implies.
This rhetoric reminds me of the straw-man argument and definitely is the logical fallacy of false alternative. Joel assured me on Twitter that the COM example is real, and not a straw-man. I'm sure it is true (I didn't think Joel was being deceitful), but berating the most extreme case with no comment on the middle-ground does not make a thoughtful article.
At Issue: Design Patterns
Quick -- what is the most commonly used design pattern, using the vocabulary of the seminal work, Design Patterns?
I don't know, but I'd wager that it is the Iterator. In fact, if you work with Java, it is so common that it may not 'count' in your mind. And yet, there it is: a freaking commodity, no less.
Joel takes a shot at a 'Design Patterns meet-up'. It's true that people can go crazy with esoterica, but again, nuances are lost with broad strokes. It is easy to deride the architecture astronauts!
There may well be times when a design pattern is the right fit, and it is our job as professionals to be prepared. True -- we have to be intellectually honest and disciplined -- but that doesn't mean we shouldn't be informed.
At Issue: Unit Tests
Hoo-boy, Joel fired a shot across the bow of the agile ship. A brave man.
I'm a big fan of unit testing and am convinced that they helps us make better software. However, if the context is a start-up in an ultra-competitive space, and we are racing for the '50% good' mark, then I agree that unit tests would slow things down. If I were in that environment, I would shower every 2 hours just to get the smell off me.
The issue here is that most of us are not in that context, and the post implies that unit tests are used for 'endless polishing' to get to the '99% sparkling' mark. That's just bogus.
Unit tests are the first client of any software. They find bugs. They highlight problems in an API. They serve as executable documentation. They get us to the X% mark faster, where X is way higher than 50 and not as obsessive as 99. Most projects are shooting for X.
The Upshot
We all agree that simplicity and features are of paramount importance. We all agree that we shouldn't listen to architecture astronauts with high-falutin', ego-driven schemes that are not appropriate for the situation. As usual, the devil is in the definitions (what is appropriate?).
But to the Twitterverse: just because an excellent writer wrote a piece, it doesn't make it excellent. Call them out, when they have no prose, and write your own 'duct tape post' instead of broad, incendiary brush strokes.
The article first places Software Transactional Memory (STM) within the milieu of other concurrency techniques. I especially like the emphasis on transactional, as I found that as a stumbling block in the past. (It does not necessarily mean a database!)
The article goes on to work though some serious study of STM in Clojure. Great stuff....
In a previous post, I explained a modest example of the Vending Machine in Groovy. I've since taken the code and tried my hand at writing an internal DSL.
I'm a relative newbie to internal DSLs, but I've pushed my example up to GitHub (here). (This post is light on explaining the details, since the code is available.)
Observation #1
The best discovery is the magic of disappearing code. In this example, the main program simply evaporates and becomes this:
// load the DSL engine/rules def dslEngine = new File("${args[0]}").text
// load the command input def input = new File("${args[1]}").text.toLowerCase()
// dslEngine creates 'machine', which accepts the input: def dslScript = " $dslEngine ; machine.accept { $input } "
// let Groovy do the rest! new GroovyShell().evaluate(dslScript)
Amazing! There are 2 input arguments to the program. One is the DSL "engine" or context. It looks something like this:
class Machine { def machineState = new MachineState()
def service(def coinList, def inventoryMap) { machineState.availableChange = new MoneyState(coinList) machineState.inventoryState = new InventoryState(inventoryMap) }
Note that the SERVICE command looks like a method call, with parentheses and a comma. That's because it is a method call. Similarly, VERIFY is as well, though no parentheses are necessary for the single string argument.
The other commands are simpler:
N ; D ; Q ; a$ ; COIN_RETURN
These are direct method/property calls as well (e.g. machine.getN()).
Observation #2
Articles on internal DSLs often talk about the contortions that one must go through to simplify the syntax for the end-user. Often, one uses techniques that would otherwise be considered poor style. (Venkat Subramaniam jokes that "designing the DSL" is "finding the right tricks").
I discovered that as well. In Groovy, it is relatively easy to have a decent DSL, but there is a never-ending desire to improve upon it. In the current version, I ran into a wall for using the dollar-sign as a token (see the compromise above: quoting that character in this post is giving Blogspot fits). Along with the parentheses on SERVICE, this pains me. I literally think about it while running on a treadmill.
Observation #3
In the first example, I adopted a file based approach for the input, over an interactive command-line. This has paid off in spades, because my suite of input files act as acceptance tests. Morphing the Java-esque example into a DSL was considerably easier with the existence of those files.
The upshot
With the right support and on the right scale, internal DSLs are terrific. For example, parts of (or, all of?) Grails and Gant are internal DSLs and I love it.
On a smaller scale, I'm not so sure. I'm still disturbed about the issue of Domain Specific Error messages. That is: can a domain expert (without development skills) really handle the power of an internal DSL (including the errors)?
Either way, internal DSLs are undeniably a fun exercise and a great way to learn more about a language.
In April 2008, I blogged about the WIDE: web-enabled IDE. I wondered if web tools might augment some of the standard stuff available in the IDEs.
(I realize many have 'scratchpads' but I wanted something more. Also, by environment, I don't mean a single app).
In particular, I've thought about a website dedicated to string and regex utilities. Whenever I find myself writing a little script to parse a data string, or to toy with a regex, I always think "there must be a better way".
Well, we're one huge step closer: check out the Groovy Web Console, by Guillaume Laforge.
It laughs at basic string utilities:
def s = "does this string fit into a 32-char column?" println s.size()
And provides a test-bed for regular expressions:
(Note, this is a Java-Groovy hybrid. It's only somewhat 'Groovy'. Making it Groovier is left to the reader)
import java.util.regex.Pattern
def s = "1 - 314-867- 5309" def p = Pattern.compile(/.*1.*(\d\d\d).*(\d\d\d).*(\d\d\d\d).*/) def m = p.matcher(s) if (m.matches()) { println "area code = ${m[0][1]}" println "exchange = ${m[0][2]}" println "digits = ${m[0][3]}" }
Count me in... I'm definitely going to have this site at hand in my environment.
Last winter, I saw a great talk by Scott Davis called "Lizard Brain Web Design". The theme was to apply psychology and neuro sci ideas to web sites, and to explain why simplicity and good design can really work. For example, we want the site to stay "out of the way" so that the users stay in a primal, "lizard" mode of consciousness with respect to the site. In this way, they can concentrate on what matters.
During the talk, I remember thinking that all of the principles discussed apply to more than surfing content on the web. They also apply to surfing code in an IDE. That is, topics such as:
Whitespace is a critical aspect of design
Group related items (locality of reference)
Our minds can only stack N items (N = 7 ?)
Principle of least surprise
apply just as well to our APIs, our code organization, and coding conventions.
For months now, I've wondered if there were studies that applied neuro sci to developers.
Subitizing
Let's play a game: as you add a parameter to a method, how many parameters triggers your sense of "this is too many -- I need to refactor this".
Seriously, go ahead, think of a number, N, for your threshold.
You probably said N = 3 or 4. True, that's what everyone says, but here is one reason why. The delightful book Mind Hacks discusses subitizing (item #35): given a set of N objects, where N is 4 or less, we process counting in a much faster way. The book claims 250 ms for the first 4 items and a full second (!) for every 4 items after that.
There is debate as to how this works (see Mind Hacks for academic references), but one conjecture is that when N <= 4, the "counting" is a side effect of visual processing: i.e. it is done by the lizard, reptilian level of the brain. When N goes past 4, we have to do some work.
Now, let's be clear: the book talks about counting shapes. Stars, circles, beads on an abacus. I have no idea if this applies to Java or C# parameters.
But I'm willing to bet money that it does.
Eye-Tracking and Variable Names
This article came across the transom recently, and dovetailed with the above ideas. The gist is that the researchers used scientific techniques (e.g. eye-tracking) to evaluate productivity of programming styles.
The claim in the paper is that the Scala style of using comprehensions is more productive than Java's iterative loops. Also, for small code blocks, well-named intermediate variables may not matter.
I didn't read the paper, and I have no idea of the validity of the science. However, I find the approach to be very fascinating. I'm sure scientific methods have been used for a long time with respect to lines of code, and productivity, but I wonder if neuroscience will have a future impact on language design?
It would be fascinating to see if researchers start hooking up developers to functional MRI machines, to see how the brain works while coding. (I know that my amygdala lights up when I see a 80-line method!)
The Upshot
Imagine a geek conference where a new language is unveiled: instead of its design being driven by a sense of tradition or aesthetic, what if its design was modeled on hard evidence from a neuro lab?
The idea is simply to build a list of Composer objects from a list of strings. As noted in the comment, the closure passed to collect uses the implicit return to great effect. At one time, I would have taken 3-4 lines to express that idea.
But now, this code is not merely concise: it is truly elegant. Very reminiscent of other languages such as Python.
The Take-Home Message
There are many lists of excellent Groovy features. However we rarely see 'implicit return' listed. I'm a fan: it greases the wheels for other features.
I realize it is available in many languages: I just haven't used it in Groovy. I suspect in a few weeks I won't be able to live without it.
David Jacobs, a friend and colleague, posted a comment in response to other comments on a previous post. The comments focused on scaffolding in Grails.
I thought it was worthy of a full-blown post, and he has kindly granted me permission. David's thesis is that there is much more to Grails than just scaffolding and quick demos.
By David Jacobs:
I've now been using Grails on my primary project for about 16 months (not just toying with it--this is at my "real job" on a large project for one of the 50 largest companies in America). My appreciation of it has continued to grow, and I have complete confidence in it as an appropriate choice for 95%+ of business applications.
Scaffolding is NOT a primary benefit of Grails. It enables cool 5 minute demos, but the real world benefits of Grails are much deeper. There are simplified (auto-wired) conventions, built in tools, or plugins for nearly every aspect of web development. This enables a massive increase in productivity by enabling and encouraging clean solutions that don't re-invent the wheel. Perhaps even more importantly, this keeps the code lean, which is very important for long-term maintenance, refactoring, and preventing "sleeper defects" that tend to hide in bloated code.
I needed to add caching for some web service calls to improve performance: 11 lines of code (entirely declarative).
To meet a versioning requirement (lock, copy as new revision), I needed to implement deep cloning of a persistent domain tree with child associations (mostly collections) five levels deep, with multiple branches, touching about 30 persistent objects. Some needed to be copied by reference (for example, a createdByUser association), and some cloned as new (recursively deep cloned). My implementation is completely generic and won't require changes if the domain model is changed. Pass the method a domain object instance, it hands you back a copy with all of the children copied and properly attached to their new parents. Then you call .save(), and all of the new ids are generated and your database has been updated. This one is a testament to the power of Grails metaprogramming with GORM and Groovy: 22 lines of code.
I needed to create a utility application for mocking a third-party SSO module to enable logins on staging/demo servers. It validates user inputs, makes an AJAX request to the target app with modified request headers to initiate a browser session, and launches it with the intended user now logged in. It uses a proper MVC architecture, the layout is configured with SiteMesh, it has configuration/build management and versioning, URL mapping/redirection to keep it friendly, and log4j integrated and configured. From creating the project to building the WAR file, it took 4 hours.
I wasn't ready for the Language Shootout at the Lambda Lounge, but I've completed an example of the vending machine, and placed it up at GitHub. (The Language Shootout used a small specification to illustrate languages ranging from Haskell to Fan, and everything in between.)
Highlights
As Lambda Loungers know, these examples are not intended to be submissions for the Turing Award. My example is a modest program, but offers this to Groovy newbies:
The project uses Gant, has test cases, and is fairly complete.
The program accepts an input file, and supports a VERIFY action to "assert" the expected state of the machine. In some ways, the file could be an acceptance test (reminiscent of FIT).
One interesting idea (IMHO) is the execution of data as code. More below.
The example has some internal uses of Expando, and many uses of the elegant closure-based iterations.
Evaluating Data
Consider a command like so:
// N = Name, P = Price, C = Count SERVICE [5,5,5,5] [[N:'A', P:'65', C:'10'],[N:'B', P:'100', C:'10']]
The pattern here is: Action Coins Inventory. In this example, Coins is evaluated as a Groovy list; Inventory is a Groovy map. This not only simplifies parsing, but affects the architecture of the example. This is hardly new (hello, Lisp!), but a powerful tool.
Java-esque
As with many languages, Groovy supports a wide-range of styles. I've dubbed this example as "Java-esque". Here's why:
Actions (like SERVICE) are objects. This is a nod to Java's style.
It eschews some common Groovyisms (e.g. using an expression as the return value, without specifying the return keyword).
Known Issues
There is no REPL-loop or interaction with the user: only file input. Also, it emits no output per se, and does not acknowledge corner-cases, such as requiring exact change. It simply allows verification of expected state.
Lessons Learned
I love evaluation of data as code. I always have.
Tests are essential with Groovy, and they become inextricably tied to your experience as a developer. This is hard to explain, but because of the dynamic types, the tests cement themselves into your dev cycle in a way that is much stronger than Java. If you cheat with a large method that does not have a test, you'll probably pay for it.
It is important to read the output when a test fails. Often, I just scan it and blithely assume I know where the problem is. This usually leads to frustration, until I realize that the test was trying to help me all along.
As an aside, this is my first project in Git. It is excellent and definitely worth studying.
The Gist
I hope this example helps someone. I hope to write some others, including a full-on Groovy MOP version.
Your friend and mine, Mario Aquino, has an excellent post on material for code reviews. In classic blogger tradition, why leave a comment when I can write a post?
I agree with his points, and will add my own below. Don't leave a comment saying 'what about unit tests?'. Mario has already covered a lot, so consider his post.
I once wrote about the Universal Issues that affect a software project. They influence this list.
Architectural Harmony
Mario writes about harmony within a source-file. One can expand that to the level of architecture: is this code in the proper place, with respect to the architecture? Is it in the right package/module?
Client-Side Specifics
Does the code follow known conventions for a GUI? e.g. In a Swing app, does it use the event-dispatch thread appropriately? What about i18n?
Server-Side Boundaries
Similar to the architectural harmony above, is the code correct with respect to transactions? concurrency? In a framework situation, these are often handled at an outer scope, but it is important to place ourselves in the appropriate context.
Concurrency is a fun topic because developers have an involuntary reflex: we look upwards to the ceiling whenever asked about multi-threading. Try it, and see for yourself.
Data Is Immortal
As I've said before, versioning is the toughest problem in software engineering, especially given that data lives forever. I've been on projects where data must persist across versions, and in this case, it is vital to give consideration to this aspect during a code review. The gotchas are downright ghostly and ghastly: i.e. hard to see, and expensive.
A Thought On Delivery
It is dangerous to riff through such a list in every code review. It will frustrate the developer and eventually turn people off from asking for code reviews. Though always wise to keep these things in mind, it is important to know when to ask out loud. That is a true art, and can only be gained by experience and a sensitivity to the working conditions.
Alex Miller is organizing a terrific conference in St Louis (October 2009). Check out all the details over at the website.
If you are in the midwest USA, you won't find a better value. Hell, if you are on another continent, you won't find a better value. Come on over.
You may think I'm biased, but check out the session list, and see for yourself. Lots of interesting, off-the-beaten-path stuff by first class speakers, with even more to be announced: I believe the 'Strange Passions' track is still open.
Alex envisioned an esoteric conference that he would like to attend -- and the dream is forming right before our eyes. Check it out!
This month's StL JUG features Matt Taylor speaking on TDD with Groovy (tonight!).
Like last month's post on Clojure, I'll play law professor, and try my hand at making some arguments that Groovy is the JVM-based language you should learn (if you can only pick one). Like last time, I'm treating this as an exercise, though this is easy, as I am a Groovy fan.
Grails
Long ago, I would say that the first reason for Groovy was its learning curve. That is still important (see below), but my new first reason is: Grails.
I imagine most readers are familiar with Grails: a Rails-influenced web framework that stands on the shoulders of giants -- namely, Hibernate and Spring. The reason I put this item up front and center is that you can make money with it. There are real jobs to be had here, folks. This is a powerful framework with plenty of momentum.
Griffon, GORM, Gradle, Gant, ...
In addition to Grails, there are several other projects that are intimately related to the Java platform. e.g. Griffon brings Grails-like conventions to Swing apps. Straight from Grails, GORM is a DSL for Hibernate mappings. Gradle is a next-gen build system that stands on top of Ant and Maven, without all that nasty XML. Gant is somewhat similar, used by Grails.
The upshot is that the Groovy community is vibrant and doing many different things: surely something will be useful to your project.
The Language
Up to this point, I haven't written about the language itself. IMO, Groovy seems familiar to Java, but with all of the ceremony stripped away. Tremendous amounts of boilerplate are removed: getters/setters are gone; access modifiers have reasonable defaults; regular expressions are trivially easy; and so on.
But there's more: dynamic typing, closures, the Meta-Object Protocol, and other features provide a rich feature-set that is quite different from a merely trimmed Java.
Tools and Testing
Scott Davis has written an excellent book with the subtitle "Greasing the Wheels of Java". That is spot-on. For XML, web services, scripting Java libraries, etc, Groovy is amazingly useful and effective.
One of these aspects is testing (this is where Matt's talk applies). Groovy has tremendous support for unit-tests, including mocks and stubs. The dynamic nature of the language offers a lot of options. Come on out to the talk to see more on this item.
A Gentle Learning Curve, With Growth
If you know Java, Groovy is extremely easy to learn: in fact, almost all Java will run as Groovy. The curve actually goes downhill (in a good, effortless way).
Now, you may dismiss this. As a reader of this and other fine blogs, you may be confident that you can learn any language, thanks. I'm sure you can. The usefulness of the learning curve is that your work (e.g. a utility) is far more likely to be used by other Java developers. In a team environment, that's important. Groovy is unmatched in this regard: it promotes social computing.
That is, if you want it to do so. If you want to do something wild, Groovy allows you to grow into that as well. In this spirit, it is reminiscent of Python: the curve is always reasonable but seems to climb forever. As mentioned, the dynamic nature of the language will definitely, quickly, take you to places that you haven't been using only Java.
The Upshot
I realize there are no code samples in this post. I apologize. Again, this is a quick list of 'arguments' for learning Groovy as your next language on the JVM. There is a ton of material out there, both in free documentation and in books.
(Full disclosure, some of the following authors are friends.)
A quick reminder that tonight's Lambda Lounge features talks on Haskell and MacRuby. Both speakers are friends: they really know their stuff, so it should be a fantastic evening.
For more details, check out this post over on the LL site.
In first year of university, I took a class that used the wonderful book Becoming A Master Student. The book had many stories that have stayed with me over the years.
One is the story of mumpsimus. You can read more at the link, but the gist is that a monk used a Latin word, mumpsimus, for decades before discovering it was bogus. Upon the revelation, the monk replied that he didn't care: he had been using it for 40 years and so it would it remain.
In my freshman year, I was an uncomfortable chemistry major. Little did I know that I would be writing about that story many years, later vis-a-vis computer science. (The story itself may go back centuries!)
I mentioned mumpsimus in the comments of the last post. I had speculated on using protected methods over private methods. The feedback was unified in its rejection of the idea, yet I mused that I would probably continue my style. Ouch. That is mumpsimus indeed: after seeking opinions, I launched a heroic denial of the responses and continued on my merry way. Nice.
Another example is the defeat of mumpsimus. Years ago, in C++, I would define class members like so:
class Person { int m_id; string m_name; int m_age; }
When I turned to Java, I held on to that style -- for about 1 day. When I saw what IDEs could do for automatically generating getters and setters, it became obvious that the prefix had to go. Thankfully, logic carried the day over mumpsimus.
The Upshot
The point here is that we have a name for a particular mindset, and a reminder that it is important to re-evaluate ideas with an honest understanding of our biases.
In code reviews (including my code), when I see a Java method marked as private, I ask if there are any unit tests for it.
Since tests are written in other files in parallel packages, the answer, of course, is no.
I've recently realized that I've obtained the following habit: my methods are either public or protected. I have no use for default or private. I don't like default because I feel compelled to write a comment saying that it is package-level access; I don't like private because of the lack of testing options.
Before you start hammering in comments, here are some thoughts:
This is not for a formal API to another team or other 3rd party. In that case, I would be more careful and rigid. You may argue there is no distinction, in which case: commence hammering.
I realize that protected is not default and that it is leaking encapsulation somewhat. I can cheerily say that I don't care. I like scanning a file and seeing either public or protected.
I'm not trumpeting that I write more unit-tests than others on my team. Just because a method is protected doesn't mean that I've written the tests!
What do you think? Has your style changed over the years, with respect to access modifiers on methods?
At NFJS shows, a common question is "I only have time to learn one new language on the JVM: which should I pick?".
The easy answer is true: it doesn't matter. Just pick one already!
However, I think people can reasonably ask for more information.
Even though my personal favourite is Groovy, I fancy that, like a good law professor, I could argue a decent case for any of them.
In this post, I'll write a bit on Clojure. To be honest, the title is deceiving: I don't know Clojure, and this isn't a full-blown legal case. I'm really inviting you to some resources (see below).
However, I know enough to see the benefit. This is an earnest post.
Mark Volkmann is giving a Clojure talk on Thursday at the St Louis JUG. Quoting from his intro, Clojure is a dynamically-typed, functional programming language that runs on the JVM.
From past talks, and from Mark's intro, here are some reasons I think Clojure is worthy of your consideration:
If you liked Lisp back in university, welcome back: Clojure has a Lisp-like syntax and style. Both the syntax and functional programming seems new again, in part thanks to Stuart Halloway's book.
Here, 'new' means a shot-in-the-arm to the FP community and a welcoming online scene. Let's face it: it can be lonely when studying older languages.
Got Lisp?
If you don't know Lisp or functional programming, adding Clojure to your repetoire is almost like right-brain thinking. It is profoundly different from Java, C#, and other imperative languages. This is important in order to truly grow. To borrow from natural languages: it's cool if you know French and Spanish, but it's cooler if you know French and Chinese (or Spanish and Chinese).
Bonus: though not unique to FP, Clojure can execute data as code. This is powerful, and quite underrated, IMHO. This goes all the way back to grand-daddy Lisp, so you'll be learning from one of the undisputed giants. What's more both Lisp and Clojure have a minimal, consistent syntax. Though it is, er, mind-expanding at first, many people become true fans of the philosophy.
Software Transactional Memory
Concurrency is the new memory management. Memory management was a beast until garbage collection evolved, over decades, to be a shining sword.
Now, on the JVM, we've tamed the memory beast and wrestle with concurrency. Consider books like this: JCiP is an outstanding book, but it is a tough go, in part because the developer must coordinate lock-level tools. Or, worse, a team of people must coordinate lock-level tools.
What if we had something really different -- a pseudo-intelligent agent like a garbage collector, except aimed at concurrency? That's what Clojure offers: a unique alternative that might (and that's all we can say: might) slay the beast of our time.
The Upshot
Come on out to the JUG on Thursday, or if you aren't in St Louis, check out Mark's excellent article on Clojure. It is quite thorough, and has received kudos from the Clojure community.
Gmail has been recommending "vending machines" ads to me for some weeks now.
Here's why: the Lambda Lounge has issued the Vending Machine exercise as a way to showcase the idioms of different languages (and there has been good discussions on the mailing list, hence the Google ads).
If you are in St Louis, drop by on Thursday night (details here), to see some live demos! There are a wide variety of languages on display.
For best results, take a shot with a language of your choice. Though somewhat behind, I've been working on a Groovy version.
Stay tuned to the website (or the mailing list) to find resources to examples.
CodeToJoy's model has been, vaguely, Calvin and Hobbes, with posts that range from the earnest to the absurd. Long-time readers can stay with me, and adjust easily as the sincerity scale changes.
However, I realize that it's difficult for newbies to pin down the vibe. The title of this post is a comment that was left on Reddit or DZone, months ago. For CodeToJoy, the answer is "it depends on the post".
Though my writing certainly doesn't compare to genius of C&H or The Onion, I've decided to branch out. Patently False will be all absurdity, all the time -- a resounding yes to this post's title. To combat ambiguity, I put a disclaimer right in the title.
The Organization
Also, I've decided to go ultra-lightweight: headlines only, on Twitter. On Google Reader, I've noticed that The Onion can make me laugh with a good headline. Sometimes, the stories seem like a forced, obligatory exercise.
There is a blog site, with an official introduction and explanation of the interrobang, but for now that is mostly a holding area.
The Upshot
Subscribe today.... I reserve the right to do longer, full spoof pieces on CodeToJoy: this blog won't change. After all, the nation of The Joyous must be served!
Many people know (or would know, if they attended Alex's talk) that the seminal book, Design Patterns, was heavily influenced by books on architecture by Christopher Alexander. In Design Patterns, the now-famous Gang of Four certainly discuss Alexander, and list patterns-based literature of the era, vis-a-vis software architecture -- but there isn't much on the semantic gap between architecture and computer science. How did we discover Alexander in the first place?
On the podcast, a woman points out that Peopleware is one of the first known books on software to reference Alexander's works (though note that the context is organizing office space).
Tom deMarco acknowledges the comment, but states that Edward Yourdon was a major factor in bringing the book into consciousness of IT (in the early 1970s). Though he can only comment for himself (and not the Gang of Four), deMarco goes on to say that he owes "a personal debt" to Yourdon.
This post is a long-delayed follow-up to a talk I gave at the Lambda Lounge. It will use a simple code example to show some intermediate ideas of monads in Haskell.
The best part is that the code can be pasted into Codepad (no affiliation) and tinkered with. No download necessary!
(Note that a recent review of RWH calls Haskell "like Klingon, but with math". Hence the subtitle.)
Where is Monads 101?
There is no Monads 101 post on this blog. This is part of the zen of monads, as explained in Monads are Burritos. I did my best at the Lambda Lounge, and have some fun ideas for the future, but for now, this post is intended for readers who have some background in Haskell.
Ok, ok, as a super brief recap, check out this photo. Recall that
1. m is a type constructor (i.e. it is a generic type that wraps another type)
2. a -> m a is a function that injects/wraps something of type a into something of type m. In Haskell, this is called return
3. m a -> (a -> m b) -> m b is a function that takes a monad m a and another function, a -> m b. It returns m b. Loosely speaking, it breaks the inner type out of m and applies the function, resulting in a computed value back inside another m. The 'breaking out of' part is unspecified and unique to each monad. This whole doozie is called bind and uses this symbol: >>=
It is vital to understand that a, b, and m above are types (e.g. Integer, String). Haskell has a gorgeous fluidity between data variables and type variables that can be confusing at first.
A monad is a type that supports all three of the above. I warned you that this was Monads 102. If you aren't comfortable at this point, that's fine. This is non-trivial stuff.
The Goals
We'll take a code example that defines a trivial function called tuple. This example won't change but we'll send in some different monadic values and see what happens.
Example: Maybe (with Integer)
Here is the full example... Paste this into Codepad:
-- see comments below tuple mx = mx >>= \x -> return (x, x + 1 )
mx = Just 10
main = (print (tuple mx))
Here is a version with comments:
-- In English, tuple accepts a mx, a monad. -- The monad pulls x out, and builds a simple tuple -- which is returned in a monad of the same type. -- Here mx is a variable name. It is of type 'm a' -- where m and a are types -- -- 1. tuple takes m a and returns m (a, a) -- 2. note that \x -> ... is a lambda expression -- with a parameter 'x'. This expression is -- the function a -> m b that is passed to bind. -- 3. >>= is called 'bind' because it binds the -- value of x -- NOTE: mx defines the way >>= behaves!!!
tuple mx = mx >>= \x -> return (x, x + 1 )
-- mx is of type Maybe Integer
mx = Just 10
-- main just prints the result
main = (print (tuple mx))
The output should be:
Just (10,11)
Try some other values for mx. Experimentation here will be worth a zillion words and comments.
Example: Maybe (with Double)
In the Codepad editor, change the value of mx to:
mx = Just 3.14
and run again. Since Haskell is Klingon-esque about types, this is a big deal. The tuple function works with Maybe Integer as well as Maybe Double. In fact, it should work with any m a where a supports addition. Example: Maybe (with Nothing)
The output should be the same: Nothing. What is happening? Recall that the monad supplies the bind function by 'breaking out' the inner type: but each monad can define that behaviour.
In the case of Maybe, that behaviour is defined in part as: if the value is Nothing, then don't even bother calling the supplied function! Hence, the result is Nothing.
Example: List
Here's where things get fun... Let's wish Maybe a fond farewell and use another monad: List. Remember that tuple isn't going to change here.
Try and guess what the output should be, then run it.
You should see:
[(1,2),(2,3),(3,4),(4,5)]
The reason for this is that the List monad uses a different definition for 'breaking out' when applying >>= / bind. Clearly, the List definition is to apply the provided function to each element in the list.
Conclusion
The upshot here is that tuple isn't changing. The monads are changing. (Or for you Zen types, your mind is changing. For Klingons, the semantics of the syntax is bending to your will.)
It is important to note that tuple is indeed a lame function with no utility. The types Maybe and List are useful; as monads, they are very basic. If you were to describe the 'breaking out' in pseudocode, they seem trivial:
Given m >>= f where m is m a and f is a -> m b
When m is Maybe: if it has something, it applies f; else it does nothing.
When m is List: it applies f for each element in the list.
Don't be fooled! There are other monads in Haskell that are much more sophisticated (an intense example is the STM monad for software transactional memory).
The important thing is to understand that the power is in the 'breaking out', which is individual to the monad. Yet against that flexibility, we have seen with tuple that monadic code remains constant.
That's monads in a nutshell: rigidity and flexibility in a powerful combination.
(The title of this post comes from a quote by Dick Wall, professing his love for Groovy in the face of charges to the contrary.)
A while back (on Twitter), I made a comment about the podcast Deep Fried Bytes. There was a brief exchange with the guys at DFB: it was friendly, but I felt like we weren't communicating. I allow that to happen 2-3 times before I bail and use email or the blog.
(Random tangent: It is mystifying why others try to debate religion or politics in 140 chars. It comes across as an intellectual boxing fight, except with Nerf gloves. Or an obscure debate between two Zen masters).
I first heard of the podcast during a talk by Ken Sipe (on F#). I loaded up on episodes for F#, C#, and some other Microsoft technologies. I'm not proud of this, but I'm not familiar with the dotNet space; I haven't used Visual Studio in years.
I loved the podcasts because they were geeky, but also because the topics are a new world to me. I felt like a spy listening in on a secure line. This is not a criticism: the 'casts are a great way to catch up on what is happening over there.
I have since learned that the podcast isn't solely Microsoft: e.g. they have a great episode on architecture with a guy from Digg, one with the Rails Rumble champs, and so on. Also, the April 1 podcast is simply brilliant.
So, I invite you to join me and sit on the porch with these guys and have some iced tea. They have asked me about some topics of interest but I refuse to offer any: I want them to surprise me and stretch my boundaries.
Recently, I read a post by James Duncan Davidson called Dear Speakers. He tweeted criticisms about speakers (no names used) and later blogged his thoughts. The tweets were not mean-spirited but also not inside jokes among friends.
I believe that James is offering earnest advice, but the post really irritates me. Here's why:
Critiquing a speaker during a presentation, even without using names, is both gutless and rude. I wonder if James offered any advice, in person, to the speakers afterwards. New technology doesn't excuse us from acting like civilized adults. True, I'm the guy that does this (yes, juvenile). But I asked first and looked people straight in the eye.
James provides some random, tactical details as advice. They are fine tips but they strike me as being mere trees in the forest. I have wanted to write about the forest for some time now, so here I am.
However, the post really irks me because I attended a technical talk by JDD on the No Fluff Just Stuff tour, circa 2002. I won't comment on it here, but: if we had the technology back then, how would JDD feel if I tweeted, even without mentioning his name?
What are my qualifications to talk on this?
Frankly, I'm no more qualified to talk about this than anyone else.
FWIW, I have emceed a couple of weddings, and have given some technical talks, all with widely mixed results. In my team's war room, I'm not at all shy about launching into an impromptu lecture on whatever I find interesting. I have taken the venerable Dale Carnegie course on public speaking (highly recommended). All of this may or may not impress you.
I fully concede I have broken many of the following rules. Sometimes, it has haunted me for weeks afterwards.
So I'm a modest presenter. However, I have seen dozens of talks: tech talks, conference sessions, keynote addresses, etc. I've attended my local JUG and NFJS for years, and am lucky to see terrific speakers on a regular basis via my employer.
It is very hard to describe what works, but I know when I see it. The best analogy is music: I can't tell you why I admire certain guitar players. There is no formula, and it is highly subjective, yet there seem to be common elements across my favourites.
Writing about this is like describing a dream: it's impossible to articulate the elements of my favourite guitar players, or my favourite speakers. But spurred to action by James' post, here are 10 things to consider.
1. Take a class
Before you can give a tech presentation, you should be able to give a presentation. JDD's post, and the comments, concentrate on things like pacing, pause words ("um", "so"), eye contact, etc. A lot of advice is written as "just keep these 1000 things in mind the next time you are feeling the adrenalin rush of the flight-or-fight syndrome while in front of a crowd".
Gee, thanks. Here's some real advice: if you want to learn to be a better speaker, with a chance to receive genuinely constructive criticism, take a class. There are classes at your local college. There are higher end classes like Dale Carnegie and Toastmasters. Or take an improv class.
It doesn't matter: just pick one and get out there.
(I have no affliation with Dale Carnegie, but a quick plug. When I was 13, I was so shy that I had to steel my resolve to call a store and ask about their hours of operation. I took Dale C at age 22 and have never looked back. No one describes me as shy now.)
2. Know your audience
I learned this one the hard way. Above, I mentioned several types of talks: tech talks, keynotes, etc. Be sure to think about your gig, and match your preparation to it.
For example, at a brown bag tech lunch, you have about 3 minutes to show some code. These people are voyeurs, and code is their porn. If you show up with 10 slides about cargo cults and the history of computing, they aren't going to be happy. Similarly, if you are up for a keynote, and don't have some kind of polish, things are going to be rough as well.
Also, you need to understand the technical level of your audience. This should be fairly obvious, as I'm sure you suffered through mismatches as an attendee.
3. Know your audience, seriously
I'm repeating this one because of the hidden audiences.
At NFJS, Scott Davis recently joked about those long, gorgeous Flash intros on artsy/marketing websites. He said, "who are those for? Everyone clicks Skip Intro".
My first thought was: they are for other people who write Flash intros. It is an arms race among a small elite to impress each other. This is an example of a hidden audience.
As an example, a fancy Keynote presentation can be very slick and alluring, but if you have 3D dancing slide transitions that emit pyro-lasers onto the ceiling, are you trying to impress the audience, or are you trying to impress other speakers? or other Keynote users? or your own ego?
That is to say, who is your real audience?
Always, always, always keep the real audience as priority #1. Be slick, be funny, be wacky, but only insofar as it advances your message.
4. Steal
If you don't play guitar, you might think that each solo, each lick, is its own creative snowflake, a sonic fingerprint that is unique in the universe.
This is just one of many lies you've been led to believe.
Guitarists copy, steal, and nick from each other all the time, and always have. The reason you may not be able to tell is that the good ones are clever about it: they take the essence of an idea, and make it their own.
With respect to speaking, I'm not talking about stealing content. I'm talking about style. Once you've identified your style (see below) think about who you like as a speaker, and why. Then, pattern your talk using similar elements.
A great example for keynote addresses is referencing a topic far removed from the ostensible subject, and then tying it in. A fantastic example is Dave Thomas' talks and writings on cargo cults.
5. Be true to yourself
This item is in a delicious tension with the previous one.
Some speakers are animated and theatric. Some are dry and yet genuinely funny. Some are no-nonsense and try to maximize the amount of content provided to you. This is all fine and well.
No one should label or box themselves in, but it is wise to think about the speaker you want to be. If you have a naturally dry sense of humor, then it may be futile to try and speak as a different character. Public speaking is inherently outside our comfort zone, so there is no need to double that by pretending to be someone you're not.
That said, it can be electrifying to go on stage. Many entertainers have alter egos that appear out of nowhere when the lights go up. If that happens, great, but it isn't necessary.
The upshot: take risks but follow your intuition.
6. Have a message
Everyone knows the old saw, "tell 'em what you're gonna tell 'em, then tell 'em, then tell 'em what you told 'em". That's good stuff.
The key point: have something to tell them.
I'm old-fashioned but I was taught that an essay should have a thesis statement. A movie should have a story. A novel should have a narrative, and so on.
In the same way, I think that a talk should have an essential message that can be condensed into a short outline or a simple phrase. If someone asks "what was your talk about?", you should be able to answer, coherently, in 30 seconds.
This may seem obvious for an expansive keynote address, but I think it applies even to the humble brown-bag tech lunch. My goal for such a lunch is to present a topic to the audience so that they can decide if they want to pursue it further. Consequently, the message is invariably along the lines of "This tool offers A, B, and C, but suffers from X. If you value X, then you may want to wait but if, like me, you value A above the others, then check this out".
The good news is that thinking about this up front will focus your preparation. As well, a creative challenge is to express your message without actually saying it, but this can be tricky (see the last item).
7. Prepare
This one is cheap and easy, but I am compelled to write it.
Prepare your talk. Practice, rehearse, check your time. Remember that time can evaporate on stage, especially if there are questions.
More than this, though, take every opportunity to prepare the equipment. If possible, go to the venue days beforehand. On the day of the talk, get there very early, and remember to test your equipment! Just showing up isn't enough!
I once had a golden opportunity to rehearse with some equipment, on the day before an event, and passed up the chance. It was a major error. The mic was hard to use and I didn't find out until "go time", despite having ample opportunity to prepare. Shameful.
8. Respect questioners, but keep it moving
Assume that a questioner is at the right technical level and earnestly trying to advance the cause of the talk on behalf of you and the audience.
If they are, no problem: be polite and answer the question.
However, it might not be true, or may become apparent after a couple of questions. E.g.
The person might not be at the right level technical level for the group (e.g. if someone asks 'does CSS support aspect-oriented monads?' or 'what is a database?').
The person might have their own hidden audience and start to grand-stand to impress others or themselves.
Everyone is a comic. Often, this sets a warm atmosphere, but one can go out of control after scoring some laughs.
(The unvarnished truth is that I'm guilty of all of these, as an audience member. Hopefully not too often!)
I defer to your intuition on how to be graceful, but it is important in these instances to acknowledge the person, be respectful, and then move on. The goal is to convey your message to the group.
Squash the impasse with the venerable "let's go offline". If you really follow-up later in an earnest manner, it is better for everyone.
9. Learn from criticism
I'm paraphrasing the master, Dale Carnegie, on this one, as he said it best.
There are two ways to handle criticism: if it's accurate, learn and adapt from it; if it isn't accurate, be a duck and let that water just roll off your back.
The trick is to identify accuracy. This is difficult but clearly it requires objective reflection. And a keen sense for the difference between fact and opinion.
I think there is an asymptotic effect here: if you give a sufficient number of talks (see item #1!), and adapt, earnestly and honestly, to enough criticism, the curve will invariably tend towards you being an excellent speaker.
10. Break the damn rules
Just like in music, the most creative and wonderful things come when we break the rules. (Note that here I mean tactical rules like "use slides", and not themes, e.g. "respect your audience".)
However, it isn't just a matter of ignoring the rules like a bull in a china shop. To truly break the rules, one should first understand them.
In this way, one becomes a master. Note that the road is not easy: for every brilliant, rule-breaking, game-changing creation, there are countless disasters laying in the ditch. Talk is cheap: you must be prepared for failure if you try something crazy.
Again, it comes back to intuition: if you feel you're ready and can accept the consequences, go for it.
Rule-breaker? A Masterful (and high risk) Example
I'm bummed that someone, in the comments on JDD's post, already pointed out the video below. A friend and I saw Clifford Stoll at SD West circa 2000. It was the single best talk I have ever seen, and represents my own Platonic ideal as a speaker. I've searched for it but no luck. (I tried to capture its spirit in Beethoven didn't use Powerpoint).
Talking about it is like trying to describe a dream. Or for someone to describe seeing Stevie Ray Vaughn play live.
The version below is similar in nature. It doesn't (can't!) compare to the dream I saw, but it is great stuff.
I include it also for the duality: on one hand, this breaks all of the little tactical rules ('make eye-contact'); on the other hand, it preserves -- even illustrates -- the core principles behind a great presentation.
Though again, you better be careful about running around your conference room like a mad scientist, I conclude with these questions:
Did he think about his thesis?
Did he know his audience? (a slide-rule!?)
Do you think he was earnest in conveying his message?
As described in the first post, composing software has much in common with composing music: both require talent and skill, and both have a spectrum from apprentice to virtuoso. At times, when composing software, there is the palpable sense of beauty -- the comp sci equivalent of hearing the chorus of Beethoven's Ninth. That feeling is Code To Joy.
(Here is a user's guide that explains some of the recurring themes on this site.)
I am a computer scientist, part-time philosopher, and part-time cyber-scriber. After many wonderful years in St Louis, Missouri, I have returned to Prince Edward Island, Canada. Java pays the bills; Groovy provides the thrills.
I am not a musical composer, but play guitar and am learning piano (on/off). I'm fairly literate with musical theory, structures, etc. That said, my main hobby these days is running with some (indoor) triathlon.
Alma Maters: UPEI and University of Waterloo