It’s really hard to answer the question: Was a release successful? In the jQuery project we try to look at a number of criteria.
- Are users pleased with the release?
- Are users adopting the release?
- Are we meeting the needs of those who don’t use jQuery?
It’s hard to put exact numbers on those points (we listen very closely to the response on our blog, on twitter, on the mailing list, and elsewhere – and thus far it’s been very positive) but we do have a couple tools that we use to try and simplify that process, namely: Google Analytics and Google Trends.
Above is a Google Analytics comparison of Jan 2009 (blue) to Dec 2008 (green) for jquery.com.
We’re currently seeing a +30% growth in visitors day to day over December. I’m really pleased to see that the 1.3 release was "sticky" (users liked what they saw and stuck around to keep using it – the daily numbers aren’t dropping to their pre-release levels).
Note the increase of bounce rate and decrease in pages/visit and avg. time on site – both were linked to the 1.3 and 1.3.1 releases where people come to check out the release then leave again.
The 14th of January was the 1.3 release (had a lot of traffic that week – we hit Ajaxian, Reddit, Hacker News, and a number of blogs). We hit Digg on the week of 26th and saw no appreciable gain in traffic.
I trimmed out the Christmas-New Years time frame since traffic was very low (and doesn’t make for a good comparison).
None of this data includes jQuery UI or static files which are tracked separately.
Google Trends has helped us to learn some things about the use of the library. The biggest of which is the "Christmas slump."
jQuery users largely appear to use it during their day jobs (see the analytics stats to see the weekend slumps). Every year at around the December holiday season (Dec 23rd to Jan 3rd) we see a major drop-off in traffic – we can see a direct correlation within the Google Trends stats, as well.
One thing that I’ve learned while managing jQuery it’s that there’s a huge potential to lose users in between projects. A developer ends a project and then re-evaluates his tool chain to see if any improvements can be made. Every time a user finishes a project there’s a possibility that they’ll leave for another tool – it’s our job to make sure that we consistently provide the best tool and experience possible so that the need doesn’t arise (better documentation, better code, frequent releases, etc.).
A very similar problem occurs over the holiday break. The users are away from their code for about 1-2 weeks and when they come back they have a chance to choose another tool, pick up where they left off, or to become engaged and continue strong.
The question now becomes: How well can we retain (and hopefully grow) the userbase over this slump?
If we look at the slump from 2006 to 2007 we see an immediate pick-up again after the users return from their breaks. The reason? jQuery 1.1 was released.
But look at 2007 to 2008 – there was almost no pick-up and it took almost half a year to get back to the point at which growth had resumed. Incidentally, there was no significant release in January.
We fixed that this time around – we released jQuery 1.3. Note how we instantly picked up our users and even grew our share during that time period.
From a growth perspective I’m very pleased with the 1.3 release – I think we’re setting ourselves up for an outstanding 2009. It’s likely that we’ll be pushing another follow-up release (1.3.2) this week to address one last 1.3 regression – but other than that, it looks like we’re in the clear to heads toward some solid new features and fixes in 1.3.3 and beyond.