There is no such thing as a JavaScript plugin contends James Coglan. I completely agree that there are no, specific, techniques within the JavaScript language that make “plugins” possible (such as the ability to namespace code and import it, or some such).
HOWEVER – I will contend that such a thing as plugins exist and are logically distinct from “random JavaScript code that manipulates other JavaScript code” as long as the following points are met:
- There have to be explicit points upon which a plugin can attach. James notes the most common one in jQuery (jQuery.fn) but we have tons more – events, animations, selectors – all over the board for developers to snap in to.
- Even more importantly: Those points have to be documented or, at the very least, be under some sort of agreement that they will be treated like a normal piece of the user-facing API. In jQuery we treat all plugin extension points as “user-facing API” and only ever change them in major releases (if at all) and always provide an alternative for authors to use.
- Finally, there has to be some sort of repository for navigating these plugins. This is a huge differentiator. Simply referring to “code in the wild” as plugins doesn’t really cut it if there’s no commitment to hosting them and keeping their documentation and examples alive.
We take our plugin architecture very seriously in the jQuery project and are constantly looking for ways to improve (looking at plugins, reading their code, seeing what we can provide to make their lives easier).
Alex Russell of Dojo recently built a sleek 6kb version of Dojo – presumably for use on mobile platforms. He states in his post that:
Even so-called “lightweight” libraries like jQuery are far too heavy for some environments…not because they (like Dojo) pull in all the code needed to use them, but because they do it all up-front. Often the best time to pay the expense of loading, parsing, and executing JavaScript code is when the user takes an action that needs the enhancement to run.
The way it’s worded you would assume that you were paying a large, up-front, cost to using jQuery when, in fact, there is very little overhead. jQuery has been shown to be the fastest loading JavaScript library for non-cached code and considerably fast for cached code.
Arguably a mini-Dojo would be able to provide an extra edge in this respect – however any gains that you would make up-front (which would be minimal – mini-Dojo is only about half the size of jQuery, as it stands) would have to contend with any future overhead incurred by loading additional components at a later time.
I frequently queue up long pages to read on my iPhone while I travel the subway system here in Boston and I think I’d be quite upset if I got halfway through a page, clicked a hide/show link, and found out that the action wasn’t able to work since the requisite functionality hadn’t been loaded yet.
There is a cost to loading “all” of jQuery up front, absolutely – however there are numerous benefits: It’s highly cacheable, you never have to worry about what you do/don’t have loaded, the API, documentation, tutorials, and examples are all dramatically simpler since you never have to worry about having extra components or making sure that they’re being included correctly.
And, as always, if you’re particularly excited about breaking jQuery down into little chunks you can grab the individual pieces from SVN and build a custom copy.
I was out of town when it happened but the release of the Google Ajax Library CDN (which includes the current release of jQuery) was incredibly cool. I’ve had a few requests from users wondering how this release came about. While I can’t speak for the other projects, I can, at least, speak for what happened with jQuery.
Dion and I had been discussing solutions for providing better hosting solutions to JavaScript libraries for a long time (at least a year or two). Progress kept getting stalled at different points but persevered and got the release up at Google. I’m really glad to see this come about and I’m sure that jQuery users will certainly appreciate this release.
I have a couple points of concern with the release, namely:
- How do we push a new release out? Currently we have to contact the guys at Google to get it pushed through – a way to automate this and do it programatically would be greatly appreciated (we could integrate it right into our release scripts).
- How do new pieces of code get added? There’s no way for other projects to get added to the repository – some sort of process for joining would be ideal.
- SSL? Having an SSL-based CDN would be very useful, as well. However I suspect that if a site is going so far as to have SSL on their pages then they’re probably not pulling their source code from an external site.
Other than that, though, I’m quite pleased with the release. The more that people pull from there the faster it’ll be for everyone who uses JavaScript libraries (cutting away that initial download time).
Dion Almaer (July 2, 2008 at 3:39 pm)
Hi John,
Good points all around as usual.
With respect to the CDN:
– We do eventually want to make it a lot easier to get new versions out than the manual way we do so now. We only want stable versions though, so we want to make sure that the system allows you the author, to only push a solid version (e.g. 1.2.6 vs. 1.2.5).
– We are seeing the real value come from popular open source libraries such as jQuery. If a library is popular enough, THAT is when you get the network effect where you go to a.com and b.com, and if they both use jQuery you have a no-op. This is one reason why we aren’t going to open this up to anything. There are good solutions for that problem though, such as hosting open source libraries on Google Code and directly connecting, or putting scripts up on Google Pages, and that is just Google projects that I know about.
– SSL: Looking into that. Would love to do it, just want to make sure it makes sense.
Cheers,
Dion
John Resig (July 2, 2008 at 3:55 pm)
@Dion: Agreed on all points – thanks for making this happen!
James Coglan (July 2, 2008 at 4:14 pm)
Hi John,
Just to be sure — I really didn’t want to give the impression I was bashing jQuery, and to be honest I should have come up with a better title. I was simply hoping to expose how trivial JavaScript makes it to build such extension points into your code, in that you’re free to monkey-patch wherever you see fit, so people can extend programs in ways the original authors never anticipated.
I’m a library author myself (I write Ojay and JS.Class and a bunch of Ruby stuff) and so I completely agree on the importance of stable documented APIs, and on the importance of providing easy extension points for common use cases.
By the way, jQuery is what got me into JavaScript in the first place so I guess I owe you a drink…
John Resig (July 2, 2008 at 4:25 pm)
@James: It’s perfectly ok – I didn’t get that impression at all – I just saw this as an opportunity to sort of expound on what I feel a “plugin” is. I definitely agree – there is nothing particularly tricky about how the implementation works (just adding properties to a function prototype) and I appreciate the time you took to write about that.
Alex Russell (July 2, 2008 at 4:26 pm)
John:
You’re being rather disingenuous. I hate to say this is a pattern coming from you…but it’s a pattern. Please stop. I didn’t impugn JQuery nor suggest that the default build strategy for either Dojo or JQuery is somehow “wrong”. dojo.js is packaged as a single file for nearly all applications for exactly the same reasons that JQuery is. What I was suggesting is that neither Dojo nor JQuery’s small initial footprint are appropriate in some situations, and I presented an alternative solution which has advantages over custom builds of either dojo.js or jquery.js.
To wit, you can’t at once crow about JQuery’s demonstrated speed advantage WRT initialization time while at the same time dismissing perceived performance improvements of this mini version of Dojo. To get the same features as dojo.js provides, jquery already needs several plugins for handling things like JSON encoding (including form serialization), color animations, and module loading. Until recently, you also needed another plugin (i.e., another expensive up-front HTTP request) to even get node positioning and location. In the stubbed Dojo, only the modules which are ever used are pulled in, and if something is used commonly, well the stub build can be augmented to include it up-front. The win here isn’t that this is 6K or that it’s 24K (necessarily), it’s that it gives developers the ability to do what’s right for their app.
Regards
tontechniker (July 2, 2008 at 5:04 pm)
Can anybody tell me why I should load (outdated, Mootools 1.2 is officially out already) JavaScript libraries at Google? When developing a web application I normally only need one library for the whole app. One library – one JavaScript include, don’t need more.
The only use case I can imagine is something like Sparkle (“Application Podcasting”) for JavaScript libraries, so I don’t miss important updates. But through API changes between the versions, even this case is absolutely useless.
John Resig (July 2, 2008 at 5:07 pm)
@Alex: [We discussed many of the concerns offline.]
John Resig (July 2, 2008 at 5:11 pm)
@tontechniker: I’m not sure about the Mootools update – I suspect that maybe the Mootools team hasn’t set an updated version to Google yet? I’m really not sure.
The point isn’t really to “make sure you don’t miss updates” it’s to make sure that the library’s code will be served fast (fast servers, large bandwidth) and be cached out the dickens. Not to mention the fact that if you point to that file and someone else does as well that means that your users will have a greater chance of coming to your site with the code pre-cached – a huge win.
Eran (July 2, 2008 at 6:37 pm)
@alex, @john:
What you guys are discussing regarding framework delivery sizes and loading performance is basically irrelevant for 99% of all web-sites. With sites becoming more and more media rich, the difference between a 15kb library and a 24kb or 6kb ones are relatively minor (at the desktop level at least). Most sites don’t receive the amount of traffic to worry about requests and bandwidth, and those who do will probably need to worry about several more optimizations techniques with minifying and gzipping javascript just another tool in the arsenal.
That being said, I think the major differentiators between the different libraries are community, documentation and extensibility which is where in my opinion jQuery is the clear winner. It is the pluggable nature of the library along with the excellent documentation that created such a fantastic community. A community that constantly provides more features in the form of plug-ins, some of which even get integrated into the library eventually (like the dimensions plugin).
Jörn Zaefferer (July 3, 2008 at 3:11 am)
The discussion about the Google Ajax library API mentioned that eventually browsers could cache compiled versions of the JavaScript code, instead of only the file content. That way even the overhead of parsing the file would vanish.
How realistic is that? Is the potential performance gain worth the trouble?
bugrain (July 3, 2008 at 4:51 am)
Interesting about queuing up pages on the iPhone – my experience (with an iPod Touch) is that doing this makes the older pages need to re-load because of the (relatively, compared to desktop) poor caching performance that Safari employs (basing this purely on this http://www.niallkennedy.com/blog/2008/02/iphone-cache-performance.html at the moment, only happened across this problem yesterday).
So, in this scenario, it doesn’t just matter how small your cacheable objects are, but how many you have (and how many pages you have open).
Dinoboff (July 3, 2008 at 7:19 am)
According to to the iPhone cache performance article, the mobile version of the dojo library make sense, since any file over 25kb (once uncompressed) won’t be cached. No js library is under this limit (yui is if you add each component separately, but it might be slower to load); even the packed version of jquery (30kb) is over the limit.
bugrain (July 3, 2008 at 8:22 am)
@Dinoboff – indeed, but the number of cachable component is also limited to 19.
I’m not arguing for/against dojo or jQuery – more wondering (in the case of the iPhone) – where the benefit lies at this point in time. If I load a page with one JS file 19 components isn’t my JS file lost from the cache so if I hit the back button it needs to be retrieved from the server again. At least, this is my experience with iPod Touch.
h3 (July 3, 2008 at 8:54 am)
I really like Google Ajax Library CDN and started using it on my sites as soon as it came out. It’s pretty slick to have the possibility to easily switch between minified/unminified version too.
Still, I wish jQuery UI were there too :|
Dipesh (July 3, 2008 at 10:20 am)
Alex, John and Eran:
I feel like quoting Rodney King…Sorry!
Note sure where the 99% came from but trimming down these libraries certainly helps when you have a page load requirement of less than 1.5 – 2 seconds. So the option of downloading only the required modules sound compelling to me. Also if I need only XHR on my page I can use dojo (or jQuery I think) without have to write a single line of code for this functionality. I realize XHR is a simple example to write but the point being is I don’t have to worry about code redundancy, inconsistencies in coding APIs, support or maintenance. And I get everything I need for the most simplest web page to the most rich.
In terms of documentation I would have to agree with you BUT it’s much much better and quickly improving compared to the days of dojo 0.4. If you haven’t yet, read the new Dojo book from O’Reilly. It’s much better than the online docs.
Josh Rehman (July 3, 2008 at 1:55 pm)
Hi John, I liked this post – it was packed with information. I don’t use Dojo currently, but I really like the idea of a minimal bootstrap version, not just because it’s smaller, but because it’s convenient for the developer. I *like* the idea of not having to worry about including all the right libraries for my work to, er, work.
I do however have concerns over how the runtime dependency resolution actually works, for a couple of reasons. First, are there any lexical gotchas? (There must be!) How can problems be addressed if they occur? And what does the runtime execution profile look like of such an app – for example, will users be waiting for script downloads at inopportune times?
I think that Alex makes a good point in his comment that basically goes to the developer convenience argument: we should do what’s right for the app, and not worry so much about libraries. I feel that, currently, there is a small but definite barrier-to-entry when adding a plugin to jQuery. You have to download the plugin, put it somewhere, add a reference to your HTML. Now you have to wonder about combining, minifying, gzipping that special combination. The dojo method avoids all that, which is great.
Scott McMillin (July 3, 2008 at 4:59 pm)
@Alex said: ‘Even so-called “lightweight” libraries like JQuery’
You see what you’ve done here, though. You’ve subtly implied that jQuery is not lightweight by using the phrase “so-called” and by putting “lightweight” in quotation marks. In fact, you’ve could have made your point without even a comparison to another library, since you continue by saying “not because they (like Dojo).” Indeed you could have just stated:
“Dojo, at its full size, pulls in all the code it needs up-front.” etc
So I think crying foul in this case is a bit disingenious on your part as you did take a swipe at jQuery when it wasn’t really necessary to make your point.
Alex Russell (July 3, 2008 at 10:47 pm)
Scott:
I was simply pointing to the oft (mis) sighted assumption that some library other than Dojo will get you out of the weeds should you have the constraints outlined in the post. Had I added MooTools to the list, the point would have been just as clear (although I might not have run afoul of your particular partisan persuasions). All of these libraries (including Dojo) have made tradeoffs which are *generally* good, and can be *specifically* sub-optimal. Saying something that something is “lightweight” (as many libraries do, not just JQuery) doesn’t magically let them hit specific performance targets. Nor does pointing out that marketing and reality differ in the specific any way count as a “swipe”.
Regards
tontechniker (July 6, 2008 at 3:34 pm)
@John: I don’t think that this little improvement is worth a request to Google – this exposes my complete site traffic to a for me in transparent company. Even if I trust Google, what about my users? Second, the Browser only loads the library one time per visit per “browser cache clear”. So in fact I have no problems with speed or traffic. Third, this system only works properly if many sites use it. All in all I don’t see anything that really speeds up a real page – it doesn’t speed up Twitter and it doesn’t speed up another real site – simple because it’s not the bottle neck.
Pete (July 7, 2008 at 7:18 am)
C’mon, stop bashing each others libraries guys. You all did a great job!!! Have a drink ;-)
Greets from Bavaria,
Pete
Guzmán Brasó (October 19, 2008 at 12:38 pm)
Good article. I just wrote one regarding an easy & simple but effective way to load the plugins when user really clicks on a given feature, using blockUI plugin to block the interface while loading.
Thought it would be nice to promote it here :)
Article URL: http://guzman.braso.info/2008/10/18/how-to-avoid-long-page-load-when-using-jquery-plugins/en/
Best regards from Montevideo