Follow up: Speeding Up JavaScript

Diving more deeply into the issue of speeding up JavaScript and the load balancing question, Scott Conroy points out:

The single URL strategy has a major downside, though it is certainly cleaner than having to deal with many URLs. Since HTTP 1.1 says that user agents and servers SHOULD have only two concurrent connections, requests for multiple resources can easily develop into blocking operations. If, say, I have a page that includes twenty images to download, my browser (in its default config) will only download 2 images at a time. If I put those images on multiple "servers" (e.g. a.images.example.com, b.images.example.com, c.images.example.com) then I'm able to download two images from *each* of the servers. Accomplishing this without a content management system - or some fancy HTTP response rewriting - could be onerous, but it's likely worth it for some media-rich sites/applications.

I took umbrage at the statement "This [a single URL] is not easy to scale" because I was looking at it purely from a server-side view point. My bad. Scott points out, correctly, that the ability to scale on the server side doesn't do a darn thing for the client-side. If user agents correctly implement HTTP 1.1 then requests for multiple resources via a single host can indeed easily develop into blocking operations.

I do think a multiple host option is a good solution to this issue in general, but I'm not thrilled about accomplishing such an implementation by hardcoding those hosts/URIs inside the applications being delivered. Scott appears to agree, pointing out that without help - he suggests a content management system or fancy HTTP response rewriting - this solution to the client-side scaling issue might be "onerous" to implement. Certainly as sites grow larger and more complex, or changes are made, this architectural solution will take more time to deploy/maintain and, because you're modifying code, increases the possibility of introducing delivery errors through malformed URLs.

WebAccelerator has long included a feature set called Intelligent Browser Referencing that, among other optimization and acceleration options, includes this very functionality, i.e. using multiple hosts to increase the number of connections between the client and the server and thus decreasing the time to load required. It does this transparently and scales well, which reduces several of the problems inherent with implementing this type of solution manually. Particularly the possibility of serving up malformed URLs.  

The problem with this solution (multiple hosts) in general is that many folks today use hosting providers, and those hosting providers don't always allow you to create additional host names willy-nilly unless you're lucky enough to be hosting your entire domain with them. Even then you may find it's an onerous process to add additional host names before modifying applications to use them. This is particularly true for the use case first introduced with this topic - blogs.

Imbibing: Coffee

Published Aug 27, 2007
Version 1.0

Was this article helpful?

No CommentsBe the first to comment