Monday, November 14, 2011

Realtime Web

I started work on a whimsical presentation I will soon give to the Biodiversity Informatics Group at the Marine Biological Laboratory about the Realtime Web and came-up with the following kooky slide. Felt the urge to share.

Sunday, November 13, 2011

Amazing Web Site Optimizations: Priceless

Quite literally, priceless. As in costs nothing.

I was obsessed with web site optimization these past few weeks, trying to trim off every bit of fat from page render times. As we all know, if a page takes longer than approx. 3-4 seconds to render, then you can expect to lose your audience. Even though expectations for speed vary depending on the end-user's geographic location, having a website that can be equally fast for a user in Beijing is just as important as the experience for a user in California. As might be expected, server hardware typically isn't the bottleneck. Another way of looking at this is to recognize that remarkable boosts in performance can be had on crap hardware. So, this post presents the tools I used to measure web site performance and describes the simple techniques I employed to trim the excess fat.

My drug of choice to measure the effect of every little (or major) tweak has been WebPagetest, a truly invaluable service because I can quickly see where in the world and why my web page suffered. Knowing that it took 'x' ms to download and render a JavaScript file or 'y' ms to do the same for a css file meant I could see with precision what a bit of js or css cleansing does to a user's perception of my web site. I also used Firebug and Yahoo's YSlow, both as FireFox plug-ins. Google Chrome also has a Page Speed extension that I used to produce a few optimized versions of graphics files.

Some tricks I employed to great effect, in order from most to least important:

  1. Make css sprites. The easiest tool I found was the CSS Sprite Generator. Upload a zipped folder of icons and it spits out a download and a css file. Could it be any easier? Making a css sprite eliminates a ton of unnecessary HTTP requests and is by far the most important technique to slash load times.
  2. Minify JavaScript and css. For the longest time, I was using the facile JavaScript Compressor, but the cut/paste workflow became too much of a pain. So, I elected to use some server-side code to do the same: jsmin-php and CssMin. When my page is first rendered, the composite js and css files are made in memory then saved to disk. Upon re-rendering (by anyone), the minified versions are served. Here's the PHP class I wrote that does this for me. Whenever I deploy new code, the cached files are deleted then recreated with a new MD5 hash as file titles.
  3. Properly configured web server. This is especially important for a user's second, third+ visit. You'd be crazy not to take advantage of the fact that a client's browser can cache! I use Apache and here's what I have:

    <Directory "/var/www/SimpleMappr">
    Options -Indexes +FollowSymlinks +ExecCGI
    AllowOverride None
    Order allow,deny
    Allow from all
    DirectoryIndex index.php
    FileETag MTime Size
    <IfModule mod_expires.c>
    <FilesMatch "\.(jpe?g|png|gif|js|css|ico|php|htm|html)$">
    ExpiresActive On
    ExpiresDefault "access plus 1 week"
    </FilesMatch>
    </IfModule>
    </Directory>

    Notice that I use the mod_expires module. I also set the FileETag to MTime Size, though this was marginally effective.
  4. Include ALL JavaScript files just before the closing body tag. This boosts the potential for parallelism and the page can begin rendering before all the JavaScript has finished downloading.
  5. Serve JavaScript libraries from a Content Delivery Network (CDN). I use jQuery and serve it from Google. Be weary that on average, it is best to ONLY have 4 external sites from which content will be drawn. This includes static content servers that might be a subdomain associated with your web site. Beyond 3 external domains or subdomains, DNS look-up times outweigh the benefit of parallelism, especially for aged versions of Internet Explorer. Modern browsers are capable of more simultaneous connections, but we cannot (yet) ignore IE. I once served jQueryUI via the Google CDN, but because this was yet another HTTP request, it was slower than had I served it from my own server. So, I now pull jQuery from the Google CDN and I include jQueryUI with my own JavaScript in a single minified file from from my server.
  6. Use a Content Delivery Network. I use CloudFlare because it's free, was configured in 5 minutes and within a day, there was noticeable global improvement in web page speed as measured via WebPagetest. Because I regularly push new code, I use the CloudFlare API to flush their caches whenever I deploy. However, this is largely unnecessary because they do not cache HTML and as mentioned earlier, I make an MD5 hash as my js and css file titles.
So there you have it, I was able to trim 4-6 seconds from a very JavaScript-heavy web site. And, web page re-render speed is typically sub-second from most parts of the world. Because much of the content is proxied through CloudFlare, my little server barely breaks a sweat.

Did I mention that none of the above cost me anything?

Sunday, June 26, 2011

SimpleMappr Embedded

I never had high hopes for SimpleMappr. There are plenty of desktop applications to produce publication-quality point maps. But it turns out, users find these hard to use or are too rich for their pocket books. As a result, my little toy and its API are getting a fair amount of use. I find this greatly encouraging so I occasionally clean-up the code and add a few logical, unobtrusive options.

A number of users appear to want outputs for copy-paste on web pages and not copy-paste into manuscripts, so I just wrote an extension to permit embedding.

Here's one such example using the URL
http://www.simplemappr.net/?map=643&width=500&height=250:



Happy mapping...