Innovating with HTTP 2.0 Server Push

HTTP 2.0 enables the server to send multiple responses (in parallel) for a single client request - aka, server push. Except, why would we ever want to do such a thing? Well, an average page requires dozens of additional assets, such as JavaScript, CSS, and images, and references to all of these assets are embedded in the very HTML that the server is producing! Hence, instead of waiting for the client to discover references to these resources, what if the server just sent all of them immediately? Server push can eliminate entire roundtrips of unnecessary network latency.

In fact, if you have ever inlined a resource (CSS, JS, or an image), you've been "simulating" server push: an inlined resource is "pushed" as part of the parent document. The only difference is that HTTP 2.0 makes this pattern more efficient and far more powerful!

Hands-on with HTTP 2.0 server push

An inlined resource, by definition, is part of the parent document. Hence, it cannot be cached independently and may need to be duplicated across many different pages - this is inefficient. By contrast, pushed resources are cached individually by the browser and can be reused across many pages. An example is in order:

spdy.createServer(options, function(req, res) {
  // push JavaScript asset (/main.js) to the client
  res.push('/main.js', {'content-type': 'application/javascript'}, function(err, stream) {
    stream.end('alert("hello from push stream!")');
  });

  // write main response body and terminate stream
  res.end('Hello World! <script src="/main.js"></script>');
}).listen(443);

What we have above is a minimal SPDY server implemented with the help of the node-spdy module, which responds to all inbound requests by writing out a "Hello World!" string, followed by script tag. Except, we are also doing something clever: we push the main.js file to the client, the body of which triggers a JavaScript alert.

As a result, by the time the browser discovers the script tag in the HTML response the main.js file is already in cache, and no extra network roundtrips are incurred! HTTP 2.0 server push obsoletes inlining. Best of all, server push is already supported by all the SPDY-capable browsers (Firefox, Opera, and Chrome).

What else can we push?

Replacing inlined resources with server push is the canonical example. However, why stop there, what else could we push? Well, any HTTP response is fair game. Could we push a redirect? Yep, that's a simple task:

spdy.createServer(options, function(req, res) {
  //push JavaScript asset (/newasset.js) to the client
  res.push('/newasset.js', {'content-type': 'application/javascript'}, function(err, stream) {
    stream.end('alert("hello from (redirected) push stream!")');
  });

  // push 301 redirect: /asset.js -> /newasset.js
  res.push('/asset.js', {':status': 301, 'Location': '/newasset.js'}, function(err, stream) {
    stream.end('301 Redirect');
  });

  // write main response body and terminate stream
  res.end('<script src="/asset.js"></script>');
}).listen(443);

Same example as above, except we've changed the asset.js resource to be a 301 redirect to the newasset.js file. The browser caches both the redirect and the asset and executes the script without any extra network roundtrips. Applications? I'll leave it as an exercise for the reader.

Could we take this even further? What about pushing cache invalidations, and revalidations, to the client? We can mark any existing resource in client's cache as stale (i.e. push a stale timestamp), or conversely, update its lifetime by pushing a 304 with a future timestamp. In short, the server could actively manage the client's cache!

If the sever is too aggressive or misbehaved, then the client can limit the number of pushed streams, or cancel individual streams as it desires. Finding the right strategy and balance on both sides will be key to getting the best performance out of server push - not an easy challenge, but one that could yield high returns.

Note: pushing cache revalidations is not supported by current browsers. Should it be?

Client-notifications for server push

HTTP 2.0 server push is not a replacement for technologies such as Server-Sent Events (SSE) or WebSocket. Resources delivered via HTTP 2.0 server push are processed by the browser but do not bubble up to the application code - there is no JavaScript API to get notifications for these events. However, the solution to this dilemma is very simple, because we can combine an SSE channel with server push and get the best of both:

spdy.createServer(options, function(req, res) {
  // set content type for SSE stream
  res.setHeader('Content-Type', 'text/event-stream');

  messageId = 1;
  setInterval(function(){
    // push a simple JSON message into client's cache
    var msg = JSON.stringify({'msg': messageId});
    var resourcePath = '/resource/'+messageId;
    res.push(resourcePath, {}, function(err, stream) { stream.end(msg) });

    // notify client that resource is available in cache
    res.write('data:'+resourcePath+'\n\n');
    messageId+=1;
  }, 2000);
}).listen(443);

Admittedly, a silly example, but one that illustrates the point: the server generates a message on a periodic two second interval, pushes it into the client's cache, and then sends an SSE notification to the client. In turn, the client can subscribe to these events in application code and execute its own logic to process the event:

<script>
  var source = new EventSource('/');

  source.onmessage = function(e) {
    document.body.innerHTML += "SSE notification: " + e.data + '<br />';

    // fetch resource via XHR... from cache!
    var xhr = new XMLHttpRequest();
    xhr.open('GET', e.data);
    xhr.onload = function() {
      document.body.innerHTML += "Message: " + this.response + '<br />';
    };

    xhr.send();
  };
</script>

The long-lived SSE stream, the pushed resources, and all other HTTP 2.0 streams are efficiently multiplexed over a single TCP connection - there is no extra connection overhead or unnecessary roundtrips. By combining SSE and server push we can deliver arbitrary resources (binary or text) to the client, get the benefit of leveraging the native browser cache, and execute the appropriate application logic! Best of all, this works today.

Innovating with server push

Server push opens up an entire world of new optimization opportunities. The examples we have covered above are scratching the surface of what's possible, and there are many additional questions to consider:

  • Which resources should be pushed and when? Are they in cache already?
  • Can the server automatically infer which resources should be pushed?
  • Should advanced applications such as active cache management be supported?
  • How can we design our applications to get the most out of server push?
  • ...

With HTTP 2.0 the servers have the opportunity to become much, much smarter, both in how they optimize the delivery of individual assets, and even more importantly, the delivery of the entire application. Similarly, the browsers could expose additional API's and capabilities to make this process more efficient. In fact, in the long run, server push may well prove to be the killer feature for HTTP 2.0!

P.S. curious to learn more about HTTP 2.0? Check out the free preview of my O'Reilly book!

Ilya GrigorikIlya Grigorik is a web ecosystem engineer, author of High Performance Browser Networking (O'Reilly), and Principal Engineer at Shopify — follow on Twitter.