Extensible Web Resource Loading Manifesto

A modern web-page is assembled from dozens of assets with different content-types. The requests for these assets are initiated in one of three ways: the document parser discovers a tag with a resource URL (e.g. img[src]), via a dynamic request initiated via Javascript (e.g. XHR, DOM-injected element), or via a CSS declaration.

There are no specified rules for how or why certain resource fetches should be prioritized, or deferred. That said, all browser vendors have implemented some form of heuristic and predictive fetching prioritization to accelerate page loading - e.g. HTML, CSS, and Javascript are considered critical; images and other asset types are less so; some browsers limit number of concurrent image downloads; CSS assets are lazyloaded by default, and so on. Introduction of preload parsers have further codified these content-type heuristics into de-facto fetching rules.

These content-type defaults are reasonable for the majority of cases, but they are insufficient for delivering an extensible and perf-friendly platform. In fact, the lack of app-level control over these fetching decisions is the root cause of the vast majority of resource loading performance problems that developers have to fight with today - delayed fetches, unwanted lazyloading, eager fetching of non-critical resources, and so on.

Extensible resource fetching

To deliver on the extensible web promise, each resource fetch must provide the following features:

  1. The browser may initialize fetch defaults for each resource, just as it does today.
  2. The developer must be able to:
    • Modify default fetch settings of all requests initiated via JS, CSS, and HTML. 1
    • Define the preloader policy for any resource declared in CSS and HTML. 2
    • Define the fetch dispatch policy for any resource declared in CSS and HTML. 3

Every element, function (CSS or JS), or other means of initiating a resource fetch must meet these criteria to provide a consistent, extensible, and a perf-friendly platform. If you are defining a new platform feature that initiates a resource fetch, you should be required to explain how the above is possible; each relevant working group needs to tackle the question of how to effectively retrofit above requirements to existing mechanisms.

  1. We need a consistent and feature-equivalent interface in HTML, CSS, and Javascript. E.g...
  2. It should be possible to extend the preload scanner with custom logic. E.g...
    • opt-in a new custom element, or opt-out any existing HTML element.
    • opt-in a CSS resource for preloader processing, and so on.
  3. Opt-out from CSS lazyloading and browser-specific heuristics that may delay fetch dispatch.

A few broken examples...

  • Image fetching
    • There is no way to modify default image fetch settings
    • Image CSS declarations have no opt-out strategy from lazyloading
    • Image fetch may be delayed in some browsers due to page structure heuristics
  • Font fetching
    • There is no way to modify default font fetch settings
    • Font CSS declarations have no opt-out from CSS lazyloading
    • Font CSS declarations have no opt-in for preloader processing
  • <X> type fetching (e.g. JSON payload)
    • Must use Javascript, missing declarative fetch mechanism
    • Javascript execution requirement precludes preloader processing
    • There is no way modify default XHR/Fetch settings

Some font resources are guaranteed to be used, and lazyloading is a performance bottleneck. Some images can be as critical as scripts and stylesheets. Javascript execution precludes early resource dispatch, which causes further prioritization and performance problems, and so on. This is not an exhaustive list, not even close. Rather, it's an illustration of the kinds of problems that many developers and applications are encountering on a daily basis.

At best, inflexible defaults and heuristics limit developer's ability to improve and adapt the default resource loading strategy to fit their needs. At worst, these defaults backfire and hurt performance of certain pages and applications. One size does not fit all.

Note that the upcoming ServiceWorker API does not address any of the above issues. ServiceWorker can influence how requests are dispatched, but only after it receives the request from the browser - that's already too late.

Laying out the work ahead

A detailed discussion on how to implement all of the above is outside the scope of this post and will require a lot of collective thought and discussion from site developers, spec wranglers, and browser engineers. That said, the high-level requirements and the key pieces of the required infrastructure are roughly as follows:

  • Fetch API attempts to unify and explain resource fetching in the web platform.
    • Fetch needs a mechanism to control transport-layer resource allocation.
    • Fetch settings initialization must be exposed in JS, CSS, and HTML.
    • Fetch API needs to be implemented and exposed to developers.
  • Web platform needs a declarative (preloader friendly) mechanism to match the Javascript Fetch API.
    • rel=preload attempts to address this problem and needs to implemented and exposed to developers.
  • Web platform needs to define an API for interfacing with the preload scanner.
    • Preload scanner should be extensible - e.g. can handle custom elements.
    • Preload scanner should be smart enough to process HTML and CSS.
    • Preload scanner logic needs to be spec'd and consistently implemented.
  • Web platform needs to provide control over resource dispatch policies.
    • CSS needs to provide lazyload opt-out mechanisms - e.g. as parameters to url(), or some such.
    • Browser vendors need to provide an opt-out mechanism from custom heuristic-driven dispatch policies.

With the above in place, we would finally have a solid foundation for an extensible resource loading model. From there, layer on Streams API, Service Worker, and other goodies - the future is bright.

Ilya GrigorikIlya Grigorik is a web performance engineer at Google, co-chair of the W3C Web Performance working group, and author of High Performance Browser Networking (O'Reilly) book — follow on Twitter, Google+.