ECMAScript 6 at Fluent Conf

Last week, I attended Fluent Conf, a web conference in San Francisco. It was a really good conference overall, much better than I was expecting. It's still not as good as JSConf, but it was still very much worth my time. This post, and two latter posts, summarize everything I learned at the conference.

I've been following the development of ECMAScript 6 for some time and am really excited for many of the new feaures. Getting a new language created, implemented, and adopted is no easy feat. Until now, I felt like most of the talk was focused on "won't it be cool when we can do this some magical day in the future." Not so at Fluent though. I feel like there is starting to be real, practical interest in ECMAScript 6, as if we are just starting to crest the wave from academic to practical.

Practical Workflows for ES6 Modules

Guy Bedford @guybedford

ECMAScript 6 includes a new module specification and loader specification.

The module spec specifies the syntax for defining modules in JavaScript. Importing is pretty straightforward, and looks kinda sorta Python-ish:

import { encrypt, decrypt } from "crypto";  

These may look synchronous, but they are actually asynchronous. imports must go at the top of files, so you can think of this as lots of syntactic sugar around AMD define()'s

Exporting is pretty CommonJS-ish:

module "foo" {  
    export let x = 42;
}

The loader spec is in charge of figuring out how to load the modules. The most awesome part is that it's extensible. The spec isn't very formal yet, mostly just a description of Mozilla's prototype implementation. The loader spec allows you to specify your own mechanisms for resolving names, fetching the code, etc. You can, for example, extend it to use SPDY/HTTP2 or your own custom bundling tech if you want.

Traceur has support for modules, and SystemJS has support for the loader, if you want to play around with it today.

technical and social progress toward ECMAScript 6 at facebook

Ben Newman, @benjamn

Ben started his talk with a bit of a story about the title (notice the capitalization). He mentioned a fun site called Title Capitalization and used it as an example of how technology can make old problems obsolete (cue foreshadowing).

Programming languages are notoriously difficult to “fix forward”, and we have to look no further than the (largely failed thus far) Python 2 to Python 3 transition. Python 3 is 5 years old, but hardly anyone uses it, instead preferring to stick to Python 2. The key question for us in the JavaScript community is: how do we avoid the Python 3 trap with ECMAScript 6 and later?

Ben's proposed solution to this problem is to incrementally start adding the most useful features into existing code and JavaScript runtimes. To achieve this, Ben implemented a tool called recast that makes it really easy to rewrite JavaScript ASTs (recast is based on Esprima). As an example:

[3, 1, 10, 28].sort((a, b) => a - b);

is transformed into

[3, 1, 10, 28].sort(function (a, b) {
  return a - b;
}.bind(this));

Ben's original plan for the Facebook codebase was to implement a set of jslint rules that would warn developers when they tried to do things the "old" way and give them a link to a wiki explaining how to write things the ES6 way. He ended up not taking this approach because:

  • jslint tells you code might be bad after it’s written
  • No one reads wikis
  • Ben didn’t want to make educating thousands of Facebook engineers his full-time job
  • As long as old style code dominated the codebase, it would continue to dominate

Ben decided that it was better to write a set of tools that would automatically convert the old ES5 style of code to ES6 code. He came up with the following requirements for the tool:

  • Support multiple idioms
    • Facebook has a few ways of creating web components, and they all needed to be supported
    • The old, custom libraries for creating classes was converted to the new ES6 class spec
  • The output code must be human readable
  • The output code must also be human reviewable
  • The output code must ultimately look like it was written by a human
    • This means maintaining comments, etc

Ben ran his script on the Facebook codebase with the following results:

  • changed 1647 files
  • ~75000 insertions and deletions
  • 1658 classes converted
    • There are about 3000 classes exist today
    • This means half were human written after the switch

Lessons learned:

  • Output must be human readable
  • Tool should be very conservative
    • If the tool doesn’t know what to do, fail hard, don't guess
  • Iterate rapidly, handling more and more rare cases
  • How you hand-fix edge cases must be done with care with regards to diffs/overwriting from subsequent runs of the tool
  • The tool must be idempotent
  • Use GNU parallel to parallelize script
  • Humans have right of way
    • The tool can be rerun easily, diffs created by hand not so much
  • Identify the stake holders, and convert whole functional units of code at once
  • Set intermediate milestones
  • New code mimics existing code
  • The future is longer than the past

Callable Entities in ECMAScript 6

Dr. Axel Rauschmayer

Unfortunately I encountered a sync error with Evernote and lost my notes for this talk. It wasn't a very good talk, so I'm not too dissapointed.

There was one good point from it though: the overloaded use of functions in JavaScript. Functions serve triple duty in JavaScript (as a function, as a method, and as a constructor), and it's because functions are trying to do three things at once that we run into problems (calling a constructor as a function, issues with this, etc).

ECMAScript 6 addresses most of these issues with the new support for true classes and arrow functions.

Everything is a Polyfill: Automate Deleting Code for Frontend Performance

Rachel Myers @rachelmyers and Emily Nakashima @eanakashima

At it's core, the primary thesis of this talk is that polyfills should be created with an expiration date.

To set the stage, let's consider an all-to-common scenario:

Alice: "We need to use Array.indexOf but IE8 doesn't support it. What should we do?"
Bob: "Create a polyfill by modifying the Array prototype of course"
Alice: "OK, I wrote the polyfill and added it to our codebase. Let's make sure we remember to remove it once we drop IE8 support"
Bob: "There's no way I would forget, it will be downright cathartic removing all the IE8 code"

A year goes by and IE8 support is dropped

Alice: "No more IE8, yay!"
Bob: "Good riddance!"

Another year goes by

Bob: "Hey Alice, what's this indexOf polyfill doing here?"
Alice: "I vaguely remember something about it from, like, 2 years ago."
Bob: "Well, can we remove it?"
Alice: "I dunno. We'll have to do an investigation into our codebase to see where it's being used and how and cross-reference it with what all of our supported browsers provide"
Bob: "That sounds like a lot of work. Let's go get beer instead"
Alice: "Beer it is!"

To avoid this common trap, a more methodical approach to polyfills is necessary. How do we do that though?

  • Add some comments to the code explaining the nature of the polyfill and when it's safe to delete that
    • This requires that developers comment their code reliably...yeah, about that...
  • Use Modernizr or a custom mechanism based on Require.js
    • Better, but causes unecessary overhead
    • You either have to Load All the Polyfills!, or do detection and request individual polyfills from the server
    • Both techniques have a network cost

Instead, the authors recommend creating a hard coded set of polyfills that are closely tied to an expiration system integrated with your continuous integration setup.

Browser support is defined in a machine parseable format with expiration dates:

{
    "IE": {
        "8": 1404254609391,
        "9": 1417477409391,
        ...
}

Polyfills can then be defined like:

var polyfills = [{  
    nativeSupport: {
        IE: 9
        ...
    },
    code: function () {
        if (!Array.prototype.indexOf) {
            ...
        }
    }
}]

These polyfills can be passed into a polyfill loader that executes them. What's advantageous about this setup is that support is explicitly codified and completely unambiguous. Even more, this setup can be integrated into your unit testing infrastructure so that they are brought to everyones attention as soon as they expire.

In addition to the technical advantages, there is also a major non-technical advantage: it forces devs to work with product management to determine exactly what the project's supported browsers are. It also forces devs and product management to keep this list up to date.

This type of polyfill infrastructure can be extended to HTML and CSS as well with the use of code analysis scripts that, for example, automatically detect vendor prefixed CSS rules and compares them against their supported browsers vs. expiration dates.