Why choose dynamic analysis over static analysis?

Static analysis of JavaScript code is the norm. Uglify-JS and Closure Compiler use static analysis to minify code. JSLint and JSHint use static analysis to find flaws in code. Titanium itself currently does static analysis to determine which APIs are used by applications. No one does dynamic JavaScript analysis at compile time to the best of my knowledge. So why did I choose dynamic analysis for the Titanium Code Processor given that it hasn't really been done before? Am i insane?

Ahem, moving on...

First, some formal definitions, since I feel that these two terms can be conflated a bit. Static analysis is any form of code analysis that is performed without executing code. The classic example of static analysis is compilation itself since it performs various operations on the code, but doesn't execute it. Dynamic analysis, on the other hand, is any type of analysis performed when executing code. The classic example of dynamic analysis is performance profiling. Most profilers work by inserting little hooks into the code that get fired during program execution.

The reason I chose dynamic analysis is due to the dynamic nature of JavaScript itself. Consider the following code snippet:

var x = {  
    requireFoo: function() {
        require(this.foo + '_file');
    }
};
x.defineOwnProperty('foo', {  
    value: 'bar',
    writable: false,
    configurable: false,
    enumerable: false
});
x.requireFoo();  

Now suppose we want to write a code analyzer that finds all files that are required. Creating a code analyzer that will properly recognize that bar_file is required is incredibly difficult, if not impossible. It's not that difficult to come up with static analysis code that handles this exact situation, but what about the tens, if not hundreds, of thousands of other edge cases that could throw off a static analyzer? The problem quickly becomes intractable. Static analysis is really only good for working with "well-formed" JavaScript, for some pre-determined definition of "well-formed."

Dynamic analysis doesn't suffer a death of a thousand paper cuts like static analysis because edge-cases are taken care of by the runtime environment. Basically you get all of this behavior for free. Of course this raises the question: if dynamic analysis is so much more accurate than static analysis, then why bother with static analysis at all. The first reason is that static analysis is a lot faster: runtime is directly correlated with file size. Dynamic analysis run time is determined by the code complexity itself, and will rarely run faster than static analysis. The other reason is difficulty. Focusing on the use case mentioned above where we want to find all require statements at "compile time," consider the following snippet:

var x = Date.now().getFullYear();  
if (x < 2012) {  
    require('1');
} else {
    require('2');
}

Which file is required? It's not possible to know at at analysis time because we don't know when the code will be run. How do we handle these situations? It's a difficult problem to solve, but I have a plan. Tune in next time to find out how I solve this problem.