Matlab has high-level language constructs that make it easy to express linear algebra computations in a terse, intuitive syntax. Its flexibility as a scripting language, however, means that it has underwhelming performance in some circumstances. While Matlab’s just-in-time (JIT) compiler helps quite a bit, research has shown that there is much space for performance improvements. In particular, the McFor project, from the CS department here at McGill, parses Matlab to generate Fortran which is then compiled, and claims speedups of 3-10x in many different scenarios. It is not yet available to the public. These levels of speedups are comparable to what you might get by writing mex files in C; however, the amount of effort involved is quite a bit lower.
After the lexing and parsing stages of McFor, a series of analyses and transformations are performed on the code. This includes array type inference, shape inference, bound checking, flow analysis, etc. Such deep static analyses have the potential to offer appreciable speed increases over a JIT strategy. In a certain sense, what the compiler does is infer the code’s semantics.
Such semantic inference is made needlessly difficult by Matlab’s lack of expressiveness. You could imagine a Matlab-like language where it is possible to specify that a matrix is static, or cannot be resized, or tridiagonal, circulant, and so forth. That would allow a compiler to write more specialized and efficient code.
The Theano library for Python, from the CS department over the hill at Université de Montréal, takes this approach. It uses the NumPy library, a Matlab-like Python extension, as its numeric layer. It offers specialized datatypes which include more semantics than Matlab or NumPy. This information, combined with a deep static analysis, allows it to compile highly efficient numeric C code targeted either for CPUs or GPUs, which is callable transparently and just-in-time in Python.
Perhaps even more exciting, its understanding of code semantics allows it to perform things which are typically associated with symbolic algebra systems like Maple and Mathematica. This includes symbolic simplification of expressions, automatic creation of reused temporaries, and automatic differentation. The documentation, for example, shows how you can simply write an error function for a classification problem (logistic regression), and the software will take care of computing dE/dw not by finite differences but by symbolically deriving the gradient of the error. Deep wizardry indeed.