### NO MAGIC BULLET

Computability theory shows that it will always be possible to invent new optimizing transformations. Let us say that a *fully optimizing compiler* is one that transforms each program *P* to a program **Opt**(*P*) that is the *smallest* program with the same input/output behavior as *P*. We could also imagine optimizing for speed instead of program size, but let us choose size to simplify the discussion. For any program *Q* that produces no output and never halts, **Opt**(*Q*) is short and easily recognizable: *L*_{1} : **goto** *L*_{1} Therefore, if we had a fully optimizing compiler, we could use it to solve the halting problem; to see if there exists an input on which *P* halts, just see if **Opt**(*P*) is the one-line infinite loop. But we know that no computable algorithm can always tell whether programs halt, so a fully optimizing compiler cannot be written either. Since we can't make a *fully* optimizing compiler, we must build *optimizing compilers* instead. An optimizing compiler transforms *P* into a program *P*′ that always has the same input/output behavior as *P*, and might be smaller or faster. We hope that *P*′ runs faster than the optimized programs produced by our competitors' compilers. No matter what optimizing compiler we consider, there must always exist another (usually bigger) optimizing compiler that does a better job. For example, suppose we have an optimizing compiler *A*. There must be some program *P*_{x} which does not halt, such that *A*(*P*_{x}) ≠ **Opt**(*P*_{x}). If this were not the case, then *A* would be a fully optimizing compiler, which we could not possibly have. Therefore, there exists a better compiler *B*: *B*(*P*) = **if** *P* = *P*_{x} **then** [`L : goto L`] **else** *A*(*P*) Although we don't know what *P*_{x} is, it is certainly just a string of source code, and given that string we could trivially construct *B*. The optimizing compiler *B* isn't very useful - it's not worth handling special cases like *P*_{x} one at a time. In real life, we improve *A* by finding some reasonably general program transformation (such as the ones listed at the beginning of the chapter) that improves the performance of many programs. We add this transformation to the optimizer's "bag of tricks" and we get a more competent compiler. When our compiler knows enough tricks, we deem it *mature*.

This theorem, that for any optimizing compiler there exists a better one, is known as the *full employment theorem for compiler writers*.

Comments