Interpreter: remove LOCAL_VAR macro
Ancient versions of gcc could not optimize short-lived local variables
very well and would instead allocate lots of stack space for them. The
interpreter function would then sometimes trigger crashes because the
stack frame got too big. These days making variables short-lived is
benefitial for the generated code. It also makes the code simpler.