To optimize or not to Optimize?

So with the CLI all compilers suddenly have a great code-generator.  However, the code-generator is generally where most if not all optimizations occur.  Now some of those optimizations should only happen in the code generator because they are machine specific.  An argument could also be made that some of the non-machine specific optimizations should not be performed because they make the analysis of the IL harder for the code-generator.  That still leaves a fairly large class of optimizations that could be performed.  Unfortunately for right now the only ones the C# compiler performes are basic dead code elimination and branch optimizing (by eliminating branches-to-branches, or branches-to-next).

Now there's lots of research, both inside and outside of Microsoft to improve the performance of managed code.  I think there definitely is a lot of room to grow here, but I'm not sure people are looking in the right places.  So far none of the research I've seen even looks at the compiler.  It's all been either source code transformations or runtime changes.  The C# compiler is relatively fast because it doesn't have to do code-generation.  It seems like a natural idea to do at least some of the optimizations inside the compiler.  Then for the rest at least do the time-consuming static analysis so that the JIT can quickly consume that analysis and generate better code faster.  So am I the only person to think about this, or is it just the unpopular idea and thus gets no attention?

--Grant