I've found myself adding a timer to each of the unit test frameworks I use so I can see how long my tests take. I am not trying to do anything complex. I just want to be aware of the perf. The same way pass and fail are visible, 1ms vs 90ms is visible.
Premature optimization rightly has a bad rep. Creating complex code where performance doesn't matter is a bad tradeoff. That's not what I'm doing here. I am not trying to code for perf. I'm trying to be perf aware. If the test is fast I'm certainly not going to use that as an excuse to start changing code. If it's slow, I know about it and I won't check in something that kills perf. Once you have a number, the real question is: How do you judge if it is fast or slow? The answer is a perf budget.
Rico Mariani has a great Designing for Performance post where he talks about using Perf budgets.
In the absence of a perf budget, spend your time getting a perf budget rather than mucking about with code. Once you have perf budgets, being aware of resource consumption makes it easier to see problems earlier. A perf problem is a bug. Fixing a bug before it gets into the source tree is always a good idea.