Whenever economic situations change, the IT department has to adapt. The situation in many shops is that IT is a recurring cost, carried by the entire business. What that means is that your IT department is asked to service the entire organization, no matter how much it grows or shrinks.
But that model has issues - like any model does. The issue here is that IT doesn't always have the visibility across the entire organization, so it is just assumed that it has enough resources to do the job. On the other side of the coin, it's difficult to show the true value of enterprise-wide IT, so the business squeezes the resources to save money when times get tough. So you have a situation that is set up to fail - on one side, more and more automation and IT magic needed, and on the other, a big pot of money being spent on something that can't track immediate value for the person evaluating the money side.
So from time to time you'll see companies implement chargeback systems. These systems attempt to assign a dollar value to a department or group's use of the computing resources, and then literally charge the department for that amount. That way each department has to "pony up" if they want more resources.
But this brings up even more issues. First, what exactly do you charge for? Power, Computers, Drives, Memory, admins and developers? IT Managers? Network wiring? And then of course, how do you track that? With middle-tier applications, packets get mixed and matched quite a bit, so how do you track who's work belongs to whom? Do you even charge IT itself for monitoring everyone's systems or do you try and charge that back, too? ?
There may be a middle ground. Perhaps the best way is to create a one-time (or perhaps periodic) review of general IT resource use, and then amortize the entire cost of IT among the various business units, once per fiscal year. This would allow IT to get proper funding, as long as the list of consumers is carefully thought out.