Clearly I'm missing something obvious here but last night's episode of Dragons' Den when everyone wanted to save the planet with a standby saver has me troubled...
The Standby Saver (or whatever it was called) claims to save the (significant amount of) electricity your TV / VCR / DigiBox etc uses while on Standby. This is a subject close to my heart as our Samsung TV has no off button - just a standby button - and I have a conscience so I worry about it. The demo showed the four-way bar socket into which the unit is integrated consuming about 70W while the TV was running, about 15W while the TV was on Standby and 0W when the "Standby Saver" was enabled.
As I understood the explanation, the unit uses re-chargeable batteries that charge from the mains when the TV is running and a small microprocessor that learns from your remote control and kicks in when you switch the TV to standby to cutoff the power 100% (and then, one assumes, continues to run waiting for you to switch the TV on again). All makes sense so far. But does it make good environmental sense?
Well hold on, what about the current required to keep those batteries charged? How does that compare with the usage in standby? And I notice the demo nicely avoided this issue altogether because we don't know what the TV power usage is without the batteries "in the loop". Call me cynical if you like but I am suspicious. I do see you could get a benefit if you're driving 4 units from the 4-way (but then you have to be happy that they all go off when your TV gets switched off I assume). But everyone got very excited when they talked about building this functionality directly into the TV itself.
Hold on a minute I cry? Is this a perpetual motion machine I see before me? Other than placing on old fashioned on/off switch on the TV which would do the job perfectly adequately (but has the downside that you have to get off your *rse to switch on the TV), you need to draw some current while sitting waiting for that signal from the remote that says "I want Pop Idol and I want it now and I want it without having to get off my *rse!". So, we've just moved that current draw from a direct draw from the mains PSU to some re-chargeable batteries (and I assume some significant inefficiency in the charging process).
So could someone who understands this better than me confirm (as this is the only thing I can think of) that the benefit is that the mains PSU is extremely inefficient at supplying the tiny current required to keep the TV in standby and therefore the re-chargeable battery approach is a better one? Because right now this just looks like emperor's new clothes to me (I do sincerely hope it isn't). What astounds me (if this is the case) is that Sony / Samsung / Toshiba / (insert big name TV manufacturer here) didn't come up with this solution years ago if it really does make the sort of difference that was being claimed.
PS Why they talked about VCRs I don't know because unless the thing is clever enough to know that I set it to record Pop Idol...