Quote:
See the thing is, those numbers are very misleading. A 1000W supply does not mean you're constantly using 1000W. It scales with what your pc is actually using. I put a 350W supply in my media center and it's running just as fine, for example.
I actually calculated how much my media center uses a year, and it's only 20 dollars if I leave it on day and night, 365 days. And this is calculated on top usage. I turn it off when I sleep and go out the house so it's probably not even a third.
Hard disks use like 10W and they spin down when you're not using them, so hardly use anything at all. Your cpu usually underclocks when idle, also using less CPU. Yours has a 125 W TDP, but the average will be much much lower. You can get a newer one pretty cheaply that only uses half that on max load and will easily do 1080P and even H265 with a better GPU baked into it. Your motherboard probably uses less than 100W. RAM like 15W. So honestly the only power hungry thing in there is the GPU and a newer CPU would completely replace it. And that will just as well underclock when idle and use far far less. Yours has 150W TDP. So all in all your power supply is most likely enormous overkill for that PC and I doubt you'll go over 40 dollars a year when leaving it on. It's honestly a non-issue unless you live in a country where electricity is SUPER EXPENSIVE. Keep in mind actually that buying a new power-friendly PC will cost you a pretty penny as well. Let's say it's 400 dollars, which is on the low end but should suffice for a media center plenty (mine was around 450 and plays _everything_ ). That would mean that the investment alone would let you run the old pc for 10 years non-stop! And don't forget it will use electricity too, adding another 20 dollars a year. The return on investment is in other words absolutely horrible and not worth it if you're doing it to save money.