The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
Indeed - it's the 4060 and 4070 cards that will interest most people, I imagine. It's not that long ago a "70"-series card drew ~140W...I'm a lot more upset about the idea their "4070" could be a 300W card.
I'll put it another way ... 800W is almost a 1-bar space heater's output - that heat has to be dissipated into the room the PC is in. Sure, it might be nice in the winter to have a heating effect, but less so in the summer when it's 37C outside (and no, installing AC isn't an option).800W is 6.67 Amps ... Assuming I bought the card and used it around 400 hours a year (8 hours a week) my power cost would be $22.40 a year ($0.07KW power) ... I really can’t understand when people are talking about GPUs that will cost several thousand dollars why $20 of power a year is a concern? ... Agonizing over 100W ($2.80 a year for me) seems absurd.
What state is that? Checking the rates here (table defaults to Business, you have to manually switch it to Residential), the cheapest in the nation is 10.03 cents/kWh.. and that's not counting delivery.
It's also not counting the extra heat pouring into your room, and the extra air-conditioning that you need to run to compensate for it.
AND... some of us actually do care to minimize the damage we're doing, and also think efficiency is important in technological development. It's not all about the money.
Indeed - it's the 4060 and 4070 cards that will interest most people, I imagine. It's not that long ago a "70"-series card drew ~140W...
I'll put it another way ... 800W is almost a 1-bar space heater's output - that heat has to be dissipated into the room the PC is in. Sure, it might be nice in the winter to have a heating effect, but less so in the summer when it's 37C outside (and no, installing AC isn't an option).
As for the cost of power ... in the winter it's likely to exceed £0.45/kWh ($0.54/kWh) here, so your $22.40 becomes ~$170, which will add up over the lifetime of the card.
So it seems like now nvidia is trying to make XX90 the new XX80 by severely undercutting the XX80 series on core count and launching them late. So when a flagship mainstream should have costed 600 to 700$ will now cost 1700$.
No, I don't. I had a Voodoo 2 SLI setup in the late 90's. Each card was $300, plus you needed a 2D card. I had a Matrox Millennium II for that, which was another $300 or so. That's $900 pre inflation in the 90's. Adjusted for inflation, that's about $1660. Right about where we are now. The more things change, the more they stay the same.remember 20 years ago when the 600 number was top of the line and now that is the bottom basement card for most people?
No, I don't. I had a Voodoo 2 SLI setup in the late 90's. Each card was $300, plus you needed a 2D card. I had a Matrox Millennium II for that, which was another $300 or so. That's $900 pre inflation in the 90's. Adjusted for inflation, that's about $1660. Right about where we are now. The more things change, the more they stay the same.
Fair enough.So lets make an assumption and then discredit an entire post on a technicality eh? You are assuming I live in the US?
I live in Alberta. Epcor currently offers a long-term fixed at 9.19 CAD which is 7.12 USD a KWH. But I locked in when it was 7.00 CAD so actually 5.42 USD a KWH. It isn’t even the cheapest power in Canada.
Wrong. I am working via a laptop for work, and for most of my basic PC usage, I'm relying on a Athlon 200GE, which also was how I used to connect to work before they went on the "everyone connect with the company provided laptop" policy.Also of course you don’t care about “minimizing the damage you do.” If that was true wouldn’t you not buy a GPU at all? Or not post on forums a bunch?
Wrong again. Presumably, you know something about GPUs. Presumably, you also realize that the top tier GPUs tend to push their architecture to the limits, and have a very nasty tendency to manage a small gain in performance via a disproportionately hefty increase in power consumption, relative to the next GPU down.Aren’t you really just justifying your own power use that you arbitrarily choose for the most comfort and entertainment you can get while demonizing anybody that chooses a slightly higher arbitrary rate?
Correct. And, part of the reason I am of those means is because I do not overspend for the purpose of impressing others, with little actual gain or benefit from that extra expenditure.Basically if you are an American with the means to even consider affording one of these
Also correct.and the tech background
Hardly. I mean, "that have ever lived" is pretty vague, I guess it's true if you're considering the entirety of human existence once Homo sapiens sapiens and Homo sapiens neanderthalensis emerged. But you're basically making a calculation that you can't possibly have the data to calculate, aren't you?you are somewhere in the top 3% of energy users that have ever lived
Dead wrong. Well, if you want to be literal, my house does have windows, and of course, the solar panels have what appears to be glass.so it’s pretty ironic to discuss how you strive to “minimize the damage you do.” You literally started throwing stones while living in a glass house. Lol.
That's just the "energy charge" though. When you include the distribution and transmission charges, the effective dollar amount paid per kWh is a fair bit higher (probably double or more).I live in Alberta. Epcor currently offers a long-term fixed at 9.19 CAD which is 7.12 USD a KWH. But I locked in when it was 7.00 CAD so actually 5.42 USD a KWH. It isn’t even the cheapest power in Canada.
Fair enough.
Wrong. I am working via a laptop for work, and for most of my basic PC usage, I'm relying on a Athlon 200GE, which also was how I used to connect to work before they went on the "everyone connect with the company provided laptop" policy.
Do I also have a gaming system? Sure. And, while I haven't had the opportunity to game in a while, it's chugging along on an older card that consumes 180W if maxed out.
Wrong again. Presumably, you know something about GPUs. Presumably, you also realize that the top tier GPUs tend to push their architecture to the limits, and have a very nasty tendency to manage a small gain in performance via a disproportionately hefty increase in power consumption, relative to the next GPU down.
So, it's not arbitrary to say that I don't see the need to waste a lot of extra power for bragging rights.
Correct. And, part of the reason I am of those means is because I do not overspend for the purpose of impressing others, with little actual gain or benefit from that extra expenditure.
Also correct.
Hardly. I mean, "that have ever lived" is pretty vague, I guess it's true if you're considering the entirety of human existence once Homo sapiens sapiens and Homo sapiens neanderthalensis emerged. But you're basically making a calculation that you can't possibly have the data to calculate, aren't you?
And, would that also put YOU in that top 3%
Dead wrong. Well, if you want to be literal, my house does have windows, and of course, the solar panels have what appears to be glass.
And that gas guzzling Prius of mine . . yeah, that's a real problem, too, isn't it?
I see you were unable to stick to the original point I was making, and just decided to go ad hominem. You're new here, so, you might not know that that sort of thing tends to ultimately bring the ire of the moderators.
You know, maybe, instead of assuming I was making a personal attack, you should've just read what I was saying. I'd recommend going back to the post you took such offense to, and re-reading it.
I asked about the power rates you were paying, and, while I assumed you were in the US, I also wondered if some of the states did, in fact, have some sort of lock-in of previous rates, and if that's how you got the rate you did. Which, other than you being in Canada instead of the US, is what you subsequently mentioned with Epcor.
I also pointed out that the extra power draw does result in extra heat. Thus, extra energy for cooling. You didn't touch this as all.
And, finally, what you took to be an insult, was me speaking the truth. There's more than just me who considers getting a card that performs how we want it to, rather than going for top-of-the-line bragging rights. And, those who consider that, if two cards perform closely, but one uses notably less power, we'll go for that one.
Maybe don't be so defensive.
Fair enough.
Wrong. I am working via a laptop for work, and for most of my basic PC usage, I'm relying on a Athlon 200GE, which also was how I used to connect to work before they went on the "everyone connect with the company provided laptop" policy.
Do I also have a gaming system? Sure. And, while I haven't had the opportunity to game in a while, it's chugging along on an older card that consumes 180W if maxed out.
Wrong again. Presumably, you know something about GPUs. Presumably, you also realize that the top tier GPUs tend to push their architecture to the limits, and have a very nasty tendency to manage a small gain in performance via a disproportionately hefty increase in power consumption, relative to the next GPU down.
So, it's not arbitrary to say that I don't see the need to waste a lot of extra power for bragging rights.
Correct. And, part of the reason I am of those means is because I do not overspend for the purpose of impressing others, with little actual gain or benefit from that extra expenditure.
Also correct.
Hardly. I mean, "that have ever lived" is pretty vague, I guess it's true if you're considering the entirety of human existence once Homo sapiens sapiens and Homo sapiens neanderthalensis emerged. But you're basically making a calculation that you can't possibly have the data to calculate, aren't you?
And, would that also put YOU in that top 3%
Dead wrong. Well, if you want to be literal, my house does have windows, and of course, the solar panels have what appears to be glass.
And that gas guzzling Prius of mine . . yeah, that's a real problem, too, isn't it?
I see you were unable to stick to the original point I was making, and just decided to go ad hominem. You're new here, so, you might not know that that sort of thing tends to ultimately bring the ire of the moderators.
You know, maybe, instead of assuming I was making a personal attack, you should've just read what I was saying. I'd recommend going back to the post you took such offense to, and re-reading it.
I asked about the power rates you were paying, and, while I assumed you were in the US, I also wondered if some of the states did, in fact, have some sort of lock-in of previous rates, and if that's how you got the rate you did. Which, other than you being in Canada instead of the US, is what you subsequently mentioned with Epcor.
I also pointed out that the extra power draw does result in extra heat. Thus, extra energy for cooling. You didn't touch this as all.
And, finally, what you took to be an insult, was me speaking the truth. There's more than just me who considers getting a card that performs how we want it to, rather than going for top-of-the-line bragging rights. And, those who consider that, if two cards perform closely, but one uses notably less power, we'll go for that one.
Maybe don't be so defensive.
Fair enough.
Wrong. I am working via a laptop for work, and for most of my basic PC usage, I'm relying on a Athlon 200GE, which also was how I used to connect to work before they went on the "everyone connect with the company provided laptop" policy.
Do I also have a gaming system? Sure. And, while I haven't had the opportunity to game in a while, it's chugging along on an older card that consumes 180W if maxed out.
Wrong again. Presumably, you know something about GPUs. Presumably, you also realize that the top tier GPUs tend to push their architecture to the limits, and have a very nasty tendency to manage a small gain in performance via a disproportionately hefty increase in power consumption, relative to the next GPU down.
So, it's not arbitrary to say that I don't see the need to waste a lot of extra power for bragging rights.
Correct. And, part of the reason I am of those means is because I do not overspend for the purpose of impressing others, with little actual gain or benefit from that extra expenditure.
Also correct.
Hardly. I mean, "that have ever lived" is pretty vague, I guess it's true if you're considering the entirety of human existence once Homo sapiens sapiens and Homo sapiens neanderthalensis emerged. But you're basically making a calculation that you can't possibly have the data to calculate, aren't you?
And, would that also put YOU in that top 3%
Dead wrong. Well, if you want to be literal, my house does have windows, and of course, the solar panels have what appears to be glass.
And that gas guzzling Prius of mine . . yeah, that's a real problem, too, isn't it?
I see you were unable to stick to the original point I was making, and just decided to go ad hominem. You're new here, so, you might not know that that sort of thing tends to ultimately bring the ire of the moderators.
You know, maybe, instead of assuming I was making a personal attack, you should've just read what I was saying. I'd recommend going back to the post you took such offense to, and re-reading it.
I asked about the power rates you were paying, and, while I assumed you were in the US, I also wondered if some of the states did, in fact, have some sort of lock-in of previous rates, and if that's how you got the rate you did. Which, other than you being in Canada instead of the US, is what you subsequently mentioned with Epcor.
I also pointed out that the extra power draw does result in extra heat. Thus, extra energy for cooling. You didn't touch this as all.
And, finally, what you took to be an insult, was me speaking the truth. There's more than just me who considers getting a card that performs how we want it to, rather than going for top-of-the-line bragging rights. And, those who consider that, if two cards perform closely, but one uses notably less power, we'll go for that one.
Maybe don't be so defensive.
That's just the "energy charge" though. When you include the distribution and transmission charges, the effective dollar amount paid per kWh is a fair bit higher (probably double or more).
Edit: looks like the local access fee is also based on usage.
That's just the "energy charge" though. When you include the distribution and transmission charges, the effective dollar amount paid per kWh is a fair bit higher (probably double or more).
Edit: looks like the local access fee is also based on usage.
That's just the "energy charge" though. When you include the distribution and transmission charges, the effective dollar amount paid per kWh is a fair bit higher (probably double or more).
Edit: looks like the local access fee is also based on usage.
Transmission charge, distribution charge, and local access fee all vary based on usage. They're not fixed.$5.42 cents per KHW USD is my variable cost of power. They do charge me a fixed transmission cost but I pay that even if I don’t use any power so my annual operating cost for the
Only my energy charge is variable, transmission and other costs
That is my variable power cost. My fixed cost portion (transmission and what not) is something I already pay whether I use a GPU or not so shouldn’t be included in my cost estimate. The GPU only adds $22 a year in cost.
Do you work for nvidia marketing? Cause even they wouldnt defend nvidia that much.Sigh. I always marvel at these types of comments. I’m sure when GPUs were $700 somebody was saying they should cost no more than $300.
At the end of the day you can happily upgrade to a better GPU than you currently have for the $700 you are willing to spend. That’s a win for you. Why would it bother you in the least if they make extra especial cards at $1700? Should we outlaw cards above a certain price because you say so? What if somebody else decided there should be no cards above $100? Would everybody having equally terrible cards make you feel better?
Also you have to realize that your view of “value” is highly personalized and is usually tied back to hourly earning power. If you make $10 an hour that $700 card looks like 70 hours work. Meanwhile another individual might increase their net worth by $300 for every hour worked last year meaning that $1700 card looks like 6 hours work. So while for you the $700 seems like a stretch for the other person $1700 seems quite reasonable.
here is a thought that it seems you are completely over looking :$5.42 cents per KHW USD is my variable cost of power
Ok, 1st off my comments weren’t ad hominem at all. I indicated (with support) why I found the talk (including the ridiculous claim it needs “commercial power”) about its consumption to be ridiculous.
You went Ad hominem claiming “some of us are doing everything to minimize
That is an ad hominem attack.Also of course you don’t care about “minimizing the damage you do.” If that was true wouldn’t you not buy a GPU at all? Or not post on forums a bunch? Aren’t you really just justifying your own power use that you arbitrarily choose for the most comfort and entertainment you can get while demonizing anybody that chooses a slightly higher arbitrary rate?
That is also an ad hominem attack.Basically if you are an American with the means to even consider affording one of these and the tech background you are somewhere in the top 3% of energy users that have ever lived so it’s pretty ironic to discuss how you strive to “minimize the damage you do.” You literally started throwing stones while living in a glass house. Lol.
here is a thought that it seems you are completely over looking :
are you also factoring the other costs related to this furnace of a card ? the main one, is the temperature of the room this card would be in during the summer.
figures, maybe cause it throws his whole argument out the window ? ( which is where the heat from this card, should also go ? )I did bring that up. He chose to ignore it.