• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.
D

Deleted member 67555

Guest
I can't wait for the GTX300's launch even though i will never buy an Nvidia product ever again, I hope they crush ATI's performance as well and dominate...and why would i want to see a terrific Nvidia product I will never buy...competition of course, if they crush the 5870 prices will Fall, and then ATI will release an even better card then Nvidia will crush it etc...I love competition, it's so much better when companies compete rather than collude with one another, I can't wait to see the prices of video cards that have more power than i need at a sub $250 price point...YAY all of us
 
Joined
Jul 2, 2008
Messages
3,638 (0.63/day)
Location
California
The GTX280 is as fast or just little slower than 9800GX2 in most games.

And I'm expecting GTX300 series would do the same, which mean, a single GPU card would be as fast (or faster than) as GTX295.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
:laugh:

No he's not, most of his crap is completely baseless BS. He's usually completely wrong.

Well, I wouldn't say he's baseless at all. While his ranting style can get wearisome, he's usually (but not always) quite accurate.

For example, he did us all a service, by exposing the bumpgate crap that nvidia was trying to hide and blame on everyone else - he wrote many many articles on that and made sure that nvidia couldn't sweep the problem under the carpet until everyone forgot about it. That article on The Inquirer where he exposed the dodgy video chips by cutting up a brand spanking new Apple notebook and looking at the chip with an electron microscope was an awesome bit of journalism - someone had to dig deep into their pockets to buy the laptop to cut up and hire the microscope. I haven't seen an article like that anywhere else.

nvidia had totally denied this problem on the Macs until he exposed it and nvidia as liars. Kudos Charlie. :)
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
right now the 8800GTX might be about twice as fast as a X1950XTX but at launch was it . . . . no, in some games it was close but in most, no.

had both and it wasn't then killed the GTX overclocking,

It was twice as fast at launch, but you needed a Core2 to unleash all of it's potential. Here's a pre-realease review (it got even better after launch and new drivers):

http://techreport.com/articles.x/11211/14

At 1600x1200 4xAA, common resolution at the time, it was twice as fast in Oblivion, GRAW and Quake 4, the most demanding games of that time, and in HL2:Ep1 is 80% faster, %50 in FEAR. A very different picture than what we see with HD5870. The thing was even more notable when we factored in the price at that time, the $400 8800 GTS was significantly faster than the $500 X1950XTX. Not much lower priced XT wasn't either a match in perf/price, neither Nvidia's 79xx cards. Nothing could touch the 8800 in price/performance and offer same kind of performance at the time. Now, GTX275/HD4890/GTX260/HD4870 literally destroy the HD58xx cards in perf/price and all of them. The 8800 competed with $300+ cards, HD58xx have to compete even with $150 cards, there's no contest between 8800 and HD58xx in that regards.

EDIT: Important to note is also that in the 8800 review they are using a stock clocked Core2, so there was still some room from improvement, while Wizzard uses a heavily overclocked i7 and you can't find anything faster right now, nor you will find in quite some time.

Well, I wouldn't say he's baseless at all. While his ranting style can get wearisome, he's usually (but not always) quite accurate.

For example, he did us all a service, by exposing the bumpgate crap that nvidia was trying to hide and blame on everyone else - he wrote many many articles on that and made sure that nvidia couldn't sweep the problem under the carpet until everyone forgot about it. That article on The Inquirer where he exposed the dodgy video chips by cutting up a brand spanking new Apple notebook and looking at the chip with an electron microscope was an awesome bit of journalism - someone had to dig deep into their pockets to buy the laptop to cut up and hire the microscope. I haven't seen an article like that anywhere else.

nvidia had totally denied this problem on the Macs until he exposed it and nvidia as liars. Kudos Charlie. :)

Yeah and that's the only thing he has ever been right about, and it wasn't even happening what he said it was happening, he just found something that was wrong and then made up the rest as usually does to portray something that wasn't true at all, as if it was the end of the world. He and every person that believes him come up with that "article" to say that he is most of the times right. One "article" doesn't make you be right "almost always", it makes you right once.

Some of his contributions:

- less than 30% yields for GT200 --- FALSE
- GT200 will be late --- FALSE
- Nvidia can't make GTX cards below $300 --- FALSE
- GT200 will probably be slower than VR770 --- FALSE
- Nvidia can't make dual GT200 card even at 55nm --- FALSE

Wait, I think the tape is finished, turn it over.

- less than 20% yields for GT300 --- FALSE
- 2% yields for GT300 --- FALSE
- GT300 will be a flop couse it's MIMD --- ?
- GT300 will not be able to compete with evergreen --- ?
 
Last edited:
W

wahdangun

Guest
so where is this rumored "monster GPU", no information yet?

what i see in this post, is just a bunch of fanboy defending their beloved company, and yet no actual information surfaced.

and nvdia say they want to steal ati thunder by releasing benchmark ES card, but until this time no word from NVdia, just a BS talk about how useless is DX 11

i want to see GT300 fast, so ATI can lowering their price and i will grab HD 5870 1 GB, no need for more expensive hardware, because with just that one it's already overkill.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Yeah and that's the only thing he has ever been right about, and it wasn't even happening what he said it was happening, he just found something that was wrong and then made up the rest as usually does to portray something that wasn't true at all, as if it was the end of the world. He and every person that believes him come up with that "article" to say that he is most of the times right. One "article" doesn't make you be right "almost always", it makes you right once.

True. However, an article like that lends a lot of credence to his other work, too. Now, I can't remember all the stories he's written over all this time, but they seemed generally right to me.

He was like a dog with a bone over the chip renaming fiasco for example, that nvidia is still pulling and I think not letting go of something like that is a good thing.

If you can find an article where he was actually wrong - significantly wrong - I'd be very interested. I don't think there is one. Please reply about Charlie in a new thread, see below.

And finally everyone, this thread is supposed to be about nvidia's nextgen graphics chips - please keep the discussion about that.

I know that details are thin on the ground at the moment and we all want to know, but please let's not allow this to become a flamefest over Charlie's articles, which are controversial and off-topic. If you want to discuss them, please start another thread.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
True. However, an article like that lends a lot of credence to his other work, too. Now, I can't remember all the stories he's written over all this time, but they seemed generally right to me.

He was like a dog with a bone over the chip renaming fiasco for example, that nvidia is still pulling and I think not letting go of something like that is a good thing.

If you can find an article where he was actually wrong - significantly wrong - I'd be very interested. I don't think there is one. Please reply about Charlie in a new thread, see below.

And finally everyone, this thread is supposed to be about nvidia's nextgen graphics chips - please keep the discussion about that.

I know that details are thin on the ground at the moment and we all want to know, but please let's not allow this to become a flamefest over Charlie's articles, which are controversial and off-topic. If you want to discuss them, please start another thread.

You have a list in my last post, I could have included what he said about G92 too, but really, there's no point, just change GT200 for G92 and you are good to go.

IMO in the very first moment that Demerjian is cited as something related to GT300, which is on-topic, discrediting him for what he deserves becomes just as on-topic.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
@newtekie1

It's clear that the HD5870 is alot more energy effective, because, 90% of the time that you use your computer, it's in IDLE mode, and HD5870 only consume like 20Watt when IDLE.

I have my computer on most of the time, restart is pita, and I'm here @ Techpowerup most of the time when I'm not gaming, so it's pretty obvious that Idle power consuming is really importance.

I don't turn off my computer, and then get my laptop to type these posts. I play games 2-3 hours a day, sometime not at all, and just surfing webs or doing my homework, researchs, reports.

Idle power consume is true to everyone, except peps that use quad-sli, or quad x-fire or something :rolleyes:. And that's is 0.001% (1 every thousand).

Not really, I put my computer to sleep when it Idles, as do 90% of the people who own computers. Modern operating systems are great at handling this.

Besides that, the GTX295 is only what? 55w at idle, not a whole lot more, and remember we are talking about CF here, so double the 20w to get 40w. That 15w is not going to kill you, in fact you aren't even going to notice the difference in your electric bill. And the fact of the matter is that people buying these high end cards are NEVER worried about the power consumption when idle. They don't care. Period. You are talking about people that are putting out a minimum of $450 in graphics cards, do you really think they are worried about the $10 extra a year one is going to cost over the other because of slightly higher idle power consumption? NO, they aren't.

However, power consumption under use is far more important to the people buying these cards. Why? Because it determines what power supply can be used. The different between a GTX295 and CF HD5870s might mean the difference between an 850w power supply and a 750w power supply.

He was like a dog with a bone over the chip renaming fiasco for example, that nvidia is still pulling and I think not letting go of something like that is a good thing.

When you can explain to me why the renaming was bad for the consumer, then I'll call it a "fiasco". But renaming SKUs to make them better fall in line with the current numbering scheme based on performance only helps the consumer. There was no way a consumer would come out worse off by renaming the older SKUs to fill the mid and low end market. And if you really examine the situation, the consumers that he claims it affected, would have probably been worse off had nVidia not renamed the G92 SKUs.
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Please use the Charlie D thread to post about Charlie

As I've said in big bold red letters above, this thread isn't about Charlie Demerjian and discussing him here is off topic and derails the thread and comments will only get unwanted moderator attention.

Therefore, I won't reply to anything said about him on this thread.

So, please direct all your replies (and flames! It's ok :) ) about him to my new thread on General Nonsense, The Charlie Demerjian flame thread

Now, I can't say fairer than that!
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Benetanegia & newtekie1, I've replied to you there. ;)
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,753 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I'm hoping GT300 packs a huge whallop, more than GTX295 average performance from a single GPU would be... woah. I had hopes a single 5870 could do it... I guess there's still time.

MARS or higher performance.... seems bold but did'nt we recently have predictions (by nvidia) of GPU power increasing a few hundred times over the next several years? at that rate you'd hope that 2x GTX285 performance isn't too much of a stretch.... I can dream right? :D

As for what's best to buy now (like 5870 CF vs GTX295) my money is on 2x GTX 260's or 275's (if the price is right)

260's should about par a 295 on average and 275's will be faster, both combination's way cheaper than 5870's (and 295), hec 2x 260's might even be around the cost of 1x 5870, and better overall performance.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
+1

I'm hoping GT300 packs a huge whallop, more than GTX295 average performance from a single GPU would be... woah. I had hopes a single 5870 could do it... I guess there's still time.

MARS or higher performance.... seems bold but did'nt we recently have predictions (by nvidia) of GPU power increasing a few hundred times over the next several years? at that rate you'd hope that 2x GTX285 performance isn't too much of a stretch.... I can dream right? :D

As for what's best to buy now (like 5870 CF vs GTX295) my money is on 2x GTX 260's or 275's (if the price is right)

260's should about par a 295 on average and 275's will be faster, both combination's way cheaper than 5870's (and 295), hec 2x 260's might even be around the cost of 1x 5870, and better overall performance.

+1 there Wolf. I was disappointed that the 5870 didn't beat the x2 cards too.

However, as it's not all that far behind and nvidia are totally obsessed with the performance crown, I will expect that nvidia's latest and greatest will comprehensively beat it and will therefore beat all the x2's as well.

I will then upgrade to it several months down the line, when the revised version (eg 280 to 285 revision) comes out and the prices have dropped to almost reasonable.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
If you want nvidia's spin unspun, renaming tricks exposed in gory details, then I can recommend http://www.semiaccurate.com/

The site is run by Charlie Demerjian and nvidia are his pet hate and he's usually right, too. His anti-nvidia rants are legendary.

Usually right? pfft mindless editorial ranting without sources makes youtube look like an encyclopedia.

Awful sources. Its spam to say the least.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Usually right? pfft mindless editorial ranting without sources makes youtube look like an encyclopedia.

Awful sources. Its spam to say the least.

Please see posts 56 & 59.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,753 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
+1 there Wolf. I was disappointed that the 5870 didn't beat the x2 cards too.

However, as it's not all that far behind and nvidia are totally obsessed with the performance crown, I will expect that nvidia's latest and greatest will comprehensively beat it and will therefore beat all the x2's as well.

I will then upgrade to it several months down the line, when the revised version (eg 280 to 285 revision) comes out and the prices have dropped to almost reasonable.

Cheers, any single gpu that can consistently beat a GTX295 has my name written all over it. I've had a 295 before (run 260 SLI now) and I wouldn't replace my GFX subsystem with any single GPU that cant at least do what I get now.

I really want to see how a 5870 does with memory clocked at 5.5+ ghz and maybe higher core, but I feel the memory bandwidth is the limiter for that card.

Somehow I fear like most of Nvidias big hitters, GT300 will pack A LOT of shaders, and A LOT of memory bandwidth.
 
Joined
Apr 2, 2009
Messages
582 (0.11/day)
System Name Flow
Processor AMD Phenom II 955 BE
Motherboard MSI 790fx GD70
Cooling Water
Memory 8gb Crucial Ballistix Tracer Blue ddr3 1600 c8
Video Card(s) 2 x XFX 6850 - Yet to go under water.
Storage Corsair s128
Display(s) HP 24" 1920x1200
Case Custom Lian-Li V2110b
Audio Device(s) Auzentech X-Fi Forte 7.1
Power Supply Corsair 850HX
Besides that, the GTX295 is only what? 55w at idle, not a whole lot more, and remember we are talking about CF here, so double the 20w to get 40w. That 15w is not going to kill you, in fact you aren't even going to notice the difference in your electric bill. And the fact of the matter is that people buying these high end cards are NEVER worried about the power consumption when idle. They don't care. Period. You are talking about people that are putting out a minimum of $450 in graphics cards, do you really think they are worried about the $10 extra a year one is going to cost over the other because of slightly higher idle power consumption? NO, they aren't.

However, power consumption under use is far more important to the people buying these cards. Why? Because it determines what power supply can be used. The different between a GTX295 and CF HD5870s might mean the difference between an 850w power supply and a 750w power supply.

Dont dismiss power savings as no big difference... thats just being ignorant of facts in innovation. If you leave your computer on 24/7 the the Idle power usage means a LOT

lets say we run the numbers on this chart and use the 20w assumed idle for the 5870 to say that the base system is 150w at idle - there you can see the 295 uses 89w at idle.... yeah, thats fricken terrible (compared to 5870 ;)) and even 2 x 5870 is less than half that power consumption.



and dont tell me small numbers are meaningless, because half is half no matter how you look at it and that small numbers adds up BIG TIME in 6months of 24/7 use
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.22/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
So not only are we looking at the performance of the GT300 but also the power and heat. This is important to remember, but in the end doesn't say which is the better card unless power consumption is wayyyyy out of proportion for the performance.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
We won't know how ATi vs. Nvidia will go this round because Nvidia has not entered the ring yet.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Dont dismiss power savings as no big difference... thats just being ignorant of facts in innovation. If you leave your computer on 24/7 the the Idle power usage means a LOT

lets say we run the numbers on this chart and use the 20w assumed idle for the 5870 to say that the base system is 150w at idle - there you can see the 295 uses 89w at idle.... yeah, thats fricken terrible (compared to 5870 ;)) and even 2 x 5870 is less than half that power consumption.
http://images.hardwarecanucks.com/image//skymtl/GPU/HD5870/HD5870-1000.jpg


and dont tell me small numbers are meaningless, because half is half no matter how you look at it and that small numbers adds up BIG TIME in 6months of 24/7 use

Yes, but we are talking about Crossfire here! That means twice the power usage of a single card! How is that hard to understand.

A single HD5870 uses 20w when idle, look at W1z's review if you don't believe me. The card alone, regardless of the rest of the system, uses 20w. Double that and you get 40w. The GTX295 uses 55w. 40w is no where near half of 55w, so where are you pulling these numbers from?

The 15w difference is next to nothing. You will never even notice the difference. We are talking maybe $1.50 a month, if you leave the computer on 24/7. These are $450+ cards, at least, so in the big picture idle power consumption really doesn't matter.

Just to give you an idea:
The current national average for electricity costs is: $0.10120 per KWH
The difference is 15w or 0.015KW.
Thats means 0.36KWH per day.
Thats 131.4KWH difference for an entire year of 24/7 idle use.
Thats $13.30 difference for an entire year of 24/7 idle use.

So yes, for a $450-600+ graphics card setup, $13.30 a year means nothing, and can really be ignored.
 
Last edited:
Joined
Apr 2, 2009
Messages
582 (0.11/day)
System Name Flow
Processor AMD Phenom II 955 BE
Motherboard MSI 790fx GD70
Cooling Water
Memory 8gb Crucial Ballistix Tracer Blue ddr3 1600 c8
Video Card(s) 2 x XFX 6850 - Yet to go under water.
Storage Corsair s128
Display(s) HP 24" 1920x1200
Case Custom Lian-Li V2110b
Audio Device(s) Auzentech X-Fi Forte 7.1
Power Supply Corsair 850HX
according to canucks the GTX 295 uses 89w, not 55

and I was talking about CF because 40w (2 5870) < 1/2 * 89w GTX295
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
according to canucks the GTX 295 uses 89w, not 55

and I was talking about CF because 40w (2 5870) < 1/2 * 89w GTX295

But it makes more sense to believe in Wizzards numbers because of 3 reasons:

1- He measures the power consumption of the card and not the whole system, and he does so with a more precise method and tools.

2- His reviews are the best ones all around.

3- You are posting in TPU. :)
 
W

wahdangun

Guest
i think SNiiPE_DoGG quite correct (although it's not really important like newtikie said) and the new HD 5870 use new method in CF, the secondary card will be shut down to sub-20 watt.
here the faq that i quoted from tomshardware :

"Speaking of CrossFire, when you have two 5870s running concurrently at idle, ATI says that secondary board will drop into an ultra-low power state (purportedly sub-20W). "

and here are the result :


so it's rather inaccurate to calculate CF HD 5870 power consumtion based on one card idle and then double it, and i think wizz should do like tomshardware because we don' now how future card behave in CF or SLI(and how the power management work if there are 2 card or more in one system) and what feature it may have to reduce the power from second card
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Ok, so even if we assume the second card uses 0w when idle(I doubt it will be that low, but lets assume).

The power savings is only $31 over the course of a year. Again, nothing that anyone buying these cards is going to be concerned with.
 
W

wahdangun

Guest
^
^ yeah i know, who ever buy CF HD 5870 will never think about power consumption but hey, a little efficiency not going hurt you, u know, and it's going to the good direction.

cheer:toast:
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
i think SNiiPE_DoGG quite correct (although it's not really important like newtikie said) and the new HD 5870 use new method in CF, the secondary card will be shut down to sub-20 watt.
here the faq that i quoted from tomshardware :

"Speaking of CrossFire, when you have two 5870s running concurrently at idle, ATI says that secondary board will drop into an ultra-low power state (purportedly sub-20W). "

and here are the result :
http://media.bestofmicro.com/1/3/224247/original/Power Consumption.png

so it's rather inaccurate to calculate CF HD 5870 power consumtion based on one card idle and then double it, and i think wizz should do like tomshardware because we don' now how future card behave in CF or SLI(and how the power management work if there are 2 card or more in one system) and what feature it may have to reduce the power from second card

Did you realise that the Crossfire setup is consuming 25W more than single HD5870 in that graph? That's from the AC source so that means the second card is consuming 25W, despite what AMD says.
 
Status
Not open for further replies.
Top