• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia cheats on 3DMark with 177.39 drivers

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Pff Inquirer BS. LOL. Calling PhysX support cheating... :shadedshu
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
So what? They enabled Physx. I don't see the issue here.
 

InnocentCriminal

Resident Grammar Amender
Joined
Feb 21, 2005
Messages
6,477 (0.93/day)
System Name BeeR 6
Processor Intel Core i7 3770K*
Motherboard ASUS Maximus V Gene (1155/Z77)
Cooling Corsair H100i
Memory 16GB Samsung Green 1600MHz DDR3**
Video Card(s) 4GB MSI Gaming X RX480
Storage 256GB Samsung 840 Pro SSD
Display(s) 27" Samsung C27F591FDU
Case Fractal Design Arc Mini
Power Supply Corsair HX750W
Software 64bit Microsoft Windows 10 Pro
Benchmark Scores *@ 4.6GHz **@ 2133MHz
That's the Inq for you, the crazy buggers starting forums 'debates' as always.

^^
 
Joined
May 19, 2007
Messages
4,520 (0.73/day)
Location
Perth AU
Processor Intel Core i9 10980XE @ 4.7Ghz 1.2v
Motherboard ASUS Rampage VI Extreme Omega
Cooling EK-Velocity D-RGB, EK-CoolStream PE 360, XSPC TX240 Ultrathin, EK X-RES 140 Revo D5 RGB PWM
Memory G.Skill Trident Z RGB F4-3000C14D 64GB
Video Card(s) Asus ROG Strix GeForce RTX 4090 OC WC
Storage M.2 990 Pro 1TB / 10TB WD RED Helium / 3x 860 2TB Evos
Display(s) Samsung Odyssey G7 28"
Case Corsair Obsidian 500D SE Modded
Power Supply Cooler Master V Series 1300W
Software Windows 11
i don't see any problem, the new nvidia card use PhysX any ways, to bad if a ATI fan boy is pissed off lol, no need to bag nvidia
 

InnocentCriminal

Resident Grammar Amender
Joined
Feb 21, 2005
Messages
6,477 (0.93/day)
System Name BeeR 6
Processor Intel Core i7 3770K*
Motherboard ASUS Maximus V Gene (1155/Z77)
Cooling Corsair H100i
Memory 16GB Samsung Green 1600MHz DDR3**
Video Card(s) 4GB MSI Gaming X RX480
Storage 256GB Samsung 840 Pro SSD
Display(s) 27" Samsung C27F591FDU
Case Fractal Design Arc Mini
Power Supply Corsair HX750W
Software 64bit Microsoft Windows 10 Pro
Benchmark Scores *@ 4.6GHz **@ 2133MHz
I can't help but feel The Inq are biased against nVIDIA, the opposite way to how Guru3D are.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,031 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
there is no more cheating in 3dmark .. just buy the futuremark seal of approval for $1M. count the days until fm changes their policy re physx drivers
 

InnocentCriminal

Resident Grammar Amender
Joined
Feb 21, 2005
Messages
6,477 (0.93/day)
System Name BeeR 6
Processor Intel Core i7 3770K*
Motherboard ASUS Maximus V Gene (1155/Z77)
Cooling Corsair H100i
Memory 16GB Samsung Green 1600MHz DDR3**
Video Card(s) 4GB MSI Gaming X RX480
Storage 256GB Samsung 840 Pro SSD
Display(s) 27" Samsung C27F591FDU
Case Fractal Design Arc Mini
Power Supply Corsair HX750W
Software 64bit Microsoft Windows 10 Pro
Benchmark Scores *@ 4.6GHz **@ 2133MHz

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,356 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
:roll:

Look at how The Inquirer placed a "Flame author" button below the article, so you could flame the author :laugh:
---
There's one way Futuremark can tide over this:

Physics testing should be done while stressing the GPU. So people with the Ageia PhysX card get a genuine advantage over those using GPU based physics acceleration. How? PhysX processing on GeForce is at the expense of graphics performance. When an ATI user benches for physics and he happens to have a PhysX card installed, he gets increments. Same with NV users.

When there's a dedicated physics test, NV users end up with an unfair advantage since it's neither the PPU nor CPU but the GPU doing the calculations, so they needn't even own a PhysX card and can get away with scores on par with ATI + PhysX card benches. This becomes a flawed bench since in real-world scenario, say when playing UT3, the graphics anyway suffers because physics is also handled by the GPU. So a synthetic bench should never differ from a game. Physics tests should be done while stressing the GPU.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
:roll:

Look at how The Inquirer placed a "Flame author" button below the article, so you could flame the author :laugh:
---
There's one way Futuremark can tide over this:

Physics testing should be done while stressing the GPU. So people with the Ageia PhysX card get a genuine advantage over those using GPU based physics acceleration. How? PhysX processing on GeForce is at the expense of graphics performance. When an ATI user benches for physics and he happens to have a PhysX card installed, he gets increments. Same with NV users.

When there's a dedicated physics test, NV users end up with an unfair advantage since it's neither the PPU nor CPU but the GPU doing the calculations, so they needn't even own a PhysX card and can get away with scores on par with ATI + PhysX card benches. This becomes a flawed bench since in real-world scenario, say when playing UT3, the graphics anyway suffers because physics is also handled by the GPU. So a synthetic bench should never differ from a game. Physics tests should be done while stressing the GPU.

In a way that's true, but also take in mind that when you stress the CPU for doing physics calculations the framerates DO SUFFER a lot as well, so you need light GPU usage in order to this not affecting the CPU physics calculations and viceversa. I think that by doing what you say you would only make CPU benchmarks look worse in comparison. Only Ati card + PPU will benefit from that. In the end I only see one way of doing this right, have separate scores.

Anyway 3DMark is not and never was a benchmark to compare the GPUs, it's the people (I was going to say reviewers at first, but taht's just not true, as most say it's pointless although they do benchmark with it) who has used them for this purpose. 3DMark is a tool to test the whole system in regards on how well it would be able to play future games. In that sense I do believe that Nvidia hardware should have that advantage in the test, as it may not be better at graphics, but right now it does look better as a gaming platform. Since AMD has decided to go Havok AND are not going to do GPU (hardware) physics just yet, I think they are lagging behin badly on this department. I'm counting the fact that most additions made to GT200 are towards better CUDA performance and almost none has been made for graphics (besides using more of the same units). GT200 DOES have those extra 30 FP64 ALUs with their registers and cache that can't be used for graphics (are only there for CUDA) and in theory could run the physics without affecting graphics performance. This puts GT200 on a disadvantage when you take physics out of the equation IMO. In fact, many of the extra silicon that makes the chip so big and uncompetitive on graphics benchmarks is due to this additions.

So what's the best way of resolving all this? I think separating the scores is a good start and then EDUCATION, people need to be properly educated on how interpret those results. I can't see any other solution to this.

EDIT: Also maybe I'm wrong, and things could have changed. But in theory you could use an Ati GPU for graphcs and a Nvidia GPU for physics, just as you do Ati + PPU, and in this case I think that the way they do it now in Vantage is legitimate, by stressing the card for phhysics "only". Again they just have to separate the scores IMO.
 
Last edited:

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,356 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
In a way that's true, but also take in mind that when you stress the CPU for doing physics calculations the framerates DO SUFFER a lot as well

I said "stress the GPU" so a scenario such as playing UT3 (where an NV GPU has to handle both graphics and physics) is created.

ATI GPU + Physics card = genuine performance advantage when doing both graphics and physics.

NVidia GPU + Physics card = genuine performance advantage when doing both graphics and physics.

but

NVidia GPU made to do only physics to add to a score = foul. Why? Because in real world, if you have just a single NV GPU, the GPU does both graphics and physics when playing a game that uses Ageia PhysX.

So people with just a NV GPU end up getting a higher 3DMark score just because their GPU switched hats, which just can't happen in a real scenario where it's made to do both graphics and physics.


So, I say stress the GPU. Make it do anything, make it render Pamela's boobs with heavy textures, shading and bump-mapping, just stress it. Ideally stressing the GPU shouldn't affect a PhysX card when it's made to do the PhysX, but when the GPU is stressed, the GPU won't be able to compute physics just as well, or none at all. So you don't have a 8800 GT pwning a HD4850 in 3DMark Vantage just because it rocks in the physics test.

And if you're talking about the 3D scene that the GPU is made to render during the physics test, and it being affected if the GPU is stressed, simple solution is that you also take into account the GPU score when rendering the 3D scene that's part of the physics test instead of just stressing it. Normally, during the physics test, the GPU's performance isn't evaluated.
 

InnocentCriminal

Resident Grammar Amender
Joined
Feb 21, 2005
Messages
6,477 (0.93/day)
System Name BeeR 6
Processor Intel Core i7 3770K*
Motherboard ASUS Maximus V Gene (1155/Z77)
Cooling Corsair H100i
Memory 16GB Samsung Green 1600MHz DDR3**
Video Card(s) 4GB MSI Gaming X RX480
Storage 256GB Samsung 840 Pro SSD
Display(s) 27" Samsung C27F591FDU
Case Fractal Design Arc Mini
Power Supply Corsair HX750W
Software 64bit Microsoft Windows 10 Pro
Benchmark Scores *@ 4.6GHz **@ 2133MHz
Well said btarunr!
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I said "stress the GPU" so a scenario such as playing UT3 (where an NV GPU has to handle both graphics and physics) is created.

ATI GPU + Physics card = genuine performance advantage when doing both graphics and physics.

NVidia GPU + Physics card = genuine performance advantage when doing both graphics and physics.

but

NVidia GPU made to do only physics to add to a score = foul. Why? Because in real world, if you have just a single NV GPU, the GPU does both graphics and physics when playing a game that uses Ageia PhysX.

So people with just a NV GPU end up getting a higher 3DMark score just because their GPU switched hats, which just can't happen in a real scenario where it's made to do both graphics and physics.


So, I say stress the GPU. Make it do anything, make it render Pamela's boobs with heavy textures, shading and bump-mapping, just stress it. Ideally stressing the GPU shouldn't affect a PhysX card when it's made to do the PhysX, but when the GPU is stressed, the GPU won't be able to compute physics just as well, or none at all. So you don't have a 8800 GT pwning a HD4850 in 3DMark Vantage just because it rocks in the physics test.

Yeah, but I will repeat myself.

1- That would only remedy the Ati/Nvidia card + PPU advantage over only a Nvidia GPU. By stressing the GPU you would make the CPU to be stressed too and wouldn't be able to perform as much physics calculations.

2- You are again taking 3Dmark as a graphics card benchmark, which it isn't. By stressing the GPU you would negate the advantage of Nvidia cards when doing physics. The final score wouldn't reflect the real advantage on this feature. Nvidia cards are going to be two things, a graphics card and a PPU. And could be used for both purposes, by stressing the GPU, you won't reflect the real power when doing physics.

The problem is this: which would be the good mix of graphics load and physics load? For example, many people could preffer having twice as much physics objects or interactions, and not having more graphical greatness. I say separate the results and let people decide which is the best solution for them. As I said above. Ageia PPU doesn't exist anymore AFAIK. They are supporting them, but they are not selling them anymore. You need a Nvidia GPU for that and use it as a PPU if you will. <-- Then you need a way to know it's real advantage over CPU physics isn't it??

EDIT: As I don't think you have undertood, this point I'm going to explain it: I'm not saying the scores are OK as they are now. But I do not think what you say would be legitimate either. 3DMark IS NOT A GPU benchmark. Never forget this!!
 
V

v-zero

Guest
Of course it's cheating, I thought people already knew this?! As if it's fair to use a GPU's full power in a physics benchmark, when in a game (virtually) all that power would go to graphics processing. Now if there was another graphics card in the system not in SLi that could be used for physics, like an 8600gts and a 9800gtx side by side... Of course nVidia have to make the drivers...
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
IMO is as much cheating (0,1% of cheating in a 100% scale, lol) as SM3.0 was when it was first implemented.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,356 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Yeah, but I will repeat myself.

1- That would only remedy the Ati/Nvidia card + PPU advantage over only a Nvidia GPU. By stressing the GPU you would make the CPU to be stressed too and wouldn't be able to perform as much physics calculations.

2- You are again taking 3Dmark as a graphics card benchmark, which it isn't. By stressing the GPU you would negate the advantage of Nvidia cards when doing physics. The final score wouldn't reflect the real advantage on this feature. Nvidia cards are going to be two things, a graphics card and a PPU. And could be used for both purposes, by stressing the GPU, you won't reflect the real power when doing physics.

The problem is this: which would be the good mix of graphics load and physics load? For example, many people could preffer having twice as much physics objects or interactions, and not having more graphical greatness. I say separate the results and let people decide which is the best solution for them. As I said above. Ageia PPU doesn't exist anymore AFAIK. They are supporting them, but they are not selling them anymore. You need a Nvidia GPU for that and use it as a PPU if you will. <-- Then you need a way to know it's real advantage over CPU physics isn't it??

EDIT: As I don't think you have undertood, this point I'm going to explain it: I'm not saying the scores are OK as they are now. But I do not think what you say would be legitimate either. 3DMark IS NOT A GPU benchmark. Never forget this!!

I know, 3DMark is not just a GPU benchmark, but it should emulate a real-world game scenario, because its credibility as a "gamers' benchmark" is at stake. You wouldn't want a 8800 GT to own a HD4850 just because in one particular test, the physics processing abilities are tested, where the GPU is doing it (and the GPU does it better than a CPU), and in that particular test, the graphics performance isn't taken into account. Hence the solution I see is a new game test which is composed of both graphics (full on graphics with SM 4.0, HDR) plus PhysX thrown in, so that an ideal setup comparable to when you're playing UT3 or any of the modern PhysX inclusive games is created. This game test should be supplied as a free downloadable patch for all registered users, the "physics test" should be scrapped. from Vantage.

And oh, it's not that AMD 'adopted Havoc and are sticking to it' but that Havoc runs on any CPU with SSE2. Just that the Havoc process is multi-threaded and AMD worked closely with the Havoc team to make sure that on a Phenom, one core is dedicated for the Havoc process, Intel did that way back in early 2007.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
I am running the Nvidia PhysX support with my 177.39 Geforce drivers and it speeds up the gameplay in Crysis by alot 7+frames with my 9800GX2. Its not cheating because it affects my performance in games, not just benches. I haven't benched in ages just because I have more interest in how it effects my games than anything else.


I think it helps multi-GPU's to scale better from what I have seen.
 
Joined
Jan 24, 2008
Messages
888 (0.15/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
1. 3DMark is not a game and all the people that use 3DMark to defend their statements are stupid.
2. PhysX support is not cheating, that's complete BS. Nvidia bought the company for a good reason.
 
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
nvidia puts GPU physics score in CPU score?
isnt that cheating, Vantage was made for CPU physics, NOT GPU
the thing is, NVIDIA owns the API, they own the driver, they own the card
makes no sense, the NVIDIA 3dmark vantage scores arnt even Offical scores heck they have to get approved drivers to get an offical score!
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,356 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Unless there's 100% fair-play, where 3DMark only evaluates those features that both ATI and NV support, 3DMark is a flawed benchmark, since it's not evaluating the physics processing abilities of a computer in a scenario identical to games.
 
Joined
Jan 24, 2008
Messages
888 (0.15/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
nvidia puts GPU physics score in CPU score?
isnt that cheating, Vantage was made for CPU physics, NOT GPU
the thing is, NVIDIA owns the API, they own the driver, they own the card
makes no sense, the NVIDIA 3dmark vantage scores arnt even Offical scores heck they have to get approved drivers to get an offical score!

Every future Nvidia driver will have PhysX support. Then eveyr future driver will not be approved. That's the end of Nvidia benchmarks in 3DMark and hopefully the end of 3DMark all together. Wonderfull if you put it like this.
 
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
Every future Nvidia driver will have PhysX support. Then eveyr future driver will not be approved. That's the end of Nvidia benchmarks in 3DMark and hopefully the end of 3DMark all together. Wonderfull if you put it like this.
then futuremark might just remove PhysX all togheter from 3dmark vantage, end of story
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I know, 3DMark is not just a GPU benchmark, but it should emulate a real-world game scenario, because its credibility as a "gamers' benchmark" is at stake. You wouldn't want a 8800 GT to own a HD4850 just because in one particular test, the physics processing abilities are tested, where the GPU is doing it (and the GPU does it better than a CPU), and in that particular test, the graphics performance isn't taken into account. Hence the solution I see is a new game test which is composed of both graphics (full on graphics with SM 4.0, HDR) plus PhysX thrown in, so that an ideal setup comparable to when you're playing UT3 or any of the modern PhysX inclusive games is created. This game test should be supplied as a free downloadable patch for all registered users, the "physics test" should be scrapped. from Vantage

You are still seeing 3DMark as a graphics benchmark. IS NOT. And I'll tell you why a 8800GT could be better for gaming and in that sense it would be legitimate to it having a better score. I will use one particular example and use that 8800GT - HD4850 scenario you mentioned. Let's imagine a future game that wants to rely a lot on physics instead of on graphics (because could enhance gameplay a lot) and thus in the case of the 8800 GT uses 50% for physics and 50% for graphics. That's in theory a lot more physics horsepower than what the Ageia PPU could do. That makes the game to have UT3 like graphics and lots of physics. Here comes the first problem, by having only one HD4850 playing that game (or with that physics greatness) is already impossible! So in this case the 8800 GT has an infinite better value on its own than the Radeon. Is it legitimate that a GAMING (NOT GRAPHICS) BENCHMARK shows the 8800 GT as the better piece of hardware? In this case ABSOLUTELY YES. From that perspective current scores could be well oriented. I myself think is not legitimate to pressupose that as a valid OVERALL score, when we don't know if future games are going to be like that.

Bottom line is you can't say those overal scores are legitimate, but you can't mix GPU and PPU* capabilities in the physics test either, because you would be cripping any PPU advantage in the case the cards where slow at rendering. GPU and PPU capabilities need to be separated and neither as they are doing it now, like: GPU and CPU scores, but physics scores apart, do them on CPU, PPU or GPU.

*When I'm saying PPU, I mean any hardware doing the calculations. Be it the Ageia card or Nvidia GPU.
 

panchoman

Sold my stars!
Joined
Jul 16, 2007
Messages
9,595 (1.57/day)
Processor Amd Athlon X2 4600+ Windsor(90nm) EE(65W) @2.9-3.0 @1.45
Motherboard Biostar Tforce [Nvidia] 550
Cooling Thermaltake Blue Orb-- bunch of other fans here and there....
Memory 2 gigs (2x1gb) of patriot ddr2 800 @ 4-4-4-12-2t
Video Card(s) Sapphire X1950pro Pci-E x16 @stock@stock on stock
Storage Seagate 7200.11 250gb Drive, WD raptors (30/40) in Raid 0
Display(s) ANCIENT 15" sony lcd, bought it when it was like 500 bucks
Case Apevia X-plorer blue/black
Audio Device(s) Onboard- Why get an sound card when you can hum??
Power Supply Antec NeoHe 550-manufactured by seasonic -replacement to the discontinued smart power series
Software Windows XP pro SP2 -- vista is still crap
this is cheating, open your eye people.. the GPU should have NOTHING to do with the CPU score.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
Yeah, we all know 2900's had drivers optimized for benches rather than games. I don't sit around and bench all day. That is not what a gaming rig is for.:laugh:

I just see it as competition between the companies. A higher score doesn't mean a better card. We all know that.;) Those benches are just to give you a vague idea of what to expect in games.

Remember how much 3dmark 06 is CPU dependent. It doesn't reflect what you see in games very well. We don't call Quads cheating though. :)
 
Top