• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Joined
Nov 4, 2005
Messages
11,683 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Oh please, now you are just making shit up. Get your facts straight, nVidia has equaled or bettered ATi in price to performance for at least the last two generations in every performance segment the two competed in.

At least you admit that the rest is true. Knowing you have a problem is the first step.

BTW, you han't even posted when I was writing that post, and you almost jumped into it, as if some of us could tell what you were going to say and do..........
 
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
"Running just fine" means it has good performance. Simply running with shitty performance would be far from "fine" would you say?

So you can imagine a game with hw physx running better on ati hw than it's competitor from nvidia?
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
+1

Bullshit, the tests with hacked drivers were showing PhysX running just fine on ATi hardware.

You are seriously over estimating the power required to run PhysX, any current ATi hardware would have been able to completely kill in PhysX performance. Remember, the original hardware the PhysX API ran on was 128MB PCI cards...


Yeah, I also remember the reports of PhysX running quite well on AMD with a driver wrapper. This whole restriction is due to politics/business and nothing else.

Apparently, PhysX was offerred to AMD but they didn't want it, because it was "nvidia" stuff, but don't quote me on it.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
At least you admit that the rest is true. Knowing you have a problem is the first step.

No, I've just already address the rest previously, and don't feel like repeating myself.

So you can imagine a game with hw physx running better on ati hw than it's competitor from nvidia?

With PhysX there really isn't a "better", it really just kind of works or it doesn't. Obviously, there are some slight variations, largely dependent on the amount of physic being calculated using PhysX, which is why nVidia has started to increase the requirements for the video cards that run PhysX. And why Batman actually has different PhysX levels that require different levels of performance from the PhysX card.

However, at the time when the hacked drivers surface, it either just worked or it didn't. The performance issues came down to rendering the acutal graphics more than PhysX. So we never really got to test if the higher levels of PhysX in modern games like Batman would really be hindered on an ATi card.

Even if development did continue once the hacked drivers were released, I don't think we would ever see PhysX running better on ATi than nVidia, simply because developement for the ATi side was severally delayed. However, I do believe that similar cards from the two would have performed similarly. Not identical but similarly.
 
Joined
Oct 19, 2007
Messages
8,195 (1.36/day)
Processor Intel i9 9900K @5GHz w/ Corsair H150i Pro CPU AiO w/Corsair HD120 RBG fan
Motherboard Asus Z390 Maximus XI Code
Cooling 6x120mm Corsair HD120 RBG fans
Memory Corsair Vengeance RBG 2x8GB 3600MHz
Video Card(s) Asus RTX 3080Ti STRIX OC
Storage Samsung 970 EVO Plus 500GB , 970 EVO 1TB, Samsung 850 EVO 1TB SSD, 10TB Synology DS1621+ RAID5
Display(s) Corsair Xeneon 32" 32UHD144 4K
Case Corsair 570x RBG Tempered Glass
Audio Device(s) Onboard / Corsair Virtuoso XT Wireless RGB
Power Supply Corsair HX850w Platinum Series
Mouse Logitech G604s
Keyboard Corsair K70 Rapidfire
Software Windows 11 x64 Professional
Benchmark Scores Firestrike - 23520 Heaven - 3670
I like how only the Ati ppl are bitching about this.
 
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
With PhysX there really isn't a "better", it really just kind of works or it doesn't. Obviously, there are some slight variations, largely dependent on the amount of physic being calculated using PhysX, which is why nVidia has started to increase the requirements for the video cards that run PhysX. And why Batman actually has different PhysX levels that require different levels of performance from the PhysX card.

However, at the time when the hacked drivers surface, it either just worked or it didn't. The performance issues came down to rendering the acutal graphics more than PhysX. So we never really got to test if the higher levels of PhysX in modern games like Batman would really be hindered on an ATi card.

Even if development did continue once the hacked drivers were released, I don't think we would ever see PhysX running better on ATi than nVidia, simply because developement for the ATi side was severally delayed. However, I do believe that similar cards from the two would have performed similarly. Not identical but similarly.


Ah yes, and a bit later when physx becomes an industry standard, because ati has accepted nvidia's "generous" offer, the user looks at the graps and every game that uses hw physx would show more fps on nvidia hw. It would work on ati hw too, but slower.

Development on ati side? What development? Nvidia would develop physx for ati hw if ati would accept it? LOL You're very naive.
 

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.04/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
nVidia disabling PhysX when an ATi card is present was a dick move. I'll be the first to say that. However, I can understand their frustration, and the reasons behind it. You are too quick to forget that nVidia actually wanted to get PhysX running natively on ATi hardware(no nVidia hardware required). And it was ATi that blocked the effort in any way possible. The fact is that nVidia was trying to be very helpful in getting PhysX/CUDA running on ATi hardware. Was the reasoning because it would benefit nVidia in the fight against Intel, and not just out of the goodness of their hearts? Probably, but who cares? The point was that nVidia was trying to get PhysX/CUDA working on ATi hardware. And after ATi blocked them at every turn, even going as far as not providing review samples to the review site that was responsible for the original hacked drivers...
HAHAHA , are you crazy man , do you take us for some very stupid people ?
It's a given that Nvidia would ask some major licesing money if AMD/ATI wants physx , it's a given they would try to make it run like shit on AMD's hardware and try to ruin them with renewed contracts about licensing this tehnology , it was never an option for AMD/ATI to use a tehnology bought by Nvidia from Ageia with who knows how many millions of dollars.
What you say is just plain stupid and you insult our intelligence , also AMD/ATI said they were never called by Nvidia about physx.
 
Joined
Jun 17, 2007
Messages
7,335 (1.19/day)
Location
C:\Program Files (x86)\Aphexdreamer\
System Name Unknown
Processor AMD Bulldozer FX8320 @ 4.4Ghz
Motherboard Asus Crosshair V
Cooling XSPC Raystorm 750 EX240 for CPU
Memory 8 GB CORSAIR Vengeance Red DDR3 RAM 1922mhz (10-11-9-27)
Video Card(s) XFX R9 290
Storage Samsung SSD 254GB and Western Digital Caviar Black 1TB 64MB Cache SATA 6.0Gb/s
Display(s) AOC 23" @ 1920x1080 + Asus 27" 1440p
Case HAF X
Audio Device(s) X Fi Titanium 5.1 Surround Sound
Power Supply 750 Watt PP&C Silencer Black
Software Windows 8.1 Pro 64-bit
Joined
Mar 11, 2009
Messages
865 (0.16/day)
Location
Dawn
I like how only the Ati ppl are bitching about this.
It makes sense that it would work that way. :p However, I've been an invida user for a while now, and I'm "bitching" about it too.:nutkick: I'm pretty sick of their shenanigans, and I'm jumping ship...Ironically that makes me a potential ATI user, so based on that, you can lump me in. :D
 
Joined
Jul 19, 2006
Messages
43,587 (6.72/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Yes, and the sole reason I stopped using Nvidia was due to their business practices. Stupid me, I went and bought a GTX 260 anyways. Great card, physX was neat. Got bored, bought another ATi card. Plus, when I buy ATi, the money stays closer to home.

I like how only the Ati ppl are bitching about this.

What are Ati people? Meh, I could just go buy a Nvidia card but have too many reasons to not want one.

*I just realized, I'm in the wrong thread. :eek:
 
Last edited:
Joined
Oct 19, 2007
Messages
8,195 (1.36/day)
Processor Intel i9 9900K @5GHz w/ Corsair H150i Pro CPU AiO w/Corsair HD120 RBG fan
Motherboard Asus Z390 Maximus XI Code
Cooling 6x120mm Corsair HD120 RBG fans
Memory Corsair Vengeance RBG 2x8GB 3600MHz
Video Card(s) Asus RTX 3080Ti STRIX OC
Storage Samsung 970 EVO Plus 500GB , 970 EVO 1TB, Samsung 850 EVO 1TB SSD, 10TB Synology DS1621+ RAID5
Display(s) Corsair Xeneon 32" 32UHD144 4K
Case Corsair 570x RBG Tempered Glass
Audio Device(s) Onboard / Corsair Virtuoso XT Wireless RGB
Power Supply Corsair HX850w Platinum Series
Mouse Logitech G604s
Keyboard Corsair K70 Rapidfire
Software Windows 11 x64 Professional
Benchmark Scores Firestrike - 23520 Heaven - 3670
Thats a generalization, I personally could care less. This is nothing new.
I could care less too.
It makes sense that it would work that way. :p However, I've been an invida user for a while now, and I'm "bitching" about it too.:nutkick: I'm pretty sick of their shenanigans, and I'm jumping ship...Ironically that makes me a potential ATI user, so based on that, you can lump me in. :D
I could care less what they do. I mainly use their cards because they are usually the best by the time i need a new card and they use a black PCB. I hate red PCB thus refuse to buy ATI cards. Thats really the only reason I never used ATI. Retarded I know but i like things to match. But now that XFX is making ATI cards and make them in black PCB, ATI is a possibility for me now. :)

And if your an nVIDIA user and your tired of their "shenanigans", then jump ship so they can stop helping benefiting you.
Yes, and the sole reason I stopped using Nvidia was due to their business practices. Stupid me, I went and bought a GTX 260 anyways. Great card, physX was neat. Got bored, bought another ATi card. Plus, when I buy ATi, the money stays closer to home.

What are Ati people? Meh, I could just go buy a Nvidia card but have too many reasons to not want one.
And im sure many others have many reasons not to go buy an ATI card. Its user preference.
 
Joined
Aug 22, 2008
Messages
2,304 (0.40/day)
Location
Edmonton, Alberta
System Name AMD | Intel | Chumpy
Processor PHII 955BE Stock | i7 920 D0 4.01 GHz | i7 920 D0 4.01 GHz
Motherboard MSI 790FX-GD70 | EX58 - UD5 | E760 4 Way SLI
Cooling Zalman 9700 CNPS | Water Loop | Water Loop
Memory 4 GB XMS3 1600 MHz | 6 GB Dominators 1600 MHz | 6 GB Dominators 1866 MHz
Video Card(s) 3 x 9600GSO, GTX260 216 | 2 x GTX 260 216 | GTX 260 216, 9600 GSO
Storage WD 640GB | Couple o' 5400RPMs | WD 1TB
Case Cosmos S | Lancool K62 Dragonlord | Lian Li PC-P80 Armor
Power Supply TX850 | HX 1000 | HX 1000
Software Win 7 Home Premium | Win 7 Ultimate | Vista Home Premium
I'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.

Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.
 

BigBruser13

New Member
Joined
Apr 30, 2008
Messages
36 (0.01/day)
Location
Portland OR
System Name V64PC
Processor Q9550
Motherboard Rampage Extreme
Cooling water
Memory 2x2gb XP3 mushkin
Video Card(s) Ati 4870 x2
Storage 2x 150gb raptor
Display(s) acer 1920x1200
Case cooler master 830
Audio Device(s) n/a
Power Supply x3 ultra
Software Vista x64 ultimate
Benchmark Scores 3dmark 06 19899
What would you do?

If nividia comes out with a monster GPU. would you buy even if they did many more (imagine) unethical things. Or buy ati because they are good not great but honest?

I honestly am on the fence at this point, leaning towards the honest
 
Last edited:
Joined
Oct 19, 2007
Messages
8,195 (1.36/day)
Processor Intel i9 9900K @5GHz w/ Corsair H150i Pro CPU AiO w/Corsair HD120 RBG fan
Motherboard Asus Z390 Maximus XI Code
Cooling 6x120mm Corsair HD120 RBG fans
Memory Corsair Vengeance RBG 2x8GB 3600MHz
Video Card(s) Asus RTX 3080Ti STRIX OC
Storage Samsung 970 EVO Plus 500GB , 970 EVO 1TB, Samsung 850 EVO 1TB SSD, 10TB Synology DS1621+ RAID5
Display(s) Corsair Xeneon 32" 32UHD144 4K
Case Corsair 570x RBG Tempered Glass
Audio Device(s) Onboard / Corsair Virtuoso XT Wireless RGB
Power Supply Corsair HX850w Platinum Series
Mouse Logitech G604s
Keyboard Corsair K70 Rapidfire
Software Windows 11 x64 Professional
Benchmark Scores Firestrike - 23520 Heaven - 3670
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
I'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.

Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.

I don't think ati has any word in a game development which is under TWIMTBP program. But i could be wrong.
 
Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
I'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.

Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.

Just for the record, and since you didn't read the 160+ replies (which is completely understandable) - Batman:AA runs on Unreal 3.5
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Ah yes, and a bit later when physx becomes an industry standard, because ati has accepted nvidia's "generous" offer, the user looks at the graps and every game that uses hw physx would show more fps on nvidia hw. It would work on ati hw too, but slower.

Development on ati side? What development? Nvidia would develop physx for ati hw if ati would accept it? LOL You're very naive.

There is really nothing to show that PhysX would run any worse on ATi hardware, really. While it might run worse in the future, I doubt that the hardware in the future will actually struggle.

If an HD3870 could stomp through PhysX with no issue back then, I doubt something like an HD5870 would stuggle today. Or even something from the HD4800 series.

HAHAHA , are you crazy man , do you take us for some very stupid people ?
It's a given that Nvidia would ask some major licesing money if AMD/ATI wants physx , it's a given they would try to make it run like shit on AMD's hardware and try to ruin them with renewed contracts about licensing this tehnology , it was never an option for AMD/ATI to use a tehnology bought by Nvidia from Ageia with who knows how many millions of dollars.
What you say is just plain stupid and you insult our intelligence , also AMD/ATI said they were never called by Nvidia about physx.

Do a little research. There would have been no licencing fee for ATi, the only thing ATi would have had to do was support the developement. The PhysX API, engine, and SDK are provided to anyone that wants to use them, free of charge from nVidia. The hardware developer just has to provide hardware drivers that support it.

Again, nVidia was more than willing to help the developer get PhysX/CUDA running on ATi hardware, no licencing or fees involved at all. They were not going to do it themselve, but they were willing to help the devloper that wanted to do it. The problem was that ATi refused to help in any way.
 
Joined
Aug 22, 2008
Messages
2,304 (0.40/day)
Location
Edmonton, Alberta
System Name AMD | Intel | Chumpy
Processor PHII 955BE Stock | i7 920 D0 4.01 GHz | i7 920 D0 4.01 GHz
Motherboard MSI 790FX-GD70 | EX58 - UD5 | E760 4 Way SLI
Cooling Zalman 9700 CNPS | Water Loop | Water Loop
Memory 4 GB XMS3 1600 MHz | 6 GB Dominators 1600 MHz | 6 GB Dominators 1866 MHz
Video Card(s) 3 x 9600GSO, GTX260 216 | 2 x GTX 260 216 | GTX 260 216, 9600 GSO
Storage WD 640GB | Couple o' 5400RPMs | WD 1TB
Case Cosmos S | Lancool K62 Dragonlord | Lian Li PC-P80 Armor
Power Supply TX850 | HX 1000 | HX 1000
Software Win 7 Home Premium | Win 7 Ultimate | Vista Home Premium
Just for the record, and since you didn't read the 160+ replies (which is completely understandable) - Batman:AA runs on Unreal 3.5

Right, and through all my searching I can't find where it says AA is natively supported in 3 or 3.5. If its added in by the developers and its co-developed by NVIDIA then there is no problem. Does anyone have the spec sheet that says UE3.5 has native AA in the engine?
 
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
There is really nothing to show that PhysX would run any worse on ATi hardware, really. While it might run worse in the future, I doubt that the hardware in the future will actually struggle.

If an HD3870 could stomp through PhysX with no issue back then, I doubt something like an HD5870 would stuggle today. Or even something from the HD4800 series.

What i'm talking about is not a hardware issue. If a hw physx title would run slower on ati hw is not because it is actually weaker hw. It is because the source code is in nv hand, and wouldn't let ati win. It's fully logical.
If i own a technology, then i own the tools, to be the best at any circumstances.

I really can't explain myself better, my English isn't so good. But if you're right, and nvidia is so generous as you describe them, then it is time them to port physx from cuda to opencl, and physx source code will be free for everyone.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
What i'm talking about is not a hardware issue. If a hw physx title would run slower on ati hw is not because it is actually weaker hw. It is because the source code is in nv hand, and wouldn't let ati win. It's fully logical.
If i own a technology, then i own the tools, to be the best at any circumstances.

I really can't explain myself better, my English isn't so good. But if you're right, and nvidia is so generous as you describe them, then it is time them to port physx from cuda to opencl, and physx source code will be free for everyone.

I get what you are saying, but what I'm saying, is there is really no way for nVidia to do this. PhysX takes so little GPU power to run, that it wouldn't be feasable.

There are several things you have to consider. The fact that an outside developer was the one doing the devloping, he was just being assisted by nVidia after his initial breakthrough. They essentially were providing him with any documentation and development tools he needed.

Also, CUDA is designed by its nature to be hardware independent. Once the hardware vender writes the driver to support CUDA, it will work. There really isn't a whole lot nVidia can do to make it perform worse on one over the other, and if they did, it would immediately send up red flags because the difference would be drastic.
 
Joined
Aug 22, 2008
Messages
2,304 (0.40/day)
Location
Edmonton, Alberta
System Name AMD | Intel | Chumpy
Processor PHII 955BE Stock | i7 920 D0 4.01 GHz | i7 920 D0 4.01 GHz
Motherboard MSI 790FX-GD70 | EX58 - UD5 | E760 4 Way SLI
Cooling Zalman 9700 CNPS | Water Loop | Water Loop
Memory 4 GB XMS3 1600 MHz | 6 GB Dominators 1600 MHz | 6 GB Dominators 1866 MHz
Video Card(s) 3 x 9600GSO, GTX260 216 | 2 x GTX 260 216 | GTX 260 216, 9600 GSO
Storage WD 640GB | Couple o' 5400RPMs | WD 1TB
Case Cosmos S | Lancool K62 Dragonlord | Lian Li PC-P80 Armor
Power Supply TX850 | HX 1000 | HX 1000
Software Win 7 Home Premium | Win 7 Ultimate | Vista Home Premium
I like how all the nvidia users are defending nvidia to the death :rolleyes: :laugh:


Just like ATI users defend ATI against baseless claims.

I've looked and I can't find once where it says Anti - Aliasing is natively supported in any of Unreal Engine's current iterations. In fact all I find are threads lamenting how UE3xx doesn't support AA at all unless done through hardware. That means NVIDIA paid extra money to get it put in, and it would be stupid of them to allow it to ATI users too. Why? Because ATI isn't paying for it, NVIDIA is. They didn't remove a feature. They added a feature for their own market. ATI didn't follow suite and add AA for their market, not they've been 'foul played'.

I find it odd that this Ian McNaughton guy is putting forward this half truth, and if I'm correct I've actually lost respect for ATI in this case because of it. Again, if anyone can prove otherwise (that UE3.5 supports AA and NVIDIA removed usage of AA for ATI instead of adding AA for their own buyers) then I'll retract my claims.

Until then it looks like NVIDIA actually did the gaming market a favor adding by AA, and is owed an apology by roughly 85% of this thread. I wouldn't bother waiting for an apology if I were them though.
 
Joined
Oct 21, 2004
Messages
203 (0.03/day)
System Name Gaming Rig
Processor Phenom II 940BE @ 3.7ghz
Motherboard ASUS M3A78-T
Cooling Coolermaster V8
Memory 4 x 2gb DDR2 800mhz
Video Card(s) Sapphire 5870 (Asus bios)
Storage 2TB SEAGATE SATA2
Display(s) Samsung T240 24" Widescreen
Case Coolermaster Cosmos S
Audio Device(s) Creative X-Fi extreme music
Power Supply Corsair TX 850W
Software Windows 7 Ultimate 64bit
Just wasted 1 hour reading this thread. :rolleyes:
One thing is for sure. nVidia is always involved in this shady tactics.
 
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Just like ATI users defend ATI against baseless claims.

I've looked and I can't find once where it says Anti - Aliasing is natively supported in any of Unreal Engine's current iterations. In fact all I find are threads lamenting how UE3xx doesn't support AA at all unless done through hardware. That means NVIDIA paid extra money to get it put in, and it would be stupid of them to allow it to ATI users too. Why? Because ATI isn't paying for it, NVIDIA is. They didn't remove a feature. They added a feature for their own market. ATI didn't follow suite and add AA for their market, not they've been 'foul played'.

I find it odd that this Ian McNaughton guy is putting forward this half truth, and if I'm correct I've actually lost respect for ATI in this case because of it. Again, if anyone can prove otherwise (that UE3.5 supports AA and NVIDIA removed usage of AA for ATI instead of adding AA for their own buyers) then I'll retract my claims.

Until then it looks like NVIDIA actually did the gaming market a favor adding by AA, and is owed an apology by roughly 85% of this thread. I wouldn't bother waiting for an apology if I were them though.

I agree, but it's even worse IMO. From what I read they have discovered all this after the game has launched!!! That means they had no contact with the developer at all! I mean if you are a GPU maker, don't you contact developers and try to optimize before launch or at least start working on the optimization of the full game before it launches? Don't you ask for a copy? IMO if they cared so little about that game that they didn't even contact them, AMD deserves every bit of unoptimized code they get. Especially if it comes from a feature that has never been there and was developed for Nvidia at their request, paid by their money. The fact that the optimization works on Ati cards as well, changes nothing IMO. If I was the developer I would have done the same.
 
Top