With nVidia:
* Open nVidia control panel.
* Set the HDTV as output device.
With ATi:
* Open CCC.
* Set the HDTV as the output device.
But wait...the image has huge black boarders around it, despite having the proper resolution. Where is that stupid option that I only happen to know about because I've seen the issue here before...
* Hunt through CCC and finally find the under/overscan option under Scaling Options.
* Set it to 0%.
__________________________________________________________________________________________________________________________________________________________
With nVidia:
* Connect SPD/IF cable from sound card to graphics card.
* Select SPD/IF as defaut sound output device.
With ATi:
* Install another driver for the sound card on the graphics card.
* Select Digital Audo(HDMI) as the default sound output device.
Wait...I'm still not getting sound...Well there is nother playback device called ATi DP Output, but it says it is disconnected...hmmm...
* Swap out HDMI adaptor.
* Swap out HDMI adaptor again.
Hey...that ATi DP Output says it is connected now...
* Select ATi DP Output as the default sound output device.
__________________________________________________________________________________________________________________________________________________________
I do have a problem with ATi's naming of those devices in the Playback devices, as what is labelled as HDMI is not actually what you select.
__________________________________________________________________________________________________________________________________________________________
The simple fact of the matter is that nVidia does not have the scaling issue, they do not have idiotic backward ass names for their audio outputs(though the new ones might I'll have to check and I get my GTX470), and they do not require a special HDMI adaptors.
__________________________________________________________________________________________________________________________________________________________
That is the thing, I don't see how it is better, or why anyone would say so. No one can give me a decent reason. I feel like the people saying it is better are probably the same that would have you believe Macs are better...they give the same BS reasoning..."It just works". Well it might for some, but it doesn't for all, and the nVidia solution does(or at least did). There are are not over/under scan issue with nVidia, not on any of my TVs or any of the cards I've used. It is a feature left over from CRT TVs, that should never have been implemented on anything LCD related.
__________________________________________________________________________________________________________________________________________________________
No, I prefer connecting a standard SPD/IF cable over using a non-standard HDMI to DVI adaptor. And since I use a 6ft standard DVI to HDMI cable on my 60" TV in the living room to connect my HTPC to the TV, I would have to pull the entire TV out to get to the back to change this cable if I wanted to use an ATi card because they don't use a standard adaptor and the cable doesn't work...works fine with my nVidia card though.
The only thing "superior" about ATi's method is possible the sound quality, but I(and probably 90% of people using this) are connecting it to a stereo TV, so it doesn't really matter.