• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RealTemp General Discussion

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
I've been working hard to finalize 3.60 and to get it uploaded here at TPU but I don't have access to any new hardware so it takes me forever to test new features and get any meaningful feedback from users. It's been like this for the last year. When I have the time and am motivated to do some programming, it takes too long to find out what new features work and which ones don't. The result is that I am endlessly waiting and end up working on other projects and lose interest in project RealTemp.

If you are a dedicated tester like my friend burebista is and if you have access to some new Core i hardware, desktop or mobile, then send me a PM and let me know that you would like to help.
 

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
RealTemp 3.60
http://www.techpowerup.com/downloads/1872/Real_Temp_3.60.html

It took a long time to finish it but it's finally official.

The most recent addition is the ability to control the turbo multiplier and turbo TDP/TDC values in the newer Core i Extreme and K series CPUs. Intel plans for more K series CPUs in the near future so hopefully this will work with them.
 
Joined
Mar 21, 2005
Messages
1,580 (0.23/day)
Location
Maribor, Slovenia, EU
System Name Core i9 rig / Lenovo laptop
Processor Core i9 10900X / Core i5 8350U
Motherboard Asus Prime X299 Edition 30 / Lenovo motherboard
Cooling Corsair H115i PRO RGB / stock cooler
Memory Gskill 4x8GB 3600mhz / 16GB 2400mhz
Video Card(s) Asus ROG Strix RTX 2080 Super / UHD 620
Storage Samsung SSD 970 PRO 1TB / Samsung OEM 256GB NVMe
Display(s) Dell UltraSharp UP3017 / Full HD IPS touch
Case Coolermaster mastercase H500M
Audio Device(s) Onboard sound
Power Supply Enermax Platimax 1700 watt / Lenovo 65watt power adapter
Mouse Logitech M500s
Keyboard Cherry
Software Windows 11 Pro / Windows 11 Pro
Looks like i've found a bug:



RealTemp and CPU-z don't report the same frequencies. btw Intel's power saving is disabled
 

Super Sarge

New Member
Joined
Aug 16, 2009
Messages
170 (0.03/day)
Location
Jordan MN
System Name Play Toy
Processor I7 920 OC 21*166 , Voltage CPU 1.11875 Stepping is D0
Motherboard ASUS P6T Deluxe V2
Cooling Arctic Cooling Freezer Xtreme REV 2
Memory 12 Gig Mushkin Red-Lines @1664 MHz, Timings 7 9 7 25 1 N, QPI 1.35 Dram 1.66
Video Card(s) GeForce GTX 260
Storage (2) 750 Seagates and (2) 1.5 Seagates All Sata 4 external HD's
Display(s) 24 inch Wide Screen LCD by LG
Case Antec 1200
Audio Device(s) Sound Max on MB
Power Supply Thermaltake 750
Software W7 Pro 64 bit
I am happy with 3.40, works fine on my I7 920 machine, I tried the beta version of 3.6 and it cause BSODs had to go back to 3.40. Like the man said if it ain't broke do not fix it.
 

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
Looks like i've found a bug:

You're right, you have found a bug but the bug is with CPU-Z. At idle CPU-Z may not report the correct multiplier. I guess it does this so users can do a validation at a high MHz.

If you have C1E enabled, try turning that off and see what the two programs report. With your core voltage down at 0.968, it looks like you have some power saving going on regardless of what you might have selected in the bios.

RealTemp follows the method recommended by Intel in their November 2008 Turbo White Paper. CPU-Z does not. Core Temp also follows the correct method to determine the multiplier so you might want to try comparing to that.

Super Sarge: Thanks for the bug report but can you tell me a few more details like what operating system you are using, what CPU, etc. and what happens when you try to run the program. Does it start up and then crash or does it not start up at all? I can't fix a problem if I don't have any idea what the problem is.
 

Super Sarge

New Member
Joined
Aug 16, 2009
Messages
170 (0.03/day)
Location
Jordan MN
System Name Play Toy
Processor I7 920 OC 21*166 , Voltage CPU 1.11875 Stepping is D0
Motherboard ASUS P6T Deluxe V2
Cooling Arctic Cooling Freezer Xtreme REV 2
Memory 12 Gig Mushkin Red-Lines @1664 MHz, Timings 7 9 7 25 1 N, QPI 1.35 Dram 1.66
Video Card(s) GeForce GTX 260
Storage (2) 750 Seagates and (2) 1.5 Seagates All Sata 4 external HD's
Display(s) 24 inch Wide Screen LCD by LG
Case Antec 1200
Audio Device(s) Sound Max on MB
Power Supply Thermaltake 750
Software W7 Pro 64 bit
I really do not know it was quite awhile ago, I think it had something to do with C1E and or Turbo both of which I use at times, I use Turbo all the time My OS is W7 64 bit Pro, Intel 920 CPU D0. 12 Gig Mushkin Triple Channel 1600 MHZ Red Lines. Program ran but sometime during the day or night I would get a crash, I re-installed 3.40 and problem never happened again

I just loaded 3.60, I will try it again. I unchecked EIST as I do not nor can I find any sush setting in MY BIOS version 1003 for an ASUS P6t Deluxe V2 which is the latest BIOS
 
Last edited:
Joined
Mar 21, 2005
Messages
1,580 (0.23/day)
Location
Maribor, Slovenia, EU
System Name Core i9 rig / Lenovo laptop
Processor Core i9 10900X / Core i5 8350U
Motherboard Asus Prime X299 Edition 30 / Lenovo motherboard
Cooling Corsair H115i PRO RGB / stock cooler
Memory Gskill 4x8GB 3600mhz / 16GB 2400mhz
Video Card(s) Asus ROG Strix RTX 2080 Super / UHD 620
Storage Samsung SSD 970 PRO 1TB / Samsung OEM 256GB NVMe
Display(s) Dell UltraSharp UP3017 / Full HD IPS touch
Case Coolermaster mastercase H500M
Audio Device(s) Onboard sound
Power Supply Enermax Platimax 1700 watt / Lenovo 65watt power adapter
Mouse Logitech M500s
Keyboard Cherry
Software Windows 11 Pro / Windows 11 Pro
You're right, you have found a bug but the bug is with CPU-Z. At idle CPU-Z may not report the correct multiplier. I guess it does this so users can do a validation at a high MHz.

If you have C1E enabled, try turning that off and see what the two programs report. With your core voltage down at 0.968, it looks like you have some power saving going on regardless of what you might have selected in the bios.

RealTemp follows the method recommended by Intel in their November 2008 Turbo White Paper. CPU-Z does not. Core Temp also follows the correct method to determine the multiplier so you might want to try comparing to that.

Super Sarge: Thanks for the bug report but can you tell me a few more details like what operating system you are using, what CPU, etc. and what happens when you try to run the program. Does it start up and then crash or does it not start up at all? I can't fix a problem if I don't have any idea what the problem is.

frequncies are now the same after i disabled C1E in the bios.looks like CoreTemp uses the same principal as it reported the same as RealTemp did
 
Joined
Dec 27, 2007
Messages
8,518 (1.44/day)
Location
Kansas City
System Name The Dove Box Rev 3.0
Processor i7 8700k @ 4.7GHz
Motherboard Asus Maximus X APEX
Cooling Custom water loop
Memory 16GB 3600 MHz DDR4
Video Card(s) 2x MSI 780 Ti's in SLI
Storage 500GB Samsung 850 PCIe SSD, 4TB
Display(s) 27" Asus 144Hz
Case Enermax Fulmo GT
Audio Device(s) ON BOARD FTW
Power Supply Corsair 1200W
Keyboard Logitech G510
Software Win 10 64x

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
Thank you mlee49 for your support of RealTemp where it counts the most. $$$$$

It doesn't take a lot of money to motivate me but even a handful of donations can make the difference between carrying on with this project or walking away from it.

I was able to add some very useful features lately and I also got the 6 core version of RealTemp GT updated too. Being able to adjust the multipliers in the Core i Extreme and K series CPUs is a great new feature for RealTemp and is going to be even more useful in the new year when Intel decides to start releasing more K series CPUs for enthusiasts.

Now for the big announcement. Finally I won't have to constantly defend myself from people that are always asking, "How come RealTemp is not the same as CPU-Z?" On the XS RealTemp forum today, the programmer of CPU-Z, in his own words, finally decided to come clean.

Originally Posted by cpuz
Of course I admit that CPU-Z is not accurate anymore at idle on latest Intel generations, that is why TMonitor was developed.

I also showed him why I don't believe that TMonitor is any more accurate than CPU-Z is at idle but that's still a discussion in progress.

Here's an example of what TMonitor tells me for my T8100.



This CPU presently has EIST disabled. When you disable EIST in a Core 2 based CPU, the CPU gets locked at a fixed frequency. The multiplier reported in MSR 0x198 never changes from idle to full load.

Using Intel's recommended method to determine the multiplier, RealTemp and ThrottleStop correctly show that the CPU is locked at the 11.5 multiplier.

TMonitor is telling me that at idle the multiplier is at 6.0 and when I apply a load to the CPU, the multiplier goes up and down. That's wrong. The multiplier does not change when EIST is disabled. It can't. If you want to argue, that's great but you need to argue with Intel. TMonitor is just as inaccurate when run on Core i CPUs. It draws a nice graph but the information it is graphing is fundamentally wrong and inaccurate so it's pointless. TMonitor would be a very useful tool if it followed Intel's methods but there's no point in telling users that their CPU is doing something that it isn't.
------------------------------------------------------------------------------------------------
Thanks Super Sarge for giving RealTemp 3.60 a fresh try.
 
Joined
Aug 7, 2008
Messages
5,739 (1.01/day)
Location
Wakefield, UK
Used to love Realtemp on my Q9550. Was the ONLY temp monitor i used.

Pity my 1055T isn't supported :(

Keep up the good work matey :)
 
Joined
Dec 27, 2007
Messages
8,518 (1.44/day)
Location
Kansas City
System Name The Dove Box Rev 3.0
Processor i7 8700k @ 4.7GHz
Motherboard Asus Maximus X APEX
Cooling Custom water loop
Memory 16GB 3600 MHz DDR4
Video Card(s) 2x MSI 780 Ti's in SLI
Storage 500GB Samsung 850 PCIe SSD, 4TB
Display(s) 27" Asus 144Hz
Case Enermax Fulmo GT
Audio Device(s) ON BOARD FTW
Power Supply Corsair 1200W
Keyboard Logitech G510
Software Win 10 64x
Thank you mlee49 for your support of RealTemp where it counts the most. $$$$$

It doesn't take a lot of money to motivate me but even a handful of donations can make the difference between carrying on with this project or walking away from it.

I was able to add some very useful features lately and I also got the 6 core version of RealTemp GT updated too. Being able to adjust the multipliers in the Core i Extreme and K series CPUs is a great new feature for RealTemp and is going to be even more useful in the new year when Intel decides to start releasing more K series CPUs for enthusiasts.

Now for the big announcement. Finally I won't have to constantly defend myself from people that are always asking, "How come RealTemp is not the same as CPU-Z?" On the XS RealTemp forum today, the programmer of CPU-Z, in his own words, finally decided to come clean.



I also showed him why I don't believe that TMonitor is any more accurate than CPU-Z is at idle but that's still a discussion in progress.

Here's an example of what TMonitor tells me for my T8100.

http://img833.imageshack.us/img833/3240/tmonitor.png

This CPU presently has EIST disabled. When you disable EIST in a Core 2 based CPU, the CPU gets locked at a fixed frequency. The multiplier reported in MSR 0x198 never changes from idle to full load.

Using Intel's recommended method to determine the multiplier, RealTemp and ThrottleStop correctly show that the CPU is locked at the 11.5 multiplier.

TMonitor is telling me that at idle the multiplier is at 6.0 and when I apply a load to the CPU, the multiplier goes up and down. That's wrong. The multiplier does not change when EIST is disabled. It can't. If you want to argue, that's great but you need to argue with Intel. TMonitor is just as inaccurate when run on Core i CPUs. It draws a nice graph but the information it is graphing is fundamentally wrong and inaccurate so it's pointless. TMonitor would be a very useful tool if it followed Intel's methods but there's no point in telling users that their CPU is doing something that it isn't.
------------------------------------------------------------------------------------------------
Thanks Super Sarge for giving RealTemp 3.60 a fresh try.

Ha, seems like your the only one who is right these days.
 

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
Maybe. I don't have access to any Sandy Bridge hardware and everyone I've approached is afraid to share any information with me for fear that it will break their NDA agreement with Intel and then they might get in trouble.

I've made a few minor adjustments so it can extract the new 4 digit model numbers for a version 3.62. If anyone has some SB hardware and wants to do some testing, send me a PM. I can keep a secret. :)
 
D

Deleted member 74752

Guest
Is there any way to get RealTemp to report correctly for an Atom?

I would be happy to donate if I knew where to go...Lord knows I have used it enough. :laugh:
 

Attachments

  • atom.jpg
    atom.jpg
    195.1 KB · Views: 422
Last edited by a moderator:
D

Deleted member 74752

Guest
Now it's working! You did something didnt you? ;) Now where is that donate button...:toast:

This is going to be running 24/7 for my MagicJack and I just wanted to know how the temps would be. Thanks!
 

Attachments

  • atom_fixed.jpg
    atom_fixed.jpg
    197.4 KB · Views: 412
  • DSC00594.jpg
    DSC00594.jpg
    191 KB · Views: 414
  • DSC00595.jpg
    DSC00595.jpg
    195 KB · Views: 421
Last edited by a moderator:

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
I'm as surprised as you are to see it working on your Atom CPU. :)

According to Intel, RealTemp is using the correct TJMax value for your CPU too.

http://ark.intel.com/Product.aspx?id=43098&code=Intel®+Atom™+Processor+D510+(1M+Cache,+1.66+GHz)

The sensors that Intel uses have never been very accurate at reporting low temperatures. It looks like even with the right TJMax value, your sensors are out to lunch unless you live on the North Pole. That's not unusual.

If the load meter is not working correctly, try clicking on the TM Load box in the Settings window. Most of the Atom CPUs are missing some internal timers that RealTemp depends on so the TM Load option should give you a Load value similar to what the Task Manager will show you.

Turn on a system tray temperature icon, right click on it and then select the About... option. Hiding in there should be a Donate button. Thanks for supporting free software.
 
D

Deleted member 74752

Guest
I tried those workarounds but no go. Bios reports it at 27c...that sounds about right. Guess we will just see how long it holds up. ;)
 

id0l

New Member
Joined
Dec 27, 2010
Messages
7 (0.00/day)
System Name Gaming Rig/HTPC ((100% STABLE))
Processor Core 2 Quad Q9400 @ 3.4ghz, 8 x 425mhz FSB @ 1.192v
Motherboard ASUS P5Q-E
Cooling XigmaTek Dark Knight, push-pull w/Enermax Magma 120mm (x2)
Memory 2x2GB A-DATA DDR2-800 @ 850mhz 4-4-4-12
Video Card(s) ATI Radeon 4890 1GB @ 975mhz/1125mhz cooled by Thermalright T-Rad2 w/Xilence Redwing 92mm (x2)
Storage Seagate Barracuda 7200.10 250GB (x2), Seagate Barracuda 7200.11 500GB
Display(s) Acer X241W 24" 16:10 WUXGA, Samsung LN40A630 40" 1080p LCDTV
Case Antec P180 Silver @ < 30dB
Audio Device(s) Sony 1000w 5.1 DD/DTS/PL2 system via optical S/PDIF on Sound Blaster Audigy 2 ZS Platinum
Power Supply Antec TruPower Trio TP3-650
Software Windows 7 Ultimate x64
Benchmark Scores Max CPU temp @ full load (17hrs Y-Cruncher) - 56c // Max GPU temp @ full load (2hrs FurMark) - 64c
I like the new version (fixed the reset button crash bug for me) and am glad to see ATI GPU monitoring support but...it seems there is an issue with it causing the video card to cycle 2d/3d modes constantly. See attached screenshots from GPU-Z.

I had to switch back to 3.40 to avoid this happening. If I hadn't have flashed my 4890s BIOS to maintain 1125mhz memory speed regardless of 'power mode' my screen probably would have been a flicker-fest. :eek:

This side effect is present regardless of turning off GPU monitoring, increasing/decreasing the poll rate, or disabling ATI support.

Also, it makes my video card generally idle hotter.

I really like RealTemp and have used it for a long time, but I registered here because this bug really annoys the hell out of me. :)

Really liking the custom font support. :p
 

Attachments

  • realtemp_340.gif
    realtemp_340.gif
    16.2 KB · Views: 487
  • realtemp_360.gif
    realtemp_360.gif
    16.3 KB · Views: 428
Last edited:

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
The problem you are having doesn't make much sense. If you disable ATI GPU monitoring in RealTemp, it can't be causing your GPU to cycle between 2D and 3D. The ATI code in RealTemp will be completely bypassed and not used at all if you disable ATI monitoring in the Settings window. If RealTemp is not accessing your GPU then I don't understand how RealTemp can be causing the problem you are having.

Did you change your 2D and 3D settings with a bios editor? If you used something like RBE then maybe upload your bios somewhere so I can have a look at what settings you are using.

How about when you are not running GPU-Z or RealTemp and just monitoring with the Catalyst Control Center. Is it steady in Windows in 2D? Try to think of a few more tests to try and isolate this problem. I have a 5770 card for testing and in Windows with RealTemp running, the GPU is steady in 2D mode. It might occasionally go into 3D mode when needed to if I start a desktop game but when not needed, it drops back to 2D and stays there. Let me know if you can figure this out
 

id0l

New Member
Joined
Dec 27, 2010
Messages
7 (0.00/day)
System Name Gaming Rig/HTPC ((100% STABLE))
Processor Core 2 Quad Q9400 @ 3.4ghz, 8 x 425mhz FSB @ 1.192v
Motherboard ASUS P5Q-E
Cooling XigmaTek Dark Knight, push-pull w/Enermax Magma 120mm (x2)
Memory 2x2GB A-DATA DDR2-800 @ 850mhz 4-4-4-12
Video Card(s) ATI Radeon 4890 1GB @ 975mhz/1125mhz cooled by Thermalright T-Rad2 w/Xilence Redwing 92mm (x2)
Storage Seagate Barracuda 7200.10 250GB (x2), Seagate Barracuda 7200.11 500GB
Display(s) Acer X241W 24" 16:10 WUXGA, Samsung LN40A630 40" 1080p LCDTV
Case Antec P180 Silver @ < 30dB
Audio Device(s) Sony 1000w 5.1 DD/DTS/PL2 system via optical S/PDIF on Sound Blaster Audigy 2 ZS Platinum
Power Supply Antec TruPower Trio TP3-650
Software Windows 7 Ultimate x64
Benchmark Scores Max CPU temp @ full load (17hrs Y-Cruncher) - 56c // Max GPU temp @ full load (2hrs FurMark) - 64c
Here's the link for my 4890 .ROM file.

I use RBE v1.25 to make clock speed adjustments in the BIOS - all voltages are stock. All I have changed is the [overclocked] memory speed to remain constant at 1125mhz across all power states (raising/lowering on the fly makes the screen flicker, common on these cards) and increased the 3D GPU clock speed to 975mhz.

Using CCC shows the GPU clock sitting at 240mhz. But...I don't really trust any readings from CCC :rolleyes: and thus I don't use it. Even still, my GPU temperature idles 3-4c higher using RealTemp v3.6 - shouldn't that be an indicator that something is wrong? I have verified that I have GPU monitoring disabling by unchecking the box next to 'ATI' in the RealTemp settings page. My card idles at a higher temperature under RealTemp v3.6 regardless of whether GPU-Z is open or not.

I can open v3.4 and watch the GPU clock speed in GPU-Z stay at 240mhz all day (and temps are lower at idle)...but when I open v3.6 it all starts going crazy. :confused:
 

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
If something is causing your GPU to rapidly cycle between 2D and 3D then that would explain why it is idling 3C or 4C higher. What I can't understand is if you disable GPU monitoring in RealTemp, the ATI code that RealTemp uses is completely bypassed and does not run at all. After ATI is unchecked, RealTemp does not interact with your GPU in any way.

I'm not trying to disagree with you or make RealTemp out to be innocent. I'm just trying the best I can to trouble shoot this problem you are having.

Are your GPU overclock settings stable while running Furmark?

It's bedtime here. I'll have a look at your rom file settings tomorrow in RBE. This is the first report of a problem like this so I'm very interested in trying to figure it out.
 

id0l

New Member
Joined
Dec 27, 2010
Messages
7 (0.00/day)
System Name Gaming Rig/HTPC ((100% STABLE))
Processor Core 2 Quad Q9400 @ 3.4ghz, 8 x 425mhz FSB @ 1.192v
Motherboard ASUS P5Q-E
Cooling XigmaTek Dark Knight, push-pull w/Enermax Magma 120mm (x2)
Memory 2x2GB A-DATA DDR2-800 @ 850mhz 4-4-4-12
Video Card(s) ATI Radeon 4890 1GB @ 975mhz/1125mhz cooled by Thermalright T-Rad2 w/Xilence Redwing 92mm (x2)
Storage Seagate Barracuda 7200.10 250GB (x2), Seagate Barracuda 7200.11 500GB
Display(s) Acer X241W 24" 16:10 WUXGA, Samsung LN40A630 40" 1080p LCDTV
Case Antec P180 Silver @ < 30dB
Audio Device(s) Sony 1000w 5.1 DD/DTS/PL2 system via optical S/PDIF on Sound Blaster Audigy 2 ZS Platinum
Power Supply Antec TruPower Trio TP3-650
Software Windows 7 Ultimate x64
Benchmark Scores Max CPU temp @ full load (17hrs Y-Cruncher) - 56c // Max GPU temp @ full load (2hrs FurMark) - 64c
Hey, don't get me wrong Unc, I'm just as confused as you are. :) I wasn't trying to point fingers. I would assume that RealTemp wouldn't touch my GPU if I disabled monitoring, but from what I can tell it's still doing something odd with it. I mentioned that it happened regardless of GPU-Z being open because I thought it may have been some kind of 'conflict' between the 2 programs causing the issue but I don't think that's the case.

FurMark is 100% stable after a 2 hour run with max GPU temp coming in at 64c (I have aftermarket VGA cooling).

Take a look at my BIOS file when you get the chance and let me know if you see anything weird. The whole reason I had to use RBE was to lock in the memory at a constant frequency. Adobe Flash Player, when it was updated a while back, enabled video hardware acceleration and every time a flash movie would play (i.e. YouTube) my screen would flicker at the beginning. I tracked this down to the memory speed jumping from low power settings to high power settings. Apparently that causes screen flickering with the 4890s (GPU clock increasing/decreasing does not cause screen flickering; only memory). Locking the memory speed to 1125mhz across the different power states solved this issue (and I don't have to use CCC anymore for overclocking, HOORAY!). :cool:

Seriously though, I have been using RealTemp since I got my E6600 (perhaps before). That was probably 4-5 years ago. If I didn't love this little app I wouldn't use it religiously. :) Heck, I think I finally figured out a decent way to 'read' my CPU cores and tune their TJmax and Idle Calibration points to where the reported temps are somewhat accurate (at least, they seem that way - they all read very close under full load).
 
Last edited:

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
I found the Radeon bios that your bios is based off of and I didn't see anything too unusual. The only thing I noticed was the T min hysteresis is 4 in the original bios and you have set that to 0. You also checked the PWM ramp on while this is unchecked in the original. Switching to the look up table to get your fan to run constantly at 100% looks like it should work. All of your bumped up clock settings seem fine so nothing is jumping out at me.

When testing, try opening up the Task Manager and see if there is anything Adobe related that is running in the background. Adobe has a habit of sliding things into your start up sequence that run in the background without most users knowing about it. I use Autoruns to pick through all the places that start up itmes can be hidden in a Windows PC.

http://technet.microsoft.com/en-us/sysinternals/bb963902

Try using CCC to monitor. Does it show the GPU switching back and forth from 2D to 3D with the clocks jumping up and down? Does running or exiting RealTemp change what CCC reports? Can you kill any tasks in the Task Manager that are Adobe related and see if that changes anything.

After a good sleep I will have a thorough look at my code to see if I can see anything unusual and try to think of anything else that you can test to find out what's going on.
 

id0l

New Member
Joined
Dec 27, 2010
Messages
7 (0.00/day)
System Name Gaming Rig/HTPC ((100% STABLE))
Processor Core 2 Quad Q9400 @ 3.4ghz, 8 x 425mhz FSB @ 1.192v
Motherboard ASUS P5Q-E
Cooling XigmaTek Dark Knight, push-pull w/Enermax Magma 120mm (x2)
Memory 2x2GB A-DATA DDR2-800 @ 850mhz 4-4-4-12
Video Card(s) ATI Radeon 4890 1GB @ 975mhz/1125mhz cooled by Thermalright T-Rad2 w/Xilence Redwing 92mm (x2)
Storage Seagate Barracuda 7200.10 250GB (x2), Seagate Barracuda 7200.11 500GB
Display(s) Acer X241W 24" 16:10 WUXGA, Samsung LN40A630 40" 1080p LCDTV
Case Antec P180 Silver @ < 30dB
Audio Device(s) Sony 1000w 5.1 DD/DTS/PL2 system via optical S/PDIF on Sound Blaster Audigy 2 ZS Platinum
Power Supply Antec TruPower Trio TP3-650
Software Windows 7 Ultimate x64
Benchmark Scores Max CPU temp @ full load (17hrs Y-Cruncher) - 56c // Max GPU temp @ full load (2hrs FurMark) - 64c
I did put the look up table to all 100% across the board but that pretty much a moot point because my vid card fans run at 100% anyways (the TRad2 heatsink has 2x 92mm fans attached to it; both connected to motherboard fan headers). I probably did edit Tmin hysteresis, as it seems vaguely familiar, but I believe all of those settings have to do with fan speed if I'm not mistaken.

Do you have a stock RV790 BIOS I can look at? Really though, I'd just like to have one as a backup as mine "somehow" got deleted. I know, I'm terrible. :p I deleted it by accident.

There is nothing Adobe running in the background on my system as I make sure to disable any extraneous programs though either msconfig or services.msc...I also have Autoruns. :)

Like I said CCC reports the 2D clock just sitting at 240mhz (idle) like it should. But considering the higher idle temps and what GPU-Z is reporting (which I trust more), I still think something is off.

Definitely let me know what you find!
 
Last edited:

unclewebb

ThrottleStop & RealTemp Author
Joined
Jun 1, 2008
Messages
7,249 (1.26/day)
TechPowerUp is a God send when you forget to save a bios before tearing into it with RBE. :)

http://www.techpowerup.com/vgabios/

RealTemp gets its GPU temperature data and clock data directly from CCC so if CCC is screwed up then RealTemp would be screwed up too. The default sensor reading interval for CCC is usually 5 seconds. If you turn off GPU-Z and turn off CCC and just use RealTemp for GPU monitoring and set the interval to 1 second, does it show this fluctuating GPU core speed?

I tend to trust GPU-Z more than CCC as well. Are you using the 10.12 driver?

The other test would be if RealTemp and GPU-Z are both running and GPU-Z is reporting this varying GPU speed, does GPU-Z immediately flat line in 2D mode at 240 MHz after exiting RealTemp? I see another sleepless night tonight. :)
 

id0l

New Member
Joined
Dec 27, 2010
Messages
7 (0.00/day)
System Name Gaming Rig/HTPC ((100% STABLE))
Processor Core 2 Quad Q9400 @ 3.4ghz, 8 x 425mhz FSB @ 1.192v
Motherboard ASUS P5Q-E
Cooling XigmaTek Dark Knight, push-pull w/Enermax Magma 120mm (x2)
Memory 2x2GB A-DATA DDR2-800 @ 850mhz 4-4-4-12
Video Card(s) ATI Radeon 4890 1GB @ 975mhz/1125mhz cooled by Thermalright T-Rad2 w/Xilence Redwing 92mm (x2)
Storage Seagate Barracuda 7200.10 250GB (x2), Seagate Barracuda 7200.11 500GB
Display(s) Acer X241W 24" 16:10 WUXGA, Samsung LN40A630 40" 1080p LCDTV
Case Antec P180 Silver @ < 30dB
Audio Device(s) Sony 1000w 5.1 DD/DTS/PL2 system via optical S/PDIF on Sound Blaster Audigy 2 ZS Platinum
Power Supply Antec TruPower Trio TP3-650
Software Windows 7 Ultimate x64
Benchmark Scores Max CPU temp @ full load (17hrs Y-Cruncher) - 56c // Max GPU temp @ full load (2hrs FurMark) - 64c
RealTemp gets its GPU temperature data and clock data directly from CCC so if CCC is screwed up then RealTemp would be screwed up too. The default sensor reading interval for CCC is usually 5 seconds. If you turn off GPU-Z and turn off CCC and just use RealTemp for GPU monitoring and set the interval to 1 second, does it show this fluctuating GPU core speed?

No! It doesn't...it seems to be solid at 240mhz! However as soon as I start up GPU-Z the funkiness returns. So I closed RealTemp and tried opening just CCC and GPU-Z, thinking that from what you said, it may have had something to do with CCC itself, and guess what I find. Running CCC + GPU-Z without RealTemp going shows the same sporadic 'jumping' in the GPU clock (see attached pic). :confused: Now I'm starting to think that this issue lies more within GPU-Z or perhaps CCC itself, or the way GPU-Z is polling CCC/the video card (yeah, I don't really know how it works :)).

unclewebb said:
I tend to trust GPU-Z more than CCC as well. Are you using the 10.12 driver?

I am using 10.7 currently. Hmmm...perhaps I will update tomorrow...I didn't know there was a new driver out. Though I am curious, now there are several different versions of CCC:

1. AMD Catalyst 10.12 Preview for Windows 7– Featuring the new Catalyst Control Center (110mb)
2. Catalyst Software Suite (64 bit) English Only (72.6MB)
3. AMD Catalyst™ Accelerated Parallel Processing (APP) Technology Edition (88.9MB)

No clue as to what #1 and #3 contain or if I should get one of those versions. I usually grab #2.

unclewebb said:
The other test would be if RealTemp and GPU-Z are both running and GPU-Z is reporting this varying GPU speed, does GPU-Z immediately flat line in 2D mode at 240 MHz after exiting RealTemp? I see another sleepless night tonight. :)
Yes! And therein lies the rub. :p

So, in summary, I now realize:
RealTemp v3.60 running by itself = no problem. GPU-Z running by itself = no problem. Running CCC by itself = why would I do that (lol)? Running GPU-Z with either RealTemp or CCC = GPU clock goes crazy. :)

Thanks for the link to the BIOS files. I see two for the ATI Radeon 4890 1GB but don't know which one is correct (perhaps they are the same?).
 

Attachments

  • ccc.gif
    ccc.gif
    15.4 KB · Views: 470
Last edited:
Top