Jump to content

Got 8800 GTX


rainwood

Recommended Posts

yep they are just like Omega drivers for ATI, they are the standard mfg's drivers with modified settings (usually for image quality/performance etc)

 

if the mfg's driver package is missing something (ie openGL 1.3 drivers in this case) then the modified driver package is going to be missing them too.

 

VJ, do you have any ideas about the openGL driver for newer nVidia gpu drivers?

 

No and didn't even know it was missing since I don't play any OGL 1.3 games in XP with an 8800

(Vista is SOOOOOOOO screwed up I doubt OGL will ever work right in it) lol.

 

You wrote this earlier "Nvidia's release notes it states that a proper openGL driver will be released

later". I tried to find that in the notes for the 97.92 XP and 100.65 (WHQL) Vista drivers and

came up dry.

 

Viper

Share this post


Link to post
Share on other sites

  • Replies 137
  • Created
  • Last Reply

Top Posters In This Topic

Got some magic numbers.

 

GPU Clock / GPU Memory: 634 / 1031 (2062)

[sOURCE]: "GeForce 8800 series round-up"

 

GPU Clock/GPU Memory: 650 / 1060 (2120)

[sOURCE]: "Reviews: Grafikkarten: XFX NVIDIA 8800 GTX XXX Review auf Technic3D"

 

GPU Clock / GPU Memory: 660 / 1070(2140)

[sOURCE]: "XFX GeForce 8800 GTX XXX Edition Review"

 

GPU Clock / GPU Memory: 654 / 1010 (2020)

[sOURCE]: "Releasing the Beasts – Overclocking the GeForce 8800's"

 

GPU Clock / GPU Memory:

650 / 1100 (2200)

683 / 1100 (2200)

692 / 1110 (2220)

692 / 1114 (2228)

704 / 1100 (2200)

[sOURCE]: "GeForce 8800 GTX overclocking preview"

 

How I tested:

The GPU Clock (Geometric domain), GPU Clock (Shader domain) & GPU Memory Clock defined in BIOS are 630 MHz, 1450 MHz & 1,000 MHz respectively.

With the GPU Clock (Shader domain) fixed, the GPU Clock (Geometric domain) & GPU Memory were adjusted.

The graphic driver was ForceWare Release 95 (version: 97.92).

 

1. The GPU Clock & GPU Memory frequencies were adjusted using nTune (3D Settings > Performance > Manual Tuning > Adjust GPU Settings).

ntuuda0.th.jpg

 

2. Tested the adjusted settings using the “Test” function of the “Adjust GPU Settings”.

3. If the nTune’s test was passed, ran 3Dmark06 test with its default settings.

4. If the 3Dmark06 test was completed successfully, ran the performance test of F.E.A.R. & played the game for three hours: Computer settings – Maximum & Graphic card settings – custom.

 

Results:

GPU Clock / GPU Memory: 634 / 1031 (2062):

Passed the nTune test. The 3DMark06 was completed successfully. The scores were, however, lower than the scores made with my default BIOS settings. The scores of the F.E.A.R. performance test were dropped, compared to the scored made with my default BIOS settings. I noticed sluggishness during game play, and BSOD was encountered in less than one hour after started playing the game.

 

GPU Clock/GPU Memory: 650 / 1060 (2120), 660 / 1070(2140):

Passed the nTune test. The 3DMark06 test failed: the test 1 of the 3DMark06 test got stuck as soon as the test 1 started. I had to shut down my system.

 

GPU Clock / GPU Memory: 654 / 1010 (2020):

The nTune test failed.

 

GPU Clock / GPU Memory: 650 / 1100 (2200)

Passed the nTune test. The 3DMark06 test failed: the test 1 of the 3DMark06 test got stuck as soon as the test 1 started. I had to shut down my system.

 

GPU Clock / GPU Memory: 683 / 1100 (2200)

Passed the nTune test. The 3DMark06 test failed: the test 1 of the 3DMark06 test got stuck as soon as the test 1 started. Within a second or two, it led to BSOD: “Error message: STOP 0x000000EA THREAD_STUCK_IN_DEVICE_DRIVER”. According to Microsoft, the cause of this error is “This issue might occur if the display driver is caught in an infinite loop while it waits for the video hardware to become idle. This issue typically indicates a problem with the video hardware or that the display driver cannot program the hardware correctly.”

 

GPU Clock / GPU Memory: 692 / 1110 (2220):

Passed the nTune test. The 3DMark06 test failed: the monitor screen blacked out right after the test 1 of the 3DMark06 test got started. I had to shut down my system.

 

GPU Clock / GPU Memory: 692 / 1114 (2228), 704 / 1100 (2200):

The nTune test was not even completed: the OS got frozen up during the nTune test. I had to shut down my system.

 

…so where is magic?

 

All numbers did not work out. Is this because I used the version of the driver that is not the same one used when the numbers were tested in these sites? Should I have used the default BIOS settings, not the over-clocked BIOS, when testing the numbers?

 

Anyway, at least I learned that the default XFX XXX GPU Clock (Geometric domain) & GPU Memory Clock frequencies with the over-clocked GPU Clock (Shader domain) works best in my system in terms of performance & stability…

 

Do we have someone out there who wants to try these numbers?

Share this post


Link to post
Share on other sites

Do we have someone out there who wants to try these numbers?

 

You may be able to get a higher overclock if you enable "linkboost" in the BIOS or manualy set the PCI frequency to 125mhz or higher. Unsure how those guys can get those clock speeds, perhaps they were using water cooling.

 

Ill give some of those numbers ago when ive got this new m/b running properly and will post the results.

Share this post


Link to post
Share on other sites

You may be able to get a higher overclock if you enable "linkboost" in the BIOS or manualy set the PCI frequency to 125mhz or higher.
…I’ve have the “LINK BOOST” enabled in BIOS since day one... It’d be interesting to see changing the PCI-E bus frequencies, but with the “LINK BOOST” enabled NVMonitor gives me a warning that the PCI-E voltage is too high…

 

Ill give some of those numbers ago when ive got this new m/b running properly and will post the results.
I’m looking forward seeing the results… (BTW, how’s Striker? Does it run hot(-er) than EVGA 680i?; I mean, the mainboard temperature.)

 

While testing clock speeds you might want to take a look at the post I put up in this section a little bit ago.
It is a good read. Thank you. Praz.

 

I will download RivaTuner and try the following settings

  • GPU Clock (Geometric domain): 648 (635 --> 661 range)
  • GPU Clock (Shader domain): 1566 (657=1566)
  • GPU Memory Clock: 1008 (1004 --> 1116)

In Tony’s article, Tony tried the 648/1566/1080 (GPU Clock (Geometric domain) / GPU Clock (Shader domain) / GPU Memory Clock) settings. I will watch if the actual GPU Memory Clock within the (1004 --> 1116) range is 1008 or 1080.

 

BTW, I recall that there are the “how-to” threads regarding the RivaTuner in this forum. I better read them first (this is the first time for me to try RivaTuner)…

Share this post


Link to post
Share on other sites

The 768Mb of memory is arranged in banks of 256Mb so there is a clock generator for each bank. While using Rivatuner this is something else you can see. In the advanced section of the program you have the ability to display all three clock generators. With the drivers released so for only one clock generator increases in frequency. The other two stay at the BIOS bootup default of 400MHz.

Share this post


Link to post
Share on other sites

… And the quote from “RivaTuner Version 2.0 Final Release: What's new:”:

Added workaround for G80 memory clock frequency generation bug of the ForceWare 97.28 and newer drivers. Unfortunately a lot of gamers with GeForce 8800 graphics adapters seem to be unable to understand the principles of multiple hardware clocking and aggressively react on any attempts to explain it. Due to that reason now by default RivaTuner displays memory clock of the only memory clock frequency generator programmed by all versions of the ForceWare drivers. The clocks of the rest two generators, which are erroneously left by the driver in BIOS default 400MHz state, are now hidden from eyes of beginners. The power users still can monitor independent clocks of all 3 memory clock frequency generators on G80 via GPUProbe.dll plugin. And the gamers unable to get technical details may relax and continue "fellas, my new 8800 is so cool" related discussions in the forums.”

 

Yeah, that’s me…

Share this post


Link to post
Share on other sites

The idle temperature of both 8800 GTX cards in my box is 58 – 59 °C. The temperature after played the game for 3 – 4 hours is 67 – 68 °C (the temperature will be higher than that during the game, but I have not measured it yet), and the temperature drops to the idle temperature within 5 – 10 minutes.

The idle CPU temperature is 27 – 29 °C, and 32 – 34 °C at load.

The mainboard temperature is 35 – 38 °C.

The hard drive temperature is 28 – 33 °C.

I have not looked at the DIMM and chipset temperatures yet, but I ordered the MCP fan from EVGA (, though I do not know if there is any Win32 app. that can read & report the MCP temperature. If you know, please let me know).

 

Among the components in my box the graphic cards are the ‘hottest’ ones.

 

Here’s what I like to know; will the graphic cards die or will the life of the cards be shortened if I leave my box running for 24/7?

 

Since I built my box to be the gaming box, I have no intention to use my box as the 24/7 workstation, but I just wonder if the, so-called, high-end PC components are not designed to be used for 24/7.

Share this post


Link to post
Share on other sites

I ordered the MCP fan from EVGA (, though I do not know if there is any Win32 app. that can read & report the MCP temperature. If you know, please let me know).

 

Your board should have already have and MCP fan with it. You have to install it when you unpack the motherboard. Mine was in a small black box, didnt you have one with your motherboard?

 

As for the MCP temp, there isnt and windows app that can read this temp yet. I read somewhere that the next release on ntune will display this temp in NVIDIA Monitor.

Share this post


Link to post
Share on other sites

Your board should have already have and MCP fan with it.
…actually the fan that comes with the mainboard package is for SPP. EVGA has recently started ‘selling’ the MCP fan for their 680i mainboard (EVGA's message baord: "All Forums > EVGA Product Offerings > Motherboards > MCP (Southbridge) Fan Available Now!").

 

As for the MCP temp, there isnt and windows app that can read this temp yet. I read somewhere that the next release on ntune will display this temp in NVIDIA Monitor.
Thanks for info. NVMonitor…usually I do not use the tool to see the temps of my system components. I wonder how many app. venders are willing to make the NDA with NVIDIA just to read the MCP temp…

Share this post


Link to post
Share on other sites

…actually the fan that comes with the mainboard package is for SPP. EVGA has recently started ‘selling’ the MCP fan for their 680i mainboard (EVGA's message baord: "All Forums > EVGA Product Offerings > Motherboards > MCP (Southbridge) Fan Available Now!").

 

Thanks for info. NVMonitor…usually I do not use the tool to see the temps of my system components. I wonder how many app. venders are willing to make the NDA with NVIDIA just to read the MCP temp…

 

Ok cool. What fan do you plan to use with the N/B chip?

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...