Jump to content

Overclocking 9800 Gx2


Ramm

Recommended Posts

I need help overclocking my 9800 gx2, I've looked through the nvidia control panel and found nothing.

 

I'm just getting into this overclocking thing (last night overclocked my e8500 to 3.86ghz)

 

Figured this would be a good place to ask, seeing how I found the help to overclock my cpu here as well.

 

Thanks in advance!

Share this post


Link to post
Share on other sites

In addition to RivaTuner, I recommend downloading and installing ATiTool, then make an SLI profile in:

 

NVidia Control Panel > 3D Settings > Manage 3D Settings > Add > (browse to C:\Program Files\ATITool\ATITool.exe)

--- Feature, SLI Performance Mode = Setting, Force alternate frame rendering 2

 

Download and extract RTHDRIBL to C:\Program Files\RTHDRIBL, create a shortcut on desktop and/or Start Menu, and create an SLI profile in the same way as ATITool

 

Now you have two 3D applications that you can run on the desktop, while adjusting the GPUs/Shader/GDDR speeds with RivaTuner

 

With ATiTool, you can activate the Scan For Artifacts mode and increase the speed of the GPUs (the speeds of each card is adjusted together) up in 5 MHz increments until you get yellow dots and errors reported on the 3D window. Leave the Shader locked to the GPU for now, you can try adjusting the speed separately after. Then take the GPU speed down to stock speed and repeat the incremental testing for the GDDR.

 

Once you have GPU and GDDR max OCs separately without errors, bring both speeds up to near max stable and see what the combined maximum stable is.

 

Return the speeds to stock. Close ATiTool and run RTHDRIBL and press 8, b, o, o, to activate 16x AA, Bump-mapping, and the skull model. Increase the speeds to the maximums you found with ATiTool and observe the 3D animation to see if there is any strange artifacting happening (sudden flashes of triangles, yellow dots, etc)

 

Download and install all Futuremark 3DMark benchmarks (01SE, 03, 05, 06), if you have Vista, 3DMark Vantage is more advanced and a good 3D stability test for new cards like yours. Aquamark 3 is another benchmark worth downloading.

 

You'll probably find that the maximum OC you have found will lock up on one or more of these benchmarks, so take the GDDR speed back down to stock, and reduce the GPU speed slightly and run all the 3D benchmarks to make sure you can successfully run them all. Then bring the GPU back up towards the maximum OC found earlier in small steps until you can't pass one of the benchmarks (I found 06 was easier to pass than 05, so try 05 first if anything). Then take GPU back down to stock, and do the GDDR near maximums for all test completions, then small steps back to near maximum.

 

Finally, do the combined GPU and GDDR OCs for all tests.

 

Once you've done that, it's pretty much guaranteed stable in everything, but the best test of all is extended testing in modern graphics heavy games like Crysis, COD4, etc.

Share this post


Link to post
Share on other sites

In addition to RivaTuner, I recommend downloading and installing ATiTool, then make an SLI profile in:

 

NVidia Control Panel > 3D Settings > Manage 3D Settings > Add > (browse to C:\Program Files\ATITool\ATITool.exe)

--- Feature, SLI Performance Mode = Setting, Force alternate frame rendering 2

 

Download and extract RTHDRIBL to C:\Program Files\RTHDRIBL, create a shortcut on desktop and/or Start Menu, and create an SLI profile in the same way as ATITool

 

Now you have two 3D applications that you can run on the desktop, while adjusting the GPUs/Shader/GDDR speeds with RivaTuner

 

With ATiTool, you can activate the Scan For Artifacts mode and increase the speed of the GPUs (the speeds of each card is adjusted together) up in 5 MHz increments until you get yellow dots and errors reported on the 3D window. Leave the Shader locked to the GPU for now, you can try adjusting the speed separately after. Then take the GPU speed down to stock speed and repeat the incremental testing for the GDDR.

 

Once you have GPU and GDDR max OCs separately without errors, bring both speeds up to near max stable and see what the combined maximum stable is.

 

Return the speeds to stock. Close ATiTool and run RTHDRIBL and press 8, b, o, o, to activate 16x AA, Bump-mapping, and the skull model. Increase the speeds to the maximums you found with ATiTool and observe the 3D animation to see if there is any strange artifacting happening (sudden flashes of triangles, yellow dots, etc)

 

Download and install all Futuremark 3DMark benchmarks (01SE, 03, 05, 06), if you have Vista, 3DMark Vantage is more advanced and a good 3D stability test for new cards like yours. Aquamark 3 is another benchmark worth downloading.

 

You'll probably find that the maximum OC you have found will lock up on one or more of these benchmarks, so take the GDDR speed back down to stock, and reduce the GPU speed slightly and run all the 3D benchmarks to make sure you can successfully run them all. Then bring the GPU back up towards the maximum OC found earlier in small steps until you can't pass one of the benchmarks (I found 06 was easier to pass than 05, so try 05 first if anything). Then take GPU back down to stock, and do the GDDR near maximums for all test completions, then small steps back to near maximum.

 

Finally, do the combined GPU and GDDR OCs for all tests.

 

Once you've done that, it's pretty much guaranteed stable in everything, but the best test of all is extended testing in modern graphics heavy games like Crysis, COD4, etc.

 

 

Thanks a lot guys!

 

Can't wait to get home now and mess around with this...

Share this post


Link to post
Share on other sites

So, can you not open the 3D / Artifact windows in ATiTool? I can't use ATiTool to OC my cards in Vista, but I use RivaTuner anyway and only use ATiTool for the basic 3D testing

Share this post


Link to post
Share on other sites

Hmm.....on the 175.19 drivers, though i just upgraded from 169.xx, didnt try it with those though.

 

::edit:: did a bit of googling and found out it is a vista compatibility issue but there is a beta (well was a beta in '07 dont think it finished being a beta lol) which solves the problem. Got it running now, thanks :)

 

how high should i expect to be able to got with this card? I see your cards have some pretty awsome overclocks on them but mine isnt water cooled and it is the A2 revision which i believe is not as overclockable. Stock speeds are: core 513

mem 792

shader 1188

 

Atm i have got to 640/792/1446 without Ati tool picking up any artifacts but i think when doing my 3dmark attempts i was not able to get the core past around 610 (with the shaders linked to about 1400) Is it not very hard on the gpu or something? (getting like 900 fps lol)

 

::edit2:: At 651/792/1507 no errors for 5 mins in atiTool

Edited by Comp Dude2

Share this post


Link to post
Share on other sites

  • 2 weeks later...
Hmm.....on the 175.19 drivers, though i just upgraded from 169.xx, didnt try it with those though.

 

::edit:: did a bit of googling and found out it is a vista compatibility issue but there is a beta (well was a beta in '07 dont think it finished being a beta lol) which solves the problem. Got it running now, thanks :)

 

how high should i expect to be able to got with this card? I see your cards have some pretty awsome overclocks on them but mine isnt water cooled and it is the A2 revision which i believe is not as overclockable. Stock speeds are: core 513

mem 792

shader 1188

 

Atm i have got to 640/792/1446 without Ati tool picking up any artifacts but i think when doing my 3dmark attempts i was not able to get the core past around 610 (with the shaders linked to about 1400) Is it not very hard on the gpu or something? (getting like 900 fps lol)

 

::edit2:: At 651/792/1507 no errors for 5 mins in atiTool

ATI Tool is a great app for testing - I find it is super sensitive to even the most remotely unstable card clocks, and shows up errors instantly.

 

To ramm's original question, I would stay away from too much GX2 overclocking, as the card tends to throttly quite agressively if you set even a remotely unstable setting(s).

 

For example, my card will reach 720/1800/1050 100% stable in ATItool and through multiple passes of 3dmark06, games etc. However my actual 3dmark score - as well as my FPS in-game - actually gets worse at any speed beyond 675/1687/1050. I can increase my shader speed to about 1800 with the core and memory still at 675/1050, but the performance increases are negligible. 1-2 FPS in game and maybe 100 3dmarks overall.

 

Ironically, this might be why none of the retail GX2s are clocked beyond 675/2100. An inherent shortcoming of the card's hardware perhaps?

Edited by politbureau

Share this post


Link to post
Share on other sites

720/1100 is about the best I can get from the ones I have toyed with.
What kind of 3dmark06 scores were you getting with the GX2 on the Abit? Mine scores around 20k without much effort, but my mate with the same mobo as you is having trouble breaking 16k, and he's got an E8400 @ 3.8.

 

I'm actually not sure if it makes much of a difference, but I'm using Vista x64 and he's still on 32. Thoughts?

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...