Jump to content

SLI - Detect optimal Frequencies?


SSmoker

Recommended Posts

Quick Question. I am running (2) 6800GT's in Sli. Stock speeds are 350 Mhz / 1.0 Ghz. When I hit the "Detect optimal frequencies" button in advanced setup it is telling me 430 Mhz / 1.15 Ghz. My question is. Is this a good / stable setting? I'm a bit confused because just last night I adjusted these frequencies to 357/ 1.05 & got quite a bit of tearing & artifacting in BF2. In game all settings are maxed. If the lower slightly OC settings caused tearing etc. won't this "optimal" setting make it worse? Thanks for helping an SLI Newbie out. :confused:

Share this post


Link to post
Share on other sites

tearing is caused by the FPS running out of sync with the refresh of the monitor. You can eliminate this by turning on v-sync however v-sync limits the total FPS to either 60 or 30 if it ever drops below 60 (or the refresh of your monitor) so I don't use v-sync. I prefer to turn up the graphics until the fps are below 60.

 

As for optimal frequencies they are usually all over the place. If the settings work and cause no artifacts go ahead in use them but from my experience they are never "optimal"

Share this post


Link to post
Share on other sites

The optimal frequency determination is not complete nor accurate enough, IMO. Testing is way to short to really determine what the true max core / memory of the cards are.

 

The best way to determine the optimal core / mem frequency is to start in small 5Mhz increments and test (by playing games and such) for at least 4 hours. Repeat until you start experiencing problems. Once you encounter difficulties, drop the frequency down 5Mhz and their you have it.

Share this post


Link to post
Share on other sites

I just tried this feature out.

I attained 450/1189 manually (which was stable) and the auto-detect set it to 469/1130, so it seems to favour higher clock frequency and lower ram frequency.

 

I'll do some benchmarks and see how it compares, and if or not this detected config is stable or not.

Share this post


Link to post
Share on other sites

I prefer to turn up the graphics until the fps are below 60.

 

how are you doing this with BF2?!? NVTray and elevated settings? My OCd 7800GT is running everything at all high settings, EAX enabled, "High Quality" and "Clamp lod bias", have manually cranked up Hardware T&L through RivaTuner, and Supersampling is on, etc. I can't find anything else to turn up and average ~90fps, rarely even see a number in the 60s except during heavy arty drops.

Share this post


Link to post
Share on other sites

The optimal frequency determination is not complete nor accurate enough, IMO. Testing is way to short to really determine what the true max core / memory of the cards are.

 

The best way to determine the optimal core / mem frequency is to start in small 5Mhz increments and test (by playing games and such) for at least 4 hours. Repeat until you start experiencing problems. Once you encounter difficulties, drop the frequency down 5Mhz and their you have it.

 

If this is true I basically can't OC my cards w/o getting tearing / artifacts in BF2.... THIS SUCKS. I really have a hard time beliving these cards can't be bumped up from stock to run Clean. Anyone that plays BF2 & has some suggestions it would be greatly apprecated.

 

PS: How do you see FPS in BF2? Thanks all

Share this post


Link to post
Share on other sites

If this is true I basically can't OC my cards w/o getting tearing / artifacts in BF2.... THIS SUCKS. I really have a hard time beliving these cards can't be bumped up from stock to run Clean. Anyone that plays BF2 & has some suggestions it would be greatly apprecated.

 

PS: How do you see FPS in BF2? Thanks all

 

you can use the command console ("~" key) to enter a command that will show in-game fps, lemme see if i can dig it up for you.

 

xxxeditxxx: Found it. Hit "~" key during play, when white command console comes up, type in "renderer.drawfps 1" and hit "~" again to hide the console.

 

I just use Fraps to view fps in corner of the screen. best is to run Fraps, start the game, then hit F11, andF11 again just before you stop gameplay. It will give you a spreadsheet report of Min, Max, and average FPS. do it a few times on a few different servers/maps to get a good idea of what you are up against.

Share this post


Link to post
Share on other sites

also btw, you really need to get another 1GB ram if you want to run BF2 textures on high. I swear my 6600GT from 1GB-->2GB made a bigger difference than going from 6600GT to 7800GT. dropped pings playing on 6600GT from ~80ms to ~40ms, even with the higher settings.

 

to get a better idea of whether the artifacting is heat-related (it might not be), run rthdribl in as large a window as your Vram will allow (probably fullscreen). Let it run for an hour or two while keeping an eye on temps. Rthdribl will heat your GPUs several degrees hotter than any game will and is kind of the ultimate GPU stability test, imho. For intstance, with rithdribl, my 7800GT loads to 63C, while it never exceeds 58C in BF2, but hits about 60C in COD2.

 

this may be as simple as adding more or better outtake fans to your case

Share this post


Link to post
Share on other sites

Quick Question. I am running (2) 6800GT's in Sli. Stock speeds are 350 Mhz / 1.0 Ghz. When I hit the "Detect optimal frequencies" button in advanced setup it is telling me 430 Mhz / 1.15 Ghz. My question is. Is this a good / stable setting? I'm a bit confused because just last night I adjusted these frequencies to 357/ 1.05 & got quite a bit of tearing & artifacting in BF2. In game all settings are maxed. If the lower slightly OC settings caused tearing etc. won't this "optimal" setting make it worse? Thanks for helping an SLI Newbie out. :confused:

Maybe this is a heat problem, and the Detect Optimal Frequencies button tells you 430/1150 mhz are ok because the system is cool at the moment.

 

Try this: play at normal frequencies for a while, any game is ok... let the system achieve a higher temperature. Then try the DOF button again and see if it says the same frequency or lower this time. You could also adjust manually the two frequencies (3D gpu and ram) and then push the Test button and see if they are ok that way.

 

If temperature changes the way your cards work then you have to increase the ventilation around them.

 

I have watercooling and the button tells me 440/1195, and I have them at 445/1200 mhz all the time with out a problem, artifact or tearing.

Share this post


Link to post
Share on other sites

I have watercooling and the button tells me 440/1195, and I have them at 445/1200 mhz all the time with out a problem, artifact or tearing.

 

i hadn't done "Detect Optimal Frequencies" for a long time so I just gave it a shot...

 

came back 518/1190. I actually game@ 530/1351 and bench@550/1389, so this thing is just wildly innaccurate

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...