Jump to content

Nvidia 8800GTX overclocking, does the card overclock like you think it


Praz

Recommended Posts

This is a writeup Tony did in reference to what the real speed is compared to what we think it is when overclocking the 8800 series cards. This applies to most all cards using the G80 core.

 

http://www.thetechrepository.com/showthread.php?t=133

Share this post


Link to post
Share on other sites

Confirmed on my 8800GTS, except the shader intervals are different.

Now I realize that is was actually my shader clocks that were limiting my MAX OC (anything above 1500MHz gives the Riva warning and causes artifacts).

I'm also wondering if the "Test" feature in Riva and Coolbits works properly with these cards.

Share this post


Link to post
Share on other sites

  • GPU Clock (Geometric domain): from 635 to 661, 648.
  • GPU Clock (Shader domain): 1566 (657=1566).
  • GPU Memory Clock: from 1004 to 1116, 1080.

I downloaded the “XFX 8800GTX (XXX) PCIe 768MB BIOS (Rev 03: Version : 60.80.0E.00.10)”, modified the GPU Clocks & GPU Memory Clock to 648/1566/1080, flashed my

current BIOS to the modified one.

 

Yes, the hardware monitor in Riva tuner shows the exactly same values as defined in the BIOS for GPU Clocks & memory Clock (, though I noticed that the warning icon showed up in the shader clock graph window…)

 

The 3DMark06 test with its default settings was frozen up shortly after the first test got started.

 

O.K., I have not understood what the findings mean; they show the actual running MHz for the MHz defined in BIOS but defining the actual running values in BIOS does not guarantee the graphic card’s stable performance gain…

 

Still trial & error to find the over-clocked settings for XFX 8800 GTX XXX?

Share this post


Link to post
Share on other sites

  • 3 weeks later...

This is a great thread that should get more attention...

 

Been trying to get my XFX 8800GTS XXX to behave nicely, and was using the information over at XS as a guide, until I realized, when pulling up the clocks on RivaTuner, that XFX overclocks the shaders to ~1512MHz out of the box...

 

So I did some testing and put together this handy dandy chart. Bear in mind the GPU clock/shader clock info is only valid for the XFX XXX model, while the memory clocks *should* be good for all 8800GTS models.

 

 

GPU Speed (XXX Model ONLY)

 

Set Speed | Real | Shader

572 - 594 | 576 | 1512 - XXX Stock Speed

595 - 615 | 594 | 1566

618 - 641 | 612 | 1620

642 - 664 | 621 | 1674 - My Max Stable Overclock

665 - 687 | 648 | 1728

688 - 711 | 675 | 1782

 

501 - 524 | 513 | 1188 - nVidia GTS Reference (?)

 

Memory Speed (all GTS Models)

Set Speed | Real Speed

792 - 800 | 792

801 - 809 | 801 - Nvidia Reference

810 - 818 | 810

819 - 827 | 819

828 - 836 | 828

837 - 845 | 837

846 - 854 | 846

855 - 863 | 855

864 - 872 | 864

873 - 881 | 873

882 - 890 | 882

891 - 899 | 891

900 - 908 | 900 - XXX Stock Speed

909 - 917 | 909

918 - 926 | 918

927 - 935 | 927

936 - 944 | 936

945 - 953 | 945

954 - 962 | 954

963 - 971 | 963

972 - 980 | 972

981 - 989 | 981

990 - 998 | 990

999 - 1007 | 999

1008 - 1016 | 1008 - My Max Stable Overclock

1017 - 1025 | 1017

1026 - 1035 | 1026

All in all, with stock volts and cooling, I could push:

GPU | Shader | Memory (real)

621 | 1674 | 1008

To my knwoledge, these speeds are *very* unusual, as in general, most cards crash when the shader clock pushes past the 1620MHz mark. With reference GTS cards, this happens at 648MHz core (real). On the XXX, this occurs at a lower core clock (612MHz), since XFX has raised the base shader clock reference.

Ironically, in making the fastest out-of-the-box GTS card on the market, I think they've limited the overall overclockability of the card... :rolleyes:

Peace

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...