Jump to content

LYNCHIN

Members
  • Posts

    21
  • Joined

  • Last visited

About LYNCHIN

  • Birthday 08/18/1975

Profile Information

  • Gender
    Male
  • Location
    Canada
  • Interests
    Father of Two. Big time Gamer. P.C. Overclocker

OCC

  • Computer Specs
    C2D XEON [email protected], EVGA GTX260 , G-Skill 4*1GB sticks F2-6400CL5D-2GBNQ, ASUS P5B Vanilla Bios 1604
    Zalman CNPS 9500 Led, CoolerMaster 600W PSU, Logitech G5/Keyboard, Acer 24 Inch Wide Screen, Win 7 ultimate 64 bit, Huge Black Case

LYNCHIN's Achievements

Newbie

Newbie (1/14)

  1. Today I ran the Benchmark with Physix on and again with Physix turned off. My scores are in my earlier post. Physix on 7872 GPU=9291 CPU=5399 Physix off 7882 GPU=9303 CPU=5405 So not sure what is happening but it is confusing. I haven't changed anything except my drivers on my GPU? I am now running the 196.75's, Last week was the 196.21's. Anyone else have a suggestion??
  2. I just re ran the Vantage test with My CPU OC to 3.0Ghz instead of default 2.6Ghz. Physx off in CP. Score 7882 GPU=9303 CPU=5405 Then I ran it with Physx on score 7872 GPU=9291 CPU=5399 So what is happening here?
  3. I was running the Vantage Benchmark today testing out my GPU's new Overclock. And I got a score of 7559. GPU=9745 CPU=4518 I then returned my GPU to stock speeds to make sure my GPU wasn't throttling, score of 7290 GPU=9206 CPU=4489 So that is understandable losing a few hundred points from returning your GPU to its' stock speeds. But here is the main issue. I ran Vantage about a week ago score of 11048 GPU=9188 CPU=28125 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!! How is that possible? I lost almost 4000 points. And I have not changed my CPU's speeds at all. The only thing I have changed is I installed the latest Nvidia drivers, But as you can see my GPU scores are still very close. I just don't get how my CPU score can change so much. And BTW, My CPU is at its' default speeds too. Ever since I installed Win 7 ultimate 64 I have left my CPU speeds at default till I got everything else sorted out drivers/software wise. I am fairly new to the Vantage test as I just got a DX10 OS. But I did have the Physix turned off in the Nvidia CP. I also picked 'Let the 3D app decide' for all other Nvidia CP settings. And in Vantage I selected 'Performance' as to leave everything at default settings. I really need to sort this out. Now I am wondering if this is why my Games have not being running with consistent speeds? What is happening people? Cheers
  4. Additional Information X-Fire Username No Information Steam Username No Information Computer Specs C2D XEON [email protected], EVGA GTX260 , G-Skill 4*1GB sticks F2-6400CL5D-2GBNQ, ASUS P5B Vanilla Bios 1604 Zalman CNPS 9500 Led, CoolerMaster 600W PSU, Logitech G5/Keyboard, Acer 24 Inch Wide Screen, Win 7 ultimate 64 bit, Huge Black Case I found this under info for my name? Those are my specs.
  5. I was looking under the plugins sections and noticed the VT1103.dll sensor. Now when I have it (plugin VT1103.dll) enabled under the hardware monitoring app on rivatuner, The temps are diff then the ones that EVGA precision is displaying?? Right now while doing a Artifact scan, EVGA precision is showing a temp of 70C, And under Rivatuner VRM phase 1 temperature, C = 55 VRM phase 2 temperature, C = 56 VRM phase 3 temperature, C =57 So which one of these temps would be the best one to go by?? Because I am trying to find a new high OC after installing Win 7 ultimate 64 and I really need a accurate temp of my GPU! I will be adding voltage VIA EVGA voltage tuner also, So it is critical that I have valid temps of my GPU! So can someone please help me understand which temp to go by? Or is there a better way to monitor my GPU temps? Here is the description in Rivatuner of the plugin in question, It clearly states that it is for the GTX200 Series! "This plugin provides Voltage regulator output, Voltage regulator temperature and Voltage regulator current hardware monitoring data sources for display adapters with Volterra VT1103/VT1105/VT1165 voltage regulators. Tip: VT1103/VT1105/VT1165 voltage regulators are used on reference design ATI RADEON X1800, X1900, HD 3800, HD 4800 and NVIDIA GeForce GTX 200 series display adapters."
  6. I have a GTX260 (196 SP), And am wondering what to upgrade to? I know I don't have the 216 Core model, But I can get a decent 700Mhz on my Core Overclock. So I am wondering If I make the jump to a ATI 5850, How much gaming FPS gain will I really see? http://www.hardwarecanucks.com/forum/hardw...k-review-3.html I was reading this review, And i know that they were getting 680Mhz on their core as their Super overclock, And it was being compaired to the GTX 275 in most cases. So I can hit 700Mhz on my Core, So would that even things out and put my card comparable to a GTX 275 Too?! Or is my CPU too much of a bottleneck to power the 5850 properly. I know there is a 5870, But it is a bit out of my price range right now. So Can someone please give me some really answers here. Will I really see a big performnace gain? Or should i just keep saving till i can get a new CPU/GPU together. My rig is almost at its end, but it would be nice to just drop in a 5850 in and keep on gaming for now:)
  7. Sparkle FSP550-60PLG That is the PSU I have installed in my Rig at the moment. I am wondering if it will run a ATI 5870? I only have two 750GB hard drives and one DVD burner installed in my Rig. So it is not overloaded with items drawing power. I know that the Sparkle PSU is a bit old, But I am wondering if it will be able to run a new ATI GPU. i would like to stick with Nvidia;s new GPU, But I would have to get a new PSU for sure also. I know there is another ATI gpu just under the 5870, I think it is the 5850 or something. Am I pushing things too much trying to run a 5870? Thanx
  8. Can no one help me out? Will adding Voltage Via EVGA Voltage tuner help with Artifacts? I used to have a nice 720 Mhz Core Overclock. Now I get Artifacts trying to push 700Mhz. So Will adding voltage help acheive this? Thanx
  9. My EVGA GTX260 is the 196 model:( But it does do a decent OC? Right now the only game I can't completely Max the settings out and get my acceptable 35-45 FPS is the new Stalker COP!! But I know that there is some nice new DX11 games comming down the pipe soon? And the longer I wait, The less I will get for my GPU. So it's either I sell now and get something for it, Or wait a few more months for Nvidia's next line of GPU's to push mine out of the top teir were it will be then worth next to nothing. ???Decisions!! I thought ATI has a GPU just under the 5850 that has the same amount of SP's or something, But has a slower clock? What is ATI's new budget King Overclocker?
  10. Well I know my GTX 260 is starting to show its' age. But compared to ATI's new 5830. Even after reading this review from this site: http://www.overclockersclub.com/reviews/sa...re_hd5830/8.htm At all resolutions under Crysis Warhead ATI's new 5830 GPU Vs A GTX260 Only a few FPS are seperating the two. And I know i can Overclock mine to at least 700Mhz on the Core so it puts it very close to the performance of a GTX 275;) So I am wondering aside from the DX11 support, What would be a good GPU to upgrade to? I am looking for the new budget Overclock King? I really havn't had the time to read up on all of ATI's offerings or what Nvidia is really offering. But I really don't have alot of cash these days. Being a father of two little kids really takes almost all of my money. But who holds the new Budget Overclock Gaming Performance crown?! Thanx for your help. I plan on getting what I can get fo rmy GTX260, Then put up the rest of the cash to get what I can. So please give good advice. Links to support your views or thoughts would really help. Cheers
  11. Hi All, Haven't been around here in a while. Ever since I installed Win 7 Ultimate 64 Bit I have been on a mission to max out my Overclocks one more time to squeeze every last FPS I can out of my Rig while I can. But I am running into some issues. While I was just trying to OC my GPU Via EVGA precision and using ATITool's Artifact scanner, I was getting some very frustrating results. Before I upgraded my OS, I was able to OC my GPU's Core to at least 720Mhz without getting any Artifacts (While leaving my shader clock at 1452 and Mem at 1100Mhz). And I knew I could push it more before the new OS install, But there really wasn't a need to because I could play my games at the max Settings and run them at acceptable FPS(35-60 FPS). Now with new games coming out like Stalker COP just to name one for this post, I am seeing that I need to really push my GPU and CPU to levels higher than before if I want to keep the Eyecandy on while maintaining decent FPS. So, I just tried a mild OC of 680Mhz on my Core while leaving my shader at 1452Mhz and Mem at 1100Mhz. I ran the Artifact scanner, And two mins later I get some very small dots on the screen. So I abort the test and here I am wondering what is happening to my once decent Overclock on my GPU. I even backed my Mem down to its' stock speed and I was able to hit 700Mhz on the core before I got little artifacts. But this is nowhere near what I was able to hit before So I am wondering if I run the EVGA Voltage tuner and add some voltage to my GPU will I be able to get my Overclock to go higher without gettting artifacts?? And BTW, My temps during these little tests where very good, My GPU never went higher than 70C. I guess I am wondering is my Core clock Overclock artifacting limited to Voltage? Will adding voltage help my GPU not Artifact? Or will it just make things worse? And also BTW, When I was hitting my previous higher Overclocks I hardly ever used them. I usually just kept my GPU at stock clocks unless I was playing Crysis or Benching, And even then it was not for very long periods. So my GPU was not abused just to clear the air. What is happening guys? Why is my GPU now not able to hit 700 Mhz and not Artifact Cheers
  12. So are you saying it is O.K to push on past 3.2Ghz? I want to try and hit 3.4Ghz or 3.5Ghz if possible. What do you guys think should be the Max Temps under full Load? I have seen people say 70C should be the Max temp you wanna stay at. I am sure I will have to set V.Core in Bios to somewhere around 1.5 Volts!! But my Board does have a big V-Drop, Look at my first OP and see the diff from Bios to CPU-Z under load. That is a big V-Droop isn't it? I have seen people warn others of the dangers of V-Droop. Saying that when your rig comes out of full load that the spike in V.core could cause damage. What are your thoughts on that?
  13. C2D E6400 EVEA 8800GTS SC ASUS PB5 Vanilla BIOS 1604 four 1GB Sticks G-Skill DDR2 800 5,5,5,12 Can run 760MHZ 4,4,4,10 @1.9V Zalman CNPS 9500 LED 17 INCH Foxconn Case Those are the basic system specs of my Rig. My main problemis I keep hearing differnt Ideas as to what a Max temp should be? Some say 70C is fine for running Orthos, Some say 60C is Max. So which is it? I really can't afford to damage any components, As having two little kids doesn't afford much left over in the Bank for spare C2D's,LOL Another Question I have is this. Before i got this G-Skill, I had some Kingston DDR2 667Mhx ram, Same config, Four 1GB sticks. Now I couldn't budge my CPU past 2.8Ghz, People were telling me then that my Memory Controller is being pushed too hard. Shouldn't My Mem Controller be stressed even more running the same amount of Ram at faster speeds? I don't quite understand this concept of a Memory controller being stressed. And me wanting to push this ram even further past 800Mhz speeds, Wouldn't that Overheat my Northbridge Heatsink? Anyway of helping cool down this area without major upgrades? Would taking the side panel of my cae off help at all? Or would that just disrupt the Case airflow more? Thanx for any and all replies people. I am still learning this Overclocking trade.
  14. So, To get my chip to run at 3.2ghz, I have to bump the V.core to 1.425 (1.328 CPU-Z) under load. So Basically my V.Core still has some headroom if CPU-z is reading 1.328 under load, Right?! And the Temps are at 59-60C after 1 hour of (Max CPU Heat Test) On prime. I have heard that you can't Overheat these chips, Or that it is very hard to overheat them. But I have seen in lots of other Articles that 60C under Load is a bit high for Prolonged use. I have also seen people use 1.5 or 1.5.5 V.Core to get there CPU to where they wanted it. I would like to Run my CPU higher than 3.2Ghz, And keep the Ram at 4,4,4,10 For gaming if possible? If I have to, I don't mind loosening the Ram off to hit higher CPU speeds. I have found that Higher CPU speed Trumps Tighter timings in Gaming Apps. Well I am satisfied with it's performance in all area's except Crysis Gaming performance. I just want to be able to get every bit of Juice out of my Rig I can for that Game. I don't plan on going too much higher than 3.2GHZ on the CPU, I just need some advice about if 60-65C is safe for prolonged use? I would love to hit 3.4 or 3.5Ghz. But I am sure I will have to set the BIOS V.Core to around 1.5Volts to get 3.5 Stable. What are the Max CPU temps that are safe for a 2 to 3 hour Gaming session. BTW, I use Core temp to monitor my CPU temps. What Is the Best Apps to use to monitor C2D temps. As you can see, My MOBO has quite a V-Droop. 1.425v on BIOS is only 1.328v under full Orthos Load, According to CPU-Z And Everest. And on a side note, I won a Thermaltake BigTyphoon(Lapped) After Market Cooler. Do you guys think I will have lower temps using the Thermaltake over my Zalman CNPS 9500?? It should be here in a few days. Thanx for any and all help people:)
×
×
  • Create New...