LYNCHIN Posted March 4, 2010 Posted March 4, 2010 I was looking under the plugins sections and noticed the VT1103.dll sensor. Now when I have it (plugin VT1103.dll) enabled under the hardware monitoring app on rivatuner, The temps are diff then the ones that EVGA precision is displaying?? Right now while doing a Artifact scan, EVGA precision is showing a temp of 70C, And under Rivatuner VRM phase 1 temperature, C = 55 VRM phase 2 temperature, C = 56 VRM phase 3 temperature, C =57 So which one of these temps would be the best one to go by?? Because I am trying to find a new high OC after installing Win 7 ultimate 64 and I really need a accurate temp of my GPU! I will be adding voltage VIA EVGA voltage tuner also, So it is critical that I have valid temps of my GPU! So can someone please help me understand which temp to go by? Or is there a better way to monitor my GPU temps? Here is the description in Rivatuner of the plugin in question, It clearly states that it is for the GTX200 Series! "This plugin provides Voltage regulator output, Voltage regulator temperature and Voltage regulator current hardware monitoring data sources for display adapters with Volterra VT1103/VT1105/VT1165 voltage regulators. Tip: VT1103/VT1105/VT1165 voltage regulators are used on reference design ATI RADEON X1800, X1900, HD 3800, HD 4800 and NVIDIA GeForce GTX 200 series display adapters." Share this post Link to post Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now