Jump to content

TDP vs thermal output


bschmidt71

Recommended Posts

can anyone explain how to come up with the thermal output for a given device if the TDP of that device is known? it cant be equal to the TDP (can it??) because my 183 TDP GPU's never get hotter (with watercooling) than my 130 TDP CPU (see sig). under full load (OCCT for gpu's, prime 95 for cpu) my gpu's run near 61 and my cpu runs near 70 - both on the same water loop with the water temp locked at 35C (the Boreas software can set a temp that the TEC's "kick in" - i have mine set to 35 so that the CPU never goes above 70) note: under less stressfull conditions (ie: video encoding) where the CPU is still running at 100% but not prime 95 the CPU only runs at 65 (+30C rather than +35C with prime). both the cpu and gpu's are OCed to their respective max, but the gpu's can not be over volted whereas the cpu can (it's running at [email protected]). i *think* my gpu's are both very close to 1V +/- 2%. i know various things can effect how much heat an item produces (as i said before: all "loads" are not created equal), and that the TDP changes (?) with overvolting. i also know that the max thermal load for a PCIe bus is 300W - but an overclocked/overvolted nVidia gtx 480 pushing max TDP of 300 with watercooling will still not (probably) get hotter than an i7 d0 stepping HT enabled cpu even running close to the same voltage. and that's close to the same process tech as well (40 vs 45) with 1.5B transistors vs 1.4B (?) all things considered the GPU **should** be MUCH hotter than the CPU???? why isn't it???

 

my questions are as follows:

 

1. what is my max thermal load/heat output?

2. how much heat can my system dissipate?

3. how does one calculate these values?

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...