Jump to content

MSI sent me the wrong card....


Recommended Posts

  • Replies 31
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

Maybe I didn't make it clear that I tested the cards one at a time, not in sli.  When in sli temps do jump by about 10C, as expected.  The reason I started testing is because I was getting funny effects and artifacts in game.  So first thing I did was ran furmark and watched my one card climb straight towards 100C, and it was the lower card in the sli setup.  I then pulled the cards and tested one at a time.  

 

Now not is SLI and testing, the card I RMA's was hitting 88C, not in SLI, fans on manual 100%, and really good case air flow.    

 

Why furmark?

MSI bundles afterburner with kombuster (aka furmark) with the cards.  This is their stability program that they include with their own cards for testing.  So when it fails to meet their advertised specs using their supplied stability program I print the screen and send it along with the card.  

 

The new ti card at 70c isn't that bad.  It just didn't meet there advertised specs is all.  I'm sure I could replace the tim and get it under 70 real easily.  That wasn't the real issue here, but more of strike three.

 

Also my other 560 runs solid at 60C running furmark.  Not in SLI, fans all turned up, 99% load, stock clocks and volts.  So I had one card running at 60 and the other at 88.  28C difference seems right to you guys?  I don't think so.

That explains your situation better, BTW that should have been your first post. I would just contact MSI again and see what they can do. Maybe you can get something for free.

Share this post


Link to post
Share on other sites

I would just contact MSI again and see what they can do. Maybe you can get something for free.

 

I have a call into them.  Called this morning and got an answering machine.  :erm:

 

Sell both MSI cards and buy another brand?

 

Or if you're adventurous pull the oem cooling assembly off the card you just got back from RMA and use a better thermal grease?

I am seriously considering just selling them both and doing something different.  I was really trying to hold out for the 7 series cards though.  If I can sli these two together then I think I can manage, but if I need to sell and buy stuff then I'll just make the jump to something better now I think.

 

I'll definitely slap on some good tim.  

Share this post


Link to post
Share on other sites

 

erm you are complaining about 70c~ temps with a sli set-up?

 

and wtf furmark, try running a game for a change, i never got why people run furmark to see how hot a card gets, its bullshit.

you never run furmark again and games never make a card as hot as furmark or any other stressprogram does (even not benchmarks)

 

70c loaded is a REALLY good temp, especially in sli,

do your research next time to see what temps other people are getting with an sli set-up.

my guess is that they run the same temps as you do right now.

 

 

some people still can amaze me :/

Well its good to test for stability if you are overclocking. Games are good for that sure, but lots of games cater differently to different settings. At least with a stress program you are minimizing the margin of error

 

 

 

 

Furmark is no longer a really viable tool for testing GPU loads. 

Share this post


Link to post
Share on other sites

Furmark not viable for testing GPU load, it has never been useful. In fact most so called burn-in programs for video cards are flawed in that they push the card well beyond any gaming scenario is going to put them in. Now I suppose there can be a claim made for the burn in software and say folding or bit coin mining but in gaming, the major use, they are meaningless. Run your favorite game for an hour and then compare to the burn in program for 10 minutes, the game will always be a lot lower in temperature. Stress testing puts the card under unrealistic loads. Now the argument, and a compelling one, is that if it passes the burn in then it should eb fine for gaming and this is true. however the cards were not designed for the burn-in style use over long periods. People make the mistake of starting these stress tests and then walking away from the system.

 

As for the temp, ANY card stay under 70c for a high end card under load is doing good, they are usually rated for close to 100c. In SLI or Crossfire where the air flow is more restrictive, getting 70C is doing well.

Share this post


Link to post
Share on other sites

 

 

 

erm you are complaining about 70c~ temps with a sli set-up?

 

and wtf furmark, try running a game for a change, i never got why people run furmark to see how hot a card gets, its bullshit.

you never run furmark again and games never make a card as hot as furmark or any other stressprogram does (even not benchmarks)

 

70c loaded is a REALLY good temp, especially in sli,

do your research next time to see what temps other people are getting with an sli set-up.

my guess is that they run the same temps as you do right now.

 

 

some people still can amaze me :/

Well its good to test for stability if you are overclocking. Games are good for that sure, but lots of games cater differently to different settings. At least with a stress program you are minimizing the margin of error

 

 

 

 

 

 

Furmark is no longer a really viable tool for testing GPU loads. 

 

 

 

cant say i agree with that statement.

Share this post


Link to post
Share on other sites

 

 

 

erm you are complaining about 70c~ temps with a sli set-up?

 

and wtf furmark, try running a game for a change, i never got why people run furmark to see how hot a card gets, its bullshit.

you never run furmark again and games never make a card as hot as furmark or any other stressprogram does (even not benchmarks)

 

70c loaded is a REALLY good temp, especially in sli,

do your research next time to see what temps other people are getting with an sli set-up.

my guess is that they run the same temps as you do right now.

 

 

some people still can amaze me :/

Well its good to test for stability if you are overclocking. Games are good for that sure, but lots of games cater differently to different settings. At least with a stress program you are minimizing the margin of error

 

 

 

 

 

Furmark is no longer a really viable tool for testing GPU loads. 

 

 

cant say i agree with that statement.

 

 

You don't have to agree as it is fact. I have been testing cards for some time now and the loads imposed on it will never be seen in any game. NVIDIA and AMD have both put hardware monitors on their cards and in their drivers to limit the impact of running these tests on the cards. Can they be defeated sure they can but the reality is that there is not a game out that puts that kind of load on the GPU. I use mine for F@H and impose a significant load that you will never see while gaming and I still put less of a load on the VRM circuits and core than Furmark does. You can use it if you want but as a means of testing a card its a waste. It is an exercise to see how fast you can overheat the GPU and PCB. You have cards now that throttle clock speeds much like CPU's do when they overheat. Its no longer a realistic viable test unless you are looking to cook the card.  

Share this post


Link to post
Share on other sites

 

 

 

 

erm you are complaining about 70c~ temps with a sli set-up?

 

and wtf furmark, try running a game for a change, i never got why people run furmark to see how hot a card gets, its bullshit.

you never run furmark again and games never make a card as hot as furmark or any other stressprogram does (even not benchmarks)

 

70c loaded is a REALLY good temp, especially in sli,

do your research next time to see what temps other people are getting with an sli set-up.

my guess is that they run the same temps as you do right now.

 

 

some people still can amaze me :/

Well its good to test for stability if you are overclocking. Games are good for that sure, but lots of games cater differently to different settings. At least with a stress program you are minimizing the margin of error

 

 

 

 

 

 

 

 

Furmark is no longer a really viable tool for testing GPU loads. 

 

 

 

 

 

cant say i agree with that statement.

 

 

 

 

 

You don't have to agree as it is fact. I have been testing cards for some time now and the loads imposed on it will never be seen in any game. NVIDIA and AMD have both put hardware monitors on their cards and in their drivers to limit the impact of running these tests on the cards. Can they be defeated sure they can but the reality is that there is not a game out that puts that kind of load on the GPU. I use mine for F@H and impose a significant load that you will never see while gaming and I still put less of a load on the VRM circuits and core than Furmark does. You can use it if you want but as a means of testing a card its a waste. It is an exercise to see how fast you can overheat the GPU and PCB. You have cards now that throttle clock speeds much like CPU's do when they overheat. Its no longer a realistic viable test unless you are looking to cook the card.  

 

 

 

 

Yes its true FurMark is on a whole other level compared to what a gaming session will do to a video card, "load" wise.

But the reason i dont agree with ditching Furmark (and this is just my opnion) is because i treat it as just one of many phases i like to put a card through.

 

And at the same time I dont abuse it either. Have you read some of the stories about people messing up their cards and or PSUs after running intensive stress tests? I bet you they run it for 24 hours or more. 

 

Wheres the common sense though? Knowing that Furmark is A LOT more intense than a gaming session why on earth would you test it for the same amount of time (Or more) than an "ideal" gaming session or multiple loops of 3dMark tests?

 

What i usually do is maybe run it for about 10-15 minutes..Just to guage out the temps, and to make sure the OC settings i have decided upon are in strong standing, stability wise. After that i"ll move along to more fine tuning and in-depth testing.

 

Like i said. I'm minimizing the margin of error. Clock by Clock, minute by minute, one test after the other.

 

A little too much? Maybe but my sh** stable...24/7

Edited by damian

Share this post


Link to post
Share on other sites

 

 

 

 

 

erm you are complaining about 70c~ temps with a sli set-up?

 

and wtf furmark, try running a game for a change, i never got why people run furmark to see how hot a card gets, its bullshit.

you never run furmark again and games never make a card as hot as furmark or any other stressprogram does (even not benchmarks)

 

70c loaded is a REALLY good temp, especially in sli,

do your research next time to see what temps other people are getting with an sli set-up.

my guess is that they run the same temps as you do right now.

 

 

some people still can amaze me :/

Well its good to test for stability if you are overclocking. Games are good for that sure, but lots of games cater differently to different settings. At least with a stress program you are minimizing the margin of error

 

 

 

 

 

 

 

 

Furmark is no longer a really viable tool for testing GPU loads. 

 

 

 

 

 

cant say i agree with that statement.

 

 

 

 

 

You don't have to agree as it is fact. I have been testing cards for some time now and the loads imposed on it will never be seen in any game. NVIDIA and AMD have both put hardware monitors on their cards and in their drivers to limit the impact of running these tests on the cards. Can they be defeated sure they can but the reality is that there is not a game out that puts that kind of load on the GPU. I use mine for F@H and impose a significant load that you will never see while gaming and I still put less of a load on the VRM circuits and core than Furmark does. You can use it if you want but as a means of testing a card its a waste. It is an exercise to see how fast you can overheat the GPU and PCB. You have cards now that throttle clock speeds much like CPU's do when they overheat. Its no longer a realistic viable test unless you are looking to cook the card.  

 

 

 

 

Yes its true FurMark is on a whole other level compared to what a gaming session will do to a video card, "load" wise.

But the reason i dont agree with ditching Furmark (and this is just my opnion) is because i treat it as just one of many phases i like to put a card through.

 

And at the same time I dont abuse it either. Have you read some of the stories about people messing up their cards and or PSUs after running intensive stress tests? I bet you they run it for 24 hours or more. 

 

Wheres the common sense though? Knowing that Furmark is A LOT more intense than a gaming session why on earth would you test it for the same amount of time (Or more) than an "ideal" gaming session or multiple loops of 3dMark tests?

 

What i usually do is maybe run it for about 10-15 minutes..Just to guage out the temps, and to make sure the OC settings i have decided upon are in strong standing, stability wise. After that i"ll move along to more fine tuning and in-depth testing.

 

Like i said. I'm minimizing the margin of error. Clock by Clock, minute by minute, one test after the other.

 

A little too much? Maybe but my sh** stable...24/7

 

Furmark proves nothing about stability of the overclock. I can get the same result using a variety of game tests(Metro, BatmanAC, Dirt 3, Crysis 3, Heaven, and 3DMark) that will fail with a bad overclock in usually about the same time frame or less than using Furmark. In fact I stopped using Furmark years ago due to the fact that what was deemed  "Furmark" stable was not "Game" stable. I could   Most cards now use a boost clock and will throttle down negating the impact of the overclock when using Furmark as a tool to assess your overclock on the card. As far as burning up PSU's and cards I fortunately have not done either in my testing. The only PSU I have had fail was the one used in my day to day system that ran 24/7 for 6 years under a full F@H load. The system is fine, the PSU just decided it had enough one day, a well earned trip to the scrap yard.

 

is their software that theyre bundling actually rebranded furmark?

 

If so then I agree with the OP, they should stand by their temp claims.

Andrew, Kombuster is based off of Furmark. MSI's claims are based on typical gaming loads against a reference card not running Furmark derivatives. Fermi cards are hot, period, and there is no way around it. With the Fermi architecture NVIDIA started putting hardware based controls on board to limit the impact of tests such as OCCT and Furmark. When they sense that load they are supposed to clamp down on the power and clocks to prevent a thermal overrun.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...