Jump to content

NVIDIA GeForce GTX Titan X Reviewed


Bosco

Recommended Posts

 

Nice review, loving that we have the OC'ed card charts again - makes it easy for me to comapare my OC'ed 980 to the Titan X
 
Here's a rough summary for 1920x1080 :

 

Metro LL
980 OC 1.3% slower than stock X
OC X 21.5% faster than OC 980
 
Bioshock Infinite
980 OC 0.2% slower than stock X
OC X 16.4% faster than OC 980
 
Crysis 3
980 OC 11% slower than stock X
OC X 38% faster than OC 980
 
Far Cry 4
980 OC 11.3% slower than stock X
OC X 32% faster than OC 980
 
BF4
980OC 11.3% slower than stock X
OC X 25.4% faster than OC 980
 
ACU 
980 OC 9% slower than stock X
OC X 29.5% faster than OC 980
 
So an OC'ed GTX980 is on average 7.35% slower than a stock X at 1080p. I know we're comparing apples to oranges with stock vs OC, but it is interesting to see that means for $559 with a bit of slight tweaking in Afterburner you can get close to stock Titan X performance which will cost $999. 
And an OC'ed X is on average 27% faster than an OC'ed GTX980 at 1080p, while the price is probably 78.7% more. 
 
The Titan series used to be for those looking for improved DPP, but now it seems like a way to take a gaming card, throw on extra VRAM and price it out of the reach of most gamers.
 
Hoping to see a 980Ti with similar specs and 6Gb VRAM arrive around the time the HBM AMD Fiji card comes out, so we can see a battle between the brands for performance and value for money in the segment between the $559 GTX980 and $999 Titan X.

 

 

 

That is kind of why we overclocked in the first place. The idea is to make your hardware that is slightly cheaper faster or as fast as the more expensive hardware. Of if you have deep pockets OC the crap out of the very best and have Fun doing it.

 

Now I do see your point there. I wonder if I may upgrade by accident here soon if the 980 card prices go down.

Share this post


Link to post
Share on other sites

I did some digging piror to the Titan X when I had two Titans and more digging now looking into FP64. I found it isn't important to 99.9% of consumers. This is because other than a few professional apps, everything still is using FP32. So unless you are NASA, Big Oil or computing high maths, you shouldn't even factor FP64 into the reason for buying any video card.

 

Adobe only uses FP32

Autodesk Iray uses FP32

V-Ray uses FP32

MathLab uses FP64

Folding@Home uses both but FP32 gives highr return due to FP64 downclocking the Titan.

 

The new Quadro M6000 is identical (as far as I can tell) to the Titan X with only 12GB of rvram and 1/32 FP64. That means the Quadro Drivers are really the only major difference. I predict the Cutdown GM200 that is most likely being called the GTX 980TI will have 6GB of ram later this year. Next year or after when the new series comes out, Nvidia will release the Full GM200 called GTX 980 SE and stop production of the Titan X. So unless you are dying to play in 4k on a single cad, it's best to wait and get a reduced price for the same preformance.

 

So in short, do not judge the card based on it's FP64 preformance since it makes no different in anything you (the consumer) will use.

Share this post


Link to post
Share on other sites

Just for kicks will be running a F@H comparison over 24 hours between the R9 290X and Titan X to see what the difference in folding production is when running at stock clocks. I may be able to pull average consumption but will check the Killawatt first. 

 

When I made this comparison to the GTX 980 it showed that the GTX 980 was a folding beast when running the x17 cores but choked on the x18 core units.

 

 

Starting now!  

Share this post


Link to post
Share on other sites

DP precision matters to some. That said, I'm just surprised they changed what the lines represented.

 

Also - does the X have ECC? The M6000 certainly does.

Share this post


Link to post
Share on other sites

DP precision matters to some. That said, I'm just surprised they changed what the lines represented.

 

Also - does the X have ECC? The M6000 certainly does.

I haven't read anything that says the X has ECC ram.

 

Also the Few that DP matters too most likely have a Telsa or whatever AMD has.

Share this post


Link to post
Share on other sites

You would have to be quite foolish to believe that BFG was making a comeback. So foolish you got a little excited and looked around for BFGs product page.

I would never be that foolish. :/

Share this post


Link to post
Share on other sites

You would have to be quite foolish to believe that BFG was making a comeback. So foolish you got a little excited and looked around for BFGs product page.

I would never be that foolish. :/

 

So you followed the link didnt you!

Share this post


Link to post
Share on other sites

First round of F@H shows the Core i7 4770K @ 4.5Ghz  and the R9 290X deliver 247744 points in 24 hours. Projected on the R9 290X was in the 219K PPD range and it came pretty close when the CPU was projected to deliver around 30K a day. 

 

GTX Titan X shows a prediction of 420K a day...........

 

Now lets see if Titan X can deliver on that projection.   

Share this post


Link to post
Share on other sites

Thanks for the F@H comparison tests Frank,..   :thumbsup:

 

I plan on upgrading folding rig hardware soon and 300K PPD range with single gpu  would be sweet,..

 

 To the best of my knowledge the core 18 projects have been temporarily restricted to use pre-Maxwell GPU's only,.. Fah servers recognize the installed gpu before assigning a wu,.. and once Maxwell gpu is detected only core 15 and core 17 will be assigned till the bug is identified and a driver comes out that fixes it.

 

So for now if your running Maxwell,.. flying the advanced flag you will receive mostly core 17 projects,.. as the core 15 projects  do not have QRB bonus.

Edited by Braegnok

Share this post


Link to post
Share on other sites

×
×
  • Create New...