bp9801 Posted September 6, 2014 Posted September 6, 2014 Tonight Frank takes a look at the new PowerColor R9 285 Turbo Duo video card. See what he thinks of it and how it compares to the other cards in its price range at the review below: http://www.overclockersclub.com/reviews/powercolor_r9_285_turbo_duo/ Share this post Link to post Share on other sites More sharing options...
IVIYTH0S Posted September 6, 2014 Posted September 6, 2014 Ok, they really need a new naming structure, all i can think of is the geforce 285 lol Thanks for the review Sir Francis! Share this post Link to post Share on other sites More sharing options...
Disparaitre Posted September 6, 2014 Posted September 6, 2014 (edited) ... Edited September 20, 2014 by WARDOZERX Share this post Link to post Share on other sites More sharing options...
ccokeman Posted September 6, 2014 Posted September 6, 2014 Whats with the BF4 results? No 285 to be found. Not sure what went on there! Share this post Link to post Share on other sites More sharing options...
l1il Posted September 6, 2014 Posted September 6, 2014 War, I reread a couple of time to make sure I haven't overlooked, but seeing your post comforts me. The naming scheme is crap with AMD in these times. R7... R9... 2x0, 2x5. Is it only a feature set that changes the last digit from 0 to 5? Faster chips with more shaders usually goes from 280 to 280x. People were complaining with nVidia that the naming was silly, but AMD brought it to a whole new level. No mention of TrueAudio, does this chips supports it? Also, AMD speaking of Freesync is idiotic at this point, since there is no hardware available yet. Share this post Link to post Share on other sites More sharing options...
Disparaitre Posted September 6, 2014 Posted September 6, 2014 (edited) ... Edited September 20, 2014 by WARDOZERX Share this post Link to post Share on other sites More sharing options...
l1il Posted September 6, 2014 Posted September 6, 2014 (edited) My main problem is that the 2xx series of AMD is a cluster**** of generations under the same naming. For ball's sake, The first number used to be the GCN generation related. Now, you have Hawaii (and it's handicapped brother, Tonga). You have GCN 2.0 and 3.0 under the SAME NAME, where half of them are bebadges, Some have lower clock speeds (the 7970 GHz was at 1050 MHz, while stock 280x are at 1000), the 7950 had a boosted version, where the 280 I truly have no clue about it's speed (mostly because I do not care much). just look at the amount XFX has of NCIX. Furthermore, the 285 is priced close to the 280X, the latter performing better (I'm sad it was not included in this set of results) on benchmarks, and has 3Gb of RAM instead of 2. So unless you NEED TrueAudio or Mantle (is there new instructions in GCN 3 offloading the CPU further?), I don't get the point. Then you have Gigabyte, that comes and name their GPU's GHz edition instead of OC edition, because it looks cooler. Cooler than a glacier under a polar bear's sack. /End of rant. I've been following nerdly computer stuff for 6 years that I know practically every model and it's equivalent counterpart (AMD vs the world lol!), but this is getting out of hand. -Edit- Furthermolore doesn't exist (typo) Edited September 6, 2014 by StefenHeif Share this post Link to post Share on other sites More sharing options...
Waco Posted September 6, 2014 Posted September 6, 2014 To be fair Nvidia is worse at rebranding. They still sell Fermi GPUs under the 700 series and 800m series... Share this post Link to post Share on other sites More sharing options...
Disparaitre Posted September 6, 2014 Posted September 6, 2014 (edited) ... Edited September 20, 2014 by WARDOZERX Share this post Link to post Share on other sites More sharing options...
l1il Posted September 6, 2014 Posted September 6, 2014 Yea, and the 800m series is there only to make people think that they have a better version than the 700m series, while being very similar. But you don't wonder if the 770 is stronger than the 760, you know by nomenclature and it applies. But it doesn't make sense that a 280x is superior to a 285. The x is just to troll, what is that for? Unlocked multiplier LOL!? Larger cache? Superior thermals? Speaking of ixes, how about the 295x2. Nice one. the 6990 was a dual 6970, so the 295 would be expected to be the 290x x2 (love the x) and then you have the x2 after that, so quad 290x (x everywhere) 290x2 looks too mainstream. They could go 580x, that would be shorter and implied 290x. Maybe AMD sucked at algebra (don't know if it necessary to program drivers, would actually explain a lot of things). /End of delirium The 260 is the 7790, then you got the 265 and 270, which are both supposed to replace the 7850. Trolling people at it's finest. Then add a GHz or Boost edition in there while were at it. Remember when there was a 5830, 5850 and 5870? Then you go with less power? 5770 and 5750 and so on. Maybe it's a Geek Squad request, someone in there has a fetish on numbers where you got to know each specific version because the numbers by themselves mean jack... Share this post Link to post Share on other sites More sharing options...
ClayMeow Posted September 6, 2014 Posted September 6, 2014 I'm glad I'm not the only one that thinks AMD's naming scheme is idiotic. I was talking to Troy about it before this review went live.... I'm convinced that AMD and Nintendo share a naming department. It's the only reason I can think of as to why both companies are absolutely horrible at it. There is absolutely no reason why something named 285 should be worse than something named 280 within the same brand. NO REASON. No wonder AMD is playing second fiddle in both its major markets. Share this post Link to post Share on other sites More sharing options...
Waco Posted September 6, 2014 Posted September 6, 2014 I'm glad I'm not the only one that thinks AMD's naming scheme is idiotic.Both companies are intentionally confusing it seems. The 285 has interesting frame buffer compression that makes better use of memory bandwidth. I imagine we'll see it in APUs shortly as well as new "big" GPUs. Share this post Link to post Share on other sites More sharing options...
Recommended Posts