Speedway Posted December 16, 2010 Posted December 16, 2010 (edited) Crysis - You see a small differences in fps here going from 1.0 to 2.0? You want lower res? Here ya go Edited December 16, 2010 by SpeedwayNative Share this post Link to post Share on other sites More sharing options...
mattyamdfanboi Posted December 16, 2010 Posted December 16, 2010 the board on my xfires was a 890gpa-ud3 it ran x8 and if it was true that id see higher performance.. im pretty sure i wouldnt have done as well as i have.. and when u reach the limit of a slot.. overclocking has no effect.. and i was never anywhere near the limit yet.. as every mhz i oc'd saw a good result hence why i had the fastest 5850 on our site hahah by a long shot too Share this post Link to post Share on other sites More sharing options...
Speedway Posted December 16, 2010 Posted December 16, 2010 the board on my xfires was a 890gpa-ud3 it ran x8 and if it was true that id see higher performance.. im pretty sure i wouldnt have done as well as i have.. and when u reach the limit of a slot.. overclocking has no effect.. and i was never anywhere near the limit yet.. as every mhz i oc'd saw a good result hence why i had the fastest 5850 on our site hahah by a long shot too Didn't know Gigabyte made the 890gpa-ud3 with PCIe 1.0 video card slots Share this post Link to post Share on other sites More sharing options...
mattyamdfanboi Posted December 16, 2010 Posted December 16, 2010 (edited) i said it ran x8 slots... :S i said the pcie 1 was some 630 nforce thing.. can u read at all? and just to throw this in.. a pcie 2 slot running at x8 is the same as a pcie 1 slot running at x16 Edited December 16, 2010 by mattyamdfanboi Share this post Link to post Share on other sites More sharing options...
Speedway Posted December 16, 2010 Posted December 16, 2010 i said it ran x8 slots... :S i said the pcie 1 was some 630 nforce thing.. can u read at all? and just to throw this in.. a pcie 2 slot running at x8 is the same as a pcie 1 slot running at x16 You say you used your GTX 480 in a PCIe 1.0 board, not long ago, but you can't rem what board it was, or you have no proof of getting the scores your talking about. You also keep talking about your 5850's and the "highest scores on OCC" in a NEW freaking mobo! Old school gpu's would see no difference in a 1.0 or 2.0 slot, as they were not maxing the bandwidth of the 1.0 slots. The gpu's out today, especially a higher end card like a GTX 480, this would not be true! This would especially not be true of multi-card setups. matt, your gigabyte board was a 2.0 slot @ x8, I guess it was hard to read my sarcasm about you talking about a very new mobo and a x8 slot like it was some old 1.0 slot mobo! We have hijacked this thread long enough! But, today's gpu's will see noticible gains in a 2.0 slot vs an old school 1.0 mobo! They use alot more bandwidth! Share this post Link to post Share on other sites More sharing options...
mattyamdfanboi Posted December 16, 2010 Posted December 16, 2010 the proof is in the benchmark score and ill go look at the mobo now.. brb GA-M61PM-S2 thats where i had to run my 480gtx as a trail to see if my mobo was the problem or the vid card.. the original mobo was a GA-890GPA-UD3 thats where my xfire lived.. again.. in the benchmark section.. if u find a 5850 thats faster.. be my guest.. till then.. stfu also.. old school gpus or not, nothing has really surpassed the x8 abilities yet.. i reckon even my 580's scores wouldnt change too much.. like i said and u seem to not be able to read.. its been proven already that x8 and x16 differences are minimal.. heres ur "multi gpu setup" proof that ur wrong http://www.hardocp.com/article/2010/08/23/gtx_480_sli_pcie_bandwidth_perf_x16x16_vs_x8x8/ like i said.. ive tried this myself.. and gotten the same results... x8 vs x16 = . all right now.. even with our current crop of gpus Share this post Link to post Share on other sites More sharing options...
medbor Posted December 16, 2010 Posted December 16, 2010 @OP: go for the 6950, since it overclocks to a 6970 and will lot be as loud @PCIe discussion: PCIe 1.0 x16 = PCIe 2.0 x8 (same bandwith by specification) Dual card setups on PCIe 2.0 x16 v.s. PCIe 2.0 x8 has a marginal difference ergo PCIe 2.0 x16 should have marginal difference to PCIe 1.0 x16 regarding dual card setups ergo PCIe 3.0 is not needed at the moment (since we have about 90% buffer perfomance wise) talking about PCIe 1.0 x8 v.s. anything better is another story and big gains is to be expected. Share this post Link to post Share on other sites More sharing options...
mattyamdfanboi Posted December 16, 2010 Posted December 16, 2010 Definitely as that's x4 in pcie 2 and that really slows things down hahah Share this post Link to post Share on other sites More sharing options...
mattwalter85 Posted December 16, 2010 Posted December 16, 2010 That "1960's" board you are talking about is PCI 2.8 BECAUSE its 16x and with it being AM2 socket its not that old maybe a couple years. Where PCI 3.0 is really going to shine will be for the dual-gpu cards........ I don't think we'll see a single gpu card needing that kind of bandwidth for a while..... BUT if u decide to throw 2 6990's or 595's (if nvidia actually produces that card) on a pci 2.0 mobo and run some bench's then put them on a pci 3.0 mobo after they're released i guarantee you will see a difference in performance. A perfect example is comparing a 5970(hemlock) 2gb vs a 4gb..... the only thing difference is the amount of memory on the card BUT there is a big diff. in performance because of that reason. In the world of computing MORE is always better. This post has been completely hi-jacked and is way off topic. Share this post Link to post Share on other sites More sharing options...
mattyamdfanboi Posted December 16, 2010 Posted December 16, 2010 Maybe with a dual gpu card we might, but given the power limits of the slot I doubt we could supply enough power to need more bandwidth hahaha It would be awesome to see though :-) Share this post Link to post Share on other sites More sharing options...
mattwalter85 Posted December 16, 2010 Posted December 16, 2010 Thats why all the cards you see today or at least most of them have power connectors on them. I haven't seen a card that draws ANY power through the slot in a long time. Its simply not needed anymore..... Share this post Link to post Share on other sites More sharing options...
mattyamdfanboi Posted December 16, 2010 Posted December 16, 2010 well considering that most cards are over 200watt.. they have to draw power from the slot.. connectors only give u 150 with 2 of em every card draws power from the slot, thats their main supply, the connectors are there because its simply not enough and they need more POWAHHHHH Share this post Link to post Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now