Jump to content

Got 8800 GTX


rainwood

Recommended Posts

  • Replies 137
  • Created
  • Last Reply

Top Posters In This Topic

Wow, those things are huge. I haven't had a card in my computer that long since the first hardware DVD decoders years ago.

 

Damn.

 

P.S. Looking at the pics was highly annoying...I have the same motherboard, the same RAM and the same CPU cooler.

 

I almost opened my case to see if someone snuck a couple of 10.5 inch presents in there in the last 15 minutes since I put the side panel back on.

 

Happy gaming with those, Rainwood.

Share this post


Link to post
Share on other sites

Wow! I did not expect such many posts to this thread! Thanks a bunch!!!

 

Yeah be sure to let us know if there is a difference in performance, I've read that a number of places (including the link I posted) so it's likely true.

I took one of the SLI bridge connector out and ran the 3DMark with the default setting (1280x1024).

12801024singlebridgev13ic5.th.jpg

…no improvement in performance… but thanks for Lord AnthraX, I learned the 3DMark score in my system was O.K when using the FX-62 CPU.

Glad to see that an OCZ GameXtreme can Handle SLI 8800GTX's...

 

http://www.theinquirer.net/default.aspx?article=35604

 

Thats pretty impressive (and somewhat disturbing) its so cpu-dependant, I'd think an FX-62 wouldn't lag it that much at all.

 

Edit:

 

Wait, are those scores for Sli? Shamino has found a pair of stock GTXs to be cpu dependant even on a 4.5GHz Conroe, so I guess it makes sense that a stock FX-62 won't hold a candle to that. But even still, you should be getting closer to 15k I suspect...

As I’ve read reviews of 8800GTX on the web, I noticed that the Intel Core 2 CPU was used in the testing in the most of the reviews. Apparently, Intel’s high performance Core 2 (like yours and x6800) and Quad-Core CPU (qx6700) with nForce 680i SLI can pull the power of the 8800 better than FX-62… DFI, please release the 680i SLI board soon!!!

Got 8800GTS? Sorry to hear about the 64-bit OS support for the card. I’ve never used the WinXP 64-bit version. How’s gaming experience on the 64-bit OS?

 

For me it's not about 3dMark scores or max fps. I haven't even ran any 3dMark benches. Whatever the game is, set in-game settings to max, enable AA and AF at 16x and play. No other card comes close to this type of image quality and smoothness.

Totally agreed (, though I cannot help longing for the high (est) performance PC components just for the sake of better gaming experience).

 

very impressive. I've read that you need true 16xPCI-e slots for sli. is this true?? have you had any problems?

Regards Andrew.

?? If I am not mistaking, I think that it’s hard to see the difference in the performance between the SLI cards running at x8 and x16 speed…

You might not even need the SLI mainboard to do SLI!? Check this out; Can You Run Nvidia Dual Graphics Without SLI?

 

expect to see some kind of 8850/8900 variant. they will get GDDR4 and a die shrink for sure. gonna keep my rig going as long as it cant till K8L drops and i weight the options. got my spacer kit and will be slapping a couple of vf900s on there.

Why did you not let me know the news before I got the 8800!? I hope that the weight of the next 88 series cards will not heavier than that of the 8800 GTX…

Can’t wait…

 

I almost opened my case to see if someone snuck a couple of 10.5 inch presents in there in the last 15 minutes since I put the side panel back on.

Though the case’s depth is long enough to eat the 88000 GTX, there are the drawbacks, too, such as need more airflow to cool the components down…

 

Well, I’ve been busy in toying with the case fans in my system (I piggybacked on the thread: Aircooling - Need a 120mm Fan. Your input are very much welcome!), and did not find time to play hard yet… As technodanvan already did, I like to try over-clocking the 8800 GTX in this weekend!

 

P.S.

I'm gunna come down to Rochester and rob you! Muahahahahah! Nice scores!

Let me know as soon as you hit the road down to Rochester. I will go right away up to Duluth to get your gorgeous rig!

Share this post


Link to post
Share on other sites

Man, I finally got a semi-credible article on the R600 PCB and I'm sure f-in glad I wasn't ready to leap onto the 8800GTX yet. (Due to $$ and need.) I'm not knocking G80.. it's amazingly badass and for at LEAST two months ATI won't have . to coompete with it but... Supposedly R600 will be amazing... more than amazing... stupidly amazing. I'm sure that, by Jan 30th, the 8800 will be well into it's refresh but I don't know if it'll be refreshed enough to make up for what ATI's supposedly doing.

 

http://theinquirer.net/default.aspx?article=35708:eek2:

 

This will pretty much leave the Geforce 8800 series in the dust, at least as far as marketing is concerned. Of course, 86GB/s sounds pretty much like nothing when compared to 140GB/s - at least expect to see that writ large on the retail boxes.

 

The memory chips are arranged in a similar manner as on the G80, but each memory chip has its own 32-bit wide physical connection to the chip's RingBus memory interface. Memory bandwidth will therefore range from anywhere between 115 (GDDR3 at 8800GTX-style 900MHz in DDR mode - 1.8GHz) and 140.1GB/s (GDDR4 at 1.1GHz DDR, or 2.2GHz in marketingspeak).

 

For starters, the rumour about this 80nm chip eating around 300W is far from truth. The thermal budget is around 200-220 Watts and the board should not consume more power than a Geforce 8800GTX. Our own Fudo was right in a detail - the R600 cooler is designed to dissipate 250 Watts. This was necessary to have an cooling headroom of at least 15 per cent. You can expect the R680 to use the same cooler as well and still be able to work at over 1GHz. This PCB is also the base for R700, but from what we are hearing, R700 will be a monster of a different kind.

 

As far as the Crossfire edition of the board goes, we can only say: good bye and good riddance.

 

Just like RV570, the X1900GT board, the R600 features new dual-bridge connector for Crossfire capability. This also ends nightmares of reviewers and partners, because reviewing Crossfire used to be such a pain, caused by the rarily of the Crossfire edition cards.

 

20060612035457_89761.jpg

Share this post


Link to post
Share on other sites

The monster is just around the corner… Will it end the days of the NVIDA SLI?

Man, to tell you the truth, I was planning to dump the FX-62 and NF590 board and get the Intel XQ6700 & NForce 680i board for the 8800 GTX.

Thanks for your info., I made up my mind – I will wait. In less than three months (January 31, ’07, right?), the monster will show up. Probably AMD will release new CPU (and the chipset) to push the card’s power to the limit (DFI, pretty please, prepare for it). I cannot help getting them home!!! Until then, I will play as hard as I can to improve my gaming skills.

Now, excuse me, I’ve got some enemies to kill (is anyone playing F.E.A.R Extraction Point now?).

Share this post


Link to post
Share on other sites

Doesn't look like it'll be stable. Runs fine for a little bit then locks up. 650/1900 seems fine though, for the time being.

 

I'll leave it at that until new drivers come out, or until I get a board thats mine and I'm serious about playing with it.

 

And unfortunately you're probably right Travis.

 

Thats it Techno! I ordered my EVGA 8800GTS today! See how that thing is going to work! :drool:

Share this post


Link to post
Share on other sites

Guest DaddyD302

Hey rainwood, nice pictures of your mobo with the 2 GTX. I was wondering if the 2nd card was blocking the 2nd to last PCI slot. I currently have the X-Fi on the last spot, and my ethernet card on the 2nd last one, will the 2nd GTX block the ethernet card?

Share this post


Link to post
Share on other sites

Two things to note from posts that i'm too lazy to find again and quote:

 

1. The GTS is considerably shorter than the GTX cause it doesn't require the same amount of power. It also only needs one PCI-E power connector.

 

2. The two bridge connectors are so 3 cards can be used in the future. Kinda daisy chaining them or so the rumor goes. Same thing ATI is doing. The advantage of the unified shader architecture on these cards is that they can do general purpose computation FAST with branching. So lets take even more work of the processor. Physics, AI and load it on the third graphics card.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...