Jump to content


  • Content Count

  • Joined

  • Last visited

About custom_coms

  • Rank
  1. I love my Acer AL2216BW. I also have done about 5 hours of gaming on a buddies Samsung 226BW, and I prefer the screen on my Acer. The Samsung has a glossy display, which I'm not a big fan of, and it costs more than the Acer. The only real problem with the Acer is the stand is a bit wobbly. The Samsung has a better stand, less bleed through issues (which basically disappeared after one week on my Acer) and is a brighter panel (which is imo not a factor because I still use like 50% brightness on my Acer and 40% on the Samsung). Yes, the Samsung warrants the extra $80, as a better stand is worth that alone practically. At the same time, $80 is $80 and both panels are imo equivalent when it comes to gaming, watching movies and general web browsing. I chose to pocket the $80.
  2. I believe angrygames has used one of these on a rig with an 8800GTS before with no problems: http://www.newegg.com/Product/Product.aspx...N82E16817189005 I currently have one in the family pc (AMD64 3000+, EPOX S939 SLI board, 6600GT, one hd, one optical), the fan on the thing hardly ever turns on. One of my buddies also has one in a rig I spec'd (7600GT, Q6600-its for coding/compiling and some light gaming). I have as much faith in that psu as my Corsair HX520, which I paid twice as much for.
  3. Honestly, gpu overclocking with rivatuner and atitool is as easy as it gets (gpu bios overclocking is another story). However, you are seriously bandwidth limited down at 2.4ghz on your 4800+ as it is. I'm cpu limited at 2.7ghz on my Opteron 165 (and even more so at the 2.6 its currently at). I'm seriously considering getting better cooling and pushing for 3Ghz+ to get the most out of this cpu before its time to call it quits and get a new 45nm quad core from Intel that hits 4ghz on air (if you have the FSB ability when it comes to the multi locked quad's).
  4. This card WILL block your number 1 and 2 nVidia SATAII ports on your SLI-DR, meaning you will have to resort to the slower Silicon Image controller to regain those ports. I am unsure atm if right angle sata connectors alleviate this problem. As far as case, it fits in my el cheapo deluxe midtower raidmax (I would have to spend double the money to get better case cooling than the 4 intake and 2 output fans my raidmax gives me), but barely. 1" behind the HD cage.
  5. Yeap, and score a 9500 in 3DMark06!!! On windows vista at a "mild" overclock of 2.6ghz...
  6. ^^^What he said. Your gpu isn't the limit when you are on a platform that is approaching four years in age, unfortunately. Time to pick up one of those shiny new Intel processors and join the dark side. a Q6600+Board+ram should put about a $500 dent in the wallet that is a serious improvement over your platform (arguably ~6 times the speed in applications that can take advantage of 4 cores).
  7. very possible. What resolution and what detail settings?
  8. I'm going to run my 8800GT through some benchmarks-its in my x8 slot as it blocks 2 of my sata ports in the top (x16) slot. I know of a pretty good average 3dmark score for my cpu/ram/gpu combo of around 9000 in an x16 slot, so I expect around there (probably lower due to vista but other than that). I will also benchmark Crysis. I feel that the ATI cards are probably ok in x8 slots and above (the x1900xt's were), but the 8800 series cards are probably hitting bandwidth limits (according to this, umm, unreliable but only source I could find, article: http://www.tomshardware.com/2007/03/27/pci...aling_analysis/).
  9. The artifact scanner in ATITool comes in very handy as well (even though ATITool itself is completely incompaitable with my 8800GT-it was compatible with my 8800GTS though, so that should serve 99% of users). It manages to heat my card up MORE than Crysis-which is saying something.
  10. my card arrived and works fine in the lower slot. Didn't run it in the upper slot because it covers 2/4 of my SATA ports (I have three plugged in, technically I could unplug one because it has NOTHING on it). Haven't run tests to see if x8 vs x16 makes a difference, but I believe it does considering this is an x16 PCIe 2.0 card to begin with.
  11. Just installed my new 8800GT from eVGA last night. It came stock clocked out of the box, immediately overclocked it, sitting at 700 core/1674 shaders/950 memory, with a fan speed of 74% set in rivatuner (any higher and you get that annoying whine associated with small fans), 30 minutes stable in ATITool and 2+ hours in Crysis. Temps: 48-49C at idle (compared with 54 at idle with my GTS at 100% fan speed), load 70 degrees max on ATITool (compared with 74 load on my GTS overclocked to 612 core/1500 shaders/900 memory), 68 in Crysis. My cpu is also idling 2 degrees cooler, and load temps on the cpu are down 2 degrees as well. My opinion is this is a relatively cool running card if you can move enough air over it-cooler even then my crappy 6600GT. Gaming: Still playing Cryis on medium settings like my 8800GTS 320mb, but I'm further into the game (where the graphics get even more ridiculous), and she is noticeably smoother. I would virtually guarantee the 8800GTS has to drop to low outside with the aliens (this is also confirmed by a hardocp review that basically states that even the 8800GT has to drop some settings to low at the final levels to keep frames playable). Installation woes: My only issue is this card IS longer than a GTS-its design covers 2 of 4 SATA ports on my Ultra-D if installed in the top slot, so I have it installed in the bottom slot for the time being. The issue is I am only getting a 8x link, and the limited information I have suggests a 16x link is needed for full bandwidth and performance on this card (the 8800 series cards, and probably the HD3800 series cards, are the only ones out that appear to use every once of bandwidth you can feed them in certain applications). Looking into getting some 90 degree SATA cables to maybe snake in under there (TIGHT fit, but this card is a tight fit to begin with).
  12. I'll post an update when I get my new card (probably next week). As for a DFI P35 board, here you go: Lanparty: http://www.newegg.com/Product/Product.aspx...N82E16813136039 Blood Iron: http://www.newegg.com/Product/Product.aspx...N82E16813136038 IMO, I think P35 belongs to Gigabyte/Asus/Abit, as far as stability, compatibility, and price are concerned (the Gigabyte C2D boards impress me the most in these regards). What do I know though, I'm playing the waiting game for at least another year or so in regards to building a completely new rig...
  13. I looked into this issue since I step up'd my 320mb 8800GTS to the new 8800GT, which is PCIe 2.0. It appears that PCIe 2.0 is compatible with PCIe 1.1, and MOST PCIe 1.0 (the DFI nForce4 Ultra-D is PCIe 1.0). Everyone who has installed a PCIe 2.0 gpu into a DFI Lanparty and posted online hasn't had an issue as far as I am aware (there are issues with cheap boards, but DFI ain't no cheap board).
  14. I'm 100% sure you can. You can "step-up" to any card in eVGA's lineup as long as you pay the difference. If the new card is cheaper, you don't pay any difference and just cover shipping. This is well documented, and I am in step-up queue right now for the 8800GT.
  15. I agree that upgrading from a 640mb version of the 8800GTS is NOT worth it. However, the difference in fps at resolution over 1600x1200 in new games (aka Crysis) between a stock 320mb 8800GTS and a stock 512mb 8800GT is more than 50% in some cases. This is literally the difference between being able to play a game smoothly and enjoyably vs. not being able to play it at all. As games mature, they are quickly filling up vram-the 320mb card just doesn't have enough vram. That is why in certain examples the 640mb 8800GTS is over twice as fast as the 320mb card (and the 8800GT is faster than the 640mb gts). While the 320mb GTS card has more available bandwidth than the GT, its pointless since the textures have already filled the available memory-being able to pump more textures more quickly is useless if you don't have the available memory. This is also why overclocking the ram/memory bus on the GTS doesn't do much to increase the cards performance (plus the bus is so fast as is that it doesn't really help to increase its clocks). You add in the new fabrication process, the extra shaders, a significant core and shader clock speed increase, VP2 offload capability, and the 8800GT is a serious card that sells for less than its predecessors. All of the reasons above have prompted me to initiate step up on my eVGA 8800GTS SC (overclocked over factory to 612 core/1512 shaders/900 memory) to a stock 8800GT. I am currently pulling 30fps on medium settings, 1680x1050, in Crysis *demo* (the real game should have a big performance increase) with the GTS and rig in sig-I will update after the exchange.
  • Create New...