Jump to content

My Ugly PC

Members
  • Content count

    377
  • Joined

  • Last visited

About My Ugly PC

  • Rank
    Member
  1. Would it fill the space? TIM is not intended to bridge any gap but fill the imperfections between 2 surfaces that are touching. Mind you if you allow it any height, heat will soften it and likely spill it over the sides of the IC leaving no contact at all, or gravity would take the bulk of the now heat-softened TIM to the HS and leave a very bad, spoty contact that would be decidedly worse than a thermal pad, there is a reason those pads are used you know. But if you say the 2 surfaces touch (which if your HS is flat they will not, if you bust out the stock KO hs and look under the pads there are rises in height at each point where mem would go, so the difference in height is that of the pad plus a few MM of extra copper at the mem points, but if they do touch on your particular aftermarket hs) then Ceramique would be the absolute safest possible thing. Of course even then there are 1 in a million scenarios which a semi solid TIM will absorb particles in the airflow and be capable of a short. Keep in mind any potential problems we discuss are only in case the TIM spills over the sides of the IC, neither TIM will have any ill effect if it stays between the HS and the mem chip so apply the TIM conservatively. Ugly
  2. OK, this thread is confusing me a bit.. like people are talking over each others heads. With the stock KO heatsink or any other one that is flat and covers the gpu and mem chips, you can't remove the pads because they are needed to fill the space between the HS and the mem IC, as the use of pads implys that there is a difference in height between the GPU and mem chip. If you remove the pads you are removing the HS's only contact with the mem chips, and TIM will not cover this difference in height. Use a different HS or live with the mem pads. If you are using a different HS than described then see below. RE the conductivity/capacitance of AS5 read this thread, of course there are some useless posts in it but the is alot of substance in it as well. [Read the whole thing, its only like 40-50 short posts, worth the 5 min it takes to read it] http://www.overclock.net/amd-air-cooling/1...conductive.html Ugly
  3. My Ugly PC

    Will the video card work with my PSU?

    1. By the numbers, yes. Real world most likely yes, you'll soon find out. 2. & 3. I'm not 100% sure what you are reffering to. Sometimes it is said by a VC manufacturer that a free 4pin molex(HDD power 'dongle') should be available, this is only because some PSUs do not have a 6pin pci-e connector, yours does so this is not an issue. Look at the bottom of this page, there is a picture of all the connectors on your PSU, the one that says "PCI Express" is the one you will use. Unless this connector is malfunctioning or you cut it off or something, you won't be using any 4pin 'HDD power dongle'. 4. One cannot compare Mhz to Mhz any 2 different processing units, would you rather have a opty 165 @ 2200mhz or a E6600 @ 2200 Mhz? The 7900gs is better from a gpu architecture standpoint, this link will show you that even a 7900gs @ 450Mhz will rape a 6600gt @ 500Mhz (the 2 cards are highlighted in blue on the bottom half of the list). And the amount of vram can be an important factor, especially when it is 128mb. Games as old as Doom3 have settings that will need 512 to run really well, Oblivion which IMO is a good gauge for a VC's ability to handle the near future, has settings that will not be kind to a 128mb Vc at all. AFAIK texture resolution/quality/size whatever is what can really eat the lions share of vram. See this thread for ideas about what midrange price VC option is the best. Good luck, Ugly
  4. My Ugly PC

    Best card for ~$150?

    Use this guide and be exact about the resolution you are using and browse through each games' result. The guide isn't perfect, but its quick and very good at direct comparison [i featured the 2 cards mentioned in this thread, they are highlighted in blue]. IDK if I wouldn't wait for plummeting VC prices to bottom out before I commited, not to mention the suspected launch of 8600 line which will likely eviscerate your price range. Its the classic PC question, buy or wait.. Keep an eye on the FS forums here and on hardforum(HUGE fs section), that is usually where the best deals lie. But if I had to buy retail today on that budget I would get the X1950PRO, and I'm not a big ATI fan. In a word; no. To generalize; SLI is a very inconsistent offering, it's use or misuse by a game engine varies drastically. You will more than likely get more consistently good performance across a variety of games by using 1 more powerful card than 2 lesser cards, especially in the example you stated. Ugly
  5. Nice choice. Personally I think 19" wides are too small, and bigger ones look great but then you have to keep a system strong enough to run 1680x1050 all the time (I don't like gaming out of native), not to mention the issues some games have with them. Then of course there are the price concerns for a good 22" wide. I made the same tough decision as you some months ago, and I ended up with a 5:4 1280x1024 like you and I'm very happy. Widescreen is just not a slam dunk for everyone. Hopefully new flat panel technology will be out in the coming years and we will have a good excuse to go 20"+ wide, by that time it'll be the norm and the swift mainstream will force resolutions[pun] to the drawbacks. Good luck breaking the LCD cherry, Ugly
  6. RE Warranty: I recently RMA'd a 940BF with samsung. They sent a brand new replacement and payed shipping both ways. And idk what newegg's problem is, I was told explicitly by a Samung rep that warranty is 3 years, it says it here on Samsung's site too under "TFT-LCD Monitor". They make some company's products look very unattractive when they stupid up the facts such as they appear to have done here. If you are at all unsure you should email samsung and inquire personally about their warranty. I am not a big fan of Hanns-G, as I've seen 3 different models with inexcusable viewing angle. They are cheap monitors, they look and feel cheap to me, and I have one in house that I see nearly every day. Maybe if I didn't have better LCD's to contrast it might be acceptable.. but if you can afford it, get a better one. Of the 3 I would go with the 931c. Samsung's customer service gets a thumbs up from me too, they took my abuse quite well. Good luck with your purchase, Ugly
  7. Had a teacher once tell me how important the electoral college is in modern America.. put down the Kool-Aid butthole. Ugly
  8. Here's rooting for XSOS. Seems too good to be true.. but seriously, MS started when some college dropout had the balls to do it himself, so damn.. ain't impossible.. Ugly
  9. I think the Expert is worth the few bucks extra, over the Ultra-D.. http://www.chiefvalue.com/product/productd...herboards%20AMD Yes for the PCI-E spacing, but for the improved power regulation as well. Over the course of the NF4 Ultra-D/SLI-XX's life there have been some pretty ugly multimeter readouts off of CPU voltage under load. And anyway $125 shipped IMO is worth it. Venus.. whatever, just depends if you get a good deal of if you got the money to burn. Ugly EDIT: DAMMIT! I forgot to say clearly and exactly that you will NOT be getting the problems or bad performance that you prolly got from either of DFI's ATI NB/and or third party SB offerings. Unless its for clear RMA reasons of defect in manufacturing. Don't get me wrong, problems can occur in any rig but there are no real born-in problems of the NF4 chipset that I am aware of, certainly not at the level of the crap DFI+ATI/ULI whatever. And Nvidia stays on top of their drivers. NF4 is a damn fine chipset, really just rock solid in my personal experience. Ugly
  10. My Ugly PC

    Excellent site - Giveawayoftheday.com

    This place is legit, I've been going there daily for a while now. Everyone should. There is a free software thread sticky in the "Applications & Other Software" forum, PM the OP of that thread. He should put in link to this site (prolly at the very top of his thread). P.S. Todays giveaway is a simple DVD/movie file player. For anyone who plays video's on their PC there is no reason not to DL this for free (must install today for freeness).
  11. My Ugly PC

    Evercool VC-RE on gpu's?

    I agree verbatim. There are much better VC heatsinks. It's funny that these we're actually designed for GPU's, if not for it fitting between the VC and chipset the on a DFI NF4 mobo, they might could be discontinued by now. From Jab-Tech description page: Compatible with: nVidia: GeForce 3 TI, GeForce 4 MX ATI: Radeon Series SIS: Xabre Series If you got a GPU at or below the level of those beasts, rock and roll Ugly
  12. My Ugly PC

    CPU-Z v.1.38 has incorrect ram speed calculations!!!

    Allright, damn snow.. Anyway, I must admit that I didn't even look at any of the calculations in this thread, I was speaking to the many times CPU-Z has really just been odd about certain dividers in my experience. But like I said now that I look at the formulas being used, they are incorrect. First and fore most, RAM Mhz goes off of the FSB or HTT or whatever you feel like calling it(heretofore reffered to as FSB), the only relation to the CPU clock is that they both multiply or divide off of the same FSB and that in SDRAM the cycles are synchronous with the CPU. CPU clock speed will never be part of the equation for exacting actual ram frequency from a divider enhanced setting. It goes like this: We will use the OP's settings as an example, so we will find out the real ram frequency for 230FSB with the 183Mhz or 183/200 divider. 1. divider the BIOS divider by the default FSB - We will use an example of 183/200 for the 183 divider so 183/200 = 0.915. This number will always be less than 1, because we are using a divider not a multiplier. If you end up with 1.xxxx you inverted 183/200 for 200/183. 2. multiply the decimal equvilant of the divider (explained above, see step #1) by the actual FSB So we came up with a decimal equivilant of 0.915 for the fractional divider of 183Mhz divider (183/200). Now we multiply that by the current FSB for the real RAM frequency; 0.915(the 183Mhz divider in decimal form) x 230(example FSB). This equals 210.45Mhz actual ram speed, given that the FSB can be an inexact number (example 230.6Mhz) we may be off by a few Mhz, but if our app tells us within IDK 2-4Mhz I'd say that is close enough, because mind you we can't fine tune the ram Mhz off of a divider, only get it as high as possible without bringing down the CPU Mhz. Full equation: 183/200x230 = 210.45 or BIOS listed divider / default FSB x actual FSB Note: A different BIOS may list a divider as 330/400 or 333Mhz otherwise in that format, in this case just break the idiot number in half to get more or less the 166Mhz or 166/200 divider. Now you know where you are, I wish I could tell you if CPU-Z is as inaccurate as I remember, but as it stands you can now tell yourself. Ugly P.S. Please someone tell me if my equation is off, It'd be damn funny if I messed it up trying to correct someone else. A scenario not at all above my spastic nature, lol.
  13. My Ugly PC

    CPU-Z v.1.38 has incorrect ram speed calculations!!!

    CPU-Z has always that I can recall had issues reporting incorrect values for certain dividers. Try using no divider, or a 'simple' one like 100mhz or 150mhz, and you'll see that it can be perfectly accurate. To gauge whether CPU-Z deserves a public calling out, compare the reading with dividers that don't work, to another program's readings, say everest. No software, and I mean absolutely none can be trusted for voltage readings. I've not even met a BIOS reading that I would really call accurate. It's either a calibrated multimeter or someone else's best educated guess. Those are the only options. In my humble opinion CPU-Z is a fine program, that at the very least is a way to show an absolute idiot which BIOS he has, or what mem he has, or what the main timings should be at teh rated frequency. Of those features and many more I've never seen an inaccuracy, and what fixable issues there are (that every single piece of software will have, let alone a free one) are pretty rapidly fixed with a new version release. The fact the the divider issue has not been fixed leads me to believe the issue is all about OC'er mobos offering us many tweener dividers that save for the enthusiast market, would never exist, and that somehow . up the process. Ugly EDIT: My simple point is that it won't(that I've seen) tell someone a falsehood about stock speeds ie no divider. So the only ones affected are those people(OC'ers) that should know exactly what Mhz every component is at before ever even booting windows. We're all aware enough to know, so some dude isn't going to think that his ram is maxing out at 213Mhz or whatever when at the last lower divider it was at 240Mhz. I wonder if this issue is acknowledged in any CPU-Z release notes.. maybe I check after I shovel some snow, its a freaking blizzard in Chicago.
  14. My Ugly PC

    PRIME Don't Place Much Faith In It?

    @faheyd Whoa, resurrected thread. I took the liberty of rereading it from the top, and I definitely gotta undo the tendencies of persuasive writing that I half learned in school (damn unfinished education). A post isn't an essay, persuasive writing as taught nearly always makes one come of cocky. Stupid Ugly. I have read your post a few times over, and I respect your opinion, and your ability to make it without flaming. And I'm sorry I can't complete my point without multiple paragraphs. But you just made my point for me, you explained it perfectly. That whom I was struggling to explain. The guy who suggests or outright states that not priming a PC for an extended time will damage a computer. How? I'm no where near as smart as most of the folks here, but that quote is the best my brain can come up with. My mentioning of voltage was merely to prove a point, an example that what we all accept as a very basic part of OC'ing is more dangerous than not priming for 8 hours. Again I want to stress that I'm not one of these guys around here that is an engineering expert, but seriously how is it dangerous to my hardware to prime for 2 hours instead of 8? Software corruption be it app, driver, or OS, yes that is what I risk. It's undoable. Worst case scenario I reinstall OS. So I have to factually technically disagree with you there. Final time I repeat, I hope someone will jump on me if I am technically wrong, neither of us are experts, though we may make the mistake of speaking in absolute terms. Moving away from our mere disagreement about the former issue, I regret the above comments and personally feel that none of them relate to me or anything I posted. Seriously, "not just 'winging' it. Wild butt guesses, unproven results, dangerous practices... Wanna have fun and not pay attention to detail, buy a dell and join AOL." I mean come on, except for "unproven results" in certain contexts, what relevance is any of that and how does it relate to my opinion or posts. I get it if you just fell into a pitfall trying to articulate a point, but I maintain that my being different on occasion (you'll find from my first post here that I do believe in priming to an extent, just not a static absolute extent) does not mean I am devoid of skill and method. I know a little bit, jeez :tooth: Ugly P.S. Again as this was overlooked last time, I hope prospective respondants understand that I've said all along..
  15. My Ugly PC

    Audio Output Buzzing, PC --> Stereo

    Screwed into the case's expansion slot, nothing more. The metal is powder coated as you suggested, but the inside of the hole the screw goes is bare with no coating of any kind. So the metal on the card's i/o panel --> screw --> bare metal inside screw hole. I'll test it with a definite ground after I test the stereo more thoroughly as shatteredsteel suggested. I just burrowed a PS2 from a friend for this purpose, I'll use that to test the exact same stereo inputs independent of the PC. Thanks for the suggestions guys, I've got a few more things to try. Ugly
×