Jump to content
Sign in to follow this  
Angry_Games

The Angry Phenom (or: How I Love-Hate AMD!)

Recommended Posts

Another long one from Angry. If you are a fanboy, you maybe don't want to read. If you know how to read that is. Some of you can't read more than 3 sentences before clicking your Favorites button to get to a porn site. Some of you though actually do read them all the way to the end...and end up sending me hatemail or praisemail.

 

I really dig praisemail! MORE!

 

I guess I really dig the hatemail as well, because at least SOMEONE is listening. Opinions are important, but no one single opinion is the rule of law. Everyone has a differing opinion on just about everything.

 

There is no right or wrong opinion.

 

There is your voice, and while maybe only fellow dorklings here at the Street might read it, at least someone read it, so at least someone knows how you are feeling, just like you know how they are feeling after reading their spew of bs lol.

 

I really love AMD. And I really hate them. Like a high school sweetheart you married and she cheated on you.

 

Let the hatemail begin!

 

 

 

 

AMD/ATI/Intel/nVidia,

 

I always choose based on the best performance per buck that I spend. I've got a X1900XT card that has served me very well. Great card. But it is gathering dust because my 8800GTS simply outperforms it.

 

I've had (and built) a lot of great rigs based on Intel/AMD or ATi/nVidia hardware. But everyone of them always goes directly back to the methodology in my first paragraph.

 

No one else owns my soul (or my checkbook). Of course, some customers specify one or the other, but whenever they will listen I try to present the pros and cons of each particular platform and try to help them make an educated decision.

I'm with you wev, I still love my X1900XT 512MB. I own an 8800GTS as well, but I find myself hanging out and gaming more on my 3800+ X2 + X1900XT than I do on my 3Ghz Core2 E6600 + 8800GTS 320MB.

 

My Core2 rig is definitely better, but honestly, the 3800+ X2 and X1900XT rig aren't really that much different when it comes to gaming. 1280x1024 or 1440x900 are the resolutions that both play in and both pretty much do the same job as far as it 'feels'.

 

However

 

When I got the X1900XT, it was the best card on the market bar none. When I got the 8800GTS 320MB, it was the best bang for the buck card on the market (couldn't afford a 640MB or GTX but I at least got me a dang 8800GTS!).

 

So when the day comes (and soon for the technology changing, probably not so soon for the financial ability to buy anything lol) that I am ready to get something new, I'm sticking to my bang for the buck. I don't need a 2900XT or 8800GTX. Sure I'd love one, but do I need one? No, not when the 8800GT or 3870 ATI can do what I want since I don't play 1920x1200. Don't care to get a 24" to play in that resolution either anymore honestly. I'm quite happy at 1280x1024. But that's getting off my rambling point.

 

I agree with nghisus in the sense that people tend to only look purely at benchmark results instead of taking in the entire equation of dollars spent vs performance gained or lost vs power consumption + noise or need for new psu etc. If the 8800GT is $279 and gets 10% more fps in all the games I like or tend to like or I know are great looking game engines, and the 3870 is $229 and gets 10% less fps, I'm OK with spending less. (more so now with what looks like proper ATI Crossfire drivers so it might actually compete with SLI, which means I can drop another 3870 in, or 3 or 4 more who knows! and not have to hem and haw over whether or not to spend $400 for a single card that can do what two $200 cards can do etc. I can pick one now, and add later.

 

SLI gets a little tougher because it only works on Nvidia chipsets, and right now Intel cpu's are top dog which means Intel chipsets are top dog, which means no SLI for me, and if I go Phenom then I don't want an Nvidia chipset, I'm going to want a 790 ATI/AMD chipset more than likely, so again SLI might be limited.

 

However Nvidia usually puts out a great chipset for AMD.

 

However again lol, AMD is FINALLY doing what I almost screamed at the AMD guys to do 4 years ago at the DFI office in Hayward: MAKE YOUR OWN CHIPSETS.

 

I tried explaining to these guys sitting around the conference table why Intel was always going to kick their butt in the end...because no one can make a chipset for a cpu than the ones that make the cpu.

 

The AMD guys were...I guess brainwashed to not hear such things, or maybe they acted like they didn't hear it even though one of the dudes actually admitted to me that Intel would always have the upper hand as long as AMD didn't make their own chipsets. The other guys spewed on in typical corporate bull crap-this-guy-until-we-believe-it-ourselves nonsense about AMD wanted to focus solely on cpu's and they wanted competition in the market for other companies like AMD was some big xxxxing huggy-bear who was looking after everyone's best interests by giving them something do to (make chipsets).

 

Total and utter bull crap is all it was. Before anyone says but AMD CAN'T make better chipsets than 3rd party mfg's can, I tell you that no one makes a better server chipset than AMD for their own Opteron cpu's. Not VIA, no SiS, not Nvidia. Hell, I don't know if any of those 3rd party mfg's even still make chipsets for enterprise Opteron stuff. But no one did it better than AMD.

 

I guess this is where I defend AMD and say they wanted that enterprise server market BADLY and they did what had to be done, and they got a pretty big chunk of it away from Intel, and just like NASA gives us most of our killer technologies or ways of life as it trickles from them to the military to civillion life, the Opteron technology did indeed trickle down to us in the form of socket 754 Athlon64's and from then on you all know the rest of that Paul Harvey story.

 

But let's go back to me yelling at AMD guys. I've read plenty of sites lately like Kyle Bennet and Anand and Ryan at PCPER all saying the same thing: they tried to tell AMD for a few years what the market was looking for, and AMD wouldn't listen because AMD thinks it knows better than anyone else how to beat Intel.

 

Which translates to: We had so much success with the original AthlonXP and the explosive success of Opteron and from there consumer Athlon64 that we didn't have to do anything else except just keep extending the Athlon64 line through new technologies like DDR1 to DDR2.

 

 

Now, re-read that but from Intel's perspective about 3 years ago: We've handily beat everyone and anyone, especially AMD, for so many damn years that we'll shove Pentium4 down everyone's throat and just extend the P4 line as long as possible since we have no real competition, and so therefore no need to keep innovating.

 

 

OMG lol. WTF? It's like those cryptograph things and you put one statement on top of the other and blammo all the letters and words match up ;)

 

So back to yelling at AMD. During that same meeting, I was practically pleading with them to listen to my dire warnings about Intel. My bosses and our marketing lady (Vivian) were practically shoving me out the door as I probably sounded like some crazy street preacher preaching the end of days to all who walk by as I tried to let them know that the whole not needing their own chipsets would eventually make them weak and worse, Intel was already starting to rumble after more than a few years of continual marketshare loss to AMD.

 

You know, when Intel is firing guys and making big internal changes and sending out memos to their people saying in essence WE WILL xxxxING DESTROY AMD OR WE WILL BURN THE WORLD DOWN TRYING!!!!, that's a pretty healthy sign that the machine is being primed and getting ready to be started, and Intel is the big 800lb machine that rolls over everyone and anyone like a steamroller over dandelions.

 

ie the little dog can yap and nip and dodge for a while, but at some point Spike is going to end up biting that dog in half or worse.

 

But, like everyone else you read on the internet, EVERYONE thinks they tried to tell AMD how to do things and AMD never listened. I'd have to say that's probably true to just about any extent. My voice was just one in probably tens of thousands of enthusiasts and website owners and tech monkies who cried out to AMD to not let their foot off the gas pedal.

 

It was so awesome to see AMD not only catch up to Intel, but blow past them and leave them in the dust for a few years. Man that . was freaking AWESOME. I hated Intel with the best of you. But how sad and ironic is it that AMD didn't even learn from their own victory over Intel...which has ALWAYS been the trojan horse for the victors of ANY struggle:

 

winners tend to forget what it took to win because they are too busy celebrating the win, while losers tend to dwell heavily on how they lost because by god losing sucks and it's better to die on your feet than live on your knees. Winners feel invincible while losers now know their own weakness and know how to defend against it and even use it to their advantage.

 

Problem in this day in age in the Intel vs AMD struggle is just another of the same problems all throughout history in these types of David vs Goliath situations:

 

who has the deeper resources to call on?

 

A small upstart might win some battles and make some gains, but the big nation-states usually have half of europe out to asia minor to draw on and eventually they can defeat the upstarts through a war of attrition (ok for non-history monkies, imagine your favorite RTS game and how your opponent might attack you quickly and surprise you and gain some land but you control all the gold and food on the map and can eventually strangle him dry).

 

Intel has HUGE pockets. Insanely huge pockets. MEGA huge pockets. AMD is no slouch, but I think Intel spends more in R&D in a year than the entire company of AMD was worth at about the time Intel was figuring out how to turn those mobile Pentium3 (Pentium-M's on the DFI 855 motherboard that cost about $400 but showed a Pentium-M clocked to 2.4Ghz blew away even the FX-55 AMD64) type cpu's into the awesomeness we know as Core2 today.

 

Or as all the rednecks around here like to say in their best stupid-Idaho-redneck-drawl: you mess with the bull, you get the horns boy!

 

 

 

 

The biggest question I see all over the tech sites now is "how could AMD let this happen?". The second biggest thing is speculation from all the readers about how buying ATI caused all of this.

 

 

Well, to the first thing...I don't know either, other than what I said earlier about when you are #1, you are too busy enjoying the spoils of war to care about anything else, because you are invincible.

 

To the second thing, I don't think I agree with that much. Buying ATI was a HUGE risk, and yeah, it probably caused a dropoff in a few projects etc while AMD tried to figure out wtf they were doing now. But I think they looked TOO FAR into the future at the prospects of owning ATI and what it would do for them, which would be to give them at some future point when the gears were oiled and turning smoothly they'd pretty much be able to own an entire market segment all by themselves for the fact they could integrate cpu and gpu either on the same board or same chipset or hell all on one single chip (die).

 

And that actually will happen at some point if AMD doesn't fold (they won't) because FINALLY they are making their own chipsets. Acquiring ATI gave them that, even though they've been able to make their own chipsets since the beginning. They just chose not too. Intel chose otherwise, and had the lead for 20+ years.

 

They are leaving at a precarious junction of not being able to compete in the same market sector as AMD in things like gaming laptops, budget gaming boards/boxes, etc. We all know Intel graphics are less preferrable than having herpes with never-ending open sores oozing pus 24/7. Yeah, that's how awful Intel graphics are.

 

Nvidia, they don't want to play ball. Intel could buy them easily, but Nvidia isn't having any of that (yet). Nvidia thinks they can survive, and right now they can because they are the only real graphics company left that can put out a winning product every time, easily beating anything ATI can offer.

 

Well, maybe until now. Now we'll see the squeeze put on Nvidia. ATI's graphics cards will always be excellent, but while they maybe never top Nvidia permanently or even often, they will perform within a few percentage points of them, and at the same time ATI doesn't need Nvidia's chipsets to fuel AMD cpu sales. ATI not only makes gpu's, but they've now got all the key pieces in a single company to make everything and lock Intel AND Nvidia, their two biggest competitors...out.

 

Intel might always have the better cpu, but it won't be THAT much better (look at today, those of us with socket 939 dual-core AMD64 and 2GB of RAM and a decent 8800 video card are enjoying all the same games as those who rushed out and bought quad Intel cpu's with 8800GTX Ultras...the Intel/Ultra setup might give you some better performance, but if Doom9 and Unreal16 still play within 10% or so on an AMD cpu system, and costs 10% less, who's really enjoying more than the other guy?).

 

Nvidia might always make better performing gpu's, but they won't be THAT much better.

 

Intel and Nvidia might make better performing chipsets, but they won't be THAT much better.

 

Remember, we can't be ruled by the 1% who rush out and buy QX9999's that have 8 cores at 4.5Ghz each and 9999GTX's with 5GB video ram who say that anything else sucks and if you buy it you probably are the type who loves humans of the same gender as yourself.

 

Remember, we went down that path during the DFI days and it only made us spend a lot of money. Those of us who bought the best, by the time your parts arrived, someone else was already far beyond your best and your choice was to feel envious or spend MORE money to try to get far beyond that jerk that just went far beyond you.

 

Remember, we don't sit on our computers and run benchmarks all day. We use Word, we browse the web, we kill some aliens in Crysis, we watch xvid and wmv porno, we steal music and listen to it all day long while browsing porno and checking email and writing documents in word...we DO THINGS.

 

If an Intel Quad can open a 200MB zip file 7 seconds faster than a Phenom, is your ego so fragile that you just can't be THAT GUY who is 7 seconds slower? Will your wife or your friends think your penis is 19" long because you encoded that AVI into Xvid 2 minutes and 24 seconds faster on your Intel? No, they will be bored listening to your bull crap and change the subject asking when you are going to mow the lawn or when you are going to stfu so we can go play Guitar Hero or Halo3 or go to a movie or get the LAN going for some RS Vegas.

 

If an 8800GTX can net you 7fps more in Crysis than your measly 43fps you get with your ATI card, is that going to be a deal-breaker? Will you have to wear the shame sign around your neck and walk the streets for 48 hours alerting everyone in your town that you are a lamer who doesn't get 50fps? (as well as alerting all the gangs and bullies to kick your butt for being a retard and wearing a dumb sign around your neck saying you are a lamer for some nerd Dungeons and Dragons nonsense since gangs and bullies aren't well-versed in the differences between D&D nerds and computer nerds?)

 

 

What if guys like me don't have $329 for that 9200GT and $279 for that QX6600 and $199 for that X38 motherboard, but we DO have $249 for a 3950 ATI and $219 for a Phenom 9700 and $129 for a 790 motherboard with killer auto-overclocking software that actually works unlike Ntune from Nvidia, the biggest piece of . software ever invented to totally xxxxing hose your computer to where it never boots again?

 

Are guys like me lamers? Would you beat me up on the side of the road? Probably not because I'm not an idiot who would announce to everyone how big of a dork I really am.

 

 

 

 

Whoa...maybe AMD HAS something going on here eh? Maybe they really are smarter than all of us and have foreseen this happening and everything is going according to plan...

 

Or maybe me and anyone else who thinks all the same things I just ranted about are the smart ones and somehow WE are making sense for AMD's actions when AMD themselves really are a bunch of idiots and somehow have stumbled into such a thing accidentally and only need to have it pointed out by all of us fanboys?

 

Ok yeah that's a bit of egomania.

 

But isn't it scary to now ponder that either/or might be right? Does it seem to you like AMD really is just stumbling along and accidentally having some good luck once in a while? Or worse they are hanging on by a thread and unable to top both Nvidia in gpu's and Intel in cpu's?

 

Or maybe they ARE hanging by that thread and just trying to stay alive long enough to finish some secret project that really will be an Intel-Nvidia killer and give them that old Athlon glory...?

 

 

My love for AMD runs deep, just like the south has for the rebel dixie and the north has for...whatever the hell the north stood for. AMD gave me some of the most glorious days of my life in computing. All the new, exciting things that would come down the pipe for us to play with and thumb our noses to Intel and Intel's fangirl base who refused to accept that the little guy trumped the big guy even if only temporarily.

 

Boy am I bitter these days, like AMD was my wife who not only cheated on me, but did so with an entire bar full of men AND women AND taped it and put it on the internet as a dedication to me. Me, I only want to love AMD.

 

Problem is, I've grown up a bit. All those glory days are now like your father and his friends telling you how great they were in high school on the football team. I'm now a modestly responsible adult who has better things to do than run 143 continuous benchmarks of 3dmark2005 to try and beat those 4 guys ahead of me at XS forums. I don't have all that disposable cash these days to buy a faster cpu the day before the fast cpu I just bought even arrives since Hangman932 at the forums already got his faster cpu and is making mine look like hair girl armpits.

 

I want to be able to play the new interesting games that come out, but I already own an Xbox360, Wii, and PS2, so I really have no interest in dropping a grand for new vid cards every 6 months that will only give me 19% more performance at resolutions I don't even play at than I already have with my not-quite-top-of-the-line-but-not-even-close-to-midrange gear I just dropped a grand on.

 

I'm sorta thinking that the majority of you who make it this far into this diatribe will say "yeah man, I'm totally with you on that" because you got chicks you need to spend money on, kids who need clothes, cars that need repairing, etc, and gaming, while fun and a good way to burn through disposable income, eventually becomes just a regular hobby and at some point you just can't continue to bleed cash for no real good reason.

 

I mean, some of you are rich enough that it's no object to you. That's cool. You guys are already getting the ego boost for being the top 1%. But the other 99% of us, we eventually cool down and just want to enjoy our gear without having to shell out every 6 months for something new. Just like you did at some point when you decided every few months putting some new engine part on your hot rod wasn't really worth the money anymore, you could already do 16.1 in a 1/4 mile and now all you wanted to do was actually drive the damn hot rod to impress chicks instead of having to repeat the same lame bull crap about how "oh I got this KILLER 1971 Super Bee in the shop but it's getting a new ".

 

 

 

I guess that makes me a voice of reason?

 

Or just some old windbag that never shuts the %#@$ up?

 

:tooth:

 

 

Either way, as the pirate guy in Family guy would say, I guess the gist of my story is that while I'm extremely disappointed that the Phenom and 38xx stuff isn't able to take top spot, I'm realistic enough to know that these products are going to be where my bang for the buck is unless Intel plays super-hardball and chokes the life out of AMD with rock-bottom cpu prices, and Nvidia does the same with gpu prices.

 

I've got no problem owning a Phenom 9700 cpu, 790X motherboard, 2GB DDR3-1066, and 3870 video card. I've already seen enough benchmarks to know that while it isn't Intel and Nvidia, I didn't spend too much to get too little advantage that no one but me gives a serious . about anyway.

 

Like my girlfriend Carly (Momma here at the forums) said to me tonight before she went to bed when I told her about the new Phenom and 38xx stuff:

 

 

 

 

 

 

 

 

Yep. That's what she said...or didn't say. She don't care as long as she can play Jade Empire, Titan Quest, Hellgate (when we can buy it haha). She gives me the same answer when I show her a killer new 50" DLP 1080p HDTV. She gives me the same answer when I make her listen to a new 800w digital audio 7.1 home theater system. She rolls her eyes and says "it looks/sounds/plays just like what we have".

 

Being a man, I know this is pure bs, because I can tell the difference. Being a woman, she doesn't give a damn as long as its decent and it works properly.

 

If we lived by man-rules, we'd be dead broke but have the fastest xxxxing computer on the planet and the biggest, sharpest tv that would make our neighbors behind us go blind from the clarity, the loudest xxxxing stereo system this side of a Metallica concert, and a car so goddamned fast it would force the feces out of your bowels and into your jeans when you hit the gas pedal.

 

Since we live by woman-rules now, we have a house, broadband internet, a LOT of computers that are nice, a bunch of gaming consoles, a nice tv that isn't the best but still does 1080i, cars that are reliable and get better than 9 gallons to the mile, and an audio system that makes her complain daily that its TOO #@[email protected]# LOUD!! TURN THAT . DOWN!!!

 

 

I got to say, a part of me would rather be dead broke and full of man-rule spoils.

 

But I'd rather have a warm body at night to curl up to in my house with electricity because if I can't sleep, I can at least go play some Call of Duty 4 or encode some porno or open the fridge door and find food inside ;)

 

Share this post


Link to post
Share on other sites
Guest r3d c0m3t

Man, always leave it to you to set other people straight.

 

Am I guilty of some of things mentioned in your rant? Quite possibly, but I would sincerely doubt any of it developed from the fact that the Q6600 (or any current Core 2 processor) could encode a video faster, or get higher benchmark scores because I could care less about any of that. I don't encode/decode videos, yes, I do benchmark, but not often enough that I compare every little gain or every little loss. Both of which are rather pointless to me. I jumped from my 939 board to LGA775 based on exactly what I've always gone by - price/performance.

 

For $279 the Q6600 was arguably the best quad-core CPU around, it was a mad overclocker, and the TDP was extremely low for what I was buying. Hell, I didn't even care about it's multi-tasking abilities because I don't multi-task that often, if at all. It reminds me of the Pentium D 805. Intel's sleeper, if overclocked to 4GHz+, it could contend with the bespoken king (albeit being pwnd by the less-expensive 4400+, and 4800+ if overclocked) FX-60. The true king(s) of the Socket939 days were the Opterons. Massive potential, ludicrous overclockers, and they didn't even move a lot of heat...

 

...My one true reason for going with LGA775 is because I never had an Intel platform before, and as unfortunate as it was, when my system died that alone necessitated the move. I love AMD, and when I got into computers AMD was king. I guess you can say I grew up with AMD more or less, but I never agreed with them buying ATI. Yes, it's good for the both of them because in the long run it definitely shows promise, that much is eminent. What I smirked about was why would AMD buy a company, that like themselves, has trouble meeting deadlines? Not only that, but, why would they purchase a company when the money could have been spent on constructing new fabrication plants? God knows they need them, last time I checked they only possessed two, and they're about to sell one. This, compared to Intel's many fabrication sites only makes the matter worse. Intel can seemingly meet every deadline, and even get the product out the door before the deadline is even in the midst. AMD has to work harder, yet they're still unable to meet 75% of their deadlines. Which of course results in delays, hissy-fits, complaints, and contusions.

 

I'm all for AMD, but I wish they were as enthusiastic about their success as I am. I loved my 4400+ and my CFX3200 DR that accompanied it. The K8-architecture was just the CPU everyone had to own, and if you didn't own it, some of us would tell you wasted your money on Intel's P4 with the gazillion pipeline stages, that awful Netburst architecture, and the high wattage TDP's. The 955 and 965...nothing but a heatwave full of disaster. Intel CPUs were always better at video encoding, so, that's just a natural follow-up. However, that fact alone failed to make the Pentium 4 any better than the AthlonXP, Athlon64/FX, and it definitely failed in being better than the Opteron 1-series CPUs.

 

I'd assume the same logic applies to Core 2, it was just the CPU to have. It did everything better, and it was a lot cheaper than a lot of the new stuff AMD was dishing out. I loathed Quad FX (4 x 4) it was doomed from the start, having a combined TDP of some 230W, and the idea of native quad-core wasn't as attractive as AMD hoped it to be, simply because of heat and the need for extensive cooling. At this point, passive was simply out of the question, plain and simple. Intel knew this and that illustrates their reasoning behind sandwiching two dual-core dies onto the same wafer (CPU) which equated to less heat exerted, and the possibility to maximize MHz.

 

Again, I love AMD, but until they start taking themselves seriously again (Phenom shows worthwhile potential) Intel is just the platform of choice, for the time being.

Share this post


Link to post
Share on other sites

Angry,

 

I love your attitude, especially the manrules, LOL.....kinda feel a bit sad for AMD, but they got a bit lost in their own fog, no one to blame but themselves.....

 

laterzzzz......

Share this post


Link to post
Share on other sites
Guest Modulok

well I will give you a quick quote from Martin Brodeur, goaltender for the NJ Devils. It applies to anything and everything.

I am not 100% of the wording but its close enough.

its easy to get to the top, its hard to stay there

Share this post


Link to post
Share on other sites

Angry,

 

I have read all of your "tirades", and up to now the one on memory, dividers, and overclocking was the classic and best. It has been greatly surpassed by this one.

 

This is pure poetry. DFI really f****d up by letting you slip through their hands. You have way too much common sense. We need to clone you 536 times (100 US senators + 435 US representatives + 1 US President) and march everyone of them off to Washington.

 

I have this week off and have been reading all the endless and stupid Phenom bashing on the various websites in the evenings. Yes Core 2 is faster at games, but not a single site ran any hard core floating point benchmarks, not even SciMark.

 

If I go look at the Fluent site ( a CFD code) I see a 2 GZ Barcelona beating a 3 GHz Core 2 Xeon in a heavy fp muti-ttasking code. Not everyone needs this performance, but then again, not everyone needs the faster integer performance either. Like you, I use my home computer to do real work and some fun things on the side. Being faster in games and porn viewing does not do the average office worker (especially an engineer) much good, since those are the two quickests way to get fired at most companies....

 

Both are damn good chips, and for those of us who have other expenses (*&^%# Jeep exhaust manifold), I will also take the 10-15% performance hit to save 25-30% in cost.

 

PS: Every socket 939 user should read Angry's memory timings and overclocking thread. It opened my eyes up to the real world vs the benchmark world.

Share this post


Link to post
Share on other sites
Guest Neezer

Angry, I really appreciate these "rants" of yours. They always seem to be loaded with common sense for someone who doesn't care to spend 58000 dollars to have the uber fastest rig on the planet. They also offer insight into the world of computing that you don't get at other sites.

 

Thanks!!!

Share this post


Link to post
Share on other sites

whew...I made it all the way thru...but had to break for the favorites tab a few times....

 

Nice post...

 

feels good to get that out huh...:)

Share this post


Link to post
Share on other sites

damn. i know if i had checked the lenght first id have hit the back button without bothering to read. and now i didnt and read everything. and i hate to say it, since you said id say it, but you are right. its kinda funny how you post this on a forum that are pretty much largely the hardware enthausiast elite. how many people are there that now how much PCI lanes there chipset have, and also know the 3 chipsets before and after the one they own? but although i love the newest hardware, and i always like to have it, reading all this kinda reflects my own thouhgs. is it really worth it

 

and i gotta say, i own a C2D rig. i like it a lot. but it cant compete against my old 939 system. its 3 times as fast as the 1st 939 dual-core rig i had, but itll never be as brilliant.

Share this post


Link to post
Share on other sites

Well, I apologize for the incorrect data about the Barcelona 2 GHz beating a 3 GHZ Xeon. My bad memory from this link:

 

www.cisl.ucar.edu/dir/CAS2K7/Presentations/torricelli.pdf

 

It is comparing a 2 GHz Barcelona vs a 2.33 GHz Xeon 5345. The Barcy is about 30-60% faster in some of the Fluent runs and in one of the LS-DYNA four core runs is about 80% faster. It is of interest that when all four cores are being used the gap widens considerably between the two in some of the benchmarks. It must be the effect of the integrated memory controller vs the FSB scheme.

 

So it looks like an Xeon at 3 Ghz would be close or maybe even exceed a 2 GHz Barcelona in fp performance.

 

I recall reading an analysis from Scientia's Blog, in which he did a comparative analysis of Core 2 Duo versus K10. He concluded that K10 would be slightly less than Core 2 Duo in integer and quite a bit better in floating point due to the larger number of FPUs. He keep trying to tell all the fan boys that K10 was going to be a catch-up to Core 2, but most of them, including this one, did not listen. LOL

 

However, if AMD can just get those speeds up a couple of notches, they will have a killer floating point machine.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...