Jump to content

The Angry Phenom: Part 2


Recommended Posts

Give me some time to build up the rant, but this ties in DIRECTLY with a lot of what I said in the original Angry Phenom Thread.

 

This is EXACTLY the direction I said AMD would end up going in...not so much the "Hybrid" Crossfire, but more so the 85%, or regular/budget market that makes up...well....85% of all computer users =/.

 

Take a read from HARDOCP and tell me if I'm wrong :tooth:.

 

Rant developing, but not a bad/angry rant (yet).

 

More as I read some more stuff while eating Cup O' Soup

Share this post


Link to post
Share on other sites

Give me some time to build up the rant, but this ties in DIRECTLY with a lot of what I said in the original Angry Phenom Thread.

 

This is EXACTLY the direction I said AMD would end up going in...not so much the "Hybrid" Crossfire, but more so the 85%, or regular/budget market that makes up...well....85% of all computer users =/.

 

Take a read from [/b]HardOCP[/b] and tell me if I'm wrong :tooth:.

 

Rant developing, but not a bad/angry rant (yet).

 

More as I read some more stuff while eating Cup O' Soup

 

Your HardOCP link goes to your Anrgy Phenom rant...I saw the Black edition overclocking article on HardOCP, is that what you were linking to?

Share this post


Link to post
Share on other sites

...Hybrid CrossFire is achieved by utilizing one of AMD’s new integrated chipset GPUs (IGP) and one of their new video cards. That’s right; Hybrid CrossFire utilizes one video card plugged into your PCI-Express slot and is able to join forces with the GPU built into the motherboard chipset...

 

...with Hybrid CrossFire enabled we were able to actually tweak out a Crysis configuration that would let you play the game at 1024x768 with most of the visual settings on “Medium.” Running the canned timedemo benchmarks supplied in the game, we saw Hybrid CrossFire give us 50% better framerates than with the single RV620 video card alone...

 

...a Hybrid CrossFire sub-$500 computer could show up on your Uncle Bob’s doorstep with Hybrid CrossFire enabled actually allowing him the ability to play some real 3D games and plug in two monitors without ever having to switch the Hybrid CrossFire mode on or off...

 

Pretty cool stuff, not like I have anything ATI, but it's still cool nonetheless. One of the reasons I never went for SLI or crossfire is that I do use dual monitors, 95% of the time.

Share this post


Link to post
Share on other sites

no not so much "wasted" hardware. That isn't the point at all. You got to stop thinking in terms of what "YOU" (ie us enthusiasts) think about things, and get with the program of thinking about the "MASSES" or the "85%" think (or how companies think about/for them).

 

Almost all computers have onboard video. You can cringe and cry out in pain all you want, but how many computers have you found that are like yours? Hardly any compared to the millions upon millions of Dell, HP, Gateway, etc computers that do NOT have separate graphics cards. Because separate graphics cards = getting away from that xxxxing wal-mart mentality of having to buy a $499 computer with monitor, speakers, and printer included.

 

So millions upon millions of computers in the near future from all these OEM's will have onboard video, just like they do now. But now you can simply buy a $49-$99 card and get instant video game action where you can't with just the crappy little onboard gpu.

 

Anything over $100 for a vid card add-on would make it unnecessary to crossfire the onboard gpu with it, as any $100+ vid card would be able to outperform any Crossfired integrated gpu + cheapo ($49-$99) video card.

 

But again this is aimed at the 85% majority who decide after buying these piece of . $499 computers that they want to play Deer Hunter or Redneck Truck Wrestle-Racing or some . like that. They call up Dell and piss and moan that it won't play their favorite piece of crap game, and Dell tells them they can easily sell them a $400+ 9999GTX or something, and the customer don't want that.

 

Customer wants to hear he can blow another $50 and check the crossfire box and play almost any game (whether that game plays well or not is beyond my ability to know, I just know that it will play a lot more games than onboard alone)

Share this post


Link to post
Share on other sites

Give me some time to build up the rant...

Rant suggests there's something wrong about it. So what exactly is wrong with AMD of 2007 and beyond:

 

- taking risks even Intel isn't ready to do?

- thinking outside the box?

- trying to do too much new stuff simultaneously?

- is it wrong to move ahead in making a complete AMD computer solution possible beyond the usual integrated GPU stuff?

- at last making concrete steps in giving the open-source drivers development momentum?

- making human mistakes?

 

I'm not challenging anything you write Angry_Games and I'm not saying that AMD is doing great in all aspects, I'm simply interested in grasping why some few enthusiast like us should be upset. Even though being one I tend to not give a toss anymore about whether the "classical hardware enthusiast" is fed with new toys or not; the classical scenario gets less interesting and other forms of computing more interesting.

 

A bigger challenge I see is for software to catch up and effectively make use of new possibilities. Computing today is a mix of stone-age thinking and brave new inventions.

Share this post


Link to post
Share on other sites

no not so much "wasted" hardware. ......

.....

Customer wants to hear he can blow another $50 and check the crossfire box and play almost any game (whether that game plays well or not is beyond my ability to know, I just know that it will play a lot more games than onboard alone)

 

I had a Biostar Tforce 939-6100 for a while. The onboard graphics was plenty for email, web browsing and even displaying 3D data plots. The new 780g chip is supposed to be about 4x more powerful than the 6100, which was no slouch (at least compared to other onboard offerings :rolleyes:). But the really neat thing about this technology is that eventually it will have the capability to detect if you are doing something graphically intensive enough to warrant using the extra video card. If not, the card is powered off. The integrated graphics chips will use much less power for those 90% of the times when you don't need the extra capability.

 

Like Angry Games said, this will be a win for AMD with Joe Six-Pack, who is never going to lay out 300-400$ for a video card just to play games. There are a lot more of us "poh folks" than "rich folks" :P

Share this post


Link to post
Share on other sites

This is pretty sweet IMO...in my situation we have 600+ workstations at school. Say I want to upgrade a lab of PCs to be able to do some light CAD work. Now instead of having to buy a $200 video card for each PC I could throw in one of these $50 crossfire cards and get away with it. Another example would be all the customers that walk into best buy and spend $150 for a HD2600 and expect it to handle Crysis. Now they just buy one of these cards and they can game with decent frame rates without having to buy a $250 video card. I'm interested in what the power consumption of such an add in card would be. Thats another big issue since most cheap PCs have 250watt or 350watt PSUs which makes it impossible to run a high end graphics card without upgrading the PSU as well.

Share this post


Link to post
Share on other sites

This is pretty sweet IMO...in my situation we have 600+ workstations at school. Say I want to upgrade a lab of PCs to be able to do some light CAD work. Now instead of having to buy a $200 video card for each PC I could throw in one of these $50 crossfire cards and get away with it. Another example would be all the customers that walk into best buy and spend $150 for a HD2600 and expect it to handle Crysis. Now they just buy one of these cards and they can game with decent frame rates without having to buy a $250 video card. I'm interested in what the power consumption of such an add in card would be. Thats another big issue since most cheap PCs have 250watt or 350watt PSUs which makes it impossible to run a high end graphics card without upgrading the PSU as well.

 

This isnt really taken into account when these users upgrade cards now is it? If anything as a less powerful card will be required thus less drain on the PSU.

Even so most so called graphics cards upgrades you find in the shops over here are only 1600XT's or similar so even when they do upgrade its not to something thats going to require alot of power.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...