Jump to content

Game Performance - Darkest of Days


Nemo

Recommended Posts

  • Replies 55
  • Created
  • Last Reply

Top Posters In This Topic

Give me a day or two and if I forget please remind me! Been swapping cards and testing like a madman tonight. On card number 5 of 6

Don't worry about it, I feel bad for asking ya to do that 4890 vs 280/285 test and then I went and bought the 4890 anyway...I hope you didn't get around to doing that for no reason.

Share this post


Link to post
Share on other sites

As far as ati getting physics. Nvidia isnt gonna do it. But really i think everyone has the right idea. Buy a cheap nvidia card dedicated for physics and the problem is solved. Its not much different then before when you had to buy discrete physic cards anyway. Most nvidia peeps are gonna want to buy a dedicated card as well anyway. So think its all well for both. Now looking at the chart the single nvidia cards, while they were playing with physics on high, its not were we would want to be in a shooter as far as fps are concerned.

 

BUT, i was impressed.....

 

I wasnt looking forward to this game. you know, I was like gee another shooter but with physics. I thought for sure modern warfare 2 was gonna be my next shooter purchase. But I there is no way i am not gonna get this now. It really does sound more interesting, and a good use of physics.

I am gonna try this 260 superclocked with 2xaa at 1680X1050 with physics on high and see how that fairs.

I think it should handle it but if it dont

I also have an old 8600gts around maybe that will do it.

but if that doesnt do well then........ i think I wont have to use it, lol

 

So really batman was good? I didnt even play mirrors edge, i bought it just never installed it.

did anyone check out the ghostbusters physics. They were all like "physics should be done on the cpu, the gpu should have plenty to do if the graphics are any good!" or whatever. Anyway I thought it was a good fun game, if your a fan especially. The objects heavily relied on physics. Things busted apart everywhere in pieces.

Share this post


Link to post
Share on other sites

I certainly hope so. The game looks like crap on my rig now. :lol:

 

I'm tempted to buy my brother's old 8800GTX (I still can't believe he spent $600 on that card) to throw into my HTPC to be able to play it smoothly. :P

I spent that 2+ years ago, and although that may seem like a lot, I'm able to max out Batman: Arkham Asylum with FULL Physx at 1650x1080 (2xAA, though I could probably do more, but am too lazy to try)... so yeah, I call that a pretty good investment. How many 2-3 year old cards can still play today's games in all their glory? The 768MB of VRAM probably helps a lot with the Physx.

 

Frank and I were talking about the differences between ATI and Nvidia well playing DOD and Batman. The differences is staggering and frankly I feel so bad for ATI owners that are playing Batman there is so many little things that are missing because of Physx :(
I agree wholeheartedly. Physx may be merely aesthetic and have no impact on actual "gameplay", but IMO, it's not just eye candy. It really does make the world feel so much more real.

 

Nvidia will probably never release Physx rights to ATI will they?? With more and more games supporting it, ATI better think of something if they are going to come back at this new incentive to get nvidia card solutions. Then again you could run the hybrid system like I'm going to and get the best of both worlds :thumbs-up:
Actually, I'm almost certain Nvidia offered to license out Physx to AMD awhile back and AMD declined.

 

Landscape and some action shot with PhysX going on :)
Hard to get a screenshot of Physx in action...you'd really need a video to capture it.

 

For example, in BAA, these cloth banners aren't even available if you have Physx turned off, but a screenshot doesn't really capture how it's "real cloth":

74cacedbb0e15a034943e439b35fd238e19da182.png

 

Also, although some things you'll easily recognize as Physx (like papers getting thrown around while you fight/walk), there are probably several other things going on that you won't even realize is Physx because it just seems so natural. Only if you turn it off will you realize what you'd be missing without it.

Share this post


Link to post
Share on other sites

did anyone check out the ghostbusters physics.

yeah there was quite a lot going on in that game, and i gotta say i was impressed with the level of destructibility. i actually played infernal a few years ago, and while the game wasn't amazing it was obvious that a lot of work had gone into the engine.

 

i know valve too has been quite vocal in the past about better utilisation of the CPU for physics (and graphics at one point), but after playing cryostasis it's obvious that physx is the most capable physics API at the moment.

 

Actually, I'm almost certain Nvidia offered to license out Physx to AMD awhile back and AMD declined.

did anyone ever get to the bottom of why they refused? in my mind if the license fee was too high then ATI can't really be blamed, but i took it to mean that ATI just didn't want physx to succeed and were willing to let their customers miss out until they had killed it, which obviously doesn't reflect well on them.

Share this post


Link to post
Share on other sites

did anyone ever get to the bottom of why they refused? in my mind if the license fee was too high then ATI can't really be blamed, but i took it to mean that ATI just didn't want physx to succeed and were willing to let their customers miss out until they had killed it, which obviously doesn't reflect well on them.

I think they did it because they wanted to shut Physx down with Havok physics......and we have seen how much has become of that :rolleyes:

Edited by IVIYTH0S

Share this post


Link to post
Share on other sites

I think they did it because they wanted to shut Physx down with Havok physics......and we have seen how much has become of that :rolleyes:

Havok isn't hardware accelerated (yet). When and if it is it will be open-source and cross-platform using OpenCL (or whatever that other open GPU computing language is).

Edited by Waco

Share this post


Link to post
Share on other sites

did anyone ever get to the bottom of why they refused? in my mind if the license fee was too high then ATI can't really be blamed, but i took it to mean that ATI just didn't want physx to succeed and were willing to let their customers miss out until they had killed it, which obviously doesn't reflect well on them.

 

Considering nVidia charges motherboard manufacturers $30000 + a flat fee for each motherboard they make to get that SLI Certified logo... they probably are asking much more for PhysX since it's more than turning a "0" into a "1" in some internal register :O

Share this post


Link to post
Share on other sites

I certainly hope so. The game looks like crap on my rig now. :lol:

 

I'm tempted to buy my brother's old 8800GTX (I still can't believe he spent $600 on that card) to throw into my HTPC to be able to play it smoothly. :P

I spent $650 on mine :D

Share this post


Link to post
Share on other sites

Considering nVidia charges motherboard manufacturers $30000 + a flat fee for each motherboard they make to get that SLI Certified logo... they probably are asking much more for PhysX since it's more than turning a "0" into a "1" in some internal register :O

It came down to CUDA. Nvidia offered Physx to anyone who wanted it (as far as I know, for free), but in order to use Physx, you need to adopt CUDA, which AMD didn't want to do.

Share this post


Link to post
Share on other sites


×
×
  • Create New...