Jump to content

Does Crytek's Engine suck?


schoolslave

Recommended Posts

Seriously, this game plays like crap even on the guys with dual Ultra's + Quads w/ phase at like 4Ghz + 8Gbyte RAM.....What gives?

 

I seriously think that Crytek's engine is horrible. Sure, it looks nice and all, but the Source Engine did that too when it came out. And I could play HL2 on my old sony w/ mx440 with 2xAA and everything on high, except for the shaders and still get 35+ FPS. Does that mean the source engine is more efficient/better designed? I surely think so. I don't care if Crytek's visuals are "better", there's no damn point if no one can see them!!!

 

And what's Crytek's lame-butt excuse.....We're way ahead of everyone because our visuals will be standard in 5 years. Umm, by then more efficient engines and games will be out that will have the same or better visuals than your game because your engine blows.

 

Same goes for MoH Airborne, nice visuals but extremely craptacular performance. And I did play this on a friend's pc with an 8800gts 320.

 

Same with Far Cry when it came out. Nobody could play it on high, not even those uber nerds with 6800Ultras in SLI. Crytek's excuse, where way ahead of the game. Umm, by the time people could fully appreciate the visuals in FC (because of better hardware) there where better games with better visuals + performance.

 

Just look at the UT3 benchmarks, the FPS rate is much higher than in Crysis and the visuals are damn good looking.

 

So you (Crytek) can go take your stupid excuses of being ahead back to your headquarters ( if it hasn't been raided again :rolleyes:) and come up with a better/more efficient engine that actually allows people to enjoy your games in motion and not as slideshows.

 

End Rant*

 

This is with an 8800gt

UT3 Performance

 

Crysis Performance

Share this post


Link to post
Share on other sites

  • Replies 46
  • Created
  • Last Reply

Top Posters In This Topic

It plays fine on my sig system at 1920x1200 with everything on high and 3 things on medium. Have yet to see any slow down or glitches. The game is released at the end of the 8800's and more for the 9800 and next gen GPU's.

Share this post


Link to post
Share on other sites

Seriously, this game plays like crap even on the guys with dual Ultra's + Quads w/ phase at like 4Ghz + 8Gbyte RAM.....What gives?

 

I very much doubt it plays like 'crap' on such rigs. Yeah, I'm sure it stutters when maxed out, but so what?

 

I don't care if Crytek's visuals are "better", there's no damn point if no one can see them!!!

 

So..uh...don't bother developing games that will look good for years to come? Take today's best card (which wasn't out when they began developing the game years and years ago) and make it run the best on that? You think Nvidia and Ati would be okay with games that basically nullified any need to upgrade a graphics card in the future?

 

And what's Crytek's lame-butt excuse.....We're way ahead of everyone because our visuals will be standard in 5 years. Umm, by then more efficient engines and games will be out that will have the same or better visuals than your game because your engine blows.

 

Of course, that's the way things work. People get better at coding or find new, more efficient way to do the same thing. This goes hand-in-hand with new graphics cards developed to do more of something faster, or more efficiently overall.

 

Just look at the UT3 benchmarks, the FPS rate is much higher than in Crysis and the visuals are damn good looking.

 

The maps are also much smaller, and the engine has been out for other games already giving them time to fine-tune it for their own game.

 

So you (Crytek) can go take your stupid excuses of being ahead back to your headquarters ( if it hasn't been raided again :rolleyes:) and come up with a better/more efficient engine that actually allows people to enjoy your games in motion and not as slideshows.

 

If you're really playing the game as a 'slideshow' then I'd probably have to blame you for not turning the graphics down to a level consistent with the performance of your system.

 

Obviously in 4 years a standard ~$250 card will be able to run this game near maxed out, and there will likely be better looking games that will run on the same hardware.

 

So what else is new in the world of computers? :rolleyes:

Share this post


Link to post
Share on other sites

Does Crytek's Engine suck?
Nope.

 

Sorry it's not working out for you kid.

The retail version plays fine on my ASRock rig on medium settings with a 148 Opty and 7800GT.

Using my Samsumg LCD TV at 1360x768.

Loooking forward to playing it on my Gigabyte rig after my new 3800 card arrives tomorrow.

Not a top of the line card but it will probably suit my needs.

NOTE: The Crysis demo version wasn't so hot.

Share this post


Link to post
Share on other sites

no, it does not suck

 

no, it is not coded poorly

 

The engine is what I would consider "super-scalar" in the sense that the more power you throw at it, the sharper and more detailed the visuals become.

 

You seem to forget the days of Quake2, Unreal Tournament, Doom3, Quake4, Half-Life2, etc that would simply and easily bring ANY system, no matter how new and powerful it was, to it's knees.

 

Why should Crysis be any different? Just because you have a Q6600 and 2x 8800GTX Ultra's in SLI you should be able to play with everything maxed out on this game? Could you do it with the top-of-the-line hardware you had when certain other big-time games came out that would bleed your system dry of silicon and leave it wheezing and nearly dead?

 

No my friends, we are seeing the same thing we've seen repeated in PC gaming/hardware history since the dawn of 3d gaming: the newest, most powerful game engine will easily destroy the newest, fastest hardware.

 

It will always be like this. Game engines like Unreal3 and Crysis and even HL2 aren't built for what is the best NOW. They are built to scale up with each new successive hardware generation.

 

If Doom3 played at 1600x1200 w/4xAA and 16xAF and everything on VERY HIGH settings and you still got 100fps with your hardware on the day it came out, would it scale any higher when say the next new video card and cpu came out? No, it would look the same, and honestly it would look dated by then.

 

Unreal3, Crysis, etc, they are built to take advantage of not only what is out right now, but what might come out during the lifespan of the game itself. Many times game engines contain code/components that don't actually work at all or aren't turned on when used on current sets of hardware, but it's designed to engage or install or be modified by a patch later when hardware comes out that can actually utilize these functions properly.

 

Also take a second to ponder DX10.1. Because a card can run DX10.1 code does NOT mean that the game code is optimized properly. It does NOT mean that the DX10.1 spec itself is optimal. It does NOT mean that your particular gpu is truly optimized at the driver level for DX10.1.

 

Then what about those of us with XP and DX9?

 

Ok, think back to any and all of my rants about Vista & DX10 being bogus piles of . spewing from Microsoft's mouth about how DX10 can't run on XP blah blah when the truth is, they thought it would make you rush out and buy Vista if they claimed DX10 would only run on Vista.

 

But in the aftermath of that mistake, you've got the two major gpu mfg's making DX10 HARDWARE that still has to run on 90%+ of the world's gaming machines that STILL use DX9 (Windows XP...Vista has less than 10% penetration rate still).

 

Without going into it all over again, DX9 and DX10 is fundamentally different at the hardware level because of how each uses the gpu's features. DX9 was all about core clock drawing polygons and shader engine functions running off the core. DX10 is all about shader stream processors being able to handle individual data stream renders.

 

DX9 gpu's = did you ever hear of such a silly thing as "96 stream processors" and "shader streams" or such?

 

nope, the gpu was just a gpu that used clock cycles to process as much data at once as it could.

 

DX10 is supposed to change all of that. Offloading individual streams to these shaders is more efficient and can give visual quality with high performance unmatched by previous generations of graphics processors.

 

Think of it sorta like how you can do multimedia on a standard Intel processor but if you use the SSE (or SSE2, SSE3, etc) extensions, the multimedia performance increases by a HUGE number if the software is optimized for it. Go into your Intel BIOS and turn off SSE functions if your bios allows you to (a LOT of old P3 and P4 bioses allowed this), and then boot up to Windows and go play a game.

 

Yeah...HORRIBLE. Unless you had a brute-strength-speed of cpu that could do the work in the main processor that a few SSE extensions do.

 

So, this is all in theory btw, (Nvidia/ATI/Microsoft's theory so maybe not to be trusted completely).

 

So, in theory, instead of having an 800Mhz gpu core that has to do it all, instead you'd use a 500Mhz gpu core consisting of 96 (or 112, or 256, or whatever) individual processors within a single gpu core to not only run cooler (we assume 500Mhz is going to run cooler than 800Mhz but a lot depends on that that we won't get into, but you know what I mean lol), but is able to more efficiently break down the workload into the individual stream processors.

 

Now, again, that's the theory.

 

But what if DX9 isn't coded to handle breaking down the game code into efficient little streams that individual stream processors can handle? (and, btw, it's not, but you already knew that lol).

 

Then you force the gpu to go back to brute-strength-speed. IE turning off your SSE extensions and then overclocking the cpu 1000Mhz to make up for having no SSE extensions to efficiently handle the code that is optimized for SSE!

 

Otherwise known as FUBAR because MS demanded new hardware be DX10 compliant so hardware makers like NV/ATI made the hardware this way, but it really isn't "compatible" with DX9 routines. So then you sorta get a type of "emulation" forcing the gpu to act like an old DX9 gpu instead of using the theoretically efficient DX10 code paths which is what the DX10 hardware is optimized for.

 

Simple yeah? lol.

 

Of course that doesn't explain why games running on DX10 hardware but using DX9 in WindowsXP still run faster than DX10 in Vista...other than Vista is a huge pile of .. But that actually can be explained by the relative simplicity of DX9 when compared to how it would have to be coded in DX10.

 

for example...almost every guitar has 6 strings. Steve Vai wanted one with 7 strings. Ibanez made him one with 7 strings. You could still play your old 6-string songs on it, but it was a bit more challenging with that extra string in there.

 

BUT

 

once you learned how to utilize that 7th string properly, you could play songs with a lot more range/detail than any 6 string could ever produce.

 

But you CAN play a 7 string guitar like a 6-string. It still has the original 6 strings you already are familiar with.

 

So why the xxxx is Microsoft saying DX10 won't work on XP?

 

Because they want you to buy Vista.

 

 

Now, all that explained, back to the task at hand: Crysis is an incredible piece of game technology. You, me, and the rest of the losers can't play it in super mega high detail because no matter how great our machines are (or how great we THINK they SHOULD be considering all the big numbers like how much money we spent on quad cores and dual 8800GTX gpu's), the game will crush them.

 

Just like all the previous killer games like this, the next generation or 3 of gpu hardware that comes down the pipe will alleviate this and pretty soon, we'll be complaining that the Crytek/Crysis engine is looking dated because finally the scalability of the engine has been maxed by the hardware (and maybe by then the software known as Vista and DX10), and it can scale no more. But by then, Unreal4, Crysis3, Quake92 etc will be out that crushes your top-of-the-line hardware because it has been built to scale up to future hardware (gpu) features that can't be done or can be done VERY inefficiently (ie slow, laggy framerates when your current hardware tries to use all the advanced functions) with the current gpu in your rig.

 

Hope that helps. Btw, even my Core2 E6600 + 8800GTS 320MB + 2GB RAM + RAID rig can't play Crysis any higher than 1280x1024 at medium settings without the game starting to drag under 20fps (usually it's down in the 10's haha).

 

Yet I can play CoD4 with everything maxed at 1280x1024 and only get a few slowdowns or frame skips when the battle action and explosions are hot and heavy and frequent.

 

But CoD4, as great as it looks, isn't Crysis.

Share this post


Link to post
Share on other sites

HL2 engine just scales a long way, cryteks doesn't, its a great engine imo.

And a new game that pushes extreme level of physics and graphics, wont be played greatly on all current hardware of release. Thats how its always been. those kids with dual 8800ultras in sli with c2d oc'd to the core running the game like crap? please, either their doing something wrong or lying. Maybe trying to max it out 100% on a huge res?

don't forget people are stupid. just because they spend a couple grand on a system they automatically think that all new games should automically run perfect on their systems and if they cant max out a game, the engine isnt optimized right? lol or the game is crap and they jsut classy lady.

release a game thats not the best graphics, those same kids classy lady

 

I think 90 % of the people with good hardware bitching about performance are just trying to run too high of settings. I mean my 7800gt plays the game just fine. In fact i was playing the game at 1680x1050 with the aa locked at 4x and af locked at 16x via nvcontrol panel (cuz of 1.6) and I forgot to turn it to app controlled for crysis and i still ran the game at like 15-20 fps on med-high settings

if i turn it off the games very playable on med-med-high

 

and about crysis's argument being retarded because u think their engine will be obsolete in years to come, I disagree.. I still think farcrys graphics are great and better than many games today.

 

and why bash an option of allowing higher graphics, you can tune in settings for a reason, and if theres settings higher than you can handle why the **** would you classy lady about that???jealousy? cmon a game should release with options that are too high for current hardware, It shows its something that will uplift standards. if game devs, released games that would run on all systems only, the hardware market would die and everyone would still be rollin with voodoo cards. everyone that . about crysis being released and no hardware can play it is just stupid. A big chunk of pc gamers should be able to play the game... look at the sys requirements they definitely arn't anything exceptional.

Share this post


Link to post
Share on other sites

I also think that a lot of people are judging the game on the demo.

The full version runs much better at least on my machine.

 

Now if we want to start a rant how come Crysis only costs $45 or less and Vista premium starts at $200+ for the home edition?

I'll bet there's more talent at Crytek than at MS. :eek2:

Share this post


Link to post
Share on other sites

Guest SuppA-SnipA

hey Angry, do u like crysis? lol

im playing that game at all high 1280 x 720 (i think)

 

CryEngine 2 is a wonderful engine, even nvidia said so, now i know ur thinking "companies say things to support other companys" but i'd believe its a wonderful engine, mess around in the sandbox editor 2 a little bit.

now STALKER's engine, Xray, was on a bad side, gfx weren't as high as todays standards, and look at that games recommened system requirements, need a bloody 7900 for everything on, with full dynamic lighting, if i remember right. and those gfx look like from 2004. although GSC said they will cleanup Xray for STALKER Clear Skies

if u read anything on incrysis.com forums, most of the people say not to get a 320 mb, go higher for the best performance (as always lol)

-SuppA-SnipA

Share this post


Link to post
Share on other sites

I'm on 1280*1024 high detail no AA

fraps shopw 25-50FPS, usually above 30 though, runs smooth as butter.

Mine is 320MB GTS BUT... at 700/2000 with HR03 cooler! (3dmark 06 becomes unstable at 725/2100 so I backed off)

 

I think this is a great game! Optimization? Who gives a .! I'll play the game again in 18mths time maxed out on my 9800 GTX!!

 

Marco.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...