Jump to content

High res or AA?


Desktop-pro

Recommended Posts

  • Replies 28
  • Created
  • Last Reply

Top Posters In This Topic

these days i usually play rendered in 1080p, then compressed onto a 720p screen.

 

means i get nice smooth antialiased images but don't have to deal with severe framerate drops or lack of AA support in certain games. also means that everything on screen in antialiased, so no need to fiddle with transparency AA etc.

Share this post


Link to post
Share on other sites

I find that better resolution is more effective at smoothing images than AA, but my monitor can only support up to 1680 x 1050, so I can also run decent AA on most games, and if I cant, I try to lower some other settings to balance the whole viewing experience

Share this post


Link to post
Share on other sites

I always try to do the moniters native resolution first, but this is a matter of personal preference.

:withstupid:

 

Native res here too (1680x1050 for PC, 1440x900 for laptop).

 

AA is actually typically the first thing I lower if I need extra FPS. I'd rather have high textures, shadows, lighting, etc.

Share this post


Link to post
Share on other sites

Surely resolution would win the argument hands down?

 

I thought it was a case of, your graphics hardware renders frames at the maximum size it possibly can, but your resolution is the bottleneck when it comes to the size of the rendered models, so the hardware needs to scale the models down, causing pixellation on the edges. At higher resolutions there's less scaling, thus less-to-no need for AA?

 

EDIT: Think about resizing an image in MSPaint. Now THAT is some serious pixellation =P It doesn't occur in photoshop because it anti-aliases the image accordingly, which is exactly what AA does in video games

 

If I'm entirely wrong, strike me down where I stand =P Please

Edited by Danrik

Share this post


Link to post
Share on other sites

Yeah I usually go for res first, although my system is a bit weaker than most, I usually try it on 1800x1440, max settings, to start with, and start turning things down from there, my 9600's seem to handle most of the games I play at 1600x1200 2xAA rather nicely though (and yes, I am running CRT's still and will be until they become completely impossible to find :P)

Share this post


Link to post
Share on other sites

Surely resolution would win the argument hands down?

 

I thought it was a case of, your graphics hardware renders frames at the maximum size it possibly can, but your resolution is the bottleneck when it comes to the size of the rendered models, so the hardware needs to scale the models down, causing pixellation on the edges. At higher resolutions there's less scaling, thus less-to-no need for AA?

 

EDIT: Think about resizing an image in MSPaint. Now THAT is some serious pixellation =P It doesn't occur in photoshop because it anti-aliases the image accordingly, which is exactly what AA does in video games

 

If I'm entirely wrong, strike me down where I stand =P Please

Don't call me Shirley. :P

Share this post


Link to post
Share on other sites

I run 1680*1050 and I start to turn down options before I turn off AA. If I cant play at decent settings with some AA.... either the game is terribly optimized or I buy new hardware haha. Thankfully I dont play games much anymore because my wallet really cant handle building another rig.

Edited by SMeeD

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...