Jump to content

Some people just don't know any better.


Guest r3d c0m3t_merged

Recommended Posts

Guest r3d c0m3t

This revolves around something I've read courtesy of the official Atari/TDU forums which I am apart of, and I thought I'd share it with my peeps over here who would understand my stated point better. So, to quote...

 

"And graphics? Psst ... if you have an HD setup for your console, running at 1080i or 1080p, you're there. It's barely noticeable - if you ran a PC and a 360 side by side, you would have to be an expert to spot the differences. And with the way TDU is running on some PCs, no, sir ... gimme something tried and true ...

 

Yeah, you can customize games on the PC, but ... only if they have the right patches and such. Not so on the 360 ..."

 

First, let me say two things - It's fairly easy to spot graphical differences every and anytime you compare a console to a PC. Case in point, the 360 version of TDU overuses HDR and Overbright to a "T" whereas the PC port uses just the right amount, and even if you feel the PC version overuses HDR...you can turn it off. As far as HD goes, PC monitors were pushing "HD" capable resolutions before it was ever standardized so-to-speak. Yes, the PC version of TDU has a gajillion things wrong with it (I should know) but that's what comes with the territory of ports (look at the port of FEAR for the 360 and PS3)

 

Also, if you're sitting about 4-7 feet away from the television there's no real applicable way to discern any differences between HD or SD, or even ED. The only thing you may be able to point out is the seemingly noticeable improved "clarity." Nothing more.

 

In closing I want to say "Wtf is this guy talking about?!" for that closing statement of his, you don't need a patch to customize your graphical experience, a patch can only improve any problems said options can exhibit. Oh, not to mention us PC guys have DirectX10...not much of a gain (or care for) at this point, but it's still there.

Share this post


Link to post
Share on other sites

well, I can tell the difference between 480i, 480p, and 720p (or upconverted to 1080i/p) easily on my 360. Games like NHL 2k7, Gears of War, Tony Hawk Project 8, they all look great at 480p, but not nearly as good as at 720p. 480i just looks kinda crappy no matter what honestly, which is odd. The 360 has an awesome scalar that will upconvert or downconvert everything and anything very nicely.

 

Clarity is important, but it's easy to tell the difference between 480 and everything else (720/1080). Switch your 360 over to 480 and play then switch it back to 720/1080 and see the difference. Distance you sit from the set makes no difference. It's like watching a sunday football game in standard def then switching over to the same channel but in high-def. Instant difference, but it's more than just clarity.

 

There's a lot of differences in PC vs consoles in the graphics department. The 360 is basically an X1950XTX gpu (or something close to it anyway), which as we all know, with an optimized 3.2Ghz core (or 6 of them) won't have a single problem pushing 1280x720 with AA enabled (sure there's a RAM size difference, but it's moot since the 360 has a kernel + drivers that are only about 2MB while we Windows users load up at about 150-200MB and have every damn device and task under the sun running).

 

The major difference to me is how much you can tweak your pc's graphics options not only within the game, but within the driver control panel for your gpu. I want to run no AA? 4x AA? 16xAF? 4xAF? HDR on? Bloom on instead of HDR? More detailed models? Less detailed textures?

 

The little things make all the difference to PC gamers, but they also make all the problems. A console game almost always runs to it's locked framerate of 60fps, which is ultra smooth. Sure some games get some slowdown, but they are consistent in their graphics quality + framerate. PC's on the other hand....well, you might be getting 322fps then have it drop down to 21fps and stutter along real good for a few seconds (this could be not just the gpu having trouble with keeping the framerate up because of the game's quality settings, but it might be because at that moment your printer decided it needed to query Windows for some reason, or maybe your hdd just did a cache dump, or maybe windows update started running in the backgroun, maybe your anti-virus had to kick in at 100% cpu for 3 seconds because a new thread/process was begun by windows that looked suspicious...etc lol).

 

Side by side, at the same resolutions, there really isn't that much difference, if using the same graphics quality options. Hard to tell what all of them might be at on the 360 since you can't tweak them other than gamma/brightness usually, but since most graphics options are on for the 360, it makes a reasonable comparison to what you and I might play our pc game at (other than a stupid resolution like 1280x720 on a pc, we'd be playing 1440x900, 1680x1050, etc hehe).

 

So no I wouldn't say it's easy to ALWAYS spot graphical differences between the two, but for the most part, ports to pc from 360 almost always look the same....because ports are already set in graphics options and the port team usually won't spend time to enbaled/disable or make it an allowed choice (or allow you to choose but regardless of setting it don't make much difference in the look lol). Ports tend to be relatively poor honestly. Rainbow Six Vegas is awesome, but the two look essentially the same. Sometimes I got to say even the 360 looks a little better (more optimized for it's hardware base than any PC game could ever dream of being since PC's contain literally billions of possible hardware combinations).

 

Games like Gears of War though should be one of the few that you CAN easily tell a difference because the port team is putting a lot of time into allowing it to take advantage of the possible hardware superiority PC's might have (ie like an 8800/2900 series card, possibly DX10, true 1080p by having a 1920x1200 setting instead of the Xbox360 upconverting 1280x720 to it since GoW is a 720p native game).

 

Really just depends. TDU is a direct port, and poorly done. I got to play a total of 1 hour on the beta because either the game crashed instantly, or hte servers were down, or the servers booted you, or the graphics would corrupt within 30 seconds of driving around after entering the server...ugh, what a nightmare).

 

A patch can most definitely fix performance issues. We see it all the time. Look at latest STALKER patch that claims 15% to 20% performance across the board and for me it actually equated out to about 30% boost in the one machine I've tested it on. Console users however don't seem to understand this concept. Most don't understand the true nature of a console's strength...the fact that it only has to have drivers for ONE gpu and ONE sound unit and ONE dvd and ONE cpu and ONE chipset and ONE hard drive, which means that all the games developed for it don't have to worry about their game code working like crap on certain hardware pieces or combinations. They don't have to try and 'dumb-down' graphics settings to make sure it plays on a 1.4Ghz Intel P4 + 6600GT + 512MB RAM (combined with all the WindowsXP/Vista overhead).

 

No, a 360 dev says "this game will ONLY work on a triple-core w/hyperthreading 3.2Ghz cpu with an ATI X1900/X1950 gpu that has 10MB of embedded DRAM (which can make a tremendous difference if you go look at how this architecture works) and this 5.1 digital sound chip and MUST be capable of doing 1280x720 resolution". That makes it soooooooooooo easy. Especially a console that uses the DirectX API that game developers have been mastering for 10+ years. A closed hardware system will yield tremendous results. A closed hardware system rarely needs a 'patch' for performance issues (most of them are for exploit fixes and online problems as well as possibly major bug(s) that crop up).

 

It's always painful to read some fanboys jostling over console vs pc gaming. Each system has excellent strengths and debilitating wealnesses.

 

And like ALL consoles, at the time of their release, they are almost always better than any PC for about 6-12 months. Then they are equal for about 6-12 months. Then the PC of course leaves the console far behind from then on, getting progressively better until about 5 years into the cycle a new console comes out, which starts the cycle all over again.

 

We are nearing the end of year 2 for the 360, and while I don't think the entire potential has been tapped yet, the gpu in the 360 is not upgradeable which means anything that is native DirectX10 should conceivably look better than anything the 360 can produce. Conceivably. However the 360 and PS3 are the most serious consoles we've ever seen, and they might still be able to surprise us far down the road.

 

Case in point: Chronicles of Riddick: Escape From Butcher Bay

 

this game was released when PC's were very much back in front of anything the Xbox original could do, yet this game really just blew away anything on the PC in terms of graphics quality. The devs really put a lot of effort into the game, and it showed. I'm still amazed at how awesome it looks on the Xbox, even at 480p.

 

 

I just don't get too upset over these things. I'm pretty moderate in my thinking that consoles are awesome, as is my PC. I hate fps and rts games on the console because of the controls. I love sports and racing and some rpg's on the console because of the controls as well as playing with friends/family all in front of the same tv. Speaking of which, I think I'm ready for some Wii Bowling again!

Share this post


Link to post
Share on other sites

Guest r3d c0m3t

A lot of good points were made, and if I were to quote any one thing, I'd might as well just quote your entire post. :sweat:

 

In any case, I agree that consoles do lead the way for an extended amount of time and I believe the reason lies within the fact that any person; especially your Average Joe can turn on a console and play a game without much trouble. Whereas with a PC there are a lot of factors that come into play, such as hardware compatibility, OS compatibility, a boatload of RAM (in some special cases) for a smooth FPS-laden experience, and the latest drivers for your GPU, sound card, mouse/keyboard, etc...

 

The only thing a console gamer has to worry about is whether or not the game is in stock and if he/she can afford that game at that very moment. However, at the same time, no matter who's more popular or more well-placed at the moment, there will always be that one mental advantage us PC guys have over the console boys...we can honestly prove and admit to building our own gaming rigs with our own two hands (cuts, scrapes and all) and within a certain timeframe we can know exactly what it's capable of by playing games that heavily rely on the GPU and CPU (Again, such as FEAR) and not to mention we have the option to OC and get that extra 7 - 25% performance gain (depending on experience, and how far you're willing to go) for any slug-like FPS instances we encounter. I'm not really bothered by such said words either, but a few of my buddies have often asked me why I bother with PCs because consoles are so much better. Better in all it's glory and splendor? No. Easier to comprehend? Yes, that makes more sense.

 

Let's not forget who's based off of what. :rolleyes:

Share this post


Link to post
Share on other sites

well remember also we can UPGRADE our gaming rigs...consoles cannot be upgraded with hardware (adding in an HD-DVD driver or memory card for game saves doesn't count either lol)

 

when our rigs start to drag playing the latest game, we just pop in more RAM, bigger cpu, better graphics card, etc.

 

Consoles have the money advantage though...

 

 

It is just how it is...many strengths and many weaknesses. I say we should all choose to have a console for casual/family-friends gaming in the same room, and then a PC for some hardcore wasting action against asshat 16 year olds on the internet lol

Share this post


Link to post
Share on other sites

Guest r3d c0m3t

Also, isn't there some rumors going around that the next generation of consoles will adopt dual or to go a more descriptive route - multi-GPU configurations? If so, I'd imagine that would be better utilized than our SLI and Crossfire setups we have available to us now, mainly because software developers won't have much other choice but to program their codes around both GPUs since it's fabricated hardware and they'll both act as one - true to the essence. Of course, there's very little need for either in the PC forefront because everyone can't afford the almost-claimed luxury of dual card setups.

 

And for those games that are optimized or at least support it to some degree it'll actually drop the framerate in some cases because of the additional lag between the two cards and the time it takes for them to communicate, especially if they're at odds with one another.

Share this post


Link to post
Share on other sites

  • 4 weeks later...

I did the 480P to 720P switch and I did notice a difference. Some games less then others. The TV your using has something to do with that as well. I've hooked up a 360 to a real nice 37" with 1080p support and it looked amazing, but when I hooked it up to a 42" with 1080p by a cheapo brand it wasn't so nice.

Share this post


Link to post
Share on other sites

Personally, I'm not a huge fan of the whole 1080p thing. Aside from minimal support (though increasing) with Blu-Ray and HD-DVD, there is so little content available for this format that it really doesn't even enter the equation when purchasing a TV. Yet everyone cries 1080p from the rooftops as if it is gift from god on high (if he/she exists).

 

For the conceivable future (5+ years), there will not be any 1080p broadcast content. It is not even being talked about by the networks. Currently we are still struggling to deploy, at vast expense, enough high definition broadcast cameras to cover major sporting events.

 

In Canada, this means that all the Toronto Maple Leaf and Montreal Canadiens (boooo) games are broadcast in 720p, but only an occasional smattering of Ottawa Senators games (yahhh), when the cameras are available. The current cost of a single HD camera for broadcast use is roughly 300-400,000. Multiply by the number of cameras required to cover a single event, and you're talking big bucks.

 

Even more of a roadblock is the available bandwidth. There is simply not enough bandwidth available on the digital spectrum to switch over to 1080p at any time in the near future. One of the network techs told me that the current spec for coax delivered content allowed for roughly 200-300 720p channels - and nothing else. Adding one 1080p channel would require eating roughly 2 720p channels, meaning that with the current infrastructure, we could only have ~150 1080p channels to choose from if thats the way the networks decided to go. Of course, the future is hard to predict.

 

Other disadvantages to 1080p include the fact that at a regular viewing distance, it is almost impossible to see a difference from 720p. I have had two ultra-high-end 60" TVs, both capable of 1080p display, and on neither could I discern a difference between 720 and 1080p content.

Plus SD content (roughly 93% of what's available) often looks much worse when scaled to the 1920x1080 native resolution of 1080p sets. Of course, this has a lot to do with the quality of the scalar and the implementation, but even I think SD content looks better on my 50" 720p plasma than my 46" 1080p LCD, especially since I consider the LCD to have the superior PQ and scalar.

 

Ramble ramble ramble. So 720p is the sweet spot as far as I can see. If you're buying a TV, do yourself a favour (Canadian spelling) and don't get hung up on 1080p sets. The added cost will do nothing for your viewing experience. Spend the money instead on a better or larger screen - something that will definitely impact your satisfaction and experience.

 

FYI, the best deals in 720p TVs right now are the 50" Panasonic PX77U and Samsung 5064. Both feature awesome anti-glare screens and excellent PQ. If you have a darkened room and glare isn't an issue, the PX75U and 5054 models are virtually identical but lack the anti-glare screen. If you've got the bills, I just got the Pioneer 5080HD and it literally blew my socks off. And yes, I play games on a plasma. Burn in is virtually a non-issue now with most sets, but do yourself a favour and turn the brightness for the first couple hundred hours.

 

If you are set on LCD, snag a 46" Samsung 4665f. It sports the uberkind 1080p, and the picture is simply mind blowing with HD DVD/Blu-ray content. But make sure you're sitting close enough to get the full picture.

Share this post


Link to post
Share on other sites

my only concern with buying a TV is will it do the 4 major formats:

 

480i

480p

720p

1080i

 

That's all I care about. 1080p is just a marketing ploy as far as I'm concerned (and though 360/PS3 games are being done in 1080p now, it's no reason to buy a 1080p because 720p, 1080i and 1080p are impossible to differentiate on a console game really, and all my sports and Discovery Channel stuff are in 720p/1080i ;)

 

Bandwidth won't be an issue down the road as we move to all-fibre (canadian spelling haha) networks and copper coax will eventually be replaced ;)

Share this post


Link to post
Share on other sites

my only concern with buying a TV is will it do the 4 major formats:

 

480i

480p

720p

1080i

 

That's all I care about. 1080p is just a marketing ploy as far as I'm concerned (and though 360/PS3 games are being done in 1080p now, it's no reason to buy a 1080p because 720p, 1080i and 1080p are impossible to differentiate on a console game really, and all my sports and Discovery Channel stuff are in 720p/1080i ;)

For gamers and general TV watchers, I think 720p is the ideal to reach for in most cases - the cost : performance ratio is good, and 99% of the content is only available at this resolution anyway.

 

For movie buffs, 1080p holds a certain appeal. I have roughly 40 movies on a combination of HD-DVD and Blu-Ray, and yes, I absolutely notice the difference between 720p and 1080p (on a 1080p set) from regular viewing distances. The differences are more than just subtle too.

 

With more and more rental shops (and the online renters like zip and netflix) renting HD movies and content, 1080p is not the blatant marketing ploy most people think it is. It's more of a subtle and overt marketing ploy :) I would still think twice about buying 1080p over a comparable - and likely cheaper - 720p set, but I think that in the future, we are going to see a move to 1080p as the standard for "true" HD, with more and more content and industry support for it. Simple economics and profit mongering assure us of that much.

Share this post


Link to post
Share on other sites

i could see a realy nice difference from 720p to 1080i when watching broadcasting tv on my toshiba 57hm167 which top of the line:)

 

720p also looks nice but 1080 is much clearer sharper.

 

i cant see what 1080p looks like because theres no broadcasting for it.

 

i cant comparing them on consoles because i dont have a 360yet.

 

or i dont have a hdmi cable for my pc or i whould test them on that:) ony got a vga cable hook up to it.

 

but while watching tv i could see the difference.

 

but for pc vs consoles i pick pc because i like haveing more control over things like changing res settings tweaking and other stuff and also nothing beats a mouse and keyboard:)

 

but i am planing later on to buy a 360 and a wii sense i got this big tv:)

Share this post


Link to post
Share on other sites

you definitely want a Wii and then go to Amazon and get some $5.99 MadCatz 480p Component cables for it (and then get Guitar Hero III!)

 

All of the sudden DirecTV has started blasting off lots of new HD channels for us, and *gasp* even NHL Center Ice games are very frequent in HD now. I must be dreaming.

 

Too bad we get no HD local channels (they barely come in on the antenna and only ABC gave us 'regional' (meaning NYC or LA) ABC programming).

 

I see lots of inexpensive yet excellent LCD and Plasma tv's that only do 720p but I don't want to be limited to it. It doesn't have to do 1080p, but they all do now anyway I guess, but the ones that do 1080 as well are about $500-$1500 more lol.

 

I'll just wait. My rear-proj Mitsubishi is starting to fade I think as I'm having to recalibrate the convergence zones every couple of days now =(

 

but it still looks awesome in 1080i!

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...