Jump to content

3DMark06 is out


ExtraPickles

Recommended Posts

  • Replies 214
  • Created
  • Last Reply

Top Posters In This Topic

that's opposite from what he said.

oh yah, i see. my bad NEO. but still, i can only tell you my own personal experiences and the lower refresh rate of 60hz, as opossed to 100hz, gave me a better score in 3D01SE.

 

all i can think of that may have been changed is the drivers refresh rate performance since it was about two years ago that i noticed the difference and have not rechecked to see if it still holds true.

 

TGM

Share this post


Link to post
Share on other sites

607 core, have you done some modifications?

Yes i have the card volt modded btu to achieve that core it is runnign @ 569 with a delta of 40 enabled. I have the voltage set at a flat 1.6 volts for benching and the card slaps an additional .030 to it during benching as proven with the use of a multi meter durign runnign benches.

Share this post


Link to post
Share on other sites

Ohh sorry I didn't meant to sound like that or to reply to you alone, I meant to tell everyone in the discussion that resolution was no longer 1024 x 768 like previous Futuremark tests... but I didn't like to write too much so I used the "FYI" abreviation just to save time and space.

 

Back to the resolution subject I don't think you understand the issue well yet, since changing the resolution will only acomplish one thing: have a completely diferent result than 99% of the other users, that taking into acount that the free version users will always be more than the registered ones.

 

To have a good results database using 1280 x 1024 will have to be used, there is no alternative I can think of. Yeah you can reduce it manually and see your score get higher... but I believe it would acomplish little doing such a thing, since the only diference other than greater resolution between 06 and 05 is a small amount of textures.

 

Please forgive me for using a tone of language that was not needed in this discussion... however I would like you to do the same type of apology to me for the use of similar language in your responce. Knowing when I make a mistake isn't the same thing as accepting your sarcasm and passive agression Cythrawl. I hope this time I was clear enough for you to understand me.

 

Thanks for the attention.

 

Thats ok, it was just that you quoted me and it really does look like that its just referring to me...

 

So yes I am sorry for replying in kind. but however when you quote one person then reply in

large bold letters

it looks like you are implying that I didnt know that and are questioning my post. Hence the sarcasm. Really if you wanted to make a statement then quoting someone and replying with FYI in big letters ISNT the way to do it. A more constructive post that doesnt seem like is its just centered on one person you would not get such replies back atchya.

 

If it was the other way round and I had posted that way to one of your posts then you would feel the same way, hence why I posted the way I did.

 

Just my 2 cents on that.

 

Anyway I think as I said before that the 1280x1024 thing its totally bogus. As stated before some people cant even display that res on some CRTS' and I think they should patch it back down to 1024x768 to keep it conistant with the previous 3DMark as you CANNOT do an objective comparison with the previous version due to the extra overhead put on the card with the higher resolution.

 

That is why I did them all at the same res (1024x768)

That way you can see a fair decrease in my results instead of a bigger drop due to futuremarks oversight of making the res even higher to reduce scores even more... Why did they do this?

Who knows, but my theory (note that its my opinion too) is that FT is in cahoots with the video card mfg's. they make a new 3DMark every couple of years to make the top line cards at the time of release look worse than they actually are (I'll guarentee that it will run everything just fine otherwise) in order to get people to upgrade to the latest and greatest, because it runs 3DMark great.

 

You only have to look in this thread on the amount of people saying thier score sux0rs, or that they are dissapointed and suchlike that thier scores dont meet the grade.

 

I say, get over it people its a becnhmark that really doesnt mean squat right now. Sure its probably telling you that your PC will suck like that in a year or two for games (but only the cutting edge ones at that) but right now it means nothing..

 

I dont know ANYONE who's using thier PC for games keep the same video card for more than 12 months... not a one.. but to upgrade becuase of a benchmark score is just way to sad for me to comprehend..

Share this post


Link to post
Share on other sites

oh yah, i see. my bad NEO. but still, i can only tell you my own personal experiences and the lower refresh rate of 60hz, as opossed to 100hz, gave me a better score in 3D01SE.

 

all i can think of that may have been changed is the drivers refresh rate performance since it was about two years ago that i noticed the difference and have not rechecked to see if it still holds true.

 

TGM

 

Well that's possible :).

I believe that if the video card is to stressed, lowering the refresh rate, hence forth lowering the amount of frames drawn could help out.

Something like that :).

 

I've seen that before too, but on avg I think a higher refresh is faster, and is much smoother during gaming.

I mean, say you're spinning your tires in gtasa or something, a higher refresh and it's gonna look alot smoother.

That would be a tiny speed increase right there.

On the other hand, take the same game, gtasa, a higher refresh may actually hurt performance in a few areas I suppose.

That's just an example.

This holds true in maybe games or apps I think.

 

Take psx emulation, back in the day I used a higer refresh and in the areas I got the best fps, I got even more fps.

But in the areas that I would'nt have to much extra speed, it hurt my performance.

Not by much though.

 

In the areas where it boosted speed, it was quite a bit.

 

 

 

As for the 1280x1024 thing...

I agree, not cool.

I would hope any res is selectable, standerd wise.

 

I have no idea why they would pick such an wierd res.

Perhaps the lcd thing...

 

If I were the dev, I would of picked 1024x768x32 as default.

If I were to pick higher...

That would be 1280x960x32, which is 2x 640x480 ;), it's all in the aspect ratio's...

Or perhaps 1480x960x32, which is 720x480 X2, ntsc standered but doubled...

 

But noppe, they picked that 1280x1024, which is 640x512 doubled :, are these guys from europe or something???

 

 

Edit:

I've kept my quadro4 for along time before I got this 6600gt.

I mean along time.

It sucked ya, but hey, not everyone can afford a $200-$500 card once a year.

So I've allways made do.

I played farcry allright with a quadro4.., not as pretty as my new card, or as fast, but when I 1st got this 6600gt, belive it or not, I had that quadro4 for so long, even though it was screwed(200mhz mem) from a accident, that it ws faster then the 6600gt ^^.

I was so used to the card, I had the driver tweaked to the max for the gf4 ti's and the q4's.

It was'nt until later I got my 6600gt up to par and beyond.

 

Thing is, I still get 2-3 fps on the begining of the 3rd map of farcry :.

My details are way past the norm though so I guess it can't be helped :(.

Share this post


Link to post
Share on other sites

It doesn't make a difference for LCD screens, benched both 60 and 75hz, no difference whatsoever.

 

Hmm with 3Dmark maybe, but if you bench something like Doom3 you will find that there IS a difference.

 

Again a real world test and not a synthectic benchmark.

 

Plus I notice some tearing much more with Vsync off when running at 60hz vs 75hz even on an LCD.

 

I would much rather run it at 75hz...

 

If you have Vsync on however you dont get any tearing, but your FPS in say FRAPS will only show you running at 60FPS max vs 75FPS max on the same game...

 

So like I said it DOES make a differnce ablit a small one.

Share this post


Link to post
Share on other sites

Well that's possible :).

I believe that if the video card is to stressed, lowering the refresh rate, hence forth lowering the amount of frames drawn could help out.

Something like that :).

 

I've seen that before too, but on avg I think a higher refresh is faster, and is much smoother during gaming.

I mean, say you're spinning your tires in gtasa or something, a higher refresh and it's gonna look alot smoother.

That would be a tiny speed increase right there.

On the other hand, take the same game, gtasa, a higher refresh may actually hurt performance in a few areas I suppose.

That's just an example.

This holds true in maybe games or apps I think.

 

Take psx emulation, back in the day I used a higer refresh and in the areas I got the best fps, I got even more fps.

But in the areas that I would'nt have to much extra speed, it hurt my performance.

Not by much though.

 

In the areas where it boosted speed, it was quite a bit.

 

 

 

As for the 1280x1024 thing...

I agree, not cool.

I would hope any res is selectable, standerd wise.

 

I have no idea why they would pick such an wierd res.

Perhaps the lcd thing...

 

If I were the dev, I would of picked 1024x768x32 as default.

If I were to pick higher...

That would be 1280x960x32, which is 2x 640x480 ;), it's all in the aspect ratio's...

Or perhaps 1480x960x32, which is 720x480 X2, ntsc standered but doubled...

 

But noppe, they picked that 1280x1024, which is 640x512 doubled :, are these guys from europe or something???

 

 

Edit:

I've kept my quadro4 for along time before I got this 6600gt.

I mean along time.

It sucked ya, but hey, not everyone can afford a $200-$500 card once a year.

So I've allways made do.

I played farcry allright with a quadro4.., not as pretty as my new card, or as fast, but when I 1st got this 6600gt, belive it or not, I had that quadro4 for so long, even though it was screwed(200mhz mem) from a accident, that it ws faster then the 6600gt ^^.

I was so used to the card, I had the driver tweaked to the max for the gf4 ti's and the q4's.

It was'nt until later I got my 6600gt up to par and beyond.

 

Thing is, I still get 2-3 fps on the begining of the 3rd map of farcry :.

My details are way past the norm though so I guess it can't be helped :(.

yah, i don't run low refresh in games, it's just one of my odd little tweaks that i do that seemingly gets me just a tad better 3DMark score :) i have a number of them as any good tweaker does.

 

about the 1280x1024 res... i know it's a huge hit on performance, specially with a new stress test like 3D06, but i'm glad that they chose that res. the native res on 19" LCD's, which seem to be the norm and easily affordible now, is 1280x1024. i still use my A+ Certified Refurbished HITACHI 21" CRT monitor i picked up for 168USD :D

 

TGM

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...