red930 Posted November 8, 2009 Posted November 8, 2009 (edited) I can't understand what you're trying to say? I'm thinking the same thing as Waco, tessellation renders faster what are you trying to say with this line "what effort (GPU power) does a certain quality of image require to render" are you saying tessellation doesn't look as good? Also what the heck are you trying to say with "so your conclusion on behalf of implementation is above common logic," "Before putting yourself in the judgement of truth" Stop speaking in riddle. I'm not speaking in riddles. I was making a simplified technical description. same surface on both. notice the complexity on the first that has added tessellation the second is "standard" The lines are denoting a certain standard surface, this surface gets filled with additional surfaces with tessellation (to simplify). Now, let us take the first image as standard. Its complexity is clear, yes? Rendering such an image with the standard method is much harder compared to the solution named as tessellation. However, to add this complexity to an image requires additional work. Edited November 8, 2009 by Digitalis Share this post Link to post Share on other sites More sharing options...
xchrissypoox Posted November 8, 2009 Posted November 8, 2009 (edited) I'm not speaking in riddles. I was making a simplified technical description. same surface on both. notice the complexity on the first that has added tessellation the second is "standard" The lines are denoting a certain standard surface, this surface gets filled with additional surfaces with tessellation (to simplify). Now, let us take the first image as standard. Its complexity is clear, yes? Rendering such an image with the standard method is much harder compared to the solution named as tessellation. However, to add this complexity to an image requires additional work. I understand the concept of tessellation (though only at the basic level you described), I will stay out of this now because I don't know what is being argued seems like we're thinking the same thing, and I have trouble comprehending your comments. Edited November 8, 2009 by xchrissypoox Share this post Link to post Share on other sites More sharing options...
Waco Posted November 8, 2009 Posted November 8, 2009 (edited) I'm not speaking in riddles. No, but you are posting in them: Tessellation was not discovered with DX11 hardware, so your conclusion on behalf of implementation is above common logic. That doesn't even begin to make sense based on what I posted. I didn't say anything about DX11 nor when tessellation was first implemented (with the Radeon 8 series IIRC). DX11 is the first API to natively support hardware tessellation though...and that's what we're talking about here. You're also arguing the exact same point as everyone else; I'm not sure why you're taking the tone you are. Edited November 8, 2009 by Waco Share this post Link to post Share on other sites More sharing options...
The Smith Posted November 8, 2009 Posted November 8, 2009 (edited) Hey I was thinkiong about this today. The Hydra chips should be used by a board manufacturer that does not make NVIDIA cards, so they would be completely independent. How many manufacturers qualify for that? Intel does. Sapphire, which manufactures a few AMD boards, also qualifies, but I'd be really surprised to see something from them. But it would not be surprising from Intel though, and I think that's really what Lucid needs to get their technology to lift off, if no other board manufacturers want to face NVIDIA's angry-ness. Edited November 8, 2009 by The Smith Share this post Link to post Share on other sites More sharing options...
Avinexis Posted November 8, 2009 Posted November 8, 2009 Sorry for being a bit slow on the uptake... What is this thread about in the long run? I've never heard about Hydra before It sounds interesting but I cant get the brunt of it all Share this post Link to post Share on other sites More sharing options...
Fight Game Posted November 9, 2009 Posted November 9, 2009 in short, it is a technology that not only allows ati cards to work alongside nvidia cards (like xfire or sli), but also scales them so that if one card is alot stronger it will distribute more of the load to the stronger card. This would allow us to throw in any card and atleast get SOME use out of it. I'm hating nvidia for trying to stop this. Share this post Link to post Share on other sites More sharing options...
psycho_terror Posted November 9, 2009 Posted November 9, 2009 What is this thread about in the long run? people hating on nvidia because they could potentially benefit from hydra's failure. it's like there's a motive and some poor circumstantial evidence, but no actual crime, and yet nvidia is being tried for the death penalty. Share this post Link to post Share on other sites More sharing options...
Waco Posted November 9, 2009 Posted November 9, 2009 Hey I was thinkiong about this today. The Hydra chips should be used by a board manufacturer that does not make NVIDIA cards, so they would be completely independent. How many manufacturers qualify for that? Intel does. Sapphire, which manufactures a few AMD boards, also qualifies, but I'd be really surprised to see something from them. But it would not be surprising from Intel though, and I think that's really what Lucid needs to get their technology to lift off, if no other board manufacturers want to face NVIDIA's angry-ness. I think it'd be awesome if it was built into Intel chipsets. people hating on nvidia because they could potentially benefit from hydra's failure. it's like there's a motive and some poor circumstantial evidence, but no actual crime, and yet nvidia is being tried for the death penalty. Not quite, but okay. Share this post Link to post Share on other sites More sharing options...
IVIYTH0S Posted November 9, 2009 Posted November 9, 2009 What is this thread about in the long run? people like me defending nvidia just to further justify my purchase Share this post Link to post Share on other sites More sharing options...
The Smith Posted November 9, 2009 Posted November 9, 2009 I think it'd be awesome if it was built into Intel chipsets. No it would not be awesome at all. IMO, it's much better as a dedicated chip that can be implemented on any platform. I'm all for diversity. Share this post Link to post Share on other sites More sharing options...
IVIYTH0S Posted November 9, 2009 Posted November 9, 2009 No it would not be awesome at all. IMO, it's much better as a dedicated chip that can be implemented on any platform. I'm all for diversity. +1 Share this post Link to post Share on other sites More sharing options...
ekiM Posted November 9, 2009 Posted November 9, 2009 No it would not be awesome at all. IMO, it's much better as a dedicated chip that can be implemented on any platform. I'm all for diversity. The utilization of the chip on Intel boards would be better than none at all Share this post Link to post Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now