Jump to content
Zertz

NVIDIA to kill GTX-series?

Recommended Posts

sweet I hope some curtains get peeked under, and some whispers come down the lane

Ya who knows whats happening but I have a guess whats happening and cards are coming to us pretty quick but thats a guess of course :lol:

Share this post


Link to post
Share on other sites
Ya who knows whats happening but I have a guess whats happening and cards are coming to us pretty quick but thats a guess of course :lol:

I hope your guesses come true :thumbs-up:, oh happy day!!

Share this post


Link to post
Share on other sites

This thread needs to make 99 pages, i hope it's scandleous news! maybe like, The CEO annouced that he did not have sexual relations with that intern! then we can debate if fermi or physX played the bigger part in it and why ATI's CEO don't seem to be getting the same benafits.

 

All jokingly of course, any news is good news tbh when it comes to consumers decideing how and when to spend their money imho.

Share this post


Link to post
Share on other sites
Ya who knows whats happening but I have a guess whats happening and cards are coming to us pretty quick but thats a guess of course :lol:

When is this happening, today the 20th, what time?

I would like to see what they can come up with in a month.

I dont expect fermi until after the new yr.......

 

(every time i type or read fermi i think of that black eye pees girl, :lol:)

Share this post


Link to post
Share on other sites

Are we done arguing about PhysX being killed on ATI-centric platforms yet? Has ocre realized that nVidia spent money to specifically block that configuration yet? Can we finish this pointless argument?

 

/annoyed rant

Share this post


Link to post
Share on other sites
but in all seriousness I see the down scaling being unimaginably difficult, considering they JUST released down scaled versions of their 200 series cards...and they are bad, hope that was the opposite of what's to come

It's not really as bad as you think.

 

They just released the cut-down 200 series because they really didn't have a reason to before that...they still had the GTS 250/9800GT/8800GT to sell at good margins.

Share this post


Link to post
Share on other sites
yes i agree with you on that, it was a shame they did do that. I think it was a desperate move. And considering the circumstances its still not a smart move for they ppl that already had this setup were directly effected. I think that was a trade off in their thinking, and they did leave the older drivers available. But you must see both sides. It could be six months after the 5870 before the gtx300 series is available. Their ambitious project was a little too big to bite off at one time, and now they are in a little bit of a pickle. I am not defending their decision, nor do i think it was right. Its sad to see this move of desperation. They are just trying to give ppl a reason to buy the few cards they have in the warehouses. They havent been making much profit from those cards anyway. The 4800 series was killing them, they just couldnt compete in a price war and make profit. Now with the 5800 its even worse.

 

So thats the setup, they were trying to give reasons for ppl to buy nvidia in the now. in the moment. The moment that they arent looking so competitive in. It all led up to this. But this isnt the end of the line. I dont expect them to really push this ATI physX disabling scheme. I dont think it was to punish anyone. They are just trying to save their own ars. And its a desperate move. This is the only edge they had atm, and they are exploiting it. Many ppl want to see nvidia go under, i think they are ambitious and they have pushed with a little too much force this time. But i do see results, and i do see their errors, but i still think they are doing a lot in a short period and i do respect that.

 

Arguing gets you somewhere :P

 

I can agree with this post.

Share this post


Link to post
Share on other sites
It's not really as bad as you think.

 

They just released the cut-down 200 series because they really didn't have a reason to before that...they still had the GTS 250/9800GT/8800GT to sell at good margins.

but why bother now, the new 200s are awful, they could have just rolled with the 9 series as their midrange/low end cards.

I think all the 9 series cards have fallen into their correct price points(not including the 9800GT, which should be as cheap as the 8800GT is)

Share this post


Link to post
Share on other sites
When it comes to business there is no good guy or bad guy. There is profit. Remember the overheating nvidia mobile GPUs that they tried to ignore and hoped would just go away? Nvidia isn't altruistic. Yes getting temporarily pushed out of the chipset business will hurt them. Although I don't feel to sorry for them. How much do they charge for their SLI stamp of approval? How good were their nvidia chipsets the last 2 years? They had great chipsets previously with socket A boards. They fell behind with the socket 775 of their own accord. I can't think of a single well made nvidia chipset that has come out in quite some time. Will nvidia chipsets really be missed?

 

I don't think of nvidia as the evil villian. I don't think of them as green armored heroes either. It's their use of proprietary things like PhysX that cast them in a negative light. They may have offered it to other developers and manufacturers but there were strings attached. I don't fault them from a business standpoint but rather from a consumer standpoint. Without hacks or patches I can't use an ATI card and an nvidia card at the same time because nvidia decided to disable PhysX when an ATI card is also in the system. I bought the nvidia card for the PhysX. Now nvidia comes along and tells me after I've bought the card that I can't use it for my intended purpose. That I resent. Why can't I use both? Because nvidia doesn't want me to. Nvidia took my money in exchange for a video card. I own it and they still treat it as if they own it. This doesn't hurt ATI as much as it hurts the consumer. Nvidia seems to have forgotten us when they made this decision. This is for the average user not the DIY self builders who will take the time to find workarounds. ATI isn't innocent in this either.

 

It's like two kids on a playground who fight whenever they get near each other. Lucid has to come along and tell them to play nice or no more recess. Nobody gets to play kickball until these 2 jackasses figure out how to coexist. For now I dislike them both equally.

 

Let Intel push their Larabee GPU. From what I've seen so far it's of little threat. It underperforms most onboard nvidia solutions already in existence. ATI's onboard solutions appear stronger than Larabee as well. Intel can't charge a premium for it as it stands due to the GPUs lack of performance with it's early showings.

 

Good write up and I am surprised at how many good economic minds are in t his forum. I think as a whole, we get it. There are a few bleeding heart types but for the most part people here seem to respect competition and free markets.

Share this post


Link to post
Share on other sites

i just got this email, and would assume this was the big news they had for the day, but who knows guess we will find out later.

 

NVIDIA and EVGA would like to invite you to a new graphics card launch at the NVIDIA campus in Santa Clara, CA. No it

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...