Jump to content
ir_cow

I hate to say I told you so

Recommended Posts

Do you know what NDA is? NDA means NO DISCUSSION AGGREEMENT. Until the NDA is lifted all the benchmarks are BS. I am not saying any results on the net are accurate or unaccurate but Intel does not give discriminatory rights to TOM'S QUEER.

 

I met Anand from Anandtech and he has more clout than anybody period. If Intel does not give Anand any permission, TOM"S QUEER wont either.

First, you're wrong, it's non-disclosure agreement.

 

Second, do you really think any of us don't know what an NDA is? :blink:

Share this post


Link to post
Share on other sites

 

Do you know what NDA is? NDA means NO DISCUSSION AGGREEMENT. Until the NDA is lifted all the benchmarks are BS. I am not saying any results on the net are accurate or unaccurate but Intel does not give discriminatory rights to TOM'S QUEER.

 

I met Anand from Anandtech and he has more clout than anybody period. If Intel does not give Anand any permission, TOM"S QUEER wont either.

Wow and here I was all this time thinking NDA was Non-Disclosure Agreement....

 

 

:)

Share this post


Link to post
Share on other sites

At any rate, one of my beef's with Intel is their iGPU with mid-to-low level CPU's. Ivy Bridge, for instance, I would think it would be more beneficial to have an HD 4000 in all of the CPU's, rather than dumb it down on the mid-to-low level ones with HD 2500's and just have the top-end ones with the HD 4000. Anyone spending $200 - $300 on a CPU is most likely going to get a discrete graphics card to use. Anyone spending less and utilizing the iGPU instead for gaming would be far better off using AMD's APU's, which leaves Intel's mid-to-lower end CPU's in kind of a rut.

Share this post


Link to post
Share on other sites

 

Do you know what NDA is? NDA means NO DISCUSSION AGGREEMENT. Until the NDA is lifted all the benchmarks are BS. I am not saying any results on the net are accurate or unaccurate but Intel does not give discriminatory rights to TOM'S QUEER.

 

I met Anand from Anandtech and he has more clout than anybody period. If Intel does not give Anand any permission, TOM"S QUEER wont either.

First, you're wrong, it's non-disclosure agreement.

 

Second, do you really think any of us don't know what an NDA is? :blink:

 

ROFL, I have so many NDA's signed that I have lost count and just keep pretty much everything secret :wallbash:

 

Second just because Anand did not get to go first does not mean Tom's would not get the chance. Tom's as a site carries a LOT of weight with the tech industry and the fact they published this shows that Intel did give them the go ahead. No one risks losing Intel as a source.

 

 

 

At any rate, one of my beef's with Intel is their iGPU with mid-to-low level CPU's. Ivy Bridge, for instance, I would think it would be more beneficial to have an HD 4000 in all of the CPU's, rather than dumb it down on the mid-to-low level ones with HD 2500's and just have the top-end ones with the HD 4000. Anyone spending $200 - $300 on a CPU is most likely going to get a discrete graphics card to use. Anyone spending less and utilizing the iGPU instead for gaming would be far better off using AMD's APU's, which leaves Intel's mid-to-lower end CPU's in kind of a rut.

 

This is something that has never made any sense to me. How often is someone going to buy a top of the line processor and NOT put in a discrete video card? I mean I understand some people do but that cannot be the norm. Now drop down to the lower tier and it is very much the norm. Would make a lot more sense to put the best iGPU in the lower end chips.

Edited by ComputerEd
  • Like 1

Share this post


Link to post
Share on other sites

 

 

At any rate, one of my beef's with Intel is their iGPU with mid-to-low level CPU's. Ivy Bridge, for instance, I would think it would be more beneficial to have an HD 4000 in all of the CPU's, rather than dumb it down on the mid-to-low level ones with HD 2500's and just have the top-end ones with the HD 4000. Anyone spending $200 - $300 on a CPU is most likely going to get a discrete graphics card to use. Anyone spending less and utilizing the iGPU instead for gaming would be far better off using AMD's APU's, which leaves Intel's mid-to-lower end CPU's in kind of a rut.

 

This is something that has never made any sense to me. How often is someone going to buy a top of the line processor and NOT put in a discrete video card? I mean I understand some people do but that cannot be the norm. Now drop down to the lower tier and it is very much the norm. Would make a lot more sense to put the best iGPU in the lower end chips.

 

OMG Ed, we finally agreed on something! :)

Share this post


Link to post
Share on other sites

At any rate, one of my beef's with Intel is their iGPU with mid-to-low level CPU's. Ivy Bridge, for instance, I would think it would be more beneficial to have an HD 4000 in all of the CPU's, rather than dumb it down on the mid-to-low level ones with HD 2500's and just have the top-end ones with the HD 4000. Anyone spending $200 - $300 on a CPU is most likely going to get a discrete graphics card to use. Anyone spending less and utilizing the iGPU instead for gaming would be far better off using AMD's APU's, which leaves Intel's mid-to-lower end CPU's in kind of a rut.

 

+3

Share this post


Link to post
Share on other sites

On the flip side, AMD should do the same thing. An AMD A4-5300 with a HD 7660D for $54.99? That thing would sell like hotcakes!

Share this post


Link to post
Share on other sites

Yes and no on the AMD side and let me explain why. I agree that the lower cost chip should have a lower GPU but the in the case of the APU even the high end ones are designed from the start to use the IGP and considered that way in most builds. The entire APU line all the way to the A10 is considered their budget side chips. I do agree however that they need to see about beefing up the lower cost chips GPU. I would love to see them offer a single IGP and spread it over the entire APU lineup, just not sure that is pricing feasible.

Share this post


Link to post
Share on other sites

Oh... computer nerd drama, so spicy. :D

+1, I'd only care if there was concrete testing laptop haswell CPU battery life (and iGPU performance)

 

 

I also agree that the better on chip graphics should be on the cheaper CPUs too (but then why would most people want anything higher than a newer gen Pentium instead of an i3 or i5 for their laptops...) 

Share this post


Link to post
Share on other sites

 

I am not so sure it is lack of competition, though that is a factor for sure, but more so a lack of need. Even the big professional apps that push the hardware are having trouble really keeping the CPUs fully fed.

 

Yes, it is another very important and deciding factor behind Intel's strategy of virtually stalling the architecture improvement.

And talking about why tom's did this 'leak' - who knows? And above all, who cares? :teehee:

 

For one I do. It means every site that posts a review will be posting old news.......

Share this post


Link to post
Share on other sites

I saw the preview on a Facebook link to Tom's Queer. The benchmarks were limited as it looks like Intel did an AMD on this one with specific Intel Benchmarks(the ones showing the best increases). The benchies were limited and do not show the entire picture. The reviewer did not want to overclock a "borrowed" CPU. It was rather lame.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

×
×
  • Create New...