Overclock.net banner

1 - 20 of 96 Posts

·
Just Lift Bro
Joined
·
3,234 Posts
Discussion Starter #1
Quote:
Intel's Larrabee will launch eventually, but not as a GPU. The project has suffered a final delay that proved fatal to its graphics ambitions, so Intel will put the hardware out as a development platform for graphics and high-performance computing But Intel's plans to make a GPU aren't deal; they've just been reset, with more news to come next year...
Source

So no larrabee gpu


Something else soon i hope.
 

·
Just Lift Bro
Joined
·
3,234 Posts
Discussion Starter #4
Quote:

Originally Posted by Open1Your1Eyes0 View Post
Deceiving title is deceiving. Larrabee on LN2?

better?
 

·
Registered
Joined
·
1,884 Posts
Should read: Larrabee Axed.
 

·
Banned
Joined
·
4,189 Posts
Quote:


Originally Posted by ericeod
View Post

How long has Larrabee been in the works? I've been hearing about if for quite some time now...

I think since 2007.
 

·
Registered
Joined
·
226 Posts
Oh wow... couldn't they at least use it somehow for integrated graphics? This seems to be a huuuge bunch of money down the drain for Intel.
 

·
Registered
Joined
·
2,555 Posts
Quote:


Originally Posted by Bluelightning
View Post

Oh wow... couldn't they at least use it somehow for integrated graphics? This seems to be a huuuge bunch of money down the drain for Intel.

Read the article? Please? Is that so much to ask?
 

·
Registered
Joined
·
4,497 Posts
I'm taking this information with a grain of salt.
 

·
Registered
Joined
·
226 Posts
Quote:


Originally Posted by Lelouch
View Post

Read the article? Please? Is that so much to ask?

I read it
Quote:


If the fact that Larrabee is "launching" not as a GPU, but as a kind of multicore graphics demo unit, sounds like a cancellation to you, that's because it kinda sorta is.

I don't see this being all that profitable. Maybe I'm wrong, but it's still a lot of money down the drain. Although they are reiterating their development of a GPU...so it could also be considered a very expensive learning experience. Though I'm certain they will reuse parts of it for w/e they have awaiting us next year.
 

·
Registered
Joined
·
3,936 Posts
Quote:


Originally Posted by 45nm
View Post

I'm taking this information with a grain of salt.

Ars... is usually reputable... I'm not saying that they are correct but this isn't Semi-Accurate,Fudzilla, and The Enquirer we are talking about.
 

·
Registered
Joined
·
2,555 Posts
Quote:


Originally Posted by Bluelightning
View Post

I read it

I don't see this being all that profitable. Maybe I'm wrong, but it's still a lot of money down the drain.

Oh, did you now? Opening the article in a tab, reading the first paragraph, skimming by every other paragraph, then posting is not reading the article..

Quote:


Even though Intel couldn't have the Larrabee software ready on a timeframe that would make it competitive with NVIDIA and ATI (again, Larrabee is really a hardware/software hybrid GPU), the chipmaker can still push out the hardware itself and let others have a go at using it for graphics and HPC. Hence the plan to release it as a development platform for doing multi- and many-core research for HPC and graphics.

They arnt just scrapping it completely. As a GPU, maybe, but as other things, no.

This would be obvious if you really had read the article.

And im sorry for picking on you, however im tired of people not reading articles and then complaining about **** that was addressed in the article.
 

·
Senior Member
Joined
·
15,777 Posts
Larrabee Forever?
 

·
Registered
Joined
·
226 Posts
Quote:


Originally Posted by Lelouch
View Post

Oh, did you now? Opening the article in a tab, reading the first paragraph, skimming by every other paragraph, then posting is not reading the article..

They arnt just scrapping it completely. As a GPU, maybe, but as other things, no.

This would be obvious if you really had read the article.

And im sorry for picking on you, however im tired of people not reading articles and then complaining about **** that was addressed in the article.

Again...I did read the article. Honestly. However, that paragraph is listing their options with it. It still won't be mass produced as a GPU and is still a considerable loss of profit for them. And in that first post I was just hinting on what I thought would be a smart way for it to be used. however I will admit that I don't know enough about larrabee to know if it could be realistically applied for use as integrated graphics in it's current state.
 

·
Premium Member
Joined
·
891 Posts
Quote:


Originally Posted by Lelouch
View Post

Oh, did you now? Opening the article in a tab, reading the first paragraph, skimming by every other paragraph, then posting is not reading the article..

They arnt just scrapping it completely. As a GPU, maybe, but as other things, no.

This would be obvious if you really had read the article.

And im sorry for picking on you, however im tired of people not reading articles and then complaining about **** that was addressed in the article.

He's wondering why Intel put all this money into a product they intended to release mainstream, only to pull out of the market and forfeit any profit they might have made that way. Having a new development/ray tracing demo unit is great. However there's no way you can argue that this is Intel's ideal situation for this product. None of his posts implied that intel was scrapping the product, he just wanted to know why there were so many development dollars put into this, only to have no mainstream (even integrated, hence his first post) component to show for it.

SELF EDIT, SEE REASON: I still do think intel may use the knowledge they've gained with this to release some for of on-cpu integrated solution. Evidently they'll soon have cores to spare.

http://www.overclock.net/hardware-ne...processor.html
 
1 - 20 of 96 Posts
Top