Overclock.net banner

1 - 16 of 16 Posts

·
Premium Member
Joined
·
7,245 Posts
Discussion Starter #1
The guys at the INQ have been talking of a merged CPU+GPU for almost a month, and it seems they were right. Dailytech introduces the new AMD CPU <a href="http://www.dailytech.com/article.aspx?newsid=4696" target="_blank">here</a>.<br />
<div style="margin:20px; margin-top:5px; ">
<div class="smallfont" style="margin-bottom:2px">Quote:</div>
<table cellpadding="6" cellspacing="0" border="0" width="99%">
<tr>
<td class="alt2" style="border:1px inset">

MD intends to design Fusion processors to provide step-function increases in performance-per-watt relative to today’s CPU-only architectures, and to provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high-performance computing. With Fusion processors, AMD will continue to promote an open platform and encourage companies throughout the ecosystem to create innovative new co-processing solutions aimed at further optimizing specific workloads. AMD-powered Fusion platforms will continue to fully support high-end discrete graphics, physics accelerators, and other PCI Express-based solutions to meet the ever-increasing needs of the most demanding enthusiast end-users.

</td>
</tr>
</table>
</div>*EDIT Techreport has some aditional <a href="http://techreport.com/" target="_blank">details</a>.<br />
<br />
**EDIT Tom has something to <a href="http://www.tgdaily.com/2006/10/25/amd_announces_fusion_processor/" target="_blank">say</a>.<br />
<br />
<div style="margin:20px; margin-top:5px; ">
<div class="smallfont" style="margin-bottom:2px">Quote:</div>
<table cellpadding="6" cellspacing="0" border="0" width="99%">
<tr>
<td class="alt2" style="border:1px inset">

Fusion is expected to make its debut in the in late 2008/early 2009 timeframe. AMD said that it will use the technology within all major computing segments, including <a href="http://www.tgdaily.com/2006/10/25/amd_announces_fusion_processor/#" target="_blank">mobile</a>, desktop, workstation, server, consumer electronics as well as products for emerging markets. Further details were not provided, but it is obvious that AMD can play with several ideas, ranging from strategies that reduce the cost of today's CPU/GPU combinations to high performance platforms that leverage the floating point capabilities of graphics engines. AMD believes that "modular processor designs leveraging both CPU and GPU compute capabilities will be essential in meeting the requirements of computing in 2008 and beyond."

</td>
</tr>
</table>
</div>
 
  • Rep+
Reactions: kagaos

·
Registered
Joined
·
2,260 Posts
I say it's about time <img src="/images/smilies/thumb.gif" border="0" alt="" title="Thumb" class="inlineimg" />
 

·
Registered
Joined
·
787 Posts
If it is going to be the heat of the cpu + the gpu, the they better throw in a TTBT as a stock cooler. <img src="/images/smilies/biggrin.gif" border="0" alt="" title="Big Grin" class="inlineimg" />
 

·
Registered
Joined
·
1,923 Posts
<div style="margin:20px; margin-top:5px; ">
<div class="smallfont" style="margin-bottom:2px">Quote:</div>
<table cellpadding="6" cellspacing="0" border="0" width="99%">
<tr>
<td class="alt2" style="border:1px inset">

<div>
Originally Posted by <strong>StepsAscend</strong>

</div>
<div style="font-style:italic">Whoever does this right first could dominate the notebook and SFF and HTPC markets.</div>

</td>
</tr>
</table>
</div>yea. but not the overclocking market! <img src="/images/smilies/tongue.gif" border="0" alt="" title="Stick Out Tongue" class="inlineimg" /> <br />
<br />
i dont really like the idea... makes it hard for budget builders... i.e il buy the best cpu now and a chepo gfx card till xmas and buy a good one then.<br />
<br />
you wouldnt be able to do that now would u <img src="/images/smilies/rolleyes.gif" border="0" alt="" title="Roll Eyes (Sarcastic)" class="inlineimg" /> <br />
<br />
im sure there are many other flaws with this design.. but i just woke up <img src="/images/smilies/biggrin.gif" border="0" alt="" title="Big Grin" class="inlineimg" />
 

·
Premium Member
Joined
·
7,245 Posts
Discussion Starter #6
What about nVidia? Some intresting assumptions about the future of X86 CPUs <a href="http://uk.theinquirer.net/?article=35281" target="_blank">here</a>.
 

·
Registered
Joined
·
409 Posts
this new technology wont change anything for the most part.... what they are saying is taking a cpu and rewrite the arciturture of it so it can perform to be the best at graphics.. so what i see happening.. is have either a second slot on the mobo for the GPU cpu.. or haveing a card like today that has a removable gpu and to upgrade performance u just slip in a new gpu..<br />
<br />
either way.. having the abilty to buy the GPU that you want will be a great improvement in prices that people that only need a processor for CAD can buy a GPU that is coded to be exactly for that.. also if gamers only plan on playing games on it.. it can be coded for just for gaming.. just as with the cpu.. you will need to buy ram... so y upgrade the entire card if u just want a little more ram?... slip in some more DDR4 or whatever... and there u go.. the major advantage for overclockers is that it will work just like a cpu.. so overclocking can be done with the multiplier, or fsb.. this will make fps shoot threw the roof unlike what current overclocking video cards do..
 

·
Registered
Joined
·
787 Posts
I like the idea of the second socket. IMO the CPU+GPU on one die (or under 1 IHS) is a bad idea.
 

·
Premium Member
Joined
·
4,807 Posts
I wonder how they plan to put all of that on one peice of silicon. GPUs are getting bigger and bigger as well as CPUs with quad core right around the corner, this is going to be a HUGE chip. Also, if it is to be done with one die, that would dramatically lower yields since it would have so many transistors.
 

·
OVERCLOCK IT OR DIE
Joined
·
7,335 Posts
I envision a day where we have one whole half of the mobo for our 4 CPU's, 2 GPU's and 1 PPU and the cooling would involve one large slab of copper covering it with just the Memory, cable slots, and back end plugs hanging out.
 

·
Registered
Joined
·
1,693 Posts
For what it's actually worth, AMD made a smart move. Piggybacking off one of the best card makers in the entire world and market for revenue, isn't such a bad idea during the investment portion of R&D for their new multi-core chipsets. When the moguls gather again for their mortal combat, AMD can at least outshine Intel with the fact that their video cards come from their newly acquired development team, and that due to that, the cards have improved somehow.<br />
<br />
I hope the fact that AMD bought the video and commercial/non-commercial application technology company ATI doesn't stray the devout followers of ATI card buyers away from that particular market. The R600, and any other that comes of breakthroughs and epiphanies, may be a stronger chip if AMD lends a hand to it. Hell, there could even be a possibility that they drop a couple of their market socket chipsets in for an LPE series card that could start a unifying trend on CPU/GPU frequency tie-ins, and how they utilize what stage set RAM is in the PC and on the card.<br />
<br />
There's always something new about a computer that makes someone want something more. Physics engines that take the brunt of vector, graphics, and trig/geo calculations and push them to the limit, commercially available home based sound cards with RAM for more voice and synth counts than ever before, hardware rails for new technologies and applications, or the actual chips that become our wettest dream in gaming or processing. Graphics and processors will always be in need of improvement, at least to some small facet in detail or design. We like the looks hot. We love the performance size big. Now, let's just get them home without breaking the bank.
 

·
Premium Member
Joined
·
8,373 Posts
<div style="margin:20px; margin-top:5px; ">
<div class="smallfont" style="margin-bottom:2px">Quote:</div>
<table cellpadding="6" cellspacing="0" border="0" width="99%">
<tr>
<td class="alt2" style="border:1px inset">

<div>
Originally Posted by <strong>Modki</strong>

</div>
<div style="font-style:italic">I envision a day where we have one whole half of the mobo for our 4 CPU's, 2 GPU's and 1 PPU and the cooling would involve one large slab of copper covering it with just the Memory, cable slots, and back end plugs hanging out.</div>

</td>
</tr>
</table>
</div>More like a PC would just be one pig chunk of copper with cables running out of it... that thing is going to be heavy.
 

·
OVERCLOCK IT OR DIE
Joined
·
7,335 Posts
Well we're seeing unbelievable speeds on CPU's right now with passive cooling. Who knows maybe in the future chips will become much cooler.
 

·
Registered
Joined
·
4,996 Posts
it's probably for people that would buy graphics integrated on the Mobo. so now, it's integrated into the processor. saves space maybe? and it's probably faster.
 

·
Registered
Joined
·
1,693 Posts
<div style="margin:20px; margin-top:5px; ">
<div class="smallfont" style="margin-bottom:2px">Quote:</div>
<table cellpadding="6" cellspacing="0" border="0" width="99%">
<tr>
<td class="alt2" style="border:1px inset">

<div>
Originally Posted by <strong>Modki</strong>

</div>
<div style="font-style:italic">Well we're seeing unbelievable speeds on CPU's right now with passive cooling. Who knows maybe in the future chips will become much cooler.</div>

</td>
</tr>
</table>
</div>That would mean they'd build them to last. Why would a company build a chip that didn't need replaced or outmoded as soon as it was built? That'd be like an honest politician, a dependable car, and a rock band that still rocks 60 years later rolled all into one.<img src="/images/smilies/gunner2.gif" border="0" alt="" title="Gunner2" class="inlineimg" /> <img src="/images/smilies/doh.gif" border="0" alt="" title="Doh" class="inlineimg" /> <img src="/images/smilies/gunner.gif" border="0" alt="" title="Gunner" class="inlineimg" /> <br />
<br />
And besides, the thinner the materials making the part means less and less resistance to electrical wear and tear. Unless they make them more durable, prices will stay the same, and users will get the same performance, no matter how much they try to stabilize or analyze what's actually being done under the hood.
 

·
Banned
Joined
·
1,195 Posts
Its a trivial technology with an even more trivial future. Workstations, Cheaper Basic Work Machines, Servers, The like, It will work for. Enthusiast? Short Answer: No. A high end CPU/GPU solution might shave a few watts off in the end, but it would be far too expensive at this time, and by far very impractical. Intel has a better chance of taking off with a technology like this because they will already have 45nm down before 2009. With that, they're pretty much holding a piece of the puzzle for making something like this happen. Keep in mind im not biased on this, it IS an opinion, but take a moment to consider that AMD hasn't gotten comfortable with 65nm yet, let alone released 65nm parts, so they don't seem to be at much of a point to really start this kind of project, even in 2009. It'll have its issues in the beginning, as every initial release does, but it might come out to be beneficial in the more mainstream markets, just not enthusiast.
 
1 - 16 of 16 Posts
Top