Overclock.net › Forums › Graphics Cards › AMD/ATI › some questions of the future.gpu and resolution(bottleneck too)
New Posts  All Forums:Forum Nav:

some questions of the future.gpu and resolution(bottleneck too)

post #1 of 9
Thread Starter 
WellI was thinking on upgrading my gpu in 2015 given that I bought a new cpu(i7 2600k),mobo and ram,But I have a 1366x768 monitor,which I want to change and for 1920x1080 at 120hz but I have some doubts

When will amd use stacked dram on their consumer level cards?
Will amd be continue using complete compute power(fp64) on their consumer level cards?
A 2016gpu mid-high range(or high range gpu 2015) would. Be bottlenecked with a 2600k. At 4.8ghz using 1080
Would a higher frecuency rate(120hz) help with powerful cards?

Could mantle be a determining factor for choosing amd over nvidia?
Edited by PontiacGTX - 3/26/14 at 11:41am
post #2 of 9
When will amd use stacked dram on their consumer level cards?

Unknown but probably not for a pretty long time. Even with traditional memory setups you can get 6GB on a card now and as time goes on there will be potentially higher density chips made available so its not un-feasible that there could be 12GB cards without stacking. So really there isn't much incentive for them to put stacked memory on an consumer card.

Will amd be continue using complete compute power(fp64) on their consumer level cards?

Probably as amd is trying to push compute into the mainstream.

A 2016gpu mid-high range(or high range gpu 2015) would. Be bottlenecked with a 2600k. At 4.8ghz using 1080
Would a higher frecuency rate(120hz) help with powerful cards?


As there is no solid info on how fast cards a year or two will be form now there is no sure way to know. But if current trends hold then mid-range (I consider the r9 280x current mid range) cards of the next generation will probably sit in the same range as the current top of the line stuff. So in short if you buy something in the $250-350 price bracket then there should be no issues combining that with an 2600k.

120hz won't really help with anything like that. You just get the ability to display higher FPS (assuming the GPU and CPU can push +60) and reduced motion blur (monitor blur not in-game blur).

Could mantle be a determining factor for choosing amd over nvidia?

Determining? No, but its is a consideration. Mantle does have the potential to reduce CPU overhead as well as get a bit more performance out of the GPU but API's are only as good as the developers who use them and thus far the results in real world games has been less then impressive.

IMO like PhysX its an debatable advantage and not really something that should be emphasize to much when selecting and GPU. There is usually more important factors to consider when selecting a card, like price/general bang for buck, size, power draw, noise ect. If the cards are so similar that the only thing setting them apart is PhysX/mantle then that choice will mostly be determined by what games use said features and what games you are going to play.
    
CPUMotherboardGraphicsRAM
i7 920 OC 4.0 GHz 1.35v HT on Asus P6T 1366 SLI Gigabyte GTX 970 OCZ 12GB DDR3 GOLD/Platinum mix 
Hard DriveHard DriveHard DriveHard Drive
Samsung Spinpoint F1 1TB Western digital 300GB Western digital Caviar Blue 1TB Samsung 840 EVO 250Gb 
Optical DriveCoolingOSMonitor
generic LG DVD WC'd , Supreme LT, NexXxoS Xtreme III 360, Phob... Windows 7 x64 Samsung 27" LED S27A550B 
MonitorMonitorKeyboardPower
ACER 23" x233H Dell U3415W Logitech G11 Chieftec 850W 
CaseMouseMouse PadAudio
Phanteks enthoo luxe Razer deathAdder respawn steelseries Qck Yamaha HTR-6130 AV Receiver 
AudioAudio
Yamaha NS-50B floor tower speakers miditech Audiolink II stereo sound card 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
i7 920 OC 4.0 GHz 1.35v HT on Asus P6T 1366 SLI Gigabyte GTX 970 OCZ 12GB DDR3 GOLD/Platinum mix 
Hard DriveHard DriveHard DriveHard Drive
Samsung Spinpoint F1 1TB Western digital 300GB Western digital Caviar Blue 1TB Samsung 840 EVO 250Gb 
Optical DriveCoolingOSMonitor
generic LG DVD WC'd , Supreme LT, NexXxoS Xtreme III 360, Phob... Windows 7 x64 Samsung 27" LED S27A550B 
MonitorMonitorKeyboardPower
ACER 23" x233H Dell U3415W Logitech G11 Chieftec 850W 
CaseMouseMouse PadAudio
Phanteks enthoo luxe Razer deathAdder respawn steelseries Qck Yamaha HTR-6130 AV Receiver 
AudioAudio
Yamaha NS-50B floor tower speakers miditech Audiolink II stereo sound card 
  hide details  
Reply
post #3 of 9
Thread Starter 
Quote:
Originally Posted by Bit_reaper View Post

When will amd use stacked dram on their consumer level cards?

Unknown but probably not for a pretty long time. Even with traditional memory setups you can get 6GB on a card now and as time goes on there will be potentially higher density chips made
available so its not un-feasible that there could be 12GB cards without stacking. So really there isn't much incentive for them to put stacked memory on an consumer card.

Will amd be continue using complete compute power(fp64) on their consumer level cards?

Probably as amd is trying to push compute into the mainstream.

A 2016gpu mid-high range(or high range gpu 2015) would. Be bottlenecked with a 2600k. At 4.8ghz using 1080
Would a higher frecuency rate(120hz) help with powerful cards?


As there is no solid info on how fast cards a year or two will be form now there is no sure way to know. But if current trends hold then mid-range (I consider the r9 280x current mid range) cards of the next generation will probably sit in the same range as the current top of the line stuff. So in short if you buy something in the $250-350 price bracket then there should be no issues combining that with an 2600k.

120hz won't really help with anything like that. You just get the ability to display higher FPS (assuming the GPU and CPU can push +60) and reduced motion blur (monitor blur not in-game blur).

Could mantle be a determining factor for choosing amd over nvidia?

Determining? No, but its is a consideration. Mantle does have the potential to reduce CPU overhead as well as get a bit more performance out of the GPU but API's are only as good as the developers who use them and thus far the results in real world games has been less then impressive.

IMO like PhysX its an debatable advantage and not really something that should be emphasize to much when selecting and GPU. There is usually more important factors to consider when selecting a card, like price/general bang for buck, size, power draw, noise ect. If the cards are so similar that the only thing setting them apart is PhysX/mantle then that choice will mostly be determined by what games use said features and what games you are going to play.

1.my concern was memory bandwidth

2.that`s a plus for amd if they offer stacked dran too

3.what about waiting for dual gpu setup(2015gpus)or single 450usd cards from 2016?i7@4.8ghz and 1920x1080

4.that was my guessing..maybe it will be usful for high end cards?but what`s the effect of fps with level of 120 limit when there is a bottleneck?or fps spikes?would It be worse than having 60hz and the top would be that?

5.but an exclusive api which gives extra choices to make the pc last long(i would keep cpu and gpu 3-4years. From now)

Thanks for your reply
post #4 of 9
Almost all 1920x1080p today are LED and have a native "refresh rate" or pixel refresh of 60. 120 and 240 are simply gimmicks to make people think one monitor is better than another, the same can be said for monitors have "1 million colors", they are all gimmicks. The 120 you see is supposed to make movies have less blur, however, many movies have been made as low as 24 fps, and, a fake 120 will do nothing for them.

The same can be said of playing games, can you actually tell the difference when the frames per second are 60 fps or 120?

Find a good name brand, such as Samsung, and stay away from the trash. I preferred the now older high quality LCD over LED, however, the industry is going LED.

I have a 40 inch Samsung monitor that sits 2 feet from my face, to play games on and it does OK, especially for the money. It uses about 40 watts less than my old 32 inch LCD [it was not a Samsung]. [My Samsung official designation is: Samsung 5000 40" 1080p Clear Motion Rate 120 LED-LCD HDTV - UN40F5000AFXZA], native resolution is 60, and it does not come with a cable. I cannot remember what I paid for it because I purchased other items in that order, however, I think it was around $400. To get a price from Newegg at the moment you have to begin the purchase procedure, but, you can back out of it after you get the price, because they are not telling on the web page. Newegg page for that monitor is here:

http://www.newegg.com/Product/Product.aspx?Item=N82E16889354264

So, you want to find a monitor that will be the proper size . Also, if you want to use a VGA adaptor and cable, the R9 290 and R9 290X does not support those cables. I use an HDMI cable [and HDMI can only send 60fps, so, 120 is worthless].

Also, if you find a screen that the native resolution is 120 you might want to purchase an GeForce 770 or 780, or AMD R9 280X and use a VGA adaptor and cable, which is probably the cable you are using now.

EDIT: More information

.

Edited by Dan848 - 3/29/14 at 8:42am
post #5 of 9
What? 120hz monitor is not a gimmick. You mean most TVs use interpolation to achieve '120' or '240' hz. Now there are native 120 hz refresh rate like some asus,benq,samsung,eizo etc etc.
Workstation
(4 items)
 
  
CPUMotherboardGraphicsMonitor
Xeon E5-2690 Supermicro 2011 Nvidia GP100/ Vega FE Dell ultrasharp 4k 
  hide details  
Reply
Workstation
(4 items)
 
  
CPUMotherboardGraphicsMonitor
Xeon E5-2690 Supermicro 2011 Nvidia GP100/ Vega FE Dell ultrasharp 4k 
  hide details  
Reply
post #6 of 9
PontiacGTX,

Quote, "1.my concern was memory bandwidth

2.that`s a plus for amd if they offer stacked dran too

3.what about waiting for dual gpu setup(2015gpus)or single 450usd cards from 2016?i7@4.8ghz and 1920x1080

4.that was my guessing..maybe it will be usful for high end cards?but what`s the effect of fps with level of 120 limit when there is a bottleneck?or fps spikes?would It be worse than having 60hz and the top would be that?

5.but an exclusive api which gives extra choices to make the pc last long(i would keep cpu and gpu 3-4years. From now)"

Answers,

1.) Memory bandwidth on my R9 290 [as is the R9 290X] is from a 512 bus, so, the memory bandwidth will be around 320 GB/s. The R9 280X was around 288 GB/s. If you overclock the card a little, the memory bandwidth will increase.

2.) Do not worry about stacked VRAM on an R9 290X or R9 290 [especially with an overclock down the road]. Those cards should perform for at least 3 years before you might start looking for something faster.

3.) You will not need more than one high end video card with a 1920x1080 screen, you might need 2 cards if a game is fully DX12. Most games still support DX9, so, DX11 will be used for years to come.

4.) I am not exactly sure what you mean with this question, except the fps hit with a 120 "refresh rate", I am not sure of what you mean by, "...would It be worse than having 60hz and the top would be that?" Again, if the native resolution of your screen will be 60 with a 120 capability, it will be best to run the screen at 60Hz or things could get ugly or dangerous beyond the native resolution. As I mentioned above, an LCD or LED native resolution of 60 is all you will need, 120 and 240 are gimmicks.
I lock my games at 60fps, even though I can go much higher. I can see no difference above 60 fps. Many, if not most people will not notice a difference above 30fps [except possibly a flight simulator or very fast paced shooter]. My eyes have been trained for many years to see at least 60 fps, and while watching TV I can sometimes see individual frames in a movie [which is troubling because it takes some of the realism away], because my eyes have been trained to see much higher fps in games or simulators.

4.) An AMD R9 290 or 290X or GTX 780Ti will last 3 to 4 years easily.

On a separate issue, a motherboard based on Z77 or later chipset will give you PCIe 3, and it will take a very powerful video card to saturate that, something that does not exist yet.

I have no problem running games with lots of eye candy on my R9 290 [my i5 3570K is running at 4.0GHz and easily overclocks to 4.6GHz on a good air cooler (overclocking is a matter of luck, you might get a CPU that overclocks well, or you may not, same with video cards)].
This may seem odd, however, if you have fast system RAM in the computer it will make CPU overclocking easier, and 8GB to 16GB will make demanding games less choppy. I noticed a difference going from 8GB to 16GB, you should have at least 8GB for gaming.
post #7 of 9
Quote:
Originally Posted by sugarhell View Post

What? 120hz monitor is not a gimmick. You mean most TVs use interpolation to achieve '120' or '240' hz. Now there are native 120 hz refresh rate like some asus,benq,samsung,eizo etc etc.

I hope this helps:

http://www.rtings.com/info/what-is-the-refresh-rate


Edit:

The figures I was using, especially refresh rate 60 and CMR 120, was due to what I interpreted as the OP was working within a specific budget, for example, I believe he said he just purchased a 2600 CPU, which, to me indicated a specific budget. However, I am assuming he currently is working off of a limited budget.
Edited by Dan848 - 3/29/14 at 9:55am
post #8 of 9
Thread Starter 
Quote:
Originally Posted by Dan848 View Post

PontiacGTX,

Quote, "1.my concern was memory bandwidth

2.that`s a plus for amd if they offer stacked dran too

3.what about waiting for dual gpu setup(2015gpus)or single 450usd cards from 2016?i7@4.8ghz and 1920x1080

4.that was my guessing..maybe it will be usful for high end cards?but what`s the effect of fps with level of 120 limit when there is a bottleneck?or fps spikes?would It be worse than having 60hz and the top would be that?

5.but an exclusive api which gives extra choices to make the pc last long(i would keep cpu and gpu 3-4years. From now)"

Answers,

1.) Memory bandwidth on my R9 290 [as is the R9 290X] is from a 512 bus, so, the memory bandwidth will be around 320 GB/s. The R9 280X was around 288 GB/s. If you overclock the card a little, the memory bandwidth will increase.

2.) Do not worry about stacked VRAM on an R9 290X or R9 290 [especially with an overclock down the road]. Those cards should perform for at least 3 years before you might start looking for something faster.

3.) You will not need more than one high end video card with a 1920x1080 screen, you might need 2 cards if a game is fully DX12. Most games still support DX9, so, DX11 will be used for years to come.

4.) I am not exactly sure what you mean with this question, except the fps hit with a 120 "refresh rate", I am not sure of what you mean by, "...would It be worse than having 60hz and the top would be that?" Again, if the native resolution of your screen will be 60 with a 120 capability, it will be best to run the screen at 60Hz or things could get ugly or dangerous beyond the native resolution. As I mentioned above, an LCD or LED native resolution of 60 is all you will need, 120 and 240 are gimmicks.
I lock my games at 60fps, even though I can go much higher. I can see no difference above 60 fps. Many, if not most people will not notice a difference above 30fps [except possibly a flight simulator or very fast paced shooter]. My eyes have been trained for many years to see at least 60 fps, and while watching TV I can sometimes see individual frames in a movie [which is troubling because it takes some of the realism away], because my eyes have been trained to see much higher fps in games or simulators.

4.) An AMD R9 290 or 290X or GTX 780Ti will last 3 to 4 years easily.

On a separate issue, a motherboard based on Z77 or later chipset will give you PCIe 3, and it will take a very powerful video card to saturate that, something that does not exist yet.

I have no problem running games with lots of eye candy on my R9 290 [my i5 3570K is running at 4.0GHz and easily overclocks to 4.6GHz on a good air cooler (overclocking is a matter of luck, you might get a CPU that overclocks well, or you may not, same with video cards)].
This may seem odd, however, if you have fast system RAM in the computer it will make CPU overclocking easier, and 8GB to 16GB will make demanding games less choppy. I noticed a difference going from 8GB to 16GB, you should have at least 8GB for gaming.

1.this point was exclusive for Bit_Reaper`s Reply

2.I only can buy 300 usd per year for onlines purchases so i saved 200 this year,plus 300 from 2015 are 500usd in total but I want a bit of future proof maybe r9 290 will be cheaper but the psu supply I will need would be a bit bigger for that I prefer new hardware,and stacked dram will brong 1000gb/s to the tabl. Maybe overkill for 1080p..oh well

3.well as I said I want a bit of futureproof.. dx12 is more like an answer to mantle to share gpu`s load with the cpu and maybe optimize some points of dx11.1

4.my point there was.which would be the effect of a bottleneck using a powerful gpu with an 120hz refresh rate

5.yes I bought 8gb ddr3 2133mhz with an i7 2600k and a z68 mobo

Thanks
post #9 of 9
Quote:
Originally Posted by PontiacGTX View Post

1.this point was exclusive for Bit_Reaper`s Reply

2.I only can buy 300 usd per year for onlines purchases so i saved 200 this year,plus 300 from 2015 are 500usd in total but I want a bit of future proof maybe r9 290 will be cheaper but the psu supply I will need would be a bit bigger for that I prefer new hardware,and stacked dram will brong 1000gb/s to the tabl. Maybe overkill for 1080p..oh well

3.well as I said I want a bit of futureproof.. dx12 is more like an answer to mantle to share gpu`s load with the cpu and maybe optimize some points of dx11.1

4.my point there was.which would be the effect of a bottleneck using a powerful gpu with an 120hz refresh rate

5.yes I bought 8gb ddr3 2133mhz with an i7 2600k and a z68 mobo

Thanks

I am sorry that I posted to the wrong person, thank you for bringing it to my attention.


2.) I doubt the R9 290 will be less expensive, until possibly may be 3rd or more likely 4th quarter 2015, or 1st quarter 2016, the world economy is still in bad shape, so, R&D will depend upon how much money is floating around or available [look for increased taxes to eat into income of most people, and hope that more people find good jobs.

I expect several variations of the current PCB components, and possible slight changes to the GPU if the economy is in bad shape.

Prices have fallen for most cards already after they went up $150 dollars because bit miners were buying most of them. I purchased my R9 290 several minutes after they became available, the bad of that was no specialized coolers [reference design only], the good is I paid $400 dollars. Although I have a Sapphire card, MSI Afterburner works well.

Power Supply: buy quality. There are several good companies. I have a 650 watt [Corsair TX650M - semi-modular], and it has no problem providing enough power for my i5 3570K at 4.4GHz [it will easily do 4.6GHz, however it is not necessary] and overclock my R9 290 to core clock 1119MHz [with a 10% bump up in power] and memory overclock to 1401 [x4 = 5604]. I only use air for cooling, and as I mentioned, the video card is reference. I do not want water cooling to try to push clocks higher, I do not need it, in fact, I do not need more than 4.0GHz on the CPU and an extra 5% on the GPU to give me very high to maximum eye candy in games.

If you are going to be doing a lot of demanding benchmarks, I suggest a good 750 watt power supply, because heavy load benchmarks demand much more power than games. Hope that helps.

I do not see a bottleneck with the CPU if it is overclocked to around 4.2GHz to 4.4GHz, on games that use the CPU heavily [most games still do not use more than 2 cores, a few use 4 cores]. Current games are not slowed down with very high to max or ultra settings at 1920x1080 resolution if your 2600K CPU is running around 4.4GHz and an R9 290 overclocked by 5% [if you have good case cooling and good CPU and GPU cooling you can safely overclock an R9 290 by 5% [+10% is very easy to reach on a reference unit in a well cooled case].

Note that even though I overclock and use air, I do not push it so far as to cause damage, I keep a close eye on temperatures. I want my equipment to last a very long time.

4.) 120hz refresh rate:

The answer to that question is determined by the game you play, some are demanding on the CPU, some are demanding on the video card, some are demanding on both so, it depends on the games you play, though more and more are becoming GPU dependent. Look up your games, do a search [name of game] demands on CPU or GPU. With the proper CPU [ yours at 4.2 to 4.4GHz] the R9 290 does OK even on 4K screens, that is plenty of power compared to a 1920x1080 screen at 120 refresh rate. The R9 290 and 290X excel as resolution is increased, especially if slightly overclocked, an R9 290 is faster than the best GeForce currently on the market. Frame rates will drop, but not so much that you will find yourself playing in nothing but chop. Remember the current R9 290s use 4GB frame buffer [VRAM] and the bus speed is 512.

The following is not a very good example in that only one popular game [Battlefiled series] is listed, and this was done with early drivers and no overclock.

http://www.techspot.com/review/736-amd-radeon-r9-290/page2.html

Take a look at 2560x1600 results [ = twice as many pixels as 1920x1080], this will give you a simple result of how demanding the number of pixels are at high res. This may not equal x2 refresh rate of 1920x1080 at 60 compared to 120 [my monitor's native refresh rate is only 60, not 120 so, I cannot quote you numbers]. I highly doubt doubling the screen size in pixels [resolution] will equal doubling the refresh rate because the latter is the speed at which the screen will draw at your native resolution, which should not be as demanding as high native resolution.

You may have to do some searching on the internet to nail this down.

I hoped that covered your questions, and you could read it, it is 1:30a.m here, atm.

Edit: More information, sorry, it is late here.

.
Edited by Dan848 - 3/29/14 at 10:24pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AMD/ATI
Overclock.net › Forums › Graphics Cards › AMD/ATI › some questions of the future.gpu and resolution(bottleneck too)