Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [VideoCardz] AMD Radeon RX 490X Specifications
New Posts  All Forums:Forum Nav:

[VideoCardz] AMD Radeon RX 490X Specifications - Page 22

post #211 of 374
Quote:
Originally Posted by prjindigo View Post

HBM2 is twin bank, son. 350(1400) sounds just right... except we know it's ___ and the ____ isn't a Vega card. This "data" is idiocy vomitously spewed by a website that banned me for saying "Videocardz wouldn't know what Journalism was if it was the only question on an open book essay test."

The chart is speculation lies.

490X is most assuredly not gonna be HBM2. With memory compression being used now there's no need for it. They could put in 8gb GDDR5x (GQDR5) on a 512bit bus with 3456~ shaders and produce a nice solid $400 card.

What you'll see from AMD that uses HBM2 will be a monster 6000+ shader chip.
A 512-bit bus isn't always an option.
post #212 of 374
Quote:
Originally Posted by slavovid View Post

why all those setences are against AMD ?
R9 290x remains relevant 3 years later, and in your opinion it is bad
R9 290x matched a newer architecture while being cheaper, it is bad
Dual AMD GPUs being cheaper and faster than any single card offering for a long time andit is bad
AMD researched HBM and released the first card with this memory, and in your opinion it is bad
AMD Reduced power consumption (comparing with GDDR5) while improving Bandwidth and getting a small form factor and adds bettter cooling , and in your opinion it is bad

You can't help people like that. Don't even bother, they won't look at things objectively and see the pros and cons of either brand.
Quote:
Originally Posted by slavovid View Post

The state of the GPU market is astonishing. Myself always trying to make an educated purchase picked Nvidia card because i was tricked into thinking that the game i am playing performs better on Nvidia cards.

Lots of people root for Nvidia that + marketing tricks other people into thinking they are better.
P.S. It's not about brands or being team red or green. It's about giving people the knowledge to make a better decision investing their money.

+rep for being honest about your experience as a brainwashed consumer!

Wise words in your post script smile.gif
Edited by Waitng4realGPU - 6/7/16 at 4:12am
post #213 of 374
Quote:
Originally Posted by Defoler View Post

It find the whole extremely inaccurate.

Kepler was never meant to be future proof. No card is. And GCN isn't all that future proof. It gets updated just like any other architecture. It also has its own limits (aka DX11 low performance across the board). They stick to one architecture, and it did not go very well for them for the last 4 years.

Funny, anyway GCN is more future proof than Kepler and Maxwell at least.
post #214 of 374
Quote:
Originally Posted by Serios View Post

Funny, anyway GCN is more future proof than Kepler and Maxwell at least.
Even pascal it seems.
post #215 of 374
I remember back when I first started looking into buying a gaming PC, I would request that people provide me builds with the specific request of "NVIDIA only".
Why?
My brand perception of AMD was that of a second grade company with second grade components that will last 1/4th of the time.

Until AMD changes that perception, I'm afraid not even a great value product will take the marketshare they're hoping for. They need a good product AND good marketing.
post #216 of 374
Again you are inaccurate, and even lied, more than once.
Quote:
Originally Posted by PontiacGTX View Post


More than 100% revenue making a High End GPU the same with midrange when they were tagged as such.

You still show lack of understanding how revenue even works.
A card cost to make 200$, if you buy it off amazon for 500$, it means the seller, his supplier and amazon are taking their cuts. They aren't giving you the card as a charity case.
This leaves nvidia with maybe 300$ of income from the card. And that is not 100%.
BTW, Nvidia are rolling on 1.25B$ in revenue for Q4 2015, and an actual profit of under 200M$ (6.25%). So even if they made 100% like you claim, it means they spend a huge amount to make those 200$ and only got 6.25% overall profit from it. And that is not even their bread and butter.
Quote:
So those card which dont have anything difference between reference card and the other brands should be more because they are handled by a 3rd party instead making an improved PCB/VRM/mostfet which certainly adds costs.
So you are saying that making the PCB, researching, testing, manufacturing, cost zero to nvidia? Again with the numbers issues?
Developing a reference PCB is also time and money consuming. And no, I know what you are going to argue. PCBs are not the same every generation. So yeah, it does cost.
Quote:
I dont know how much they get from their card it should be less than the GTX 970 whicha cutdown midrange GPU.R9 290x from 550usd went down to 450usd in the 8GB mode (290x) to Sub 400usd with the r9 390x. and still it was 150-100usd cheaper than the GM204 GPU while offering similar or better performance (1440,4k,Multiples displays,DX12)
Wait a minute! You are not saying that AMD overpriced the 290x for 550$ when they could have sold it for 450$ and even threw in another 4GB memory in there?
Holy! So they got 100-150$ extra revenue per card?
Where were you when this monstrosity happened!!?? Where were you yelling how AMD are cheating on us and having 1000000% revenue when they could have sold the cards cheaper?
Where were you when the sold us a 290x 8GB by calling it 390x and charged us more money?
I know. You were standing in front of nvidia offices with "we hate you!" boards. right? right?
Quote:
R9 290X(2013) with a slighly OC is matching the new architecture from nvidia on 2014-2015, while getting playable framerate on high resolution, getting better Multi engine support than Nvidia card which are supposed to improve most of the features from Kepler,and still it didnt for the new API from Microsoft (DX12)
So you are comparing "slight OC" to reference? You do remember that nvidia cards, both the 600 series, 700 series and 900 series were quite good overclockers, so slight OC which ended up being the ceiling for the 290x, was a long OC for the nvidia cards, which allowed them to get over the 290x.
Also CFX issues, huge stuttering, DX12 extremely partial support, but we should ignore that because it is written DX12, while ignoring all the DX11 underperformance completely.
Quote:
R9 295X2 for 50% lower price and later Powercolor decided to reduce price on their 290x2 still was cheaper and better than ANY of the Nvidia offering how is selling a worse card for more is a good thing?
You mean the 295x2 which was pricier than the 980 TI? So...cheaper?
More than a few games the 980 TI was better at compared to the 295x2, which was priceier. Let alone the devil 13 based on the 290, or the one based on the 390.
Quote:
what? R9 290x2,R9 390X2,R9 295x2 all remains relevant to date beating the best Nvidia offering the Titan X while being cheaper
Relevant? Not really. Cheaper, because they couldn't sell for 1500$ (remember that release price number? No. I'm sure you don't. Because well..,)
Quote:
Devil13 290x2 is the only version which improved PCB,power delivery and VRM cooling given AMD didnt allow modifying r9 295x2 all. and it dropped a 53% its price when the slower High end Maxwell GPU was released and the r9 290x2/295x2 was faster.
I was referring to the devil 13 390x2, which was actually faster. 290 =/= 390.
Quote:
Many investors who has stocks from AMD and effects new technology which later becomes standard because it is open.
Or like nvidia gameworks which forced AMD to put out opengpu? Without nvidia tech, AMD would never have thought to go that route.
Or like g-sync which forced AMD to convince vesa to support eDP tech (not new tech) and call it their own?
Or AMD taking all of the credit for tressfx which was actually developed by Crystal Dynamics, and is only used by them?
Quote:
if you talk about the R9 290x it has been their best GPU up to date while beating Kepler,matching GM204 on DX11 and beating GM204 on DX12.
And it was bad in all DX11, it was only popular because no one had a reason to upgrade to 390x because it was the same card just pricier so they kept it, and its 2nd hand from the foolish who did update to 390x, was great for the price. That doesn't mean AMD did a great sale on it. Just that it was very popular in the 2nd hand market.
I was actually referring to the fury X, which came out in a bang, and got banged when its DX11 performance was barely over the 980 in some games, let alone compared to the 980 ti or titan x.
It was only good on select AMD sponsored games. Everything else, AMD blamed anything but their architecture (from gameworks to tessellation to the universe spinning in a certain direction only).
Quote:
it wasnt only showcased also the Proof of Concept was taken into market and used effectively to recover a small part of their investment of R&D during 5/7years
the power consumption of the card is merely 275w under load
Are you seriously saying a 3000$ made in a very limited number (they stated less than 100) was meant to recover any R&D investment? 5/7 years? Do you realise they have been profitable in both consumer and enterprise markets for many years? Those 3000$ limited amount cards is barely a spare change in their revenue. That would mean barely a 0.001% of the revenue.
Also "merely" compared to everything else or one card?
Also:

Quote:
it increased SP by 45% and also improved the bandwidth while keeping the power draw of the r9 290x with less compute performance.
It doesn't matter. They can increase SP by 100000%.
If they needed that amount of SPs and that amount of memory performance and still missed their mark (to remind you they actually aimed at the titan x and "settled" to be just below the 980 TI), what good did it do?
Quote:
why would you care about power consumption on a enthusiast level card? by then you needed 2x card to match a proper performance in demanding games at high resolution
I was referring to the fact that it run way too hot on the core, leaving it barely OC room and requiring liquid cooling.
Quote:
that same memory allowed to match the 980 with same power levels in a smaller form factor
But it wasn't aiming for the 980 now did it? And the huge external cooler? Sorry, that is not smaller form factor.
The little mini is, and how much did that one cost compared to the 980?
Quote:
Pascal shares Maxwell pipeline where are the improvements from R&D in DX12 M.E?
1080p and you take that as a reference? biggrin.gif
Come on, be serious. Might as well run in at 480p.

There you go. That is the right reference, which you tried to hide.
Quote:
why all those setences are against AMD ?
R9 290x remains relevant 3 years later, and in your opinion it is bad
R9 290x matched a newer architecture while being cheaper, it is bad
Dual AMD GPUs being cheaper and faster than any single card offering for a long time andit is bad
AMD researched HBM and released the first card with this memory, and in your opinion it is bad
AMD Reduced power consumption (comparing with GDDR5) while improving Bandwidth and getting a small form factor and adds bettter cooling , and in your opinion it is bad


never criticizing the nvidia bad practices, Nvidia criplling/stop improving driver of 1000-700-650usd GPUs which performed quite similar to the new architecture mid range gpus at release, selling midrange gpus at +100% what they used to be, and high end GPUs almost 25% more, showing marketing slides with misinformation, making 3rd API to cripple whatever isnt their latest gpu architecture.

I never said the 290x was bad.
The fury x which was aiming to beat the titan x at every game, DX11 or DX12, at cheaper price, has failed.
And regardless of the things you wrote, AMD made them as their only way to gain performance, for they couldn't on any other way, while nvidia, could.
And dual AMD cards? Sorry, not relevant especially with DX12 CFX/SLI issues, overall driver issues, and DX11 underperformance.

I do criticise nvidia when there is need to. I don't criticise AMD as much as I criticise you, criticising nvidia but ignoring AMD's faults.
I criticise you for referring to nvidia as the bad guy and AMD as the good guy, like it is light vs darkness issue.
I was referring to the fact that both companies are doing what they are doing as either to gain market, because they have to, or because they can. Not because it is good or bad (which are irrelevant).

Nvidia crippling? Like AMD never ever in a million yes ever done that, ever. Or I'm sure you have been looking at forums seeing that nvidia are really not increasing performance, ever. Sure.
I'm also sure you have criticised AMD for doing so? And 100%? Stop lying and make numbers up.

You are overall blinded by your hate to nvidia from looking at the market itself, or what is actually going on (hence your links or references to performance at 1080p for a 1440p+ card).

Anyway, I expect you to lie better next time.
Edited by Defoler - 6/7/16 at 5:38am
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
post #217 of 374
Quote:
Originally Posted by lolerk52 View Post

I remember back when I first started looking into buying a gaming PC, I would request that people provide me builds with the specific request of "NVIDIA only".
Why?
My brand perception of AMD was that of a second grade company with second grade components that will last 1/4th of the time.

Until AMD changes that perception, I'm afraid not even a great value product will take the marketshare they're hoping for. They need a good product AND good marketing.

So you went on perception rather than fact?

Components that will last 1/4 of the time?

It's not up to AMD to change such a distorted personal perception, that's on you.
post #218 of 374
Quote:
Originally Posted by Oj010 View Post

A 512-bit bus isn't always an option.

It is to AMD.
post #219 of 374
Quote:
Originally Posted by Waitng4realGPU View Post

So you went on perception rather than fact?

Components that will last 1/4 of the time?

It's not up to AMD to change such a distorted personal perception, that's on you.

If AMD wants te sell i would say its up to them to change distorted perception of future clients, AMD has made rookie mistakes and has paid the price for that
Menta
(9 items)
 
  
CPUMotherboardGraphicsRAM
Intel 4970k  Asus Maximus Hero vii ASUS Strix 980 GTX Trident X 2400mhz cl10 
Hard DriveHard DriveCoolingPower
Samsung Evo 850 WD blue 1 tera  Noctua nh-d15  Seasonic 760 platinum  
Case
Nzxt H440  
  hide details  
Reply
Menta
(9 items)
 
  
CPUMotherboardGraphicsRAM
Intel 4970k  Asus Maximus Hero vii ASUS Strix 980 GTX Trident X 2400mhz cl10 
Hard DriveHard DriveCoolingPower
Samsung Evo 850 WD blue 1 tera  Noctua nh-d15  Seasonic 760 platinum  
Case
Nzxt H440  
  hide details  
Reply
post #220 of 374
Quote:
Originally Posted by Menta View Post

If AMD wants te sell i would say its up to them to change distorted perception of future clients, AMD has made rookie mistakes and has paid the price for that

I'm curious to know where you get the perception that AMD uses 2nd grade components? Where are these threads with people reporting their video cards dying on them or their CPUs dying on them in droves.

I think it's safe to say your perception is more of an illusion that was created by other people.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Rumors and Unconfirmed Articles
Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [VideoCardz] AMD Radeon RX 490X Specifications