Overclock.net › Forums › Industry News › Hardware News › [Softpedia] AMD Bulldozer CPUs Selling Out Surprisingly Fast
New Posts  All Forums:Forum Nav:

[Softpedia] AMD Bulldozer CPUs Selling Out Surprisingly Fast - Page 17

post #161 of 202
Quote:
Originally Posted by Homeles View Post

I dunno, while I agree 100% that most people won't be able to tell the difference, I personally am extremely sensitive to refresh rates. LEDs for example bug the crap out of me... while I enjoy their brightness and color compared to incandescent bulbs, they have a very noticeable (to me) refresh rate, which from what I've gathered is 60Hz, and I find it rather annoying.
I find myself getting into arguments consistently over how many frames an eye can see in second... there are so many fools that believe that the human eye can't notice any changes past 30 fps. To me the difference between 30 and 60 fps is incredibly noticeable when motion is involved. Obviously a static 30 fps image and a static 60 fps image would be incredibly difficult to pick apart, but when you're, say, panning the screen around in an fps or mmo, the choppiness of a 30fps image is quite blatant.
Obviously there are diminishing returns involved, and I'm not sure that even I would be able to tell the difference between 100 and 120 fps, but there's still a noticeable difference between 60 and 120.

While some Florescent and LED lights function at less than 60 Hz, incandescent bulbs MUST operate at the frequency of the alternating current (they turn on and off 60 times per second). Either the LED problem is placebo or the LED is functioning at less than 60 Hz. Further, the fact that the incandescent doesn't bother you proves that 60Hz is okay for you (ie you can't tell it's cycling on and off under normal circumstances) which makes you NOT "extremely sensitive". It's like the fact that 90% of people believe that they are 'supertaskers' when less than 1% actually are.

Minimum frames is what most people notice. A steady 30 FPS handily beats 60FPS with dropped frames.


OT. GF dropped the ball with 32nm and AMD dropped the ball with Bulldozer. If both of them can fix their problems, then we might just have a winner in CMT. Until that time, I will continue to look incredulously at anyone upgrading/sidegrading to a bulldozer CPU.
post #162 of 202
Quote:
Originally Posted by hajile View Post

While some Florescent and LED lights function at less than 60 Hz, incandescent bulbs MUST operate at the frequency of the alternating current (they turn on and off 60 times per second). Either the LED problem is placebo or the LED is functioning at less than 60 Hz. Further, the fact that the incandescent doesn't bother you proves that 60Hz is okay for you (ie you can't tell it's cycling on and off under normal circumstances) which makes you NOT "extremely sensitive". It's like the fact that 90% of people believe that they are 'supertaskers' when less than 1% actually are.
Minimum frames is what most people notice. A steady 30 FPS handily beats 60FPS with dropped frames..

this
post #163 of 202
Quote:
Originally Posted by hajile View Post

While some Florescent and LED lights function at less than 60 Hz, incandescent bulbs MUST operate at the frequency of the alternating current (they turn on and off 60 times per second). Either the LED problem is placebo or the LED is functioning at less than 60 Hz. Further, the fact that the incandescent doesn't bother you proves that 60Hz is okay for you (ie you can't tell it's cycling on and off under normal circumstances) which makes you NOT "extremely sensitive".
Incandescent bulbs have a hot filament that glows because of heat. The heat in the filament is generated by the current. The only way for an incandescent bulb to flicker is for the filament temperature to drop a few hundred degrees at the frequency of the alternating current, which obviously doesn't happen.

Apparently some fluorescent lights do flicker, but they do so at twice (IIRC) of the frequency of the current, which would mean they flicker at 100-120Hz in most countries. Something to do with the device required to maintain an even distribution of current or along those lines if I remember right - apparently this doesn't happen with newer fluorescent lights.
post #164 of 202
People with a moral compass know that any current CPU will meet the needs of 99% of PC users. wink.gif

They also know that multiple convictions for U.S. tax fraud in addition to countless convictions and pending convictions for violation of anti-trust laws hurts all consumers. Denial has never changed reality.

Trying to rationaize supporting a criminal corporation is futile and disgraceful. You might as well be supporting Bernie Madoff and ponzi schemes. No ethical person would do that.

http://www.theinquirer.net/inquirer/news/2120866/intel-antitrust-claims-dismissed
Edited by AMD4ME - 12/5/11 at 10:15pm
post #165 of 202
Quote:
Originally Posted by hajile View Post

While some Florescent and LED lights function at less than 60 Hz, incandescent bulbs MUST operate at the frequency of the alternating current (they turn on and off 60 times per second). Either the LED problem is placebo or the LED is functioning at less than 60 Hz. Further, the fact that the incandescent doesn't bother you proves that 60Hz is okay for you (ie you can't tell it's cycling on and off under normal circumstances) which makes you NOT "extremely sensitive". It's like the fact that 90% of people believe that they are 'supertaskers' when less than 1% actually are.

There's a problem with your claim: an incandescent bulb is still putting off light between refreshes. The metal glows white hot continuously.

Anyways, 60Hz is fine with me, but whatever these LEDs (talking Christmas lights, generally) are refreshing at annoys me.
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom 9750 (stock) MSI MS-7548 (Aspen) HD 6950 @ 971/1387 1.25v 8GB DDR2 
Hard DriveOSMonitorPower
750GB Windows 7 64-bit ASUS VH238H 1920x1080 Seasonic X-650 Gold 
CaseMouseMouse Pad
Rosewill Smart One Razer Naga Razer Scarab 
  hide details  
Reply
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom 9750 (stock) MSI MS-7548 (Aspen) HD 6950 @ 971/1387 1.25v 8GB DDR2 
Hard DriveOSMonitorPower
750GB Windows 7 64-bit ASUS VH238H 1920x1080 Seasonic X-650 Gold 
CaseMouseMouse Pad
Rosewill Smart One Razer Naga Razer Scarab 
  hide details  
Reply
post #166 of 202
I really love how some people are trying to accentuate the fact that you wont feel a difference between something like 76fps vs 114fps. Sure because of the refresh rate there is no noticeable difference but think of it like this. In 2-3 years when games become more taxing on the cpu the SB chip will still be chugging along higher than the BD chips and the difference COULD theoretically be between 40 vs 60. Now that to me defines playability in multiplayer games. If you put this into a GPU metaphore lets say the 6850 vs gtx 570, although both graphics cards are able to produce above 60fps at the time in the near future the 6850 will not be able to stay at the golden 60 frames where as the 570 is more likely to do so. Yes some people think future proofing is bs but if you look at it this way SB is a much better purchase, some people dont really have the money to upgrade so often and purchase parts every 3-5yrs or so. Also if you are running a high end setup then the 8150/20 is definately a no go, you are fooling yourself if you think there will be no bottleneck. If you are one of those people that change components and like to have fun OCing etc then BD is not such a bad purchase. But realistically in the "real world" i see no reason to buy BD if you want the best bang for buck/ logitivity of your chips or want the high end stuff.
Edited by Uncivilised - 12/6/11 at 1:19am
Main Rigg
(14 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7 920 C0 @ 4GHz Foxconn Bloodrage GTI [Temporary] Asus 9600gt 3x2GB G-Skill Pi Silver DDR3 1600Mhz 
Hard DriveOptical DriveOSMonitor
Western Digital Blue 500GB AAKS Asus 24x DVD Drive Windows 7 Ultimate x64 BenQ G2220HD 
KeyboardPowerCaseMouse
Microsoft Reclusa Thermaltake Evo Blue 750w A cardboard box  Logitech G400 
Mouse PadAudio
Steelseries Qck Mini Diablo III Edition Senheisser HD518  
  hide details  
Reply
Main Rigg
(14 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7 920 C0 @ 4GHz Foxconn Bloodrage GTI [Temporary] Asus 9600gt 3x2GB G-Skill Pi Silver DDR3 1600Mhz 
Hard DriveOptical DriveOSMonitor
Western Digital Blue 500GB AAKS Asus 24x DVD Drive Windows 7 Ultimate x64 BenQ G2220HD 
KeyboardPowerCaseMouse
Microsoft Reclusa Thermaltake Evo Blue 750w A cardboard box  Logitech G400 
Mouse PadAudio
Steelseries Qck Mini Diablo III Edition Senheisser HD518  
  hide details  
Reply
post #167 of 202
Quote:
Originally Posted by Stefy View Post

Quote:
Originally Posted by Brutuz View Post

No, a 386 performs horribly in today's software. I'm willing to bet that most people on OCN wouldn't notice the difference if I swapped their motherboard and CPU out for a BD one.
And on the other hand, most wouldn't notice the difference between a 2500k and a Q6600 either.
Anyone with a clue will notice the difference, not to mention most people here on OCN does some sort of benching or gaming, the difference is definitely noticable in both gaming and benching. Your point is irrelevant anyway because Intel is faster, whether you'll notice the difference or not doesn't really matter. Why pay the same price for less?

Oh, there's a difference in gaming?

Well, the extra core and 1Ghz clock speed (As well as IPC gains) I got from going to Phenom II from a Core 2 Duo I was on added performance increases in a lot of stuff, but games were not one of them.

As for paying the same...Compare the motherboard and CPU at the same time, the CPUs are similar but motherboards are generally more expensive for Intel. Proof? Go on newegg, 2500k for $219.99 and a Gigabyte GA-Z68XP-UD3P (Similar features to the AMD one I'm comparing it to, I have it on a source who has owned both and who I bought my UD3 from) for $174.99 versus FX-8120 for $209.99 and a GA-990FXA-UD3 for $159.99. That's $395 for Intel and $360 for AMD,
With that price difference you could get more RAM, a slightly better GPU or PSU or after the floods subside, another large HDD.
Quote:
Originally Posted by Stefy View Post

Quote:
Originally Posted by Vagrant Storm View Post

If you leave the FPS meter off and just play a game the only people that will notice a difference are those few super humans out there that truly can feel the difference between 100 and 175 fps. I have to admit I can notice my mouse moving better at more than 60 fps, but any tiny advantage it would give me would be drastically overshadowed when my eye follows a screen tear so it is V-Sync for me.
Of course if you bench you will know one system is faster than the other...that is the point of benching. However gaming and encoding will be hard to distinguish. Both systems will do the job right now just fine and you'd really have to pay attention to detail to notice a difference if you are not actively monitoring performance.
Plenty of games that requires some pretty good hardware to run above 60 FPS and yes, the difference between SB and BD can be huge in such games. Not to mention BD will bottleneck two high end GPUs.

I'd call that a pretty big difference. thumb.gif
Quote:
In CPU bound environments in Crysis Warhead, the FX-8150 is actually slower than the old Phenom II. Sandy Bridge continues to be far ahead.

All of those but one are above 60fps. The one that isn't is within pissing distance and I severely doubt those benchmarks are run with the whole "randomly clocks back to 1.5Ghz" problem fixed.
Quote:
Originally Posted by Homeles View Post

Quote:
Originally Posted by Vagrant Storm View Post

And you could have each of these systems setup and 95% of people will not be able to tell a difference between them. Not even the earlier mentioned super humans will be able to notice 20 fps more if you are over a 100fps already. The Dawn of War II test might have a higher percentage, but I bet that not even 75% of the people testing would guess which system is slower.

I dunno, while I agree 100% that most people won't be able to tell the difference, I personally am extremely sensitive to refresh rates. LEDs for example bug the crap out of me... while I enjoy their brightness and color compared to incandescent bulbs, they have a very noticeable (to me) refresh rate, which from what I've gathered is 60Hz, and I find it rather annoying.

I find myself getting into arguments consistently over how many frames an eye can see in second... there are so many fools that believe that the human eye can't notice any changes past 30 fps. To me the difference between 30 and 60 fps is incredibly noticeable when motion is involved. Obviously a static 30 fps image and a static 60 fps image would be incredibly difficult to pick apart, but when you're, say, panning the screen around in an fps or mmo, the choppiness of a 30fps image is quite blatant.

Obviously there are diminishing returns involved, and I'm not sure that even I would be able to tell the difference between 100 and 120 fps, but there's still a noticeable difference between 60 and 120.

Human eyes can see upwards of 400fps iirc, but a 60hz monitor (What most of us have) can't display more 60fps for the simple fact that it updates 60 times a second, if the frame is updating 120 times a second (ie. 120fps) then it's going to just drop every second frame. That's why (nearly) no-one needs more than 60fps.
Quote:
Originally Posted by Vagrant Storm View Post

Quote:
Originally Posted by Malcolm View Post

But with Intel you change sockets/motherboards like underwear...

Same with AMD...that backward compatability is a joke. Sure you might be able to stick an AM3 CPU in a AM2+ board and have it work, but you are severely gimping the CPU. It's fine for when performance isn't important. I actually have an old AM3 dual core in and AM2+ socket in one of my media machines. However, no one will ever do this for extended periods of time where performance is important. I can see some one doing this while saving up for a new motherboard...though people usually get the mother board with the new socket first and then save for the new CPU.

And actually I don't think I've heard of an AM3 board that can support an AM3+CPU yet. I haven't gone looking though either.

You're a fool if you think there is any difference between the main AM2+ CPU (Phenom II) in either AM2+, AM3 or AM3+.
I've used a Phenom II in each platform, no difference. Unless you're trying to imply DDR2 is a limitation for any CPU..Which it isn't.
Quote:
Originally Posted by Malcolm View Post

Quote:
Originally Posted by Vagrant Storm View Post

Same with AMD...that backward compatability is a joke. Sure you might be able to stick an AM3 CPU in a AM2+ board and have it work, but you are severely gimping the CPU. It's fine for when performance isn't important. I actually have an old AM3 dual core in and AM2+ socket in one of my media machines. However, no one will ever do this for extended periods of time where performance is important. I can see some one doing this while saving up for a new motherboard...though people usually get the mother board with the new socket first and then save for the new CPU.
And actually I don't think I've heard of an AM3 board that can support an AM3+CPU yet. I haven't gone looking though either.

Interesting. thinking.gif The AMD fanclub make it sound like the backwards compatibility is 100% and everything is awesome, I suppose not lol. At any rate my current and last few previous AMD chips are/were laptops so I've been out of the loop for awhile when it comes to upgrade compatibility and whatnot.

It is fairly good, BD not being backwards compatible with AM3 is a bit of a kick in the nuts but other than that it seems to work really well.
Quote:
Originally Posted by Steak House View Post

Will the AMD Apologist Ever Give Up!

Intel - Has a BETTER OPTION then ANY of the Bulldozer Chips - END OF DISCUSSION.

And I use 8 cores, as well as run Linux so I can recompile most of my applications to use the new instructions with ease.

Oh what's this? Bulldozer seems to be destroying Sandy Bridge, because SB doesn't have the new instructions!

There are scenarios when BD is faster, there are more where SB is faster. Most of them won't be noticeable for anyone on here as much as they act like its a night and day difference.
My system
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 3770k @ 4.7Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveOptical Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Samsung Spinpoint EcoGreen 2TB Pioneer DVR-220LBKS 
CoolingCoolingCoolingOS
Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition Arch Linux x86-64, amdgpu 
OSMonitorMonitorKeyboard
Windows 10 Bloatfree Edition BenQ G2220HD BenQ G2020HD Ducky Shine III Year of the Snake, Cherry Blue 
PowerCaseMouseMouse Pad
Silverstone Strider Plus 600w Lian Li Lancool PC-K60 SteelSeries Sensei Professional Artisan Hien Mid Japan Black Large 
AudioOther
ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
My system
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 3770k @ 4.7Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveOptical Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Samsung Spinpoint EcoGreen 2TB Pioneer DVR-220LBKS 
CoolingCoolingCoolingOS
Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition Arch Linux x86-64, amdgpu 
OSMonitorMonitorKeyboard
Windows 10 Bloatfree Edition BenQ G2220HD BenQ G2020HD Ducky Shine III Year of the Snake, Cherry Blue 
PowerCaseMouseMouse Pad
Silverstone Strider Plus 600w Lian Li Lancool PC-K60 SteelSeries Sensei Professional Artisan Hien Mid Japan Black Large 
AudioOther
ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
post #168 of 202
Quote:
Originally Posted by Brutuz View Post

With that price difference you could get more RAM, a slightly better GPU or PSU or after the floods subside, another large HDD and a much worse CPU.
Fixed.
Quote:
Originally Posted by Brutuz View Post

All of those but one are above 60fps. The one that isn't is within pissing distance and I severely doubt those benchmarks are run with the whole "randomly clocks back to 1.5Ghz" problem fixed.
Human eyes can see upwards of 400fps iirc, but a 60hz monitor (What most of us have) can't display more 60fps for the simple fact that it updates 60 times a second, if the frame is updating 120 times a second (ie. 120fps) then it's going to just drop every second frame. That's why (nearly) no-one needs more than 60fps.
Who cares? There are more demanding games out there where FPS will go way below 60, this only proves how far behind BD really is when it can't even keep up with Intel in less demanding games.
Quote:
Originally Posted by Brutuz View Post

And BD is getting better over time, plus the difference is very small.
proof.gif
Quote:
Originally Posted by Brutuz View Post

There are scenarios when BD is faster, there are more where SB is faster. Most of them won't be noticeable for anyone on here as much as they act like its a night and day difference.
They are most definitely noticable.
Edited by Stefy - 12/6/11 at 4:04am
AiryBox
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 3770k Gigabyte GA-Z77X-UD5H Sapphire 7950 Vapor-X Corsair Vengeance 8GB 
Hard DriveHard DriveCoolingOS
Hitachi Desktar 7200RPM 1TB Samsung 830 128GB Corsair H100i W10 Pro 
MonitorKeyboardPowerCase
BenQ 24" Qpad Mk-50 Cherry MX Red CM V700 NZXT H440 
  hide details  
Reply
AiryBox
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 3770k Gigabyte GA-Z77X-UD5H Sapphire 7950 Vapor-X Corsair Vengeance 8GB 
Hard DriveHard DriveCoolingOS
Hitachi Desktar 7200RPM 1TB Samsung 830 128GB Corsair H100i W10 Pro 
MonitorKeyboardPowerCase
BenQ 24" Qpad Mk-50 Cherry MX Red CM V700 NZXT H440 
  hide details  
Reply
post #169 of 202
Quote:
Originally Posted by Homeles View Post

There's a problem with your claim: an incandescent bulb is still putting off light between refreshes. The metal glows white hot continuously.
Anyways, 60Hz is fine with me, but whatever these LEDs (talking Christmas lights, generally) are refreshing at annoys me.

The light emitted between current changes has a fall off. If you were to plot this on a graph you would still see changes in lighting levels (a delta between fully lit and partially lit).

In a good LED, the AC is passed through a full-wave rectifier and then through a capacitor. This creates an almost constant light that has a similar falloff to an incandescent light bulb. In florescent bulbs, the big reason for flickering is a partially-bad ballast (note: incandescent and florescent lights will also display flicker if the voltage is fluctuating, even if it only has small changes).

All that is still secondary to the statement that a constant lower framerate (~30) > high framerate with dropping frames. The biggest causes of low minimum frames are poor optimization of software and GPUs resulting in too many milliseconds spent on certain types of operations.
post #170 of 202
so many stuff is unsupported or plain not working because of techno amd used this time nobody can say how it will fare once they re supported =and when i see mobo guys saying we re 100% when even ms itself have issue with bulldozer (best software maker on the planet i just pause at what mobo guy say and im like:heu hum ,ok (i guess)cuase lets face it none were and are ready for bulldozer at best of best we ll have benchmark to test those feature does it mean we ll be able to use them hell no check 64 bit great idea dx 11 great idea html5 great idea but all have one thing in comon they have yet to reach mass market adoption maturity?oh we got the geart but most game maker of software maker make sure their toy work on console then adapt them to pc !why bother getting a pc if a console will do just has good and often better!
also remember this guys almost everybody is going 1080p 24 hertz .blue ray use it big screen use it .and it is a standard so 24 hertz is what 24 fps so after 24 fps everything is useless,they use the tv to do the job not they x the 24 herts signal so you end up with 120 hertz 240 hertz) this is for america .i dont know how they go at it elsewhere!like me i am set at 24 hertz i get flickering but it hasnothing to do with the 24 hertz deal it is just a compatibility issue because some stuff in the computer expect 60 hertz(old way)but i bet it will be corrected soon
why 24 hertz very simple take a lot less ressource that is what ps3 do .the first thing to go when you need power is or should be the refresh rate!but remember a lot of screen led will flicker because the ms isnt high enough or dont work as they should be(save money)
as for comparing a gpucpu vs a cpu i got to say this is ludicrous ,since when a cpu is expected to be a cpugpu!it never happened in the past i dont see why today would be any different!compare cpu vs cpu then we can talk .i will ignore any compary with i5 2500 k to i7 2700k for the simple reason that those are gpucpu what to compare gpucpu use the a8 for comparing lol
Edited by drbaltazar - 12/6/11 at 5:56am
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [Softpedia] AMD Bulldozer CPUs Selling Out Surprisingly Fast