Overclock.net › Forums › Graphics Cards › AMD/ATI › how to turn off ATI graphics card power managerment?
New Posts  All Forums:Forum Nav:

how to turn off ATI graphics card power managerment? - Page 2

post #11 of 23
Quote:
Originally Posted by crunchor;12003469 
GPU suppose to use the decoding ability of the graphics card to play video, then turn it off would not help, just get worse.

I had the same issue since I use 2 monitors and sometime watch streams/youtube while gaming on the other, and I can confirm that by un-checking the "enable hardware acceleration" solved all my problems.
Katy Perry
(14 items)
 
  
CPUMotherboardGraphicsRAM
i7 930@4.02 Rampage III Extreme Evga GTX 690 Hydro Copper Signature 12Gb 2000 MHz Patriot DDR3 
Hard DriveCoolingOSMonitor
256GB M4 | 2TB Seagate XSPC Rasa Windows 7 64bit Professional Alienware OptX 2310 
PowerCase
Corsair AX 850 Corsair 800D 
  hide details  
Reply
Katy Perry
(14 items)
 
  
CPUMotherboardGraphicsRAM
i7 930@4.02 Rampage III Extreme Evga GTX 690 Hydro Copper Signature 12Gb 2000 MHz Patriot DDR3 
Hard DriveCoolingOSMonitor
256GB M4 | 2TB Seagate XSPC Rasa Windows 7 64bit Professional Alienware OptX 2310 
PowerCase
Corsair AX 850 Corsair 800D 
  hide details  
Reply
post #12 of 23
Thread Starter 
Quote:
Originally Posted by rent.a.john View Post
I had the same issue since I use 2 monitors and sometime watch streams/youtube while gaming on the other, and I can confirm that by un-checking the "enable hardware acceleration" solved all my problems.
What happen afer you un-checking it.
post #13 of 23
Quote:
Originally Posted by crunchor View Post
What happen afer you un-checking it.
seriously dude, just try it. it isn't going to kill your system.
post #14 of 23
Quote:
Originally Posted by crunchor View Post
It is just stupid to turn the hardware accelerator function off.
Why? Do you need spare CPU cycles badly?

Quote:
Do you even know what is that?
How does shot web? Yes, I know "what is that" -- but it's not necessary to use if the CPU can properly decode the video. Unless your CPU is old/insufficient, why not just use it? PowerPlay is a useful technology, and I have offered you a solution.

Quote:
Adobe spends so much time developing it so flash can directly use hardware decoding.
Spoon-fed, the way the masses like it.

Quote:
You may want to do more research first.
Says the guy asking for help. Google ULPS and have a field day.
post #15 of 23
Thread Starter 
Quote:
Originally Posted by sub50hz View Post
Why? Do you need spare CPU cycles badly?



How does shot web? Yes, I know "what is that" -- but it's not necessary to use if the CPU can properly decode the video. Unless your CPU is old/insufficient, why not just use it? PowerPlay is a useful technology, and I have offered you a solution.



Spoon-fed, the way the masses like it.



Says the guy asking for help. Google ULPS and have a field day.
I am just using E6500 OC to 3.66Ghz, before with onboard display 1080p fps sucks so much, if not using gpu I doubt the quality and I am not at home to try.
post #16 of 23
Thread Starter 
Quote:
Originally Posted by sub50hz View Post
Why? Do you need spare CPU cycles badly?



How does shot web? Yes, I know "what is that" -- but it's not necessary to use if the CPU can properly decode the video. Unless your CPU is old/insufficient, why not just use it? PowerPlay is a useful technology, and I have offered you a solution.



Spoon-fed, the way the masses like it.



Says the guy asking for help. Google ULPS and have a field day.
disable using gpu is just not even using the displays card to do the graphic work but let the cpu do it, then we just prevent dealing with the graphics card instead of finding a solution to use the graphic card as we want.
post #17 of 23
Quote:
Originally Posted by crunchor View Post
disable using gpu is just not even using the displays card to do the graphic work but let the cpu do it, then we just prevent dealing with the graphics card instead of finding a solution to use the graphic card as we want.
You're using the CPU to decode the video, not draw it on screen.
post #18 of 23
Quote:
Originally Posted by sub50hz View Post
You're using the CPU to decode the video, not draw it on screen.
Not necessarily. It depends upon the API. Some API's (i.e. DXVA) allow the CPU to pass some information to the GPU to decode.

Quote:
Originally Posted by crunchor View Post
My reason is that when I watch youtube 1080p, the fps sometimes drop to be around 19 to 24, which is not 25 to 30, and only 25 to 30 gives the perfect smooth quality. I can see through cpuz or ati driver program that my 5770 is only using around 400mhz out of maximum 850mhz while watching youtube 1080, so I think if I can set it always run at maximum, then should be better for this case.
You don't want to do this. This is your card running in Unified Video Decoder (UVD) mode. This mode allows hardware accelerated playback of video of h.264 and vc-1 content. It is completely normal for video cards to have specific clocks when playing back video content. Nvidia does the same thing iirc.

I also have a 5770 (running Cat. 10.12) and it plays back Flash, blu-rays, h.264 content well with no artifacts/green screens/etc.
Edited by Riou - 1/13/11 at 8:48pm
post #19 of 23
Quote:
Originally Posted by Riou View Post
Not necessarily. It depends upon the API. Some API's (i.e. DXVA) allow the CPU to pass some information to the GPU to decode.

edit: My reply sounded dickish.

Of course there are varying ways of decoding video, but I was not implying that ALL video will solely use the CPU to decode, but rather in this instance using the provided fix.
Edited by sub50hz - 1/13/11 at 8:56pm
post #20 of 23
Thread Starter 
Quote:
Originally Posted by sub50hz View Post
edit: My reply sounded dickish.

Of course there are varying ways of decoding video, but I was not implying that ALL video will solely use the CPU to decode, but rather in this instance using the provided fix.
Should we consider the hardware accelerator of flash is not well made yet so just disable it then using the old way to play the youtube flash would work well?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AMD/ATI
Overclock.net › Forums › Graphics Cards › AMD/ATI › how to turn off ATI graphics card power managerment?