Overclock.net banner

1 - 20 of 23 Posts

·
Banned
Joined
·
43 Posts
Discussion Starter · #1 ·
I have a gigabtyes HD5770 card, and the core speed would be changed base on the demand of the software or game. Is there a way to turn this power management function off just like what we can do with cpu through bios?
 

·
Banned
Joined
·
43 Posts
Discussion Starter · #2 ·
I am surprised that people really love their ati graphics card here but seem like no one know about disable the power management function.

I find out it is called powerplay, and found this post:
http://forums.amd.com/game/messageview.cfm?catid=260&threadid=100048
It is kind of outdated since it is wrote in 2008, and I doubt if it is a perfect way to solve it, may be it can create big problem in hardware since no one proves it works perfectly.
 

·
Premium Member
Joined
·
6,867 Posts
The only reason you should WANT to disable PowerPlay is if it's not working correctly, or if you're trying to overclock with 3rd-party software in Crossfire.
 

·
Registered
Joined
·
4,096 Posts
Quote:
Originally Posted by sub50hz;12003214
The only reason you should WANT to disable PowerPlay is if it's not working correctly, or if you're trying to overclock with 3rd-party software in Crossfire.
clearly you are not aware of the multimonitor idle/load clock flickering issue.
 

·
Premium Member
Joined
·
6,867 Posts
Quote:
Originally Posted by kiwiasian;12003245
clearly you are not aware of the multimonitor idle/load clock flickering issue.
Nope, because it's been fixed since 10.3 for me.
 

·
Banned
Joined
·
43 Posts
Discussion Starter · #6 ·
Quote:
Originally Posted by sub50hz;12003214
The only reason you should WANT to disable PowerPlay is if it's not working correctly, or if you're trying to overclock with 3rd-party software in Crossfire.
My reason is that when I watch youtube 1080p, the fps sometimes drop to be around 19 to 24, which is not 25 to 30, and only 25 to 30 gives the perfect smooth quality. I can see through cpuz or ati driver program that my 5770 is only using around 400mhz out of maximum 850mhz while watching youtube 1080, so I think if I can set it always run at maximum, then should be better for this case.
 

·
Premium Member
Joined
·
6,867 Posts
You should right-click the YouTube video and uncheck the "Enable Hardware Acceleration" setting. This will fix your issue, since it will no longer use the GPU.
 

·
Banned
Joined
·
43 Posts
Discussion Starter · #8 ·
Quote:
Originally Posted by sub50hz;12003413
You should right-click the YouTube video and uncheck the "Enable Hardware Acceleration" setting. This will fix your issue, since it will no longer use the GPU.
GPU suppose to use the decoding ability of the graphics card to play video, then turn it off would not help, just get worse.
 

·
Banned
Joined
·
43 Posts
Discussion Starter · #10 ·
Quote:
Originally Posted by sub50hz;12003493
Did you try it? It might not be a problem.
It is just stupid to turn the hardware accelerator function off. Do you even know what is that? Adobe spends so much time developing it so flash can directly use hardware decoding. You may want to do more research first.
 

·
Registered
Joined
·
273 Posts
Quote:
Originally Posted by crunchor;12003469
GPU suppose to use the decoding ability of the graphics card to play video, then turn it off would not help, just get worse.
I had the same issue since I use 2 monitors and sometime watch streams/youtube while gaming on the other, and I can confirm that by un-checking the "enable hardware acceleration" solved all my problems.
 

·
Banned
Joined
·
43 Posts
Discussion Starter · #12 ·
Quote:


Originally Posted by rent.a.john
View Post

I had the same issue since I use 2 monitors and sometime watch streams/youtube while gaming on the other, and I can confirm that by un-checking the "enable hardware acceleration" solved all my problems.

What happen afer you un-checking it.
 

·
Registered
Joined
·
4,096 Posts
Quote:


Originally Posted by crunchor
View Post

What happen afer you un-checking it.

seriously dude, just try it. it isn't going to kill your system.
 

·
Premium Member
Joined
·
6,867 Posts
Quote:


Originally Posted by crunchor
View Post

It is just stupid to turn the hardware accelerator function off.

Why? Do you need spare CPU cycles badly?

Quote:


Do you even know what is that?

How does shot web? Yes, I know "what is that" -- but it's not necessary to use if the CPU can properly decode the video. Unless your CPU is old/insufficient, why not just use it? PowerPlay is a useful technology, and I have offered you a solution.

Quote:


Adobe spends so much time developing it so flash can directly use hardware decoding.

Spoon-fed, the way the masses like it.

Quote:


You may want to do more research first.

Says the guy asking for help. Google ULPS and have a field day.
 

·
Banned
Joined
·
43 Posts
Discussion Starter · #15 ·
Quote:


Originally Posted by sub50hz
View Post

Why? Do you need spare CPU cycles badly?

How does shot web? Yes, I know "what is that" -- but it's not necessary to use if the CPU can properly decode the video. Unless your CPU is old/insufficient, why not just use it? PowerPlay is a useful technology, and I have offered you a solution.

Spoon-fed, the way the masses like it.

Says the guy asking for help. Google ULPS and have a field day.


I am just using E6500 OC to 3.66Ghz, before with onboard display 1080p fps sucks so much, if not using gpu I doubt the quality and I am not at home to try.
 

·
Banned
Joined
·
43 Posts
Discussion Starter · #16 ·
Quote:


Originally Posted by sub50hz
View Post

Why? Do you need spare CPU cycles badly?

How does shot web? Yes, I know "what is that" -- but it's not necessary to use if the CPU can properly decode the video. Unless your CPU is old/insufficient, why not just use it? PowerPlay is a useful technology, and I have offered you a solution.

Spoon-fed, the way the masses like it.

Says the guy asking for help. Google ULPS and have a field day.


disable using gpu is just not even using the displays card to do the graphic work but let the cpu do it, then we just prevent dealing with the graphics card instead of finding a solution to use the graphic card as we want.
 

·
Premium Member
Joined
·
6,867 Posts
Quote:


Originally Posted by crunchor
View Post

disable using gpu is just not even using the displays card to do the graphic work but let the cpu do it, then we just prevent dealing with the graphics card instead of finding a solution to use the graphic card as we want.

You're using the CPU to decode the video, not draw it on screen.
 

·
Premium Member
Joined
·
13,796 Posts
Quote:


Originally Posted by sub50hz
View Post

You're using the CPU to decode the video, not draw it on screen.

Not necessarily. It depends upon the API. Some API's (i.e. DXVA) allow the CPU to pass some information to the GPU to decode.

Quote:


Originally Posted by crunchor
View Post

My reason is that when I watch youtube 1080p, the fps sometimes drop to be around 19 to 24, which is not 25 to 30, and only 25 to 30 gives the perfect smooth quality. I can see through cpuz or ati driver program that my 5770 is only using around 400mhz out of maximum 850mhz while watching youtube 1080, so I think if I can set it always run at maximum, then should be better for this case.

You don't want to do this. This is your card running in Unified Video Decoder (UVD) mode. This mode allows hardware accelerated playback of video of h.264 and vc-1 content. It is completely normal for video cards to have specific clocks when playing back video content. Nvidia does the same thing iirc.

I also have a 5770 (running Cat. 10.12) and it plays back Flash, blu-rays, h.264 content well with no artifacts/green screens/etc.
 

·
Premium Member
Joined
·
6,867 Posts
Quote:


Originally Posted by Riou
View Post

Not necessarily. It depends upon the API. Some API's (i.e. DXVA) allow the CPU to pass some information to the GPU to decode.


edit: My reply sounded dickish.

Of course there are varying ways of decoding video, but I was not implying that ALL video will solely use the CPU to decode, but rather in this instance using the provided fix.
 

·
Banned
Joined
·
43 Posts
Discussion Starter · #20 ·
Quote:


Originally Posted by sub50hz
View Post

edit: My reply sounded dickish.

Of course there are varying ways of decoding video, but I was not implying that ALL video will solely use the CPU to decode, but rather in this instance using the provided fix.

Should we consider the hardware accelerator of flash is not well made yet so just disable it then using the old way to play the youtube flash would work well?
 
1 - 20 of 23 Posts
Top