Originally Posted by killeraxemannic
I have a buddy that has the 8320 and OBS streaming absolutely annihilates that CPU. He gets bad frame rate drops at any thing more than ultrafast and 720P downscale for his encoding in basically any game that is slightly CPU bound. I would advise against AMD for anything involving streaming (Or gaming, or encoding).
This is a new one. OBS appears to be using x264 encoding. An FX8350 in x264 is roughly equivalent to i5 6600K.
If your friend can't record, he is probably using bad settings and also depends on game. I don't know OBS, but for instance, normal x264 encoder, assigns 12 threads on an 8 core. It's rather obvious, that if you have a game that uses 4 cores and you have OBS set to similar default values of using 12 threads, you have the CPU that in total must run 16 threads at the same time. Which is a bit problematic for an 8 thread CPU, isn't it.
Just saying, i don't know OBS settings, but if there's something that FX is good at, it's encoding.
EDIT: Here's from the mouth of a moderator at OBS forum:
Not that i had much doubt about it... As with most things, the problem is lack of knowledge. Video encoding is a particularly heavy task, that brings a CPU on its knees at high resolutions and quality. One must know what the settings do:
It took me about a year of reading avidly video experts' posts, as well as encoding myself and comparing results, to arrive to a point, where i can say that i know what i am doing. And i still don't know everything. I just know what i need to know and i don't use "exotic" values, like some real gurus know. And this was without running a game at the same time. If someone expects that he can master the "art" of video encoding, just by installing a program and pressing a button and thinks that one setting will fit all games, he is probably overly optimistic. The most mundane thing, is that you must adjust the settings according to the game. It's rather obvious that if you run a game that runs 8 threads, there aren't many free resources left for encoding, so you must use light settings on the encoder. If a game uses 1 core, you can use heavier settings on the encoder.
@ IRUSH: I think you 're better off asking at OBS forum about the specific requirements you ask. Personally, i can't answer, cause i 've no idea, as to what kind of source (bitrate) a game corresponds to. But 1080p is a tough resolution. For instance, if you get a Blue ray source and try to convert it to 1080p with at least medium settings, there is no way you can get 60fps. And this without anything else running at the same time. Now, the question is, what's the "source bitrate" of a game? Because this is crucial. The lower the source bitrate, the higher the fps you get.Edited by Undervolter - 6/27/16 at 11:25am