Overclock.net › Forums › Industry News › Software News › [Various] Just Cause 3 PC GPU & CPU Benchmarks (Third Update)
New Posts  All Forums:Forum Nav:

[Various] Just Cause 3 PC GPU & CPU Benchmarks (Third Update) - Page 7

post #61 of 166
Quote:
Originally Posted by BiG StroOnZ View Post

I gave you proof, and a link to an article and you decided to refute it with as stated previously, arrogant absolute statements.

Based on the information we have, as said previously, there is proof that it is the same Engine as Mad Max because the Engine they use for all their titles is the Avalanche Engine:


1, Wiki is the for-most absolute truth in the universe rolleyes.gif
2. Engines do not stay the same for every single game. You have still provided zero proof except speculations out of your hole the whole thread.
3. You, who thinks he is 100% right above everything else without even the siltiest consideration that, you might be wrong because you no zero facts on your side, is the puffy chested arrogant guy here.

If the dev comes out and said that the game engine is 100% the same engine, and it doesn't requires a patch for multi-GPU, than the nvidia guy is wrong.
Until that time occurs, you have zero proof that he is wrong (wiki is your proof even though it doesn't specify anything? :lol:), so claiming he is 100% wrong without giving him any chance that, I don't know, he might be actually right, is the most arrogant self centred thing someone can do.
Quote:
Originally Posted by BiG StroOnZ View Post

It even suggest that a patch will allow multi-GPU support. Which in the end, what the argument was based on, means that the game isn't incompatible with multi-GPU.

You also just said yourself that the game might get a patch to get multi-GPU support. This means that currently, it doesn't. So why the nvidia guy was correct, that god forbid you will accept that because god forbid, you would be wrong.
Need patch to get SLI support = can't support SLI right now = it is not capable to run SLI now.
I never said that a patch will not fix it, and the nvidia spokesman did not suggest otherwise as well. If the game atm does not support SLI, it doesn't support SLI. No matter how arrogant you are trying to call everyone else, that is the simple truth.
Edited by Defoler - 12/2/15 at 9:52pm
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
post #62 of 166
Thread Starter 
Quote:
Originally Posted by Defoler View Post

1, Wiki is the for-most absolute truth in the universe rolleyes.gif
2. Engines do not stay the same for every single game. You have still provided zero proof except speculations out of your hole the whole thread.
3. You, who thinks he is 100% right above everything else without even the siltiest consideration that, you might be wrong because you no zero facts on your side, is the puffy chested arrogant guy here.

If the dev comes out and said that the game engine is 100% the same engine, and it doesn't requires a patch for multi-GPU, than the nvidia guy is wrong.
Until that time occurs, you have zero proof that he is wrong (wiki is your proof even though it doesn't specify anything? :lol:), so claiming he is 100% wrong without giving him any chance that, I don't know, he might be actually right, is the most arrogant self centred thing someone can do.

You also just said yourself that the game might get a patch to get multi-GPU support. This means that currently, it doesn't. So why the nvidia guy was correct, that god forbid you will accept that because god forbid, you would be wrong.

Well, the problem here, is if you understood what the word incompatible means, you would automatically know that it translates to not possible. Directly quoting the representative himself, this is what he said. Therefore if they support it in the future, simply by releasing a patch for the game, then that means the representative was incorrect or was using the wrong terminology.

Not sure why you are so angry though, but pretty simple logic to follow.
post #63 of 166
Quote:
Originally Posted by BiG StroOnZ View Post

Well, the problem here, is if you understood what the word incompatible means, you would automatically know that it translates to not possible. Directly quoting the representative himself, this is what he said. Therefore if they support it in the future, simply by releasing a patch for the game, then that means the representative was incorrect or was using the wrong terminology.

Not sure why you are so angry though, but pretty simple logic to follow.

The wrong terminology? laughingsmiley.gif
On that same "terminology", putting a patch to the game, will make that engine, not the same engine.
So by changing the game engine, the developer can enable multi-GPU. That means he was right, because that game engine, is incompatible with multi-GPU, and a new game engine (a patched one), will allow multi-GPU.
That means we will have engine 1.0 without multi-GPU, and engine 1.1 with multi-GPU. They are not the same engine.

Here is your answer with throwing "terminology" in the air like it suddenly makes you win biggrin.gif
If they bring a new engine to the table, that one might support multi-GPU. Until then, the current game engine of JC3, is incompatible and incapable of running multi-GPU.

I find it extremely arrogant that you accuse others about understand a word, while you don't even understand what the word "patch" means.
Edited by Defoler - 12/2/15 at 10:01pm
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
post #64 of 166
Thread Starter 
Quote:
Originally Posted by Defoler View Post

The wrong terminology? laughingsmiley.gif
On that same "terminology", putting a patch to the game, will make that engine, not the same engine.
So by changing the game engine, the developer can enable multi-GPU. That means he was right, because that game engine, is incompatible with multi-GPU, and a new game engine (a patched one), will allow multi-GPU.
That means we will have engine 1.0 without multi-GPU, and engine 1.1 with multi-GPU. They are not the same engine.

Here is your answer with throwing "terminology" in the air like it suddenly makes you win biggrin.gif
If they bring a new engine to the table, that one might support multi-GPU. Until then, the current game engine of JC3, is incompatible and incapable of running multi-GPU.

No because if it is incompatible or incapable of running multi-GPU then it would never be capable or compatible with doing so. Not with a patch, not at all. That's the meaning of the word incompatible. So now when developers release a patch, it suddenly becomes a whole new engine? Please, you know that's a bunch of claptrap.
Quote:
Originally Posted by Defoler View Post

I find it extremely arrogant that you accuse others about understand a word, while you don't even understand what the word "patch" means.

A patch is a piece of software designed to update a computer program.

A patch is a piece of software code that can be applied after the software program has been installed to correct an issue with that program.

Also called a service patch, a fix to a program. A patch is an actual piece of object code that is inserted into (patched into) an executable program. Typically, a patch is installed into an existing software program.
Edited by BiG StroOnZ - 12/2/15 at 10:26pm
post #65 of 166
I don't have the game but from the gameplay I saw of it, it seems like everyone is jumping on the hate bandwagon. It seems like one of the best looking open world games so far if you also count the huge view distance. So I can accept it having high hardware requirements to run on max.

I only just bought JC2 so that's why I'll not buy JC3 anytime soon. But I'll definitly buy it at some point.

About the SLI thing and the engine thing. I think people put to much value on what a game engine does. What the developers do with the engine is always far more important.
I think at some point SLI support will come simply due to pressure from Nvidia/AMD, or maybe they just do it completely from the driver side if that's possible.
Quote:
Originally Posted by BiG StroOnZ View Post

No because if it is incompatible or incapable of running multi-GPU then it would never be capable or compatible with doing so. Not with a patch, not at all. That's the meaning of the word incompatible. So now when developers release a patch, it suddenly becomes a whole new engine? Please, you know that's a bunch of claptrap.
A patch is a piece of software designed to update a computer program.

A patch is a piece of software code that can be applied after the software program has been installed to correct an issue with that program.

Also called a service patch, a fix to a program. A patch is an actual piece of object code that is inserted into (patched into) an executable program. Typically, a patch is installed into an existing software program.

Actually a patch nearly always completely replaces the executable. Executables usually are below 50 MB anyway, not a big deal to have everyone redownload it.
Edited by Sisaroth - 12/3/15 at 12:04am
2013
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7 2700k GIGABYTE GA-Z77-D3H Sapphire HD 7870 XT Boost G.SKILL 1866 Sniper 2x4 GB 
Hard DriveHard DriveOSMonitor
Samsung 840 Basic Crucial MX200 Windows 10 HP IPS 23-inch 1080p 
PowerCaseMouse
Be quiet! BN144 System Power 7 500W NZXT Tempest 210 Zowie FK1 
  hide details  
Reply
2013
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7 2700k GIGABYTE GA-Z77-D3H Sapphire HD 7870 XT Boost G.SKILL 1866 Sniper 2x4 GB 
Hard DriveHard DriveOSMonitor
Samsung 840 Basic Crucial MX200 Windows 10 HP IPS 23-inch 1080p 
PowerCaseMouse
Be quiet! BN144 System Power 7 500W NZXT Tempest 210 Zowie FK1 
  hide details  
Reply
post #66 of 166
Thread Starter 
Quote:
Originally Posted by Sisaroth View Post

Actually a patch nearly always completely replaces the executable. Executables usually are below 50 MB anyway, not a big deal to have everyone redownload it.

Then why when you download a patch through Steam for a game, you don't have to reinstall the entire game? Why is the patch just added to the existing installation? This definition doesn't primarily apply to executables.
Edited by BiG StroOnZ - 12/3/15 at 12:35am
post #67 of 166
Thread Starter 
Quote:
Originally Posted by Liranan View Post

I was being sarcastic.

Well your sarcasm is fruitless, because in the majority of gaming situations a 2500k will undoubtedly outperform an 8350. So just because a handful of games shows the opposite result, suddenly the 90% of the other results are incorrect because of a measly 10%, so we can suddenly sing Hallelujah joyously praising Bulldozer? rolleyes.gif
post #68 of 166
Quote:
Originally Posted by Sisaroth View Post

About the SLI thing and the engine thing. I think people put to much value on what a game engine does. What the developers do with the engine is always far more important.
I think at some point SLI support will come simply due to pressure from Nvidia/AMD, or maybe they just do it completely from the driver side if that's possible.

I am not familiar with their engine but If they use standard deferred techniques where each frame uses data from the previous, the usual AFR method SLi and CFX utilize, will simply never work. AFR assigns all even frames to one GPU and all odd frames to the other. If the engine needs to defer frame by frame,AFR can't be used without hacks because in that case different GPUs with their own memory are used whereas this game probably wants every frame done by the same GPU.
Mastodon Ryzen
(12 items)
 
HP Z220
(8 items)
 
 
CPUMotherboardGraphicsRAM
R7 1800X Asus Crosshair VI Hero Sapphire RX Vega 64 reference Gskill TridentZ 
Hard DriveOptical DriveCoolingOS
Pny SSD 240GB Crucial MX100 CM Nepton 280L Win 10 
MonitorPowerCaseMouse
Acer Predator XG270HU Freesync XFX 750W Pro HAF XM Logitech G502 
CPUMotherboardGraphicsCooling
i7 3770 HP Quadro K2000 HP 
OSPowerCaseMouse
Win 7  HP 400W HP CMT RAT 7 
  hide details  
Reply
Mastodon Ryzen
(12 items)
 
HP Z220
(8 items)
 
 
CPUMotherboardGraphicsRAM
R7 1800X Asus Crosshair VI Hero Sapphire RX Vega 64 reference Gskill TridentZ 
Hard DriveOptical DriveCoolingOS
Pny SSD 240GB Crucial MX100 CM Nepton 280L Win 10 
MonitorPowerCaseMouse
Acer Predator XG270HU Freesync XFX 750W Pro HAF XM Logitech G502 
CPUMotherboardGraphicsCooling
i7 3770 HP Quadro K2000 HP 
OSPowerCaseMouse
Win 7  HP 400W HP CMT RAT 7 
  hide details  
Reply
post #69 of 166
Quote:
Originally Posted by BiG StroOnZ View Post

Well your sarcasm is fruitless, because in the majority of gaming situations a 2500k will undoubtedly outperform an 8350. So just because a handful of games shows the opposite result, suddenly the 90% of the other results are incorrect because of a measly 10%, so we can suddenly sing Hallelujah joyously praising Bulldozer? rolleyes.gif

As I said earlier in the thread they are pretty similar level , 2500k slightly better but not by much. If you had two computers with same GPU's and SSD's one with fx8350 and one with 2500k it would be difficult to tell them apart by looking at gaming performance.

Bulldozer better if you are doing more stuff at same time , for example I stream a football match on a monitor and play Dark Souls on the other. With the 2500k the stream actually stutters a bit but when I do the same thing with bulldozer it does not

I have a 3770k also that is better than both by some margin that I have in my main computer
post #70 of 166
Thread Starter 
Quote:
Originally Posted by daviejams View Post

As I said earlier in the thread they are pretty similar level , 2500k slightly better but not by much. If you had two computers with same GPU's and SSD's one with fx8350 and one with 2500k it would be difficult to tell them apart by looking at gaming performance.

Bulldozer better if you are doing more stuff at same time , for example I stream a football match on a monitor and play Dark Souls on the other. With the 2500k the stream actually stutters a bit but when I do the same thing with bulldozer it does not

I have a 3770k also that is better than both by some margin that I have in my main computer

Do I really have to post benchmarks?

Nobody said anything about streaming. Pure gaming, the 2500k wins 9/10 times. An 8350 will bottleneck any high end graphics card. Basically if you put a card above a 280X while running an 8320 or 8350 you are going to lose performance.

I would disagree, there have been many threads all across the web on various tech sites of people making the switch from a Bulldozer chip to even an i5 and have saw noticeable gaming improvements. Most sum it up as a "night and day difference."

There is a very small minority of people who say, "they saw no difference or it is unnoticeable" but they are a very vocal in comparison.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Software News
Overclock.net › Forums › Industry News › Software News › [Various] Just Cause 3 PC GPU & CPU Benchmarks (Third Update)