Overclock.net banner

1081 - 1100 of 1115 Posts

·
Like a fox!
Joined
·
2,724 Posts
We are both positively thinking people, but at my personality it dominates the side of troubleshooter and problem solving at any cost, or else I do not get paid.
This Link, it might help you to understand that regular computer users, they do suffer when MS this delivers imperfect software.

I do agree with you that microsoft software is very much "Imperfect". However that does not change the fact that if people want to actually use the Microsoft Windows operating system they must leave dot-net installed. I outlined above how to kill the background processes for dot-net which effects the performance of games and any other running software. As to start up times that's largely irrelevant because most people are at least on a modern SATA 6 Gbps SSD on computers with 6 Gbps SATA controllers which complete the entire cold boot start up to desktop in < 30 seconds (even with Dot-net installed). NVME drives are becoming far more common today however which are even faster than SATA Drives. Having dot-net installed is irrelevant to system start up times in 2021 as most people have computers at least made within the past 5 years. I would remind you that you are in the extreme minority with your computer. Typically most people do not still use 14 year old hardware today for their main computer. I would suggest that you think of this when writing your posts. Just because something being installed makes your computer start up slower does not mean that it will effect everyone else.
 

·
Registered
Joined
·
630 Posts
I would remind you that you are in the extreme minority with your computer.
I would remind that you are the minority, that accepts hiding MS software problems under the carpet by the use of a bunch of SSD and piles of CPU cores.
I am supporting the industry, an sector which admires productivity and does not follow Microsoft fashions of what is supported and of what it is not.
The only reason that I am in denial to to touch Linux this is mostly sentimental after all these 25 years.
I am not proud any more for my MS certification, I feel as an huge idiot when I get at MS knowledge base to find solution for Event Viewer errors, and the suggested solution this is.. it can safely this error be ignored.
The Linux guys they are justified to laugh at our face, not that their Kernel toy this is easy to handle, if you are not a space rockets specialist or a huge nerd at writing code.
This ping pong game about MS, this ends here from my end.

Let's focus at the GTX 1060 and it performance, which this is a topic which I do find pleasure.
 

·
Like a fox!
Joined
·
2,724 Posts
I would remind that you are the minority, that accepts hiding MS software problems under the carpet by the use of a bunch of SSD and piles of CPU cores.
I am supporting the industry, an sector which admires productivity and does not follow Microsoft fashions of what is supported and of what it is not.
The only reason that I am in denial to to touch Linux this is mostly sentimental after all these 25 years.
I am not proud any more for my MS certification, I feel as an huge idiot when I get at MS knowledge base to find solution for Event Viewer errors, and the suggested solution this is.. it can safely this error be ignored.
The Linux guys they are justified to laugh at our face, not that their Kernel toy this is easy to handle, if you are not a space rockets specialist or a huge nerd at writing code.
This ping pong game about MS, this ends here from my end.

Let's focus at the GTX 1060 and it performance, which this is a topic which I do find pleasure.
I do agree with you that I don't like microsoft sweeping crap under the rug and hiding it in the OS. It's why I actively use Windows 7 as my primary choice for my daily usage OS even on my new Ryzen X570 system, with windows 10 as a secondary boot option only for games that require DirectX-12. However you and I are the minority here. The large population out there (I would say at least 95% or higher of Microsoft Windows users) just install windows 10, let microsoft run all it's default software and just go about their daily computing lives despite all that monitoring software being there.

The only reason I spoke up is because your suggestion to uninstall dot-net is just flat out wrong. I wanted to make sure others finding this thread know NOT to follow that suggestion. Unfortunately dot-net is required to play games and use our computers in Windows for daily usage outside of testing scenarios in any version of windows. I don't like it but it is what it is.
 

·
Registered
Joined
·
630 Posts
The only reason I spoke up is because your suggestion to uninstall dot-net is just flat out wrong. I wanted to make sure others finding this thread know NOT to follow that suggestion. Unfortunately dot-net is required to play games and use our computers in Windows for daily usage outside of testing scenarios in any version of windows. I don't like it but it is what it is.
Battlefield V this it did not requested higher Layer than .NET 4.5 ( yes there is a DOT in-front of NET) ;) .
NVIDIA Drivers does not require higher layer than .NET 4.5
AMD drivers has happy using up to .NET 2.0
Only EVGA appears to bother using .NET 4.6.1
MS Windows updates they keep pushing up .NET version even when there is no specific applications using it.
The use of Win 7 Pro, this is wisest choice for experts, minority it is also the class owning expensive cars, therefore I have no issue with this word .:)
 

·
Registered
Joined
·
630 Posts
About my MSI GTX 1060 iGAMER OC, I did came down to very useful conclusions.
At the screen resolution of 1080P there is no hardware restrictive factors as for example Core top Clock in MHz, and or clock MHz that VRAM operates at stock.
And I will now agree with the opinion of experts at benching these cards, that Top grade 6GB GTX1060 this is optimized out of the box for Max performance delivery at 1080P.

I am now feel eager to discover how much additional benefit at my system, this Intel Core 2 Quad Q9650 (12M Cache, 3.00 GHz, 1333 FSB) Socket 775 can deliver instead of Q6600 this be now clocked gently at 2.5 GHz.
An 25% increase or boost at lowest frame rare this is all than I need.
 

·
Registered
Joined
·
630 Posts
I did a fresh discovery, this might be show as helpful and to others. :cool:
By disabling the start-up of Ansel plug-in (NVIDIA folder tools), at Battlefield V my GTX 1060 this now delivers four to five FPS higher than before, within the game.

While my DELL IPS PC screen, this is capable for 75Hz, with Ansel plug-in enabled, this was limiting max FPS to 60Hz (BF setting = limit FPS to screen refresh).

After the new change, highest FPS at my screen this is now at 69 FPS according BF5 internal FPS counter, that this is always active, but visible only due in game settings.

I would never expect that NVIDIA plug-in toys as is Ansel, that works at Single player mode only, this would be a significant performance thieve, at on-line gaming.
Here is my souvenir = First and last for ever ... screenshot's that Ansel did take at my PC.
2487839
2487840


2487838
 

·
Like a fox!
Joined
·
2,724 Posts
I did a fresh discovery, this might be show as helpful and to others. :cool:
By disabling the start-up of Ansel plug-in (NVIDIA folder tools), at Battlefield V my GTX 1060 this now delivers four to five FPS higher than before, within the game.
Could you please tell us exactly what you did to disable Ansel like this? Did you disable it globally for all games? Or just for this one game?
 

·
Registered
Joined
·
630 Posts
Could you please tell us exactly what you did to disable Ansel like this? Did you disable it globally for all games? Or just for this one game?
Yes this is now disabled for all games.
Ansel has it own control module, there is not shortcut for this executable.
Yesterday I did confirmed again this FPS increase by shooting others :cool:
I am now enjoying 35FPS minimum and 47 FPS as max.

The lesser access of other Apps at the NVIDIA driver, this delivers performance boost.
Today with a higher level of confidence I am ready to switch CPU and get an Q9650, as counter balance to a few BF5 maps them with a higher load of graphics than the average.
This last step of optimization, will deliver 40+ FPS as minimum and 55+ FPS as maximum.
But there is no rush for that last step, I am not feeling as limited in the game at all. :)
 

·
Registered
Joined
·
230 Posts
I am holding this computer setup at my forum signature with pride, because I am aware that later time released hardware (CPU and MB) they was inferior up to the year 2019.
I would feel as an idiot if I was making premature hardware upgrade by listening voices them recommending purchase of such fresher produced hardware just because this is fresher produced.
Right now CPU upgrade for my socket this has non important cost of 20~30 Euro in the market of used CPU.
No offence mate, but getting a quarter of your GPU performance is nothing to be proud of. You would need to overclock it to 10 GHz for Core 2 Quad to be relevant again and only until games pull the support of all old instructions that Core 2 Quad supports. Core 2 Quad was already only letting use one third of GTX 1060 in BF4, just shows how inadequate it was even back then. It can barely not bottleneck GTX 750 Ti, which is pretty much useless in modern titles. The best upgrade that you can buy for that board is either Xeon X3380 or Core 2 Extreme QX9775. And all you get is marginally higher clock speed, but no hyperthreading or core count increase. Socket 775 is no longer relevant. And even if you overclock it to 4 GHz, it's not going to be really decent anymore. Core 2 platform is so low end, that even stock FX 6300 would whop it. And if you overclock that FX, then there's just no comparison. Even 200GE APU is faster than Core 2 Quad. So, please, don't waste your cash and time on socket 775 anymore. It will never be good again. If you actually bought this card for Core 2 Quad, then in your system it hardly performs any better than GT 1030 GDDR5.

You know, Haswell Xeon rigs on ebay, aliexpress or alibaba are cheap and powerful. Otherwise i5 10400F is best value brand new upgrade.
 

·
Like a fox!
Joined
·
2,724 Posts
No offence mate, but getting a quarter of your GPU performance is nothing to be proud of. You would need to overclock it to 10 GHz for Core 2 Quad to be relevant again and only until games pull the support of all old instructions that Core 2 Quad supports. Core 2 Quad was already only letting use one third of GTX 1060 in BF4, just shows how inadequate it was even back then. It can barely not bottleneck GTX 750 Ti, which is pretty much useless in modern titles. The best upgrade that you can buy for that board is either Xeon X3380 or Core 2 Extreme QX9775. And all you get is marginally higher clock speed, but no hyperthreading or core count increase. Socket 775 is no longer relevant. And even if you overclock it to 4 GHz, it's not going to be really decent anymore. Core 2 platform is so low end, that even stock FX 6300 would whop it. And if you overclock that FX, then there's just no comparison. Even 200GE APU is faster than Core 2 Quad. So, please, don't waste your cash and time on socket 775 anymore. It will never be good again. If you actually bought this card for Core 2 Quad, then in your system it hardly performs any better than GT 1030 GDDR5.

You know, Haswell Xeon rigs on ebay, aliexpress or alibaba are cheap and powerful. Otherwise i5 10400F is best value brand new upgrade.
I tried to tell this to them but they don't seem to understand how their old computer is holding back their new video card. I went to lengths over about 8 replies and they still don't get it. I wouldn't bother trying to explain it to them at this point.
 

·
Registered
Joined
·
571 Posts
You guys ever try NVCleanstall? It's what I've been using to remove all the extra driver crap like Ansel and telemetry.

 

·
Registered
Joined
·
230 Posts
I tried to tell this to them but they don't seem to understand how their old computer is holding back their new video card. I went to lengths over about 8 replies and they still don't get it. I wouldn't bother trying to explain it to them at this point.
Oh, well. It's kinda crazy how humans perceive bottlenecks. I'm pretty sure that right now people think that no chip holds back RTX 3090. But it's pretty crazy to see that in quite a bit of titles a fast CPU is a bottleneck already at 1440p:

And in titles like Flight Simulator, no matter what faster card will come out later, current fast CPUs will be stuck at being able to run at 60 fps average. That's not much indeed and will only take a few years for game developers to make a game, that is even harder to run on CPU. And soon i9 won't be able to maintain 60 fps. Some titles like Hitman are already heavy on CPU and RTX 3090 cannot be fully utilized with 10900K clocked at 5.2 GHz at 4K maximum settings. I remember when I wanted PC in 2014 and knew almost nothing about them, but watched a lot of videos. I was trying to decide if FX 6300 or i3 4130 was a better deal. I remember that there were titles like ARMA 3 and some others, in which CPUs couldn't reach 60 fps no matter with what card it was paired with. I settled on FX 6300 and it lasted me until 2020 (so 6 years), when it became a clear big bottleneck. Something like 4770K in 2014 cots 3 times more and while still usable today, it will bottleneck GPUs too already in 2019 and at stock speeds, yeah it will get less than 60 fps sometimes(side note: I didn't know that 9700K was bottlenecking 2080 Ti sometimes, ouch):

Hardware aging isn't a graceful process and yes sometimes more money buys a CPU that lasts longer than other, but still it won't take very long for top tier CPU to become the weak link that cannot pump out 60 fps in latest titles. I'm not sure if I would want 4770K in 2022 anymore. So spending 3 times more, yields you 2-3 additional years of CPU lifespan, which is at best 50% longer but at 300% increase in price. While not a bad purchase, it certainly wasn't the best value purchase either.

Considering that today there's no big IPC differences between AMD and Intel like there were in FX days, there's no really better or worse choice as long as you get enough fast cores now + a bit more. This awful aging process is what keeps me from ever considering buying top tier hardware. Especially in terms of GPUs, those don't last as long (except some outliers like Radeon HD 7970 or Radeon RX 480).

Well that's me, but that kiriakos dude is just choosing to suffer. 30 fps is just painful experience. Even more so knowing that GPU could perform times better than that.
 

·
Like a fox!
Joined
·
2,724 Posts
Oh, well. It's kinda crazy how humans perceive bottlenecks. I'm pretty sure that right now people think that no chip holds back RTX 3090. But it's pretty crazy to see that in quite a bit of titles a fast CPU is a bottleneck already at 1440p:
I have a GTX 1080 Ti in my main gaming computer and I had a Ryzen 5 2600 + X370 system (manual CPU OC @ 4150 Mhz + ram @ 3333 with tight timings (12-14-12-29-2T @ 3333)) and I recently switched it up to upgrade to a Ryzen 7 5800X + X570 system (manual CPU OC @ 4750 Mhz now + Ram @ 3800 tight timings again (14-16-14-25-1T @ 3800)) and I mainly game at 1080p. My current monitor maxes out at 80hz all these years since buying the 1080 Ti I honestly thought my video card was what was holding me back and I needed to buy a new GPU. Most games I played were running around 40-55 FPS, some 60 but not many, and some games would even run around 20-30 FPS. But I was pleasantly surprised to find almost all of my games (except for 1-2) are suddenly able to run 80 FPS for 0.1% minimums now all the time everywhere. As in never drops below 80 FPS even once. And most of them if I turn off Vsync I see they can run around 125-170 FPS in most games, all with my 5 year old video card. Which means now I need to shop for a "Faster monitor" soon, but not a new video card. So yes the CPU + Ram + motherboard the video card is paired with does make a REALLY BIG difference in system performance. Especially for playing games at 1080p or any resolution below 1440p. I wish there was some way to try to get people to understand this. But some folks just refuse to acknowledge this is how computers work. Probably because they either don't want to spend money on more hardware (or can't and don't have it).

Also I'll likely be retiring my GTX 1060 3GB card soon in my second computer and replacing it with an older water cooled (and +53% overclocked) GTX 780. I want to capitalize on the market and try to get around $300 for it if I can before the bubble bursts. I paid about $75 for this card a few years ago.
 

·
Registered
Joined
·
230 Posts
I have a GTX 1080 Ti in my main gaming computer and I had a Ryzen 5 2600 + X370 system (manual CPU OC @ 4150 Mhz + ram @ 3333 with tight timings (12-14-12-29-2T @ 3333)) and I recently switched it up to upgrade to a Ryzen 7 5800X + X570 system (manual CPU OC @ 4750 Mhz now + Ram @ 3800 tight timings again (14-16-14-25-1T @ 3800)) and I mainly game at 1080p. My current monitor maxes out at 80hz all these years since buying the 1080 Ti I honestly thought my video card was what was holding me back and I needed to buy a new GPU. Most games I played were running around 40-55 FPS, some 60 but not many, and some games would even run around 20-30 FPS. But I was pleasantly surprised to find almost all of my games (except for 1-2) are suddenly able to run 80 FPS for 0.1% minimums now all the time everywhere. As in never drops below 80 FPS even once. And most of them if I turn off Vsync I see they can run around 125-170 FPS in most games, all with my 5 year old video card. Which means now I need to shop for a "Faster monitor" soon, but not a new video card. So yes the CPU + Ram + motherboard the video card is paired with does make a REALLY BIG difference in system performance. Especially for playing games at 1080p or any resolution below 1440p. I wish there was some way to try to get people to understand this. But some folks just refuse to acknowledge this is how computers work. Probably because they either don't want to spend money on more hardware (or can't and don't have it).
Congrats on upgrade, but there's something really odd about Ryzen 2600 performing poorly. It shouldn't be this bad, unless it's MS flight sim or AC Odyssey. Take a look at how Ryzen 2600 and GTX 1080 combo should perform with stock settings and random 3000MHz memory:

Especially overclocked Ryzen 2600 shouldn't be so bad.
 

·
Like a fox!
Joined
·
2,724 Posts
Congrats on upgrade, but there's something really odd about Ryzen 2600 performing poorly. It shouldn't be this bad, unless it's MS flight sim or AC Odyssey. Take a look at how Ryzen 2600 and GTX 1080 combo should perform with stock settings and random 3000MHz memory:

Especially overclocked Ryzen 2600 shouldn't be so bad.
The issue is the games played. I can see pretty much right away from every single one of the games shown in that thumbnail of that video without even clicking on it that they are only testing "Modern AAA games". The only 1 game out of that entire list in the thumbnail I wanted to play was The Outer worlds and the R5-2600 system did great in it with no issues. Every single other game shown there I do not want to play nor would I ever have any interest in playing at all. I play mainly Indie, Early Access, and otherwise unoptimized and unfinished games. Typically with A LOT of mods in them. Most processors are going to look good if you run them in highly polished games released from major studios that have the money and manpower to optimize them properly. But the type of games I play are going to have to be brute forced with fast hardware. I sometimes play Original / Classic Skyrim with around 200-250 mods loaded for example. And I enjoy playing American Truck Sim with about 85 mods loaded to give you an idea. Or Satisfactory with 128 mods loaded. In the type of games I enjoy playing the CPU and the system running it is going to be a big factor. Not everyone wants to play "The latest and greatest" games.
 

·
Registered
Joined
·
230 Posts
The issue is the games played. I can see pretty much right away from every single one of the games shown in that thumbnail of that video without even clicking on it that they are only testing "Modern AAA games". The only 1 game out of that entire list in the thumbnail I wanted to play was The Outer worlds and the R5-2600 system did great in it with no issues. Every single other game shown there I do not want to play nor would I ever have any interest in playing at all.
lol that's so relatable. I don't play any of them either and don't care to play them probably ever.

I play mainly Indie, Early Access, and otherwise unoptimized and unfinished games. Typically with A LOT of mods in them. Most processors are going to look good if you run them in highly polished games released from major studios that have the money and manpower to optimize them properly. But the type of games I play are going to have to be brute forced with fast hardware. I sometimes play Original / Classic Skyrim with around 200-250 mods loaded for example. And I enjoy playing American Truck Sim with about 85 mods loaded to give you an idea. Or Satisfactory with 128 mods loaded. In the type of games I enjoy playing the CPU and the system running it is going to be a big factor. Not everyone wants to play "The latest and greatest" games.
Meanwhile me: Cries at 5 fps in Victoria 2. That game is extremely demanding on single thread performance and no current chip can run it well. And at first, that game runs really well, but once there are lots of units and countries going apeshit, then fps tanks at maximum game speed. There's nothing I can buy that will run it well. I would probably need i5 11600K at 20GHz to run it truly well. And yet it, in terms of graphics, it runs on absolute potato specs. Another game I play is Genshin Impact, runs well on 10400F and downclocked RX 580 at 1440p. And if you try to play any Japanese games, they are programmed really poorly and have big spikes in CPU usage. My HTPC with Athlon X4 845 and RX 560 has enough muscle for Sonic Generations to run, but it still stutters at times due to those spikes. Yakuza 0 Also ran on that PC at 1080p maximum settings and 60 fps, but in some places fps consistently tanked to 30s for no apparent reason.

I wouldn't say that AAA titles are all that polished. We got to remember how awful some of them were at optimization like Crysis, AC Odyssey, MS flight sim, ARMA 3, GTA 4 (nope it still doesn't run smoothly for me), Yakuza Kiwami 2, TES Oblivion (it still fails to run at 60 fps), Watch Dogs, TES Morrowind, AC 3, Cyberpunk 2077, pretty much every grand strategy or heavy simulation game, Valheim, PUBG, Nier Automata, Sekiro. All of them are a big piles of poo poo in utilizing system resources decently, but people just accept that their hardware sucks and move on. Many barely known games have very low system requirements and are decently made. AAA games just get more coverage, that's it.

I'm surprised that newer Ryzen helped you so much as there weren't really big IPC gains since Zen+.
 

·
Like a fox!
Joined
·
2,724 Posts
I wouldn't say that AAA titles are all that polished. We got to remember how awful some of them were at optimization like Crysis, AC Odyssey, MS flight sim, ARMA 3, GTA 4 (nope it still doesn't run smoothly for me), Yakuza Kiwami 2, TES Oblivion (it still fails to run at 60 fps), Watch Dogs, TES Morrowind, AC 3, Cyberpunk 2077, pretty much every grand strategy or heavy simulation game, Valheim, PUBG, Nier Automata, Sekiro. All of them are a big piles of poo poo in utilizing system resources decently, but people just accept that their hardware sucks and move on. Many barely known games have very low system requirements and are decently made. AAA games just get more coverage, that's it.

I'm surprised that newer Ryzen helped you so much as there weren't really big IPC gains since Zen+.
TES Oblivion and Morrowind both ran best with my pair of GTX 470's in SLI, way back when, years ago at 1080p with high res texture pack and lots of mods and forced 16xSSAA and they still ran 60 FPS smoothly. But on newer video cards like the pascal 1000 series and even the R9 290X, they both run terrible, around 30-40 FPS. I don't know why but I've experienced this. I stopped playing em years ago because of that. Those are actually 2 of the games that I'm hopeful using the GTX 780 in my tuned R5-2600 system that I might be able to run well again.

WatchDogs (the original) I can run easily at flat 80 FPS all the time. I found WatchDogs 1 performance depends on your storage performance. On a mechanical drive it has horrible FPS dips. On a SATA SSD it's better but still a few. On a fast NVME drive it's buttery-smooth 80 FPS always for me.

Valheim is one of the games that saw a huge boost for me. I'm now easily at 80 FPS minimums everywhere in Valheim even in my friend's large base area. With Vsync off I can see Valheim running at 125 minimums and typically in the 180-200 FPS range everywhere in the game no matter what's going on. Valheim saw a BIG boost from the 5800X. This is at 1080p though.

Cyberpunk is good, but not amazing for me. My limitation with CP-2077 is the video card, not the system. I can run the game on max ultra settings @ 1080p and it has the 1080 Ti at 100% usage constantly no matter where I am in the game and usually runs 50-60 FPS. I'll need a faster GPU to do well in that game, even at 1080p.

The rest of those games I don't even know what they are and I probably wouldn't be interested in them.
 

·
Registered
Joined
·
230 Posts
TES Oblivion and Morrowind both ran best with my pair of GTX 470's in SLI, way back when, years ago at 1080p with high res texture pack and lots of mods and forced 16xSSAA and they still ran 60 FPS smoothly. But on newer video cards like the pascal 1000 series and even the R9 290X, they both run terrible, around 30-40 FPS. I don't know why but I've experienced this. I stopped playing em years ago because of that. Those are actually 2 of the games that I'm hopeful using the GTX 780 in my tuned R5-2600 system that I might be able to run well again.
I have RX 580 8GB and it runs at 60 fps or more, but there are nasty drops sometimes for no reason. Morrowind is just really badly optimized for any card. It runs poorly with FX 5200 128MB, GTX 650 Ti.


WatchDogs (the original) I can run easily at flat 80 FPS all the time. I found WatchDogs 1 performance depends on your storage performance. On a mechanical drive it has horrible FPS dips. On a SATA SSD it's better but still a few. On a fast NVME drive it's buttery-smooth 80 FPS always for me.
It mostly was just way to difficult to run on CPUs. My FX 6300 wasn't enough to get consistent frames. My fps was all over the place. I haven't tried it with i5 as I no longer care about it.

Valheim is one of the games that saw a huge boost for me. I'm now easily at 80 FPS minimums everywhere in Valheim even in my friend's large base area. With Vsync off I can see Valheim running at 125 minimums and typically in the 180-200 FPS range everywhere in the game no matter what's going on. Valheim saw a BIG boost from the 5800X. This is at 1080p though.
Valheim doesn't look great and it has absolutely awful performance on a lot of hardware. There's no reason to it, other than stupid programming choices.


Cyberpunk is good, but not amazing for me. My limitation with CP-2077 is the video card, not the system. I can run the game on max ultra settings @ 1080p and it has the 1080 Ti at 100% usage constantly no matter where I am in the game and usually runs 50-60 FPS. I'll need a faster GPU to do well in that game, even at 1080p.
That's the problem. It shouldn't be this hard to run. It just looks like Deus Ex game with some GTA elements. To be honest, initial buggyness was atrocious. It's probably the only reason why people even care about that game. Other than that, I don't even know why people even cared about such obscure franchise. Nobody cared about Cyberpunk 2020 or Cyberpunk Red. It was pretty much dead franchise that nobody really knew. They sure have paid a lot for marketing to make it relevant in media.


The rest of those games I don't even know what they are and I probably wouldn't be interested in them.
I didn't write that list of poorly optimized games to get any commentary about them. Just saying that so many AAA games are very badly optimized, despite having big budget. I have actually watched this long video about Oblivion development:

Yeah, it wasn't smooth and had typical time crunch that ruins games.
 

·
Registered
Joined
·
630 Posts
You guys ever try NVCleanstall? It's what I've been using to remove all the extra driver crap like Ansel and telemetry.

I did Telemetry removal by Windows system command line, after that I did switch between several NVIDIA drivers (reinstall) and this never was installed again.
GTX 1060 due it age, it does not support any huge pile of Add-on, other than Ansel and Telemetry.
NVIDIA experience nowadays does not display GTX 1060 advanced Hardware tool box, and therefore I do not install it at all.
Old memories from others mentioning that it was there one Auto-OC check box.

Either way after playing Battlefield V for 316 hours so far, I do not miss it.
But the GTX1060 it is a super weapon now that it is optimized.
I thought that GTX1660 Super it would be my lowest limit as useful upgrade and I was wrong, there is too much misinformation at forums.
 

·
Registered
Joined
·
230 Posts
But the GTX1060 it is a super weapon now that it is optimized.
I thought that GTX1660 Super it would be my lowest limit as useful upgrade and I was wrong, there is too much misinformation at forums.
Your 1060 performs the same as 1050:

See fps, it's the same, because you have Core 2 Quad. And if you seriously think that 1660 Super is the lowest upgrade from HD 5770, you are really badly out of touch. Even GT 1030 is times faster than Radeon 5770:

GT 1030 is going head to head with 5870, so perhaps lame GT 1030 DDR4 version is closer to 5770. And so many people have been saying that even GTX 1050 was a good budget GPU (I would argue that RX 560 was better deal due to better longevity due to 4GB version). Perhaps you read too much Tom's HW, who don't know how to properly test cheaper cards.
 
1081 - 1100 of 1115 Posts
Top