Overclock.net banner
1101 - 1120 of 1128 Posts

·
Registered
Joined
·
952 Posts
I think that I have to teach NVIDIA of how to write software, just discovered another toy the dedicated NV GPU usage meter :D
They neither leave room to be recorded more that 90 seconds.
They did not leave enough space for the text of my terrific MSI medals taker neither.

Less than 40% GPU usage and I do Rock the House !! ;)

2488252


2488253
 

·
Facepalm
Joined
·
10,048 Posts
TES Oblivion and Morrowind both ran best with my pair of GTX 470's in SLI, way back when, years ago at 1080p with high res texture pack and lots of mods and forced 16xSSAA and they still ran 60 FPS smoothly. But on newer video cards like the pascal 1000 series and even the R9 290X, they both run terrible, around 30-40 FPS. I don't know why but I've experienced this. I stopped playing em years ago because of that. Those are actually 2 of the games that I'm hopeful using the GTX 780 in my tuned R5-2600 system that I might be able to run well again.

WatchDogs (the original) I can run easily at flat 80 FPS all the time. I found WatchDogs 1 performance depends on your storage performance. On a mechanical drive it has horrible FPS dips. On a SATA SSD it's better but still a few. On a fast NVME drive it's buttery-smooth 80 FPS always for me.

Valheim is one of the games that saw a huge boost for me. I'm now easily at 80 FPS minimums everywhere in Valheim even in my friend's large base area. With Vsync off I can see Valheim running at 125 minimums and typically in the 180-200 FPS range everywhere in the game no matter what's going on. Valheim saw a BIG boost from the 5800X. This is at 1080p though.

Cyberpunk is good, but not amazing for me. My limitation with CP-2077 is the video card, not the system. I can run the game on max ultra settings @ 1080p and it has the 1080 Ti at 100% usage constantly no matter where I am in the game and usually runs 50-60 FPS. I'll need a faster GPU to do well in that game, even at 1080p.

The rest of those games I don't even know what they are and I probably wouldn't be interested in them.
I finished Morrowind on a r9 290X, and it was running with a high view distance mod, so I was way above 60 FPS. Without the LOD mod, it ran too fast (like constant 120 FPS).
So you shouldn't have had any of these problems on your 290X. Was your card not upclocking? You can use clockblocker to force the card to run at full speed (I forgot if that's what it was called, I'm on a 3090 now. But it used some sort of OpenCL thing to stop downclocking).

I still remember when MW first came out and I think I tried playing it on a Geforce 4 and I remember how hard it was to pass 30 FPS at max detail settings back then.
 

·
Like a fox!
Joined
·
2,762 Posts
I finished Morrowind on a r9 290X, and it was running with a high view distance mod, so I was way above 60 FPS. Without the LOD mod, it ran too fast (like constant 120 FPS).
So you shouldn't have had any of these problems on your 290X. Was your card not upclocking? You can use clockblocker to force the card to run at full speed (I forgot if that's what it was called, I'm on a 3090 now. But it used some sort of OpenCL thing to stop downclocking).

I still remember when MW first came out and I think I tried playing it on a Geforce 4 and I remember how hard it was to pass 30 FPS at max detail settings back then.
It was probably some of the mods I was running loading the system hard. But what I stated above is true: AMD & Nvidia both "optimized away" from DirectX9 in their video cards and drivers as time went on. That is newer cards are slower at DX9 than older cards were. In fact my GTX 780 is the card that has given me the highest 3dMark Skydiver & 3DMark 2006 scores (DirectX9) out of all hardware I've ever owned (and I've owned a lot of hardware). Including vs my 1080 Ti, R9 290X, and 1060 card(s). In fact my old GTX 470 pair I pulled out of storage for the record 3dmark run when I bought my 5800X was even faster than the GTX 780 with both of em together.
 

·
Registered
Joined
·
952 Posts
When some one it does waste five hours of precious personal time, at reading user commends written bellow several NVIDIA driver releases page (download links) for Pascal, he does become receiver of first hand information.
Entire pack of complains this is relative to Microsoft patches and newest DirectX along Win10.
I did install the worst NVIDIA driver that most people having issues with, at my Win7 box, and there was No-negative-impact.

An second interesting story this is that INTEL, it did everything possible for SLI hardware support and performance, so to assist NVIDIA marketing in favor of their in common profitability.
Now suddenly NVIDIA changing plans to support single card solutions due their stupid drivers, and a Billion or more of SLI boards they will form soon a HUGE pile of E-waste.

This is my own Whom to blame list:
1) Microsoft at 100%
2) NVIDIA at 80%
3) INTEL at 25%
4) AMD this is now 100% Chinese owned company with no moral values in the regard of PC hardware preservation, therefore AMD alone, this now build up their own HUGE pile of E-waste.

Preservation of healthy well performing hardware in the unit of time, this is now at consumers hands and according degree of intelligence, of its individual alone.
 

·
Registered
Joined
·
952 Posts
Today its my birthday and I just ordered my gift from EU :)
Intel Core2Quad Q9650 3.00GHz
  • The CPU comes with higher operating frequency than the Intel Q6600.
  • The processor incorporates SSE4.1 instructions.
  • The microprocessor is 10% more energy efficient.
2.40 GHz - 8 MB L2 Cache - Buss 1066 MHz - TDP 105 W - Processing Die Transistors 582 million

3.00 GHz - 12 MB L2 Cache - Buss 1333 MHz - TDP 95 W - Processing Die Transistors 820 million

When this chip will be delivered, I do promise to return and leave reliable feedback about it.
I did repeatedly wrote, that YouTube videos has zero value as hardware performance evidence to me, as no one has the patience or knowledge so to optimize a computer PC, so actual performance to be proven on footage.
Another thing that bothers me with YouTube videos, its the non traceable ownership of video channels, several kids especially from third world, they just perform video editing on every subject, regarding titles and descriptions, because they do not own the hardware, neither they have money to buy it.
Still they do expect to get paid from clicks made by people them unable to detect this scam technique.


2489858
 

·
Stock *ahem*
Joined
·
6,417 Posts
Hi all!

Surprised I never posted in here way back when I first got the card (saga here), but as I have just recently upgraded cards and plan to put the 1060 in a secondary system, I thought I would post some benchmark results :)

TimeSpy_1060_1.png

Heaven_1060_2560x1440Custom.png

I'm going to be officially joining the 2060 club soon, so more results there :p
 

·
Like a fox!
Joined
·
2,762 Posts
When this chip will be delivered, I do promise to return and leave reliable feedback about it.
I did repeatedly wrote, that YouTube videos has zero value as hardware performance evidence to me, as no one has the patience or knowledge so to optimize a computer PC, so actual performance to be proven on footage.
The optimization thing may be true but if someone is using your same video card on a newer system and it shows a performance difference of +150% to +200% then it should be pretty obvious that newer systems would be faster with your card. Optimization may help a little bit but there is no amount of optimization you could do with your older system (or even overclocking) that would make up for a 150% - 200% difference in performance.
 

·
Registered
Joined
·
952 Posts
The optimization thing may be true but if someone is using your same video card on a newer system and it shows a performance difference of +150% to +200% then it should be pretty obvious that newer systems would be faster with your card. Optimization may help a little bit but there is no amount of optimization you could do with your older system (or even overclocking) that would make up for a 150% - 200% difference in performance.
Partially correct statement.
I thought to test my DDR3 at 1333 Rated FSB prior new CPU this arrives.
In the past two days my Q6600 this is OCed at 3G and I do not see any difference at gaming FPS.
Now I have the faith that Processing Die Transistors of 820 (Q9650 3.00GHz) VS 582 million (Q6600 3.00GHz), they will do the difference.

Plain OC does not qualify as counter balance when your chip this is sort at several million of transistors.
And this is old news for me, that I do retype them and for others to read as fact and reminder.

Most people are totally unaware that a single FPS in order to be generated this requiring portion of million of transistors so to work for it.

By the use of classic math, I am aware that I am enjoying 35 FPS due the use of 582 million of transistors = 16.6 million per FPS.
At 820 million of transistors (CPU), classic math they do verify that I will enjoy 49.39 FPS instead.

If you challenge me? if I will demonstrate trust to math or people voices, I will vote Math.
 

·
Like a fox!
Joined
·
2,762 Posts
Now I have the faith that Processing Die Transistors of 820 (Q9650 3.00GHz) VS 582 million (Q6600 3.00GHz), they will do the difference.
I didn't think for a while at just how big the gap in performance has been through the years. Supposedly the ryzen 5800X processor in my primary gaming computer has 12.6 billion transistors in it.

Anyway. All of your discussion of LGA-775 system performance is kind of.. off topic a bit for this thread. I know you're using a GTX 1060 but you're probably the only person in this entire thread using such a system with their 1060. So I think you might have a better response posting over in the LGA-775 thread instead. If that is the processor you intend to use then they can help you get the most out of it, including optimizing and overclocking. Here's a link to that thread for you: -=LGA775 Club=-
 

·
Registered
Joined
·
952 Posts
Hi all!

Surprised I never posted in here way back when I first got the card (saga here)
You should be thankful to Amazon, their mistake was a gift of 6GB version to you instead of 3GB GTX1060. :)
 

·
Registered
Joined
·
952 Posts
I didn't think for a while at just how big the gap in performance has been through the years. Supposedly the ryzen 5800X processor in my primary gaming computer has 12.6 billion transistors in it.

Anyway. All of your discussion of LGA-775 system performance is kind of.. off topic a bit for this thread. I know you're using a GTX 1060 but you're probably the only person in this entire thread using such a system with their 1060. So I think you might have a better response posting over in the LGA-775 thread instead. If that is the processor you intend to use then they can help you get the most out of it, including optimizing and overclocking. Here's a link to that thread for you: -=LGA775 Club=-
Its best for you to admit that you do not qualify offering to me further advice's.
As about me, I will simply ignore any further messages coming from your end, its for the best about keeping the peace.

I do not need 200FPS when my lovely IPS monitor this supports 60Hz as max at 1080P.
 

·
Stock *ahem*
Joined
·
6,417 Posts
You should be thankful to Amazon, their mistake was a gift of 6GB version to you instead of 3GB GTX1060. :)
Well, in a very convoluted way, yes :p

In the past two days my Q6600 this is OCed at 3G and I do not see any difference at gaming FPS.
Empirically, what this shows is that the CPU is the "performance bottleneck".

I used to not believe they existed, but way back in around 2007-9, somewhere in there, I bought basically an SLI testbed setup - Athlon X2 5200+, Asus M2N something or other, 4 or 8 GB DDR2-800 as I recall, and two 8800GTs. Ran the benches, was happy.

Decided to beef it up with two 9800GTX+'s. Even after a clean install of Windows, no change in the benchmark results.

As soon as I got an Athlon X2 250, those 9800GTX+'s in SLI showed a noticeable improvement.

So what this shows is that the CPU can hold back a GPU or a pair of GPUs.
 

·
Like a fox!
Joined
·
2,762 Posts
Its best for you to admit that you do not qualify offering to me further advice's.
As about me, I will simply ignore any further messages coming from your end, its for the best about keeping the peace.

I do not need 200FPS when my lovely IPS monitor this supports 60Hz as max at 1080P.
I'm just trying to help. If that is the processor you intend to use then the people in the LGA-775 thread can help you get the most out of your system with that card. Maybe you could get a little better performance out of it if you spoke with people that actually understand the platform you're using.
 

·
Registered
Joined
·
952 Posts
As soon as I got an Athlon X2 250, those 9800GTX+'s in SLI showed a noticeable improvement.

So what this shows is that the CPU can hold back a GPU or a pair of GPUs.
Its all about expectations and the Game that you are interested to play.
Even fresher systems they have issues with specific games due the sloppy work of software developers.
If I were e-sports athlete, I would care to minimize any bottleneck, so to get the prize which relates to HOT cash.

Benchmarks this is a partial indication, and at real gaming you may set to Low filters that serve not necessary aesthetics, and keep your CPU&GPU busy this computing exclusively the essential workload.
BF5 in order to look appealing to eyes, this using piles of 3D graphics and demands a system capable be cinema 3D workstation class, so this to be able to deliver.
I prefer to own more essential things than cinema 3D workstation class PC.
 

·
Registered
Joined
·
952 Posts
I'm just trying to help. If that is the processor you intend to use then the people in the LGA-775 thread can help you get the most out of your system with that card. Maybe you could get a little better performance out of it if you spoke with people that actually understand the platform you're using.
You are welcome, since yesterday I am now a member at G.SKILL forum, as I do plan to push my Q9650 higher than 3G.
Their forum admin he is Canadian and an extremely expert.
OCN this is now empty of elder experts, and it can not deliver much of help to me.
 

·
Registered
Joined
·
952 Posts
Here is another aha moment (fresh experience of my), my VGA has now to deal also with Greek summer temperatures range. :)
Today this is first of Hot days at central Greece, 30C outdoor and 26.2C indoor.
I though it as good opportunity testing NVIDIA power profiles, so to see their impact at idle.
What worth noted this is that specific latest BIOS for my GPU, does not bump-up the DC Fan speed when the card this is at: Prefer best performance.

I will repeat the experiment when out door get close to 40C and indoor around 35C, that they are both annoying temperatures for the human body.



2511350
 

·
Like a fox!
Joined
·
2,762 Posts
Here is another aha moment (fresh experience of my), my VGA has now to deal also with Greek summer temperatures range. :)
Today this is first of Hot days at central Greece, 30C outdoor and 26.2C indoor.
I though it as good opportunity testing NVIDIA power profiles, so to see their impact at idle.
What worth noted this is that specific latest BIOS for my GPU, does not bump-up the DC Fan speed when the card this is at: Prefer best performance.

I will repeat the experiment when out door get close to 40C and indoor around 35C, that they are both annoying temperatures for the human body.
Just a note: You probably should never use "Prefer best performance". As you saw it makes the GPU clocks very high for no useful reason. Using "Prefer best performance" does not allow the card to idle and effectively disables nvidia's boost for most games. Adaptive should always be used.

Also 35c is extremely hot for indoor temps. Do you have not have air conditioning?
 

·
Registered
Joined
·
952 Posts
Also 35c is extremely hot for indoor temps. Do you have not have air conditioning?
We hardcore motorcyclist we use only plain fan at summer, and minimum heat at winter, so to be harmonized with nature and be tolerant to extreme cold or heat.
My motorcyclist jacket this is 15 kilos, with removable parts for all seasons, jacket and helmet worth equal to one RTX 3070 :) Safety First !
Worst hot zone 15 of August up to 10 September.
 

·
Registered
Joined
·
952 Posts
System Ready for Gaming benchmarks !!
I am now waiting for fresh Microsoft keyboard and Microsoft IntelliMouse Pro (fastest input devices).
Then I will start FPS benchmarks and post relative screenshots. :cool:

System FLAG as we use to made them back at 2005 (With allot of love for our Top Hardware Choices).


2515277
 

·
Registered
Joined
·
952 Posts
3DMark 11 judgement this is that I belong to 30% of best systems worldwide of equivalent technology in use. 😎🙃
Result details -> Detailed Scores

3DMark 11 Performance settings

Now when the system was stressed with 3DMark 11 Extreme settings, the relative recorded performance this is good enough too.
1080P with all bells and whistles active of needed and unneeded 3D filters.

Therefore at real gaming conditions, the truth this is somewhere in between of those results. ;)

2515455
 
1101 - 1120 of 1128 Posts
Top