Overclock.net banner

1 - 20 of 23 Posts

·
Banned
Joined
·
15,763 Posts
Discussion Starter #1 (Edited)
Google has built a multibillion-dollar business out of knowing everything about its users. Now, a video produced within Google and obtained by The Verge offers a stunningly ambitious and unsettling look at how some at the company envision using that information in the future.

Source

 

·
Registered
Joined
·
324 Posts
Couldn't find the original video, only the Verge talking about the video. I'd prefer the source so I can make my own interpretations.
 

·
ObiwanShinobi
Joined
·
211 Posts

Tinfoil hat much? lol
 

·
Registered
Joined
·
5,374 Posts
The things being done to manipulate society and the way we feel about everthing are truly shocking. This is nothing in comparison.
News channels, papers and even the weather people have strict gag orders on everything. Remember the "forest" fires in Cali that were burning down city blocks to ash and leaving cars melted? While leaving trees between houses unburned? Probably not but I bet you heard about the "forest" fires. That was a gag order taking place and just one of hundreds of examples.

These big companies don't car about money, they already have plenty of that so what's next? Control.
Why else would you want all this data? To help us use the internet easier? Yeah, sure.

I'm not attacking you at all, it's just very naive to assume anything marketed to help you is actually for that reason.
 

·
pass the butter
Joined
·
537 Posts
This stuff doesn't even scare me anymore. It's too late.
 

·
Registered
Joined
·
799 Posts
Don't be evil.

Be REALLY evil!
 

·
Prolly vaping...
Joined
·
3,355 Posts
I wonder why google changed their motto from don't be evil.

more like dont be google. lol
 

·
ObiwanShinobi
Joined
·
211 Posts
I see where this is going. OCN ToS restricts talking about certain things and this is why google keeps coming up in these types of discussions lol. or maybe google is trying to disrupt the big round table? :D

either way, lobbyists destroyed everything that was good on this side of the pond long ago. FCC regulations and the strict choke on technology there is today. We haven't really even advanced much as a whole since the 60's. Sure we made the transistor go from the size of a coffee table to being able to fit about a thousand of them in a human red blood cell, needing FINFETT designs so that quantum mechanics can't mess with the flow of electrons, but have we really advanced much? I mean LCD technology got smaller and we have more pixels, but how is the display tech actually gotten any better when we look at the fundamentals how how they work? just keep adding more pixels and lie about all the other specs lol.

look at processor technology.. I mean a brand new i7 clock for clock isn't that much better than my old core i3 550 was for 4k gaming when you really look at it. sure you can have bottlenecks in some games, but that thing should be like 40 times more effective by now as a proccesor, not just a percent margin maybe between 10-30 percent depending on the game @ 4k. graphics card technology look 2 years ago the 1080 came out and the brand new 1180 or whatever is only going to be 30 percent faster.

the favorite device in the WORLD, the iphone couldn't even send picture messages while flip phones could. it's called we are being robbed as a society and tricked into shelling all our money out for limited advancements that we pay for. in 100 years we still will be doing the same stupid circle because that's how a free market works, and that's how it affects our tech. Not trying to get political at all here I'm just saying how our economy affects our tech and industries. it's all about lying now and even people smarter than I am are fooled by this now. people get brainwashed so easily it's not even funny. Can't believe specifications until someone trusted tests them for real lol
 

·
Registered
Joined
·
5,374 Posts
look at processor technology.. I mean a brand new i7 clock for clock isn't that much better than my old core i3 550 was for 4k gaming when you really look at it. sure you can have bottlenecks in some games, but that thing should be like 40 times more effective by now as a proccesor, not just a percent margin maybe between 10-30 percent depending on the game @ 4k. graphics card technology look 2 years ago the 1080 came out and the brand new 1180 or whatever is only going to be 30 percent faster.
While you're right about not being much difference between CPUs at 4k, your dead wrong about the CPUs even being close in terms of performance. It's like saying my 4690k is as good as a 8700k because I'm using a 750TI. Remove the GPU and compare that CPU's to each other and you'll see a massive difference. Granted from generation to generation the difference is usually 5 to 10% and that's ok. That just means your CPU stays relevant much longer, there's no need to buy a CPU each year lol, if you did that I can imagine your disappointment.

As for GPUs even a 20% improvement over a two year period is pretty massive IMO, I'm not sure what more you could ask for. And there is way more to that than raw horsepower, I'm not sure what magic Nvidia did here but my 980 couln't do 1440p and get 60fps in a great deal of games but my 1080 is able to get either well over 60fps in demanding games or well over 144fps in shooters with faster engines. If you look at 1080p benchmarks there isn't a massive gap between them. The way I see it my 980 was as good at 1080p as my 1080 is at 1440p which is 70% more pixels.

At the end of the day every CPU maker could give it all they've got each time but then we'd get stuck like Intel with 10nm. Can you imagine if Intel released a 6 core 4790k? They'd hit that wall much sooner and wouldn't really have anything to sell. Without anything to sell each year they'd hemorrhage money working on 10nm and we'd all still have 3 year old CPUs. It's in their best interest to drag this out so OEMs can release new laptops and desktops each year to keep that cash flow while your crafty PC gamer can keep milking that $700 they spent on their last CPU and MOBO.

I might be a little off base with my hypotheticals but not by much. Maybe I'm the only one that can see how much better things get every few years, IDK but I don't see why people get upset that their 1st and 2nd gen i7s are still great chips. I'd be really happy my investment took me that far while the guy who built his first rig this year is really happy he has DDR4 and a really fast gaming chip with 6 cores.
 

·
ObiwanShinobi
Joined
·
211 Posts
While you're right about not being much difference between CPUs at 4k, your dead wrong about the CPUs even being close in terms of performance. It's like saying my 4690k is as good as a 8700k because I'm using a 750TI. Remove the GPU and compare that CPU's to each other and you'll see a massive difference. Granted from generation to generation the difference is usually 5 to 10% and that's ok. That just means your CPU stays relevant much longer, there's no need to buy a CPU each year lol, if you did that I can imagine your disappointment.

As for GPUs even a 20% improvement over a two year period is pretty massive IMO, I'm not sure what more you could ask for. And there is way more to that than raw horsepower, I'm not sure what magic Nvidia did here but my 980 couln't do 1440p and get 60fps in a great deal of games but my 1080 is able to get either well over 60fps in demanding games or well over 144fps in shooters with faster engines. If you look at 1080p benchmarks there isn't a massive gap between them. The way I see it my 980 was as good at 1080p as my 1080 is at 1440p which is 70% more pixels.

At the end of the day every CPU maker could give it all they've got each time but then we'd get stuck like Intel with 10nm. Can you imagine if Intel released a 6 core 4790k? They'd hit that wall much sooner and wouldn't really have anything to sell. Without anything to sell each year they'd hemorrhage money working on 10nm and we'd all still have 3 year old CPUs. It's in their best interest to drag this out so OEMs can release new laptops and desktops each year to keep that cash flow while your crafty PC gamer can keep milking that $700 they spent on their last CPU and MOBO.

I might be a little off base with my hypotheticals but not by much. Maybe I'm the only one that can see how much better things get every few years, IDK but I don't see why people get upset that their 1st and 2nd gen i7s are still great chips. I'd be really happy my investment took me that far while the guy who built his first rig this year is really happy he has DDR4 and a really fast gaming chip with 6 cores.

encoding performance and what I do on my pc are two very different things. This coffee lake i3 would destroy older i7 cpu's for encoding but I still hate it because I can't overclock it and microcenter didn't even have an unlocked i3 to buy which was stupid. They wanted 3 times the price for any unlocked processor and I should have bought from newegg but I wanted to build it now.

my point was that technology advancement sucks. Moores law dictates that transistors are supposed to double every 18 months and be much lower power as well as huge performance gains. I'm not seeing computers double every 18 months and a brand new gtx 1080 can barely max out crysis from 2007 over a decade later @4k 60hz. still dips below 60hz with the best i7 processor overclocked to 5ghz

You tell me how a decade old game can't be maxxed out by anything but a $2500+ computer today?

It's called we have lazy garbage coders and our hardware tech companies are complacent withholding technology from us until we get pissed off enough to buy nothing at all.
 

·
Registered
Joined
·
5,374 Posts
Crisis was given the options to tank anything. Maxed out is just stupid at 4k, why does a GPU need to run 8x MSAA at 4k on a game that wasn't really optimized for 4k and hasn't been updated in years? And if it could do that every one would wonder why it can't do 8k. Look at a 980 trying to run Crysis at 4k, I couldn't even do it at 1440p and get 60 fps minimums. The fact that a 1080 can run Crysis 3 at 4k and maintain 60fps with sensible settings is nothing short of amazing when the GPUs are only one generation apart. When I got into PC gaming I had a Core 2 Duo powered Dell that I put a $200 750 TI in which easily ran Crysis 3 at 30fps/1080p with settings that made consoles look horrible. A $100 PC that was old when I bought it with a $200 GPU ran every game I needed it to while making the console version look pretty bad. If I'd known anything about PCs at the time I could have had an old i7 rig with a used 750TI for around the same price and ended up with a great budget rig. I only bought/built it for Skyrim so I could play with mods and it was a fairly new game at the time. Ran perfectly at 60 fps maxed out. That impressed the hell out of me because a used console would have cost about the same but looked much worse at a lower FPS.

Yes the industry is held down to an extent but it's nothing like the advancements held back by big oil. If Intel showed all their cards every time they'd make less money, same for Nvidia and they are businesses. If there is a way for them to make more money then they should do it. That money lines plenty of pockets sure but it also goes to research and development of future products.

No matter how good your hardware is it will always be possible to bog it down. As for software they aren't being lazy, they're making games for 5+ totally different platforms with different operating systems and hardware. Consoles dominate gaming so obviously they will get the priority because they aren't as strong as a PC can be so they need to work really hard to make it run as well as it can for them.

This stuff isn't made for just you and me right here right now so we can play games. It's made for many people with different needs and they tend to prioritize the ones that have the biggest part of the market while also making a great product for us off the same wafer.
Moore's law isn't even a law, it's an idea and one that remained true until the last few years. Silicon has a limit somewhere and each time they shrink a die they have to work harder on the next one. Intel with all its money and man power is struggling to get good yields with 10nm. If making much better CPUs and GPUs is so easy with todays tech AMD would have done it and may still on 7nm.

However there have been instances of each AMD, Intel and Nvidia trying to manipulate devs into coding for their products or against the others, lying about products outright or using naming schemes to sell lesser silicon as the little brother of a better product such as the 3 vs 6th 1060.

I'm not saying you aren't right, I'm just saying it isn't as bad as the way you see it when you step outside yourself and look at it from their perspective and the perspective of other users.
Each time a piece of hardware is made that exceeds the performance of what enthusiasts want, they change it up by saying it isn't good enough because it can't do this new higher resolution. 3 years ago it was "Still can't do Crysis 1080p/60fps maxed out." and now it's 4k. That's way more than 2x more performance to get the same FPS with the same settings.
2073600 pixels vs 8294400 pixels which is 4x more. My 980 could do 1080p/60fps with reasonable settings and now just one generation later my 1080 can do 4k/60fps at reasonable settings. It will dip under that from time to time but so would my 980 at 1080p.
 

·
BOINC Cruncher
Joined
·
1,861 Posts
If you think technology isn't advancing as fast as you want, the phone you probably have in your pocket is more powerful than desktop computers 15 or so years ago. Back then, to have something this powerful that fits in your pocket would have been unimaginable.
My first ever computer was only 4mhz with an EGA graphics adapter that was only capable of displaying 16 colors and that was considered high end.
 

·
ObiwanShinobi
Joined
·
211 Posts
Crisis was given the options to tank anything. Maxed out is just stupid at 4k, why does a GPU need to run 8x MSAA at 4k on a game that wasn't really optimized for 4k and hasn't been updated in years? And if it could do that every one would wonder why it can't do 8k. Look at a 980 trying to run Crysis at 4k, I couldn't even do it at 1440p and get 60 fps minimums. The fact that a 1080 can run Crysis 3 at 4k and maintain 60fps with sensible settings is nothing short of amazing when the GPUs are only one generation apart. When I got into PC gaming I had a Core 2 Duo powered Dell that I put a $200 750 TI in which easily ran Crysis 3 at 30fps/1080p with settings that made consoles look horrible. A $100 PC that was old when I bought it with a $200 GPU ran every game I needed it to while making the console version look pretty bad. If I'd known anything about PCs at the time I could have had an old i7 rig with a used 750TI for around the same price and ended up with a great budget rig. I only bought/built it for Skyrim so I could play with mods and it was a fairly new game at the time. Ran perfectly at 60 fps maxed out. That impressed the hell out of me because a used console would have cost about the same but looked much worse at a lower FPS.

Yes the industry is held down to an extent but it's nothing like the advancements held back by big oil. If Intel showed all their cards every time they'd make less money, same for Nvidia and they are businesses. If there is a way for them to make more money then they should do it. That money lines plenty of pockets sure but it also goes to research and development of future products.

No matter how good your hardware is it will always be possible to bog it down. As for software they aren't being lazy, they're making games for 5+ totally different platforms with different operating systems and hardware. Consoles dominate gaming so obviously they will get the priority because they aren't as strong as a PC can be so they need to work really hard to make it run as well as it can for them.

This stuff isn't made for just you and me right here right now so we can play games. It's made for many people with different needs and they tend to prioritize the ones that have the biggest part of the market while also making a great product for us off the same wafer.
Moore's law isn't even a law, it's an idea and one that remained true until the last few years. Silicon has a limit somewhere and each time they shrink a die they have to work harder on the next one. Intel with all its money and man power is struggling to get good yields with 10nm. If making much better CPUs and GPUs is so easy with todays tech AMD would have done it and may still on 7nm.

However there have been instances of each AMD, Intel and Nvidia trying to manipulate devs into coding for their products or against the others, lying about products outright or using naming schemes to sell lesser silicon as the little brother of a better product such as the 3 vs 6th 1060.

I'm not saying you aren't right, I'm just saying it isn't as bad as the way you see it when you step outside yourself and look at it from their perspective and the perspective of other users.
Each time a piece of hardware is made that exceeds the performance of what enthusiasts want, they change it up by saying it isn't good enough because it can't do this new higher resolution. 3 years ago it was "Still can't do Crysis 1080p/60fps maxed out." and now it's 4k. That's way more than 2x more performance to get the same FPS with the same settings.
2073600 pixels vs 8294400 pixels which is 4x more. My 980 could do 1080p/60fps with reasonable settings and now just one generation later my 1080 can do 4k/60fps at reasonable settings. It will dip under that from time to time but so would my 980 at 1080p.
Hey I'm just saying, when I got my old 8800gtx which was a 768MB card with only just under 400gflops of computational power, I was running world of warcraft at over 200fps in non desnely populated areas, could run the first halo game at over 500fps @ 1080p on my old gateway photo editing 1920x1200 panel. most other older games would run well over 150 fps at the time. back then HUGE technology advancemnets were made from netburst technology to conroe architecture. even single threaded performance per clock.

and, the thing about Crysis from 2007 that sickens me the most is the fact it looks better than most AAA games still made to this day. There used to be a market for this stuff until consoles took over. and then look at the fact that the xbox one x has a GPU with almost as many floating point operations per second as a GTX 1070 yet it's crap compared to my $600 gaming rig with a coffee lake i3 and gtx 970 and the xbox is even locked to 30fps.

Usually I don't run AA or AF in a great looking game like crysis. but I will in something like PUBG or shift2 unleashed, call of duty which have really terribad textures. on 1440p there was barely a need for those settings period. you'd think with the GPU shortage and everyone and their sister trying to mine with GPU's that more advancements would come along faster with more supply and less demand but the companies still don't care. I mean we have 8k displays coming out and GPU companies won't even make a new connector to have higher cable bandwidths..we should be able to use 1 cable that supports up to 8k 120hz by now, and expandable to 165hz later.

don't get me started on oil companies because these stupid electric cars are a scam. each prius is worse for the environment than a hummer by a factor of 2, because the hummer lasts 3 times longer, and doesn't have a high carbon emiision to be built, and doesn't toxify the environment nearly as bad as the batteries from EV cars. The battery plant in ontario canada is literally so toxic, that they use the outskirts of it to test mars rovers. we should have had brushless motors over a decade ago and should be using graphine batteries by now. Someone tell me why we're still using 30 pound steel stamped wheels while they spend millions every year trying to come up with more attractive hubcaps to litterally trick stupid people into thinking they are alloy wheels? rotational mass is parasytic loss. Why are we still using drum breaks in brand new ford focus's? more sprung weight on the suspension. Why does the brand new prius have a WORSE drag coefficient than the first model (obviously to look more agressive and trick stupid people into thinking it's actually more aerodynamic than it is). Brand new prius gets the same fuel economy as a mid 90's honda EG hatchback. I know which one I'd pick.

If you think technology isn't advancing as fast as you want, the phone you probably have in your pocket is more powerful than desktop computers 15 or so years ago. Back then, to have something this powerful that fits in your pocket would have been unimaginable.
My first ever computer was only 4mhz with an EGA graphics adapter that was only capable of displaying 16 colors and that was considered high end.
the phone in my pocket has a snapdragon 821 and actually got a 6 second super pi 1M benchmark score. of course it's faster than a decade old computer. advancements only get driven by where the market is, but if that were the case, we should see more GPU advancements from mining. :/
 

·
Tank destroyer and a god
Joined
·
2,511 Posts
This stuff doesn't even scare me anymore. It's too late.
How much time until the Puppetmaster emerges from the sea of the information? :)
 

·
pass the butter
Joined
·
537 Posts
1 - 20 of 23 Posts
Top