Overclock.net banner
21 - 26 of 26 Posts
Quote:
Originally Posted by r0ach View Post

The last I heard, Google's flagship product, Nexus 9, was unusable due to memory leaks. Java precompiles to ARM (normal devices), then Nvidia Denver compiles to it's own cached instruction set at runtime (what the @#$%#@?). So good old Nvidia is driving ARM/Android towards even higher latency in the first 64bit chips.
Well, I would hardly consider the N9 a flagship. It never should have happened and it was clear it would flop the second it was announced. Same goes for Nvidia's android chips: They've been at it for years and have only failed.The 5.0 memory leaks were also a giant disaster but they've been fixed in 5.1 which has been a pleasure to use.
 
Discussion starter · #22 ·
Quote:
Originally Posted by aleexkrysel View Post

Well, I would hardly consider the N9 a flagship. It never should have happened and it was clear it would flop the second it was announced. Same goes for Nvidia's android chips: They've been at it for years and have only failed.The 5.0 memory leaks were also a giant disaster but they've been fixed in 5.1 which has been a pleasure to use.
QSVC noticed that Maxwell renders bordless window mode much laggier than Kepler and previous chips, but full screen mode the same. I noticed it too. Linux based distributions seem to use borderless window mode to render. So combine the fact that Nvidia is now shipping chips for android with Maxwell cores, plus the whole Denver runtime compile thing, it seems Nvidia is the worst thing to happen to android in terms of user response. When I talked to people like ManuelG from Nvidia years ago, it was almost like he didn't even understand what the word input lag means or why it would be important in a video game.
 
Discussion starter · #24 ·
Quote:
Originally Posted by qsxcv View Post

why do you keep bringing that up
tongue.gif

i've never used a kepler card so idk when nvidia changed stuff.

plus no one seems to care anyway
http://forums.blurbusters.com/viewtopic.php?f=2&t=1541

Something is strange about your computer because if I do that test on a 570 gtx on Win 7 with flipque set to 0, there's only one tearing line on desktop, it occurs at the exact same place every time, at the top 1/4th of the screen, but then if you launch a borderless window mode or full screen game, it's gone and 0 visible tearing anywhere once you leave desktop. Meanwhile, your tearing seems to occur in a different place each frame on desktop.
 
take a picture?

85hz crt, 500hz mouse
its absolutely expected to see a cascade of tears at different locations since 500/85 isnt an integer
 
Quote:


I'm not going to read those sources at the moment, but I will say that audio I/O is a completely different monster from HID I/O. Interrupts vs. polling, being an obvious reason.
Not for me it isn't. My gold standard test: see if any virtual drum machine / drum kit app is playable. Meaning I can play a beat and keep time with it. Possible on any iOS device since the iPad 1. The original first gen iPad has music apps on it. Playable music apps. Never seen a playable music app on Android. The input lag + audio lag makes the beat out of time. Essentially, it gets slower if i don't start to play hits ahead of the time I want to hear it, but is very tricky to keep this up unless the tempo is very low, hence why I slow down on Android.
 
21 - 26 of 26 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top