Originally Posted by Wall Street
A tear occurs when there is a switch from one image to another that is noticeably different in mid screen. The higher the framerate, the less things change on each screen, so the smaller the tear will be. For example, at 120 FPS you will get four small tears rather than the one big visible tear that you would get at 30 FPS. Also, the speed of the game and how fast things change in game impact the perception of tearing with slower games exhibiting less tearing due the the screen looking very similar frame to frame while the fastest games have quickly moving objects that you are more likely to see "cut" above and below the tear.
Actually it is the opposite. Frames are meant to be shown in sequence. If your monitor can't refresh fast enough to display all the frames it is better to have a lower fps rate. If it skipped from frame 1 to 4...you might not notice it because there isn't a lot of difference between frames 1 and 4. However, if it jumped from frame 1 to frame 10...you will probably see a large tear since there will probably be quite a bit of difference between frame 1 and 10. Think of it like one of those pads of pictures that if you flip through and it looks like it is moving or even a movie film. If they chopped out a lot of the pictures or removed a chunk of film...you'd see weird stuff displayed. And when you are actually playing the game you will be skipping from 1 to 10 to 19 to 28...ect...thus you get tearing. This cumulative effect is what we see. If you missed 10 frames one time you probably wouldn't see it at 60fps or if you did it would just be something you took in stride and kept playing.
I personally don't seem to notice tearing until about 80 fps...I know it happens from 60 to 79 fps on my crudy old monitor, but I just don't see it. However, when I do see it...it really bothers me. So I typically have adaptive Vsync turned on.Edited by Vagrant Storm - 10/23/13 at 1:59pm