Originally Posted by Tator Tot
If they spent the money on CRT research....
but size would be a huge thing to tackle.
For TV's it wouldn't matter so much. But size is still a huge problem to get over for computers.
It could actually be overcome. I don't have the link offhand, and I forget what the name they gave the technology is, but Canon and a few other companies were working on a CRT technology where instead of using a single gun that scanned the image they were going to split it up into one gun per every square inch (or something like that).
Also, if you look at more modern CRTs versus the older ones, you'll find that their gun doesn't have to be placed as far back from the glass. The reason for the distance is that the farther back the gun is, the smaller the deflection angle. Decreasing the deflection angle allows them to use larger screens and higher resolutions without needing to increase the speed of the gun's scanning (which would require more powerful magnets). The more powerful the magnets, the harder it is to be accurate. This is why CRT TVs had much less depth than PC CRTs. Because TVs only needed to scan a 640x320 resolution at a max of 30Hz (interlacing allowed them to scan odd lines first and then even lines) and had larger phosphors, they didn't need the same sort of accuracy as on a PC....I'm rambling now, but my point is that it's very possible for them to bring the gun closer to the screen and reduce the depth.