Overclock.net banner

Sony GDM-FW900 24" CRT FS @ $1250!!???

13K views 41 replies 11 participants last post by  spacediver  
#1 ·
Isn't this monitor at least a decade old? Is it even possible to get CRT's repaired anymore?

For the record it's on ebay, free local pickup only (I don't have to wonder why).
 
#2 ·
It's a fantastic CRT, one of the best. Seller is probably being a bit optimistic trying to get that much for it though.
 
#3 ·
Might be worth it if it's brand new or in awesome condition.

With a bit of effort, it's possible to maintain these units.

A regular white point balance adjustment is important (See my guide here)

Also, if you can get a CRT tester/rejuvenator, such as the Sencore CR70 or CR7000, you can extend the life of the tube if it starts to fail.

There's a whole thread on hardforum dedicated to this monitor. It's been going strong since 2005.
 
#4 ·
Quote:
Originally Posted by spacediver View Post

Might be worth it if it's brand new or in awesome condition.

With a bit of effort, it's possible to maintain these units.

A regular white point balance adjustment is important (See my guide here)

Also, if you can get a CRT tester/rejuvenator, such as the Sencore CR70 or CR7000, you can extend the life of the tube if it starts to fail.

There's a whole thread on hardforum dedicated to this monitor. It's been going strong since 2005.
Spacediver, do HDMI to VGA converters add much latency? I was a sad puppy the day my Sony 21" CRT died.

Did they ever make CRT's that natively used DVI-I interfaces?

That's an interesting thread over on hardforum -- there are some diehard fans of this Sony fw900.

Why did Sony recommend running this CRT at less than its highest resolution?
 
#5 ·
Quote:
Originally Posted by 8051 View Post

Spacediver, do HDMI to VGA converters add much latency? I was a sad puppy the day my Sony 21" CRT died.
I believe it depends on which one. From what I understand, the HDFury converters are excellent, but for now, will not support super high resolutions combined with high refresh rates (once we get 400MHZ ramdacs on them, then things will be good).
Quote:
Originally Posted by 8051 View Post

Did they ever make CRT's that natively used DVI-I interfaces?
I think for TV's yes (I believe some of them had HDMI interfaces also), but I don't think for monitors.
Quote:
Why did Sony recommend running this CRT at less than its highest resolution?
I believe it's because 1920x1200 (considered the prime mode) is the highest resolution that can more or less fully resolve each "pixel". If I have time in the future, I may perform proper tests to confirm this.
 
#7 ·
Quote:
Originally Posted by 8051 View Post

Why did Sony recommend running this CRT at less than its highest resolution?
This is typical of CRTs.

You almost never want to max out the resolution because then the refresh rate would be in the toilet and refresh rate on a CRT matters much more than on an LCD.

I had a pretty high-end CRT back in the day (I always wanted one of the FW900s but it was a bit pricey), and it could run 2048*1536...but only at a maximum of 66Hz. I normally used 1600*1200 @ 90Hz, because that was the sweet spot. For faster paced games I sometimes went down to 1024*768 @ 170Hz.
Quote:
Originally Posted by spacediver View Post

I believe it's because 1920x1200 (considered the prime mode) is the highest resolution that can more or less fully resolve each "pixel". If I have time in the future, I may perform proper tests to confirm this.
This is true to, an extent, as well.

It was actually kind of a nice feature...being able to run resolutions past what the shadow mask supported and getting a form of free analog anti-aliasing...if you could tolerate the rather low refresh rates this usually meant.
 
#8 ·
If you're talking about scaler based supersampling, before DSR/VSR came out through drivers, some LCD monitors can supersample in the scaler also. Some Dell LCD's were capable of accepting an EDID override up to 3200x1800 (something around here) at 60hz, and 3840x2160@30hz. Many of the Benq 1080p 144hz monitors (based on the Mstar scaler) can run 2560x1440@100hz. Granted, CRT's doing this type of scaling looked better than LCD's...
 
#9 ·
Quote:
Originally Posted by Blameless View Post

This is typical of CRTs.

You almost never want to max out the resolution because then the refresh rate would be in the toilet and refresh rate on a CRT matters much more than on an LCD.

I had a pretty high-end CRT back in the day (I always wanted one of the FW900s but it was a bit pricey), and it could run 2048*1536...but only at a maximum of 66Hz. I normally used 1600*1200 @ 90Hz, because that was the sweet spot. For faster paced games I sometimes went down to 1024*768 @ 170Hz.
This is true to, an extent, as well.

.
Yes I guess $2300 US could be a bit pricey for a CRT.
 
#10 ·
Quote:
Originally Posted by Falkentyne View Post

If you're talking about scaler based supersampling, before DSR/VSR came out through drivers, some LCD monitors can supersample in the scaler also. Some Dell LCD's were capable of accepting an EDID override up to 3200x1800 (something around here) at 60hz, and 3840x2160@30hz. Many of the Benq 1080p 144hz monitors (based on the Mstar scaler) can run 2560x1440@100hz. Granted, CRT's doing this type of scaling looked better than LCD's...
I wonder if this type of LCD supersampling increased latency?
 
#11 ·
There's always some increased latency when using display scaling of any sort. How much ms depends on the electronics. I believe it's 1-2 total frames of latency (at least this is what the Eizo Foris FS2735 manual says about using display scaling!). It's the same increased latency when using display scaling (not GPU scaling) of a lower resolution (like 800x600) vs a higher resolution (like 2560x1440 on a 1920x1080 monitor), but this is only when using the monitor image scaling settings. GPU scaling also adds latency but that's video card/driver related. However, using the 1:1 and Aspect ratio settings adds more latency than using "Full screen". Naturally only "full" is available when doing scaler supersampling, so that only applies for lower resolutions.
 
#14 ·
Quote:
Originally Posted by Falkentyne View Post

There's always some increased latency when using display scaling of any sort. How much ms depends on the electronics. I believe it's 1-2 total frames of latency (at least this is what the Eizo Foris FS2735 manual says about using display scaling!). It's the same increased latency when using display scaling (not GPU scaling) of a lower resolution (like 800x600) vs a higher resolution (like 2560x1440 on a 1920x1080 monitor), but this is only when using the monitor image scaling settings. GPU scaling also adds latency but that's video card/driver related. However, using the 1:1 and Aspect ratio settings adds more latency than using "Full screen". Naturally only "full" is available when doing scaler supersampling, so that only applies for lower resolutions.
On CRT's the latency at lower resolutions would decrease because you could crank the vertical refresh rates higher.
 
#16 ·
Quote:
Originally Posted by Falkentyne View Post

If you're talking about scaler based supersampling
CRTs didn't use scalers and that's not the phenomena responsible for the slight blur seen at the extreme end of a CRT's usable resolutions.

The shadow mask or aperture grill has a finite resolution (dot pitch/stripe pitch) and scanning above this blurs the edges of pixels.
 
#17 ·
This is the best gaming monitor ever released, TBH. Remember that CRT is much better technology than LCD, it's just bulky and problematic with setup.

I would NEVER buy one nowadays tho, because FD Trinitron tubes after 20 000 hours are usually junk.

It's a novelty & a way to scam people right off their money for a retro holy grail.

Hell, all retro is ridiculously overpriced nowadays, to a point where you're better off not caring for it.
 
#18 ·
Quote:
Originally Posted by Astreon View Post

This is the best gaming monitor ever released, TBH. Remember that CRT is much better technology than LCD, it's just bulky and problematic with setup.

I would NEVER buy one nowadays tho, because FD Trinitron tubes after 20 000 hours are usually junk.

It's a novelty & a way to scam people right off their money for a retro holy grail.

Hell, all retro is ridiculously overpriced nowadays, to a point where you're better off not caring for it.
Is there no way to get the Trinitron tubes replaced or reconditioned?

At least this used Sony CRT is cheaper than it was new (even cheaper if you consider over a decade of inflation).
 
#22 ·
Quote:
Originally Posted by Falkentyne View Post

There's always some increased latency when using display scaling of any sort. How much ms depends on the electronics. I believe it's 1-2 total frames of latency
if it only buffers 2 lines or so then there's a few ~10us of latency associated with it, which is completely negligible.
 
#25 ·
Quote:
Originally Posted by spacediver View Post

not sure what the refresh rates were, but there were some high resolution medical grade monochrome CRTs, that I believe reached up to 5 megapixels
ummm... what?

CRTs and megapixels have nothing in common

the best, most acurate, most amazing line of monitors (color ones, at least) you could buy was the SONY BVM line, which had 900 TVL, or even more, depending on the model, and absolutely unparalleled color accuracy.

BVM-F1E is highly sought after by retro videogame freaks nowadays and costs a crapload of money. However, it's really unfit for gaming (900 TVL will just net you thicker "scan lines" (those aren't really scanlines but people call them that, so let's not go into detail - unless you like that effect, I don't), small (F1E is 20 inch IIRC) and most likely worn out to a point where a regular consumer-grade CRT in a good shape offers better image (!).

However, if you could get your hands on F1E with below 10000 hours operation time... get it, haha
 
#26 ·
Quote:
Originally Posted by Astreon View Post

ummm... what?

CRTs and megapixels have nothing in common
The terms was often used to describe the number of independent modulations of intensity that the CRT could handle in a single refresh period (e.g. a 2000x2000 resolution would be equivalent to 4 megapixels).

For example, see here
Quote:
Originally Posted by Astreon View Post

the best, most acurate, most amazing line of monitors (color ones, at least) you could buy was the SONY BVM line, which had 900 TVL, or even more, depending on the model, and absolutely unparalleled color accuracy.
The GDM FW900 uses the same phosphors as the BVM's (SMPTE-C phosphors), so the display primaries have identical chromaticities. The WinDAS WPB adjustment (see my guide here), allows you to calibrate the white point with excellent accuracy. And if your video card has a 10 bit DAC, you can use software to achieve even finer precision, both for white point, and for gamma correction.

So I don't buy the claim that the BVMs were more color accurate than the high end GDMs. As for geometry and convergence, it wouldn't surprise me if the BVMs won out here, especially in the ability to calibrate these properties, but this is not an area I know much about.