Overclock.net banner
1 - 20 of 42 Posts
It's a fantastic CRT, one of the best. Seller is probably being a bit optimistic trying to get that much for it though.
 
Might be worth it if it's brand new or in awesome condition.

With a bit of effort, it's possible to maintain these units.

A regular white point balance adjustment is important (See my guide here)

Also, if you can get a CRT tester/rejuvenator, such as the Sencore CR70 or CR7000, you can extend the life of the tube if it starts to fail.

There's a whole thread on hardforum dedicated to this monitor. It's been going strong since 2005.
 
Discussion starter · #4 ·
Quote:
Originally Posted by spacediver View Post

Might be worth it if it's brand new or in awesome condition.

With a bit of effort, it's possible to maintain these units.

A regular white point balance adjustment is important (See my guide here)

Also, if you can get a CRT tester/rejuvenator, such as the Sencore CR70 or CR7000, you can extend the life of the tube if it starts to fail.

There's a whole thread on hardforum dedicated to this monitor. It's been going strong since 2005.
Spacediver, do HDMI to VGA converters add much latency? I was a sad puppy the day my Sony 21" CRT died.

Did they ever make CRT's that natively used DVI-I interfaces?

That's an interesting thread over on hardforum -- there are some diehard fans of this Sony fw900.

Why did Sony recommend running this CRT at less than its highest resolution?
 
Quote:
Originally Posted by 8051 View Post

Spacediver, do HDMI to VGA converters add much latency? I was a sad puppy the day my Sony 21" CRT died.
I believe it depends on which one. From what I understand, the HDFury converters are excellent, but for now, will not support super high resolutions combined with high refresh rates (once we get 400MHZ ramdacs on them, then things will be good).
Quote:
Originally Posted by 8051 View Post

Did they ever make CRT's that natively used DVI-I interfaces?
I think for TV's yes (I believe some of them had HDMI interfaces also), but I don't think for monitors.
Quote:
Why did Sony recommend running this CRT at less than its highest resolution?
I believe it's because 1920x1200 (considered the prime mode) is the highest resolution that can more or less fully resolve each "pixel". If I have time in the future, I may perform proper tests to confirm this.
 
I'm sure there are CRT's with HDMI as I had one, it was a Philips 32PW9551 "32 inch.
Still no, won't use it for anything serious these days as it runs 1080i (not P) and you will still see the screen pixels.
 
Quote:
Originally Posted by 8051 View Post

Why did Sony recommend running this CRT at less than its highest resolution?
This is typical of CRTs.

You almost never want to max out the resolution because then the refresh rate would be in the toilet and refresh rate on a CRT matters much more than on an LCD.

I had a pretty high-end CRT back in the day (I always wanted one of the FW900s but it was a bit pricey), and it could run 2048*1536...but only at a maximum of 66Hz. I normally used 1600*1200 @ 90Hz, because that was the sweet spot. For faster paced games I sometimes went down to 1024*768 @ 170Hz.
Quote:
Originally Posted by spacediver View Post

I believe it's because 1920x1200 (considered the prime mode) is the highest resolution that can more or less fully resolve each "pixel". If I have time in the future, I may perform proper tests to confirm this.
This is true to, an extent, as well.

It was actually kind of a nice feature...being able to run resolutions past what the shadow mask supported and getting a form of free analog anti-aliasing...if you could tolerate the rather low refresh rates this usually meant.
 
If you're talking about scaler based supersampling, before DSR/VSR came out through drivers, some LCD monitors can supersample in the scaler also. Some Dell LCD's were capable of accepting an EDID override up to 3200x1800 (something around here) at 60hz, and 3840x2160@30hz. Many of the Benq 1080p 144hz monitors (based on the Mstar scaler) can run 2560x1440@100hz. Granted, CRT's doing this type of scaling looked better than LCD's...
 
Discussion starter · #9 ·
Quote:
Originally Posted by Blameless View Post

This is typical of CRTs.

You almost never want to max out the resolution because then the refresh rate would be in the toilet and refresh rate on a CRT matters much more than on an LCD.

I had a pretty high-end CRT back in the day (I always wanted one of the FW900s but it was a bit pricey), and it could run 2048*1536...but only at a maximum of 66Hz. I normally used 1600*1200 @ 90Hz, because that was the sweet spot. For faster paced games I sometimes went down to 1024*768 @ 170Hz.
This is true to, an extent, as well.

.
Yes I guess $2300 US could be a bit pricey for a CRT.
 
Discussion starter · #10 ·
Quote:
Originally Posted by Falkentyne View Post

If you're talking about scaler based supersampling, before DSR/VSR came out through drivers, some LCD monitors can supersample in the scaler also. Some Dell LCD's were capable of accepting an EDID override up to 3200x1800 (something around here) at 60hz, and 3840x2160@30hz. Many of the Benq 1080p 144hz monitors (based on the Mstar scaler) can run 2560x1440@100hz. Granted, CRT's doing this type of scaling looked better than LCD's...
I wonder if this type of LCD supersampling increased latency?
 
There's always some increased latency when using display scaling of any sort. How much ms depends on the electronics. I believe it's 1-2 total frames of latency (at least this is what the Eizo Foris FS2735 manual says about using display scaling!). It's the same increased latency when using display scaling (not GPU scaling) of a lower resolution (like 800x600) vs a higher resolution (like 2560x1440 on a 1920x1080 monitor), but this is only when using the monitor image scaling settings. GPU scaling also adds latency but that's video card/driver related. However, using the 1:1 and Aspect ratio settings adds more latency than using "Full screen". Naturally only "full" is available when doing scaler supersampling, so that only applies for lower resolutions.
 
Discussion starter · #14 ·
Quote:
Originally Posted by Falkentyne View Post

There's always some increased latency when using display scaling of any sort. How much ms depends on the electronics. I believe it's 1-2 total frames of latency (at least this is what the Eizo Foris FS2735 manual says about using display scaling!). It's the same increased latency when using display scaling (not GPU scaling) of a lower resolution (like 800x600) vs a higher resolution (like 2560x1440 on a 1920x1080 monitor), but this is only when using the monitor image scaling settings. GPU scaling also adds latency but that's video card/driver related. However, using the 1:1 and Aspect ratio settings adds more latency than using "Full screen". Naturally only "full" is available when doing scaler supersampling, so that only applies for lower resolutions.
On CRT's the latency at lower resolutions would decrease because you could crank the vertical refresh rates higher.
 
Quote:
Originally Posted by Falkentyne View Post

If you're talking about scaler based supersampling
CRTs didn't use scalers and that's not the phenomena responsible for the slight blur seen at the extreme end of a CRT's usable resolutions.

The shadow mask or aperture grill has a finite resolution (dot pitch/stripe pitch) and scanning above this blurs the edges of pixels.
 
This is the best gaming monitor ever released, TBH. Remember that CRT is much better technology than LCD, it's just bulky and problematic with setup.

I would NEVER buy one nowadays tho, because FD Trinitron tubes after 20 000 hours are usually junk.

It's a novelty & a way to scam people right off their money for a retro holy grail.

Hell, all retro is ridiculously overpriced nowadays, to a point where you're better off not caring for it.
 
Discussion starter · #18 ·
Quote:
Originally Posted by Astreon View Post

This is the best gaming monitor ever released, TBH. Remember that CRT is much better technology than LCD, it's just bulky and problematic with setup.

I would NEVER buy one nowadays tho, because FD Trinitron tubes after 20 000 hours are usually junk.

It's a novelty & a way to scam people right off their money for a retro holy grail.

Hell, all retro is ridiculously overpriced nowadays, to a point where you're better off not caring for it.
Is there no way to get the Trinitron tubes replaced or reconditioned?

At least this used Sony CRT is cheaper than it was new (even cheaper if you consider over a decade of inflation).
 
1 - 20 of 42 Posts