[Giz]Samsung’s New TV Tech Is Mind-Bending But Why? - Overclock.net - An Overclocking Community

Forum Jump: 

[Giz]Samsung’s New TV Tech Is Mind-Bending But Why?

Reply
 
Thread Tools
post #1 of 13 (permalink) Old 01-08-2019, 11:47 AM - Thread Starter
New to Overclock.net
 
EniGma1987's Avatar
 
Join Date: Sep 2011
Posts: 6,057
Rep: 327 (Unique: 240)
[Giz]Samsung’s New TV Tech Is Mind-Bending But Why?

Quote:
On the eve of the Consumer Electronics Show (CES), there’s an itching feeling in the air that this year isn’t a big year for Samsung. The company unveiled (literally) a smaller version of last year’s big TV, the Wall, featuring MicroLED technology at an event in a ballroom. More than one person wondered, what’s the difference? The answer, awkwardly, is everything and nothing.


...


In addition to the smaller version of the Wall, the company revealed tall, skinny MicroLED displays alongside long, fat displays that broke into pieces. Samsung also had a hulking 219-inch version of the Wall, which frankly looks like the highest resolution digital billboard you’ve ever seen.

Source

EniGma1987 is offline  
Sponsored Links
Advertisement
 
post #2 of 13 (permalink) Old 01-08-2019, 02:01 PM
Frog Blast The Vent Core
 
Join Date: Jan 2014
Posts: 5,950
Rep: 367 (Unique: 181)
The issue with inorganic emissive displays is that if you're using traditional silicon processing, the wafers just aren't big enough to make a TV. Most wafers are 8", with the top end around 12". Bigger than that and you run into major problems with yield (not to mention lithography), and for a wafer large enough to have a full TV on it, the yield would be abysmal. The reason OLED gets around it is that it can just be printed on whatever size substrate you want, the deposition technology and the substrate are completely different.

The "modular" concept is Samsung working around the wafer size limitation. Make individual displays out of a slice of a wafer, and stitch them together, and now you have an inorganic emissive display. But you're still dealing with 2400 square inches of silicon to make a 75" TV, and that's well outside consumer price ranges.

Micro LED will be great for phones. It'll be really hard to use it for TVs.
Mand12 is offline  
post #3 of 13 (permalink) Old 01-08-2019, 02:18 PM
Indentified! On the Way!!
 
LancerVI's Avatar
 
Join Date: May 2012
Posts: 2,496
Rep: 149 (Unique: 111)
Quote: Originally Posted by Mand12 View Post
The issue with inorganic emissive displays is that if you're using traditional silicon processing, the wafers just aren't big enough to make a TV. Most wafers are 8", with the top end around 12". Bigger than that and you run into major problems with yield (not to mention lithography), and for a wafer large enough to have a full TV on it, the yield would be abysmal. The reason OLED gets around it is that it can just be printed on whatever size substrate you want, the deposition technology and the substrate are completely different.

The "modular" concept is Samsung working around the wafer size limitation. Make individual displays out of a slice of a wafer, and stitch them together, and now you have an inorganic emissive display. But you're still dealing with 2400 square inches of silicon to make a 75" TV, and that's well outside consumer price ranges.

Micro LED will be great for phones. It'll be really hard to use it for TVs.
Thank you for this. I was thinking this is the year to buy a new TV and finally make the jump to 4k, but all I've heard lately is how MicroLED is going to crush LED and OLED; so it gave me pause. Based on what you're saying, this won't happen for quite some time, if ever?
LancerVI is offline  
Sponsored Links
Advertisement
 
post #4 of 13 (permalink) Old 01-08-2019, 02:31 PM
New to Overclock.net
 
ILoveHighDPI's Avatar
 
Join Date: Oct 2011
Posts: 3,169
Rep: 132 (Unique: 84)
75" is exactly the size of space I measured out for the largest possible display in my Home Theater area.
If they can bring MicroLED to market in that format, at less than astronomical prices, they'll have a winner of a product.
ILoveHighDPI is offline  
post #5 of 13 (permalink) Old 01-08-2019, 02:45 PM
Frog Blast The Vent Core
 
Join Date: Jan 2014
Posts: 5,950
Rep: 367 (Unique: 181)
Quote: Originally Posted by LancerVI View Post
Thank you for this. I was thinking this is the year to buy a new TV and finally make the jump to 4k, but all I've heard lately is how MicroLED is going to crush LED and OLED; so it gave me pause. Based on what you're saying, this won't happen for quite some time, if ever?
Basically. As long as they're using traditional silicon processing for their inorganic LEDs, wafer size will be a dramatic limitation. OLED doesn't use it, so it can be as large as you want it to be. LCD also isn't fabricated by silicon processing, which is how we got where we are now.

I used a lot of microdisplays in my last job, and you can make emissive inorganic displays using quantum dots (which is --NOT-- what Samsung QLED uses). But these were microdisplays: pixels around 8 microns (with color subpixels) and a total display size of around an inch. It was for helmet-mounted displays, not living rooms.

Inorganic emissive displays will be the best, no question: all the advantages of emissive displays but without the fading issues of OLED. Except they'll be small, unless you build them out of pieces like Samsung claims it will want to do.

But again, that 75" TV is twenty four hundred square inches of silicon. That's about 50 wafers worth for your TV. Wafers. Start counting the limbs you're willing to pony up.

Last edited by Mand12; 01-08-2019 at 02:53 PM.
Mand12 is offline  
post #6 of 13 (permalink) Old 01-08-2019, 02:54 PM - Thread Starter
New to Overclock.net
 
EniGma1987's Avatar
 
Join Date: Sep 2011
Posts: 6,057
Rep: 327 (Unique: 240)
Quote: Originally Posted by Mand12 View Post
Basically. As long as they're using traditional silicon processing for their inorganic LEDs, wafer size will be a dramatic limitation.
Im not really an expert on LEDs and how they are made, but Samsung MicroLED uses gallium-nitride LED tech. Which I think is normal for LEDs used in light bulbs and such on the market so I assume thats standard. Do gallium-nitride LEDs actually get made on silicon wafers? I know there is a lot of research of Gallium Nitride transistors as an actual replacement for silicon, so if they are set to replace silicon then couldnt the GaN LEDs used by Samsung MicroLED be manufactured and processed differently?

EniGma1987 is offline  
post #7 of 13 (permalink) Old 01-08-2019, 02:56 PM
Graphics Junkie
 
UltraMega's Avatar
 
Join Date: Feb 2017
Location: USA
Posts: 589
Rep: 7 (Unique: 7)
Quote: Originally Posted by LancerVI View Post
Thank you for this. I was thinking this is the year to buy a new TV and finally make the jump to 4k, but all I've heard lately is how MicroLED is going to crush LED and OLED; so it gave me pause. Based on what you're saying, this won't happen for quite some time, if ever?

I'd say if you are currently on a 1080p 60hz display, you should definitely upgrade to a cheap 4K TV. If 120hz is not important to you, the TVs out now are great for the price. I got a 43inch samsung 6900 or something, it has full 10-bit HDR and HDR is really cool for the game that support it, and 4K is obvously also great... for about $300. I feel like for the price it's a steal. I'm waiting for 120ha 4K TVs to become more wide spread and affordable for my next upgrade but I think thats still about two years away so I think now is a great time to get a quality HDR 4K TV for the price.

i7 7700kK @4.2ghz
16GB DDR4 3200mhz
GeForce 1080 Ti
UltraMega is offline  
post #8 of 13 (permalink) Old 01-08-2019, 03:10 PM
Overclocking Enthusiast
 
ozlay's Avatar
 
Join Date: Aug 2009
Location: USA
Posts: 5,330
Rep: 268 (Unique: 215)
But why do we have 4k 5'' screens but not 8k 20'' screens. I want a 40'' 16k. What was the 75"? Was it 8k?

In Loving Memory Of Kassandra

Kaby
Rikka
(11 items)
CPU
Intel I5-7600k
Motherboard
Asrock Z270M Extreme 4
GPU
Titan X
GPU
Titan X
RAM
Galax HOF
Hard Drive
850 pro
Hard Drive
960 Pro
Hard Drive
970 Evo
Power Supply
Seasonic Prime Snow
Case
enthoo evolv matx tempered glass
Operating System
windows 10 pro
▲ hide details ▲

Last edited by ozlay; 01-08-2019 at 03:20 PM.
ozlay is offline  
post #9 of 13 (permalink) Old 01-08-2019, 03:19 PM
New to Overclock.net
 
tubers's Avatar
 
Join Date: Mar 2009
Posts: 3,417
Rep: 50 (Unique: 46)
Quote: Originally Posted by UltraMega View Post
I'd say if you are currently on a 1080p 60hz display, you should definitely upgrade to a cheap 4K TV. If 120hz is not important to you, the TVs out now are great for the price. I got a 43inch samsung 6900 or something, it has full 10-bit HDR and HDR is really cool for the game that support it, and 4K is obvously also great... for about $300. I feel like for the price it's a steal. I'm waiting for 120ha 4K TVs to become more wide spread and affordable for my next upgrade but I think thats still about two years away so I think now is a great time to get a quality HDR 4K TV for the price.
Careful with a cheap 4K tv. The motion handling could be worse. At least get to maybe $ 500 since that's the base line "common sales/deals" price for the TCL 6 series that gets you much better HDR and DV support or a better SDR TV.

55 inch tcl 6 has that weird cross-hatch pixels that looks a bit worse specially as a monitor. Vizio also has them. Get a 65 inch from both companies to get proper pixels.

Got cheap TCL 4 and there's an inherent low level of soap opera effect or quite some motion smearing that can't be turned off. Gamma shift and viewing angles quite took a hit. Our 8 year old IPS 1080p tv was superior other than size and deeper blacks.

Lower end Samsungs also have judder problems from 60hz if you're sensitive to that though I'm guessing you can force a PC to output 24p if needed.

That being said, check for deals now until new sets come in anywhere from march to july to possibly see clearance prices of 2018 tvs.

Be careful with Samsungs and check first if you can completely disable GLOBAL DIMMING. There's no consensus if fluctuating brightness depending on scene can truly be rid-off unless you start digging into service menus.

Google nu8000 and q6fn subtitle problems, see if that'll be an issue as well.

If one is anal retentive, maybe get a set with no local dimming or at least not a Samsung that can give you a hard time of completely turning of local and global dimming (2 different can of worms). Sony, Vizio, allows you to completely turn it off and no global dimming problems on their P series and X900F but best ask in av forums or check rtings to be sure.

RTINGS isn't perfect to do cross-references on owner forums and other reviews.
tubers is offline  
post #10 of 13 (permalink) Old 01-08-2019, 04:22 PM
Indentified! On the Way!!
 
LancerVI's Avatar
 
Join Date: May 2012
Posts: 2,496
Rep: 149 (Unique: 111)
Quote: Originally Posted by Mand12 View Post
Basically. As long as they're using traditional silicon processing for their inorganic LEDs, wafer size will be a dramatic limitation. OLED doesn't use it, so it can be as large as you want it to be. LCD also isn't fabricated by silicon processing, which is how we got where we are now.

I used a lot of microdisplays in my last job, and you can make emissive inorganic displays using quantum dots (which is --NOT-- what Samsung QLED uses). But these were microdisplays: pixels around 8 microns (with color subpixels) and a total display size of around an inch. It was for helmet-mounted displays, not living rooms.

Inorganic emissive displays will be the best, no question: all the advantages of emissive displays but without the fading issues of OLED. Except they'll be small, unless you build them out of pieces like Samsung claims it will want to do.

But again, that 75" TV is twenty four hundred square inches of silicon. That's about 50 wafers worth for your TV. Wafers. Start counting the limbs you're willing to pony up.
Quote: Originally Posted by UltraMega View Post
I'd say if you are currently on a 1080p 60hz display, you should definitely upgrade to a cheap 4K TV. If 120hz is not important to you, the TVs out now are great for the price. I got a 43inch samsung 6900 or something, it has full 10-bit HDR and HDR is really cool for the game that support it, and 4K is obvously also great... for about $300. I feel like for the price it's a steal. I'm waiting for 120ha 4K TVs to become more wide spread and affordable for my next upgrade but I think thats still about two years away so I think now is a great time to get a quality HDR 4K TV for the price.
Quote: Originally Posted by tubers View Post
Careful with a cheap 4K tv. The motion handling could be worse. At least get to maybe $ 500 since that's the base line "common sales/deals" price for the TCL 6 series that gets you much better HDR and DV support or a better SDR TV.

55 inch tcl 6 has that weird cross-hatch pixels that looks a bit worse specially as a monitor. Vizio also has them. Get a 65 inch from both companies to get proper pixels.

Got cheap TCL 4 and there's an inherent low level of soap opera effect or quite some motion smearing that can't be turned off. Gamma shift and viewing angles quite took a hit. Our 8 year old IPS 1080p tv was superior other than size and deeper blacks.

Lower end Samsungs also have judder problems from 60hz if you're sensitive to that though I'm guessing you can force a PC to output 24p if needed.

That being said, check for deals now until new sets come in anywhere from march to july to possibly see clearance prices of 2018 tvs.

Be careful with Samsungs and check first if you can completely disable GLOBAL DIMMING. There's no consensus if fluctuating brightness depending on scene can truly be rid-off unless you start digging into service menus.

Google nu8000 and q6fn subtitle problems, see if that'll be an issue as well.

If one is anal retentive, maybe get a set with no local dimming or at least not a Samsung that can give you a hard time of completely turning of local and global dimming (2 different can of worms). Sony, Vizio, allows you to completely turn it off and no global dimming problems on their P series and X900F but best ask in av forums or check rtings to be sure.

RTINGS isn't perfect to do cross-references on owner forums and other reviews.
Thanks to you for the info.

Yes, I'd be moving from a Sammy 65" 1080p @60 (120 Motion blah, blah, blah marketing jargon) to a 65" 4k and it MUST have native 120hz. Huge sports fan and I'm tired of "streaky" baseballs when watching MLB. Too afraid of OLED image retention and burn-in, so I'm going to probably stick with LED.

Last edited by LancerVI; 01-08-2019 at 05:13 PM.
LancerVI is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off