Overclock.net › Forums › Components › Monitors and Displays › [Official] The Qnix/X-Star 1440p Monitor Club
New Posts  All Forums:Forum Nav:

[Official] The Qnix/X-Star 1440p Monitor Club - Page 529

post #5281 of 25911
Quote:
Originally Posted by Shock96 View Post

Why is it UPS's fault that you can't hear the doorbell?

because they don't read the note that says "KNOCK ON THE SIDE DOOR" nor leave the package when the note said to leave it.
Supa's Beast
(14 items)
 
  
CPUMotherboardGraphicsRAM
i5-6660 Gigabyte GA-Z170N-Gaming 5 GTX670-DC2-2GD5 G.Skill TridentZ Series 16GB (2 x 8GB) DDR4-3200 
Hard DriveCoolingOSMonitor
Samsung 950 PRO 512GB M.2-2280 CORSAIR H100 Liquid CPU Cooler Windows 10 Pro 64-bit BenQ 2710PT 
MonitorKeyboardPowerCase
Acer Predator XB271HU 165 Hz Ducky DK9087 Shine II TKL Blue LED (Brown) Corsair SF​600 Ncase M1 V5 
MouseMouse Pad
Logitech g502 Cloth Perixx 
  hide details  
Reply
Supa's Beast
(14 items)
 
  
CPUMotherboardGraphicsRAM
i5-6660 Gigabyte GA-Z170N-Gaming 5 GTX670-DC2-2GD5 G.Skill TridentZ Series 16GB (2 x 8GB) DDR4-3200 
Hard DriveCoolingOSMonitor
Samsung 950 PRO 512GB M.2-2280 CORSAIR H100 Liquid CPU Cooler Windows 10 Pro 64-bit BenQ 2710PT 
MonitorKeyboardPowerCase
Acer Predator XB271HU 165 Hz Ducky DK9087 Shine II TKL Blue LED (Brown) Corsair SF​600 Ncase M1 V5 
MouseMouse Pad
Logitech g502 Cloth Perixx 
  hide details  
Reply
post #5282 of 25911
Quote:
Originally Posted by exzacklyright View Post

because they don't read the note that says "KNOCK ON THE SIDE DOOR" nor leave the package when the note said to leave it.

Although i agree it is pretty lazy of the individual, there is no policy that holds delivery drivers liable to folow any directions that are not attached to the package upon shipment. Therefor however lazy it may have been he didn't do anything "wrong". USPS and FedEx internal policies are the same or worse(in reference to this topic). I know as the company i work for uses all of them.
post #5283 of 25911
Ok, I just received my matte Qnix about 20mins ago. No dead pixels or BLB and went straight to 120Hz. I have tried all of the 120Hz profiles floating around and in the OP and the color is still really warm and yellow. This is a bit scary since my Shimian is dead on the white point and is much more blue relative to the Qnix. Is there anything I can do or am I doing something wrong? I'm quite worried as to why all of the color profiles I've tried don't calibrate the white point to 6500K like my Shimian is at.
post #5284 of 25911
So I will probably update this post as I feel I should make comparisons between this monitor which I am going to dub the God LCD at least for someone like me and compare it to the God CRT the Sony GDM-FW900 vs the QNIX QX2710(jet black bezel is better for movies in my book).

Overall the picture quality and color reproduction of the CRT are unmatched due in part to the invisible aperture grill and the nature of the diffusion of the electron gun beam through the space of the vacuum tube which is the feature I feared most moving to this QNIX but this has not been the case it is extremely hard to make out the pixels on this screen as on the FW900(the aperture grill is more apparent in games) and the text is superiorly readable just as stated by tek syndicate on their youtube channel but this is not where the QNIX truly shines it shines the most in video playback and I have watched Oblivion 1080P at a very high bitrate/quality x264 stream and it was more enjoyable than the fw900 mostly because of the size and the geometric uniformity of the panel and because the natural shading and grain of film is somewhat obscured and enhanced by the backlight LCD effect which is how film is supposed to be projected through a film onto a screen which is what an lcd is albeit in a far more compact configuration. The GDM-FW900 CRT is too precise for video and even tho it is the most pixel dense CRT I have ever seen and its response is very fast it is not as fast as the QNIX which owes its speed to its internal rendering chips which is an aspect of monitors seldom read about in commercial or ametuer reviews the various chips that make up the pixel engine giving the QNIX the upper hand in overall response time even tho the individual pixels of the monitor will never match the speed of the electron gun beam on phosphors. The CRT shows too many flaws in the source material and it comes off looking creamy and cheap and too much on the surface of the screen while the LCD has an interesting amount of depth to its rendering and while the QNIX with its increased size and resolution overall shows more detail it is not more detailed per unit area like the CRT.

When it comes to gaming and general OS usage the GDM-FW900 is the clear winner the webpages, static images, documents, wallpaper all render more realistically(better color reproduction) on the flat paper like surface of glowing milky phosphors smoothed by the diffusion of the electron gun beam at super high resolution and pixel density. This is most evident in what I am currently thinking should have the graphics crown Borderlands 2. With SweetFX enabled Borderlands 2 becomes moving art on a CRT the milky sheet of phosphors paper effect and SweetFX anti-aliasing really fool the mind as you watch the mountains in the distance grow in size with pixel perfect precision on approach. Walking around in the game feels seamless continuous smooth creamy no holes no empty space just pure continuous graphics drawn as if by a Japanese lithographer glowing Eridium and E-tech bullets shimmer and glow an effect which seems to be lost on the LCD and the aperture grill becomes very obvious on the LCD but once again the offsetting saving grace of the QNIX, its shear size, continues to make gaming immersive and its faster overall response time due to the smaller lithography of its internal pixel processors which make the dynamic images more rapid in many interesting and intriguing ways. The QNIX upon its introduction appears in the mind to be 2x the size of the FW900 but in reality is probably closer to 1.5x the size in overall area or possibly 1.33 now looking it up on a chart. The pitfals of the QNIX are its geometric uniformity and LCD backlight that introduce the most distracting artifacts into the image. The most distracting and destructive of all on Borderlands 2 are the horizontal aliasing due to vertical pixel alignment and because it is a geometric printed screen without an electron gun beam to diffuse and soften the edges there is an panel horizontal aliasing that cannot be removed by the graphics card as the pixel is either one color or another and cannot be a blend of 2 at the same time as a crt can due to beam diffusion. The tearing is far more noticeable and is especially bad on older games with smaller world scale and higher frame rates once again a product of transistors and a digital screen. Another distracting feature of a translucent/parent screen is the swizzle effect of pixel glimmer somewhat below the threshold of conscious perception but is noticeable in the area at the center of the eye's focus intermingling with an imperceptible, except for the swizzle, aperture grill. Over time this can be forgiven as well as the dpi reduction because the extra size and vision realestate still allows faster and more consuming immersion with eye real estate closer to that of imax and the superior 16:9 aspect ratio.

Another fear I had was game playability and this topic of ghosting. Now upon receiving the panel this is no longer a concern(all of these observations have been at 60hz). People in the lightboost monitor clubs and postings all make it seem that only a lightboost monitor is capable of pulling off a flick rail in quake live(not even quake 3 once again my noob detector is alight) and all I have to say about that is to call bullcrap. I was pulling flick rails in Borderlands 2 CS GO Quake 3 Quake2 and the only thing bad about it is the screen tearing which sadly is unavoidable on LCD without some form of VSync. I could make out all the frames inbetween(nothing even close or as bad as motion blur) no horrible gray mush as would be encountered on early LCDs. Sure there was some ghosting in there but let me tell you even phosphors have a major ghosting effect. I have a few Sony CRTS and they all ghost probably even more so than this QNIX because it is a fact of phosphor glow physics. Just because the QNIX ghost is a digital artifact of a transistor screen emulating the line by line rendering a CRT does not mean it is very noticeable or a major factor in your PC gaming experience. Sure when you turn your camera up to 30 to 60 frames per second and play it back in slow motion you can see the difference and because humans can perceive differences in frame rate qualities into the hundreds or thousands of frames or even possibly hundreds of thousands does not mean you can perceive individual frames flying by at 60 frames per second. I don't understand all this fuss about ghosting as on this monitor I don't see any problem at all and I have been gaming for 13 years on the God CRT and am currently Ranked No1 in wins in Crysis 3. Don't let this be a factor in your purchase. I think the superior image quality of IPS/PLS color rendition are the deciding factor and now am confident that my gut made all the right decisions in chosing this monitor to move forward in my hardware acquisitions of ever increasing quality.

So In summary and to be updated at higher refresh rates. The advantage of the QNIX is size and resolution giving an overall higher perceived detail especially in film but the apparence of the aperture grill in games kills this detail effect(I'm also starting to think 27 inches 16:9 is a more appropritate desktop screen size and configuration) The QNIX LCD projection film effect is also superior for video; size with comparable high density resolution are its strong suits. Its' geometric uniformity is also interesting and disorienting(in an interesting new detail kind of way not disorienting as in nauseated or unwanted) especially at the edges of the screen for those used to the CRT fishbowl effect. The Sony GDM-FW900 is still the overall DPI champ tho its smaller screen seems to make its scenes diminutive and its creamy phosphors provide a superior desktop and paper effect this same effect is inferior when it comes to film and cheapens media player classic home cinema. The CRT electron gun beam diffusion also provides for a smoother image while still providing incredible detail lessening the need for aliasing and completely outclassing the digial horizontal aliasing effect of a digital screen with vertically aligned pixels, which cannot be at the edge of a beam diffusion having a gradient across the pixel it must be one color or another creating aliasing at the edges of contrasting lines. The QNIXs much faster pixel chips completely outclass the internal rendering of the Sony with its 13 year older technology. No doubt the QNIX has superior internal codecs whether they be hardware or software that add cleaner focus, overall clarity and transparency, superior speed, and overall latency reduction(in the internal logic space of the monitor). It is very interesting because each of these monitors is the opposite at varying levels of abstraction of its function but overall in the end come out comparable even with a 13 year gap in release. To finish the CRT vs LCD debate I just need the impossible, a 27" 2560x1440 crt with enhanced internals, but this will never happen because Mitsubishi used to make the tubes and this tooling has been scrapped plus it is far more expensive to ship a CRT tube. The GDM-FW900 feels like it weighs 100 pounds(talk of the depth of a crt monitor is for noobs, besides I'm sure most of you have nerd headsets and when do you ever look at the back of the monitor just get out already and go buy a mac so you can look pretty in a movie). Also focusing a beam at a 27inch screen size at that resolution is not easy and was quite a feat on the FW.

Sorry for long post but my eyes getting screwed by the new GOD Scrane.
post #5285 of 25911
Quote:
Originally Posted by Barc0de View Post

Ok, my Paypal account was used (not sure if hacked or just racked charges on my CC) and nearly $3000 has been charged all to Korean names just about a week after my purchase. This is the only transaction through Paypal I have done in 3 months. Pretty damn sure my CC got around, I bought from !BEWARE! dream-seller !BEWARE! I am only praying they haven't been through my bank account (can't access it right now to check, online banking down atm) and being thankful I used my CC rather than a direct withdrawal.

Hmmm that sucks, I am loving my purchase with dream-seller, only thing that sucked with dream-seller that it took forever to make it in the states and he didn't bubble wrap the outside of the box but no biggie no holes in box or any damage to the monitor. My Qnix had very small BLB but I fixed that with the tape mod and now I hardly notice the BLB and I put on a vesa mount stand and got the monitor overclocked 110Hz anything over 110 and I get lines no dead or stuck pixels, I'm pretty happy games look amazing with this monitor.
Gaming rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 920 @ 3.8GHz GIGABYTE GA-EX58-UD5 EVGA GTX 780 SC w/ACX CORSAIR DOMINATOR 12GB 
Hard DriveOptical DriveOSMonitor
Samsung 840 Pro 256GB, SSD Western Digital VelociRaptor 300GB Windows 8 Pro QNIX QX2710 LED Evolution ll 27" 
KeyboardPowerCaseMouse
Corsair K70 CORSAIR-HX 1050W Antec 1200 Razar Deathadder 3.5G 
Audio
Creative Sound Blaster X-Fi Titanium HD 
  hide details  
Reply
Gaming rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 920 @ 3.8GHz GIGABYTE GA-EX58-UD5 EVGA GTX 780 SC w/ACX CORSAIR DOMINATOR 12GB 
Hard DriveOptical DriveOSMonitor
Samsung 840 Pro 256GB, SSD Western Digital VelociRaptor 300GB Windows 8 Pro QNIX QX2710 LED Evolution ll 27" 
KeyboardPowerCaseMouse
Corsair K70 CORSAIR-HX 1050W Antec 1200 Razar Deathadder 3.5G 
Audio
Creative Sound Blaster X-Fi Titanium HD 
  hide details  
Reply
post #5286 of 25911
Quote:
Originally Posted by ColdFlo View Post

So I will probably update this post as I feel I should make comparisons between this monitor which I am going to dub the God LCD at least for someone like me and compare it to the God CRT the Sony GDM-FW900 vs the QNIX QX2710(jet black bezel is better for movies in my book).

Overall the picture quality and color reproduction of the CRT are unmatched due in part to the invisible aperture grill and the nature of the diffusion of the electron gun beam through the space of the vacuum tube which is the feature I feared most moving to this QNIX but this has not been the case it is extremely hard to make out the pixels on this screen as on the FW900(the aperture grill is more apparent in games) and the text is superiorly readable just as stated by tek syndicate on their youtube channel but this is not where the QNIX truly shines it shines the most in video playback and I have watched Oblivion 1080P at a very high bitrate/quality x264 stream and it was more enjoyable than the fw900 mostly because of the size and the geometric uniformity of the panel and because the natural shading and grain of film is somewhat obscured and enhanced by the backlight LCD effect which is how film is supposed to be projected through a film onto a screen which is what an lcd is albeit in a far more compact configuration. The GDM-FW900 CRT is too precise for video and even tho it is the most pixel dense CRT I have ever seen and its response is very fast it is not as fast as the QNIX which owes its speed to its internal rendering chips which is an aspect of monitors seldom read about in commercial or ametuer reviews the various chips that make up the pixel engine giving the QNIX the upper hand in overall response time even tho the individual pixels of the monitor will never match the speed of the electron gun beam on phosphors. The CRT shows too many flaws in the source material and it comes off looking creamy and cheap and too much on the surface of the screen while the LCD has an interesting amount of depth to its rendering and while the QNIX with its increased size and resolution overall shows more detail it is not more detailed per unit area like the CRT.

When it comes to gaming and general OS usage the GDM-FW900 is the clear winner the webpages, static images, documents, wallpaper all render more realistically(better color reproduction) on the flat paper like surface of glowing milky phosphors smoothed by the diffusion of the electron gun beam at super high resolution and pixel density. This is most evident in what I am currently thinking should have the graphics crown Borderlands 2. With SweetFX enabled Borderlands 2 becomes moving art on a CRT the milky sheet of phosphors paper effect and SweetFX anti-aliasing really fool the mind as you watch the mountains in the distance grow in size with pixel perfect precision on approach. Walking around in the game feels seamless continuous smooth creamy no holes no empty space just pure continuous graphics drawn as if by a Japanese lithographer glowing Eridium and E-tech bullets shimmer and glow an effect which seems to be lost on the LCD and the aperture grill becomes very obvious on the LCD but once again the offsetting saving grace of the QNIX, its shear size, continues to make gaming immersive and its faster overall response time due to the smaller lithography of its internal pixel processors which make the dynamic images more rapid in many interesting and intriguing ways. The QNIX upon its introduction appears in the mind to be 2x the size of the FW900 but in reality is probably closer to 1.5x the size in overall area or possibly 1.33 now looking it up on a chart. The pitfals of the QNIX are its geometric uniformity and LCD backlight that introduce the most distracting artifacts into the image. The most distracting and destructive of all on Borderlands 2 are the horizontal aliasing due to vertical pixel alignment and because it is a geometric printed screen without an electron gun beam to diffuse and soften the edges there is an panel horizontal aliasing that cannot be removed by the graphics card as the pixel is either one color or another and cannot be a blend of 2 at the same time as a crt can due to beam diffusion. The tearing is far more noticeable and is especially bad on older games with smaller world scale and higher frame rates once again a product of transistors and a digital screen. Another distracting feature of a translucent/parent screen is the swizzle effect of pixel glimmer somewhat below the threshold of conscious perception but is noticeable in the area at the center of the eye's focus intermingling with an imperceptible, except for the swizzle, aperture grill. Over time this can be forgiven as well as the dpi reduction because the extra size and vision realestate still allows faster and more consuming immersion with eye real estate closer to that of imax and the superior 16:9 aspect ratio.

Another fear I had was game playability and this topic of ghosting. Now upon receiving the panel this is no longer a concern(all of these observations have been at 60hz). People in the lightboost monitor clubs and postings all make it seem that only a lightboost monitor is capable of pulling off a flick rail in quake live(not even quake 3 once again my noob detector is alight) and all I have to say about that is to call bullcrap. I was pulling flick rails in Borderlands 2 CS GO Quake 3 Quake2 and the only thing bad about it is the screen tearing which sadly is unavoidable on LCD without some form of VSync. I could make out all the frames inbetween(nothing even close or as bad as motion blur) no horrible gray mush as would be encountered on early LCDs. Sure there was some ghosting in there but let me tell you even phosphors have a major ghosting effect. I have a few Sony CRTS and they all ghost probably even more so than this QNIX because it is a fact of phosphor glow physics. Just because the QNIX ghost is a digital artifact of a transistor screen emulating the line by line rendering a CRT does not mean it is very noticeable or a major factor in your PC gaming experience. Sure when you turn your camera up to 30 to 60 frames per second and play it back in slow motion you can see the difference and because humans can perceive differences in frame rate qualities into the hundreds or thousands of frames or even possibly hundreds of thousands does not mean you can perceive individual frames flying by at 60 frames per second. I don't understand all this fuss about ghosting as on this monitor I don't see any problem at all and I have been gaming for 13 years on the God CRT and am currently Ranked No1 in wins in Crysis 3. Don't let this be a factor in your purchase. I think the superior image quality of IPS/PLS color rendition are the deciding factor and now am confident that my gut made all the right decisions in chosing this monitor to move forward in my hardware acquisitions of ever increasing quality.

So In summary and to be updated at higher refresh rates. The advantage of the QNIX is size and resolution giving an overall higher perceived detail especially in film but the apparence of the aperture grill in games kills this detail effect(I'm also starting to think 27 inches 16:9 is a more appropritate desktop screen size and configuration) The QNIX LCD projection film effect is also superior for video; size with comparable high density resolution are its strong suits. Its' geometric uniformity is also interesting and disorienting(in an interesting new detail kind of way not disorienting as in nauseated or unwanted) especially at the edges of the screen for those used to the CRT fishbowl effect. The Sony GDM-FW900 is still the overall DPI champ tho its smaller screen seems to make its scenes diminutive and its creamy phosphors provide a superior desktop and paper effect this same effect is inferior when it comes to film and cheapens media player classic home cinema. The CRT electron gun beam diffusion also provides for a smoother image while still providing incredible detail lessening the need for aliasing and completely outclassing the digial horizontal aliasing effect of a digital screen with vertically aligned pixels, which cannot be at the edge of a beam diffusion having a gradient across the pixel it must be one color or another creating aliasing at the edges of contrasting lines. The QNIXs much faster pixel chips completely outclass the internal rendering of the Sony with its 13 year older technology. No doubt the QNIX has superior internal codecs whether they be hardware or software that add cleaner focus, overall clarity and transparency, superior speed, and overall latency reduction(in the internal logic space of the monitor). It is very interesting because each of these monitors is the opposite at varying levels of abstraction of its function but overall in the end come out comparable even with a 13 year gap in release. To finish the CRT vs LCD debate I just need the impossible, a 27" 2560x1440 crt with enhanced internals, but this will never happen because Mitsubishi used to make the tubes and this tooling has been scrapped plus it is far more expensive to ship a CRT tube. The GDM-FW900 feels like it weighs 100 pounds(talk of the depth of a crt monitor is for noobs, besides I'm sure most of you have nerd headsets and when do you ever look at the back of the monitor just get out already and go buy a mac so you can look pretty in a movie). Also focusing a beam at a 27inch screen size at that resolution is not easy and was quite a feat on the FW.

Sorry for long post but my eyes getting screwed by the new GOD Scrane.

Awesome feedback from CRT...was a bit long but so are my posts lol.
post #5287 of 25911
Finally got my Qnix (matte). Ordered Wednesday from ipsledmonitors.com and received it today (Monday). But it was shipped from South Korea, not California.



No dead pixels. Overclocked to 120hz without any fuss, although the colors did seem sort of wrong/washed when overclocked? Ended up using Windows color management and upped the gamma.

Minimal backlight bleed, although I guess at this point I'm used to it. There is a small smudge in the bottom right corner. Hard to notice unless you're looking for it, but slightly annoying when you do find it. Not sure how to take an accurate picture of it, and not sure what to do about it either. Would opening up the monitor do anything?

Also I'm using their bundled DVI cable.
post #5288 of 25911
Can someone please tell me if it's normal for the entire monitor to have a reddish/warm hue despite trying every ICC profile on this thread and on google for the QNIX? Is this from another batch of panels which I need to calibrate differently/manually?
post #5289 of 25911
My [Perfect Pixel] FREE EXPRESS X-STAR DP2710LED 27" 2560x1440 Samsung PLS "Matte" just arrived today and looks great it was 308.00 + 49.99 for a 3 year warranty, I am very pleased so far but haven't tried oc it partly cause I don't know how to. However I was playing Mortal kombat and it did not give me the option of 2560x1440 res, Can I fix that? anyway heres a pic

1596980
Uploaded with ImageShack.us
post #5290 of 25911
Quote:
Originally Posted by ronquilent View Post

Can someone please tell me if it's normal for the entire monitor to have a reddish/warm hue despite trying every ICC profile on this thread and on google for the QNIX? Is this from another batch of panels which I need to calibrate differently/manually?

ICC profiles are mostly per monitor as each monitor is slightly different. It is normal for the monitor to potentially have a slight hue in any direction. It will also have a bit better blacks than IPS which can cause a more creamy look on the IPS panel and a more warn look on the Qnix. That part may never completely go away. I also did not have too much luck with the posted ICC profiles. In this instance yes you will likely have to calibrate manually either with windows color management, drivers or a professional colorometer. This site is the best calibration site to go by for the most part and the rest of it will be your personal preference http://www.lagom.nl/lcd-test/

All monitors are different when it comes to color hue and correction yours just didn't closely match anyone else's that posted a ICC profile no reason to get un-settled.

EDIT: I recommend using the drivers as many games ignore ICC profiles anyway. It is not that hard to get really close. If you are really worried about color correction go PRO and get a spyder calibrator otherwise get as close as possible with driver calibration and that website + your preference.

EDIT2: Lastly to change the color ration from each color adjust the brightness of the color that is overpowering the screen down by itself.
Edited by Spartan F8 - 8/5/13 at 5:25pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
Overclock.net › Forums › Components › Monitors and Displays › [Official] The Qnix/X-Star 1440p Monitor Club