(Ugh, meant to send this to the whole list, but only sent it to Chris
instead. Sorry.)
On 4/8/2010 12:01 PM, Chris Wolf wrote:
> Not to divert the subject, but...I'm curious why "lcd monitors really just
> don't cut it" for their experiments. The three main limitations I can think
> of for LCD vs. CRT are that their color reproduction is often not as good,
> their dynamic range is not as wide, and their viewing angle is more
> restricted.
>
It's been a while since I myself have looked carefully enough at lcd's
monitors, but in the past, one thing we did do was pull out an
oscilloscope and used a photodiode to do some measurements. I'm going
from memory here, but as I recall, what we found was that while crt
monitors do a fairly good job of going from black to grey, lcd monitors
tended to have a longer period of rampup for the color to get to where
it was supposed to. And ultimately it just proved to be too slow. I
haven't had to worry about it much myself lately because I don't work in
the lab that I used to have to worry about it personally anymore, so
I've fallen out of touch, but from the information that I've had passed
on to me by those that do still care, supposedly this is still something
of an issue (plus some of the color reproductions issues too).
David McFarlane might be able to chip in too on this, he was involved in
doing these tests back when we did them, and he's far more meticulous of
a note taker than I am.
Now, for those that want to have more of a breakdown from someone that
really would know, I'm pasting a posting to the visionscience mailing
list that was forwarded to me on what the current state of things are.
> Hi Deborah, Visionlist,
>
> I have been actively researching future displays for the last three
> years, in part to get to the future of high dynamic range imaging (I
> work in a lab that studies Brightness, a high-frequency capable HDR
> display is one of our white whales).
>
> As you know, most LCDs are unsuitable for time-critical work. This is
> unlikely to change, as most technologies that accelerate the temporal
> response are aimed at gamers and perform in unpredictable ways, or use
> algorithms/techniques that are trade secrets and a PITA to reverse
> engineer. Additionally, most LCD panels (including, at this time,
> every Mac laptop panel) are Twisted Nematic (TN) panels, which are
> only capable of 6 bit color, or worse, do some kind of temporal
> dithering/screwing around to give the appearance of 8 bit color.
> In-Plane-Switching panels have usable color, and there are even
> ten-bit options (12 often claimed in medical grayscale LCDs) but the
> temporal performance on IPS panels is worse than TN. Both exhibit
> slight differences in the time to move from one "gray" to another
> "gray" versus moving from white to black. Recently, the CFL backlight
> tubes of LCDs are being replaced with LED arrays. These LED arrays
> illuminate the edge of a laser-engraved piece of acrylic which is
> supposed to make a homogeneous white field for the LCD to filter. LEDs
> do allow for more careful wavelength selection, to better match the
> bandpass characteristics of the front panel, and sometimes can extend
> the gamut beyond sRGB or NTSC, but they do nothing for the temporal
> properties of the front panel. There have been prototypes of LCD
> displays with AMOLED panels behind them for illumination. This has the
> advantage of being extremely high contrast, but the crossed-polarizers
> inherent to LCD technologies make the transmission rate extremely low;
> the luminance from these AMOLEDs would be reduced by almost 90%.
>
> To evaluate the temporal characteristics of monitors generally, I have
> used instruments ranging from homemade junktronics like a photodiode
> connected to a sound card, to a high-speed video camera. The high
> speed video camera provides very good diagnostic information when
> evaluating monitors. With the 1200hz sampling available on some Casio
> cameras, it is possible to see the backlight pulsing on and off, or to
> see the scanlines on a CRT. There are some issues with the "binning"
> that the camera is doing to obtain such high sampling rates, but it is
> good enough to see major problems with most monitors with nothing more
> than visual observation.
>
> As for extant technology:
>
> You can throw DLP projectors out for nearly any speed-critical task,
> as they do all kinds of weird processing. The problem here is that
> they have a monochrome MEMS imaging element, and they have to pass it
> through a color filter wheel to produce color temporally. This wheel
> also has a clear element, which is used to jack the brightness in some
> areas of the display, but it doesn't necessarily refresh the image
> over the entire imaging area every time. In our Dell 5100mp it is
> updated piecemeal. If you must use DLP, you should use a high-speed
> camera or other sensor to check that your stimuli are being presented
> as you think they are.
>
> LCD projectors are better, but not much better. Hit or miss.
>
> I've recently had some success with the 3-chip LCoS projector (a
> hybrid between LCD and reflective technologies) that came with our
> eLumens Visionstation. Although it is low-contrast and the luminance
> sucks (60cd/m2 max), so far the temporal properties on gratings look
> good at lower frequencies like 30 hz. No artifacts or tearing. LCoS
> might be a good solution in the near term for slower stuff. Our
> projector is a JVC SX21. It may be that the new projectors are much
> better, but I would check to make sure that their claimed "15000/1"
> contrast ratios are not the result of using a variable aperture. Any
> contrast claim over a few hundred to one is probably some nasty trick
> like that. Ensure that you can disable it, in software or by yanking
> the cable internally.
>
> Getting to the point, the future of displays generally will likely be
> OLED, LCD (it's so cheap and established that it is not likely to go
> away), or TMOS.
>
> OLED does not (yet) offer much hope, in my opinion. OLED panels like
> in the small Sony display are typically driven in patches. By this I
> mean that the whole display is subdivided into smaller grids which are
> driven by individual processors, much like in commercial LED signage.
> The timing between these sections is not guaranteed, or may be
> sequential, left to right, top to bottom. If timing is not important
> to you, the contrast of an OLED display is rather good. This is
> because you can actually turn off the little OLED at whatever
> location, and it's hard to get blacker than that. The brightness
> problem with these displays is a thermal issue. To get the LEDs as
> bright as they can go, you must dissipate tons of heat from a very,
> very small area. If you do not dissipate the heat, there is a thermal
> runaway condition where the OLEDs will destroy themselves. As a
> result, I do not see small-pixel OLED displays getting to be much
> brighter than a few hundred cd/m2 until there are a few more
> breakthroughs in OLED efficiency, or active cooling. This held with
> the small Sony panel that I saw in person. It was clearly not very
> bright compared to the TN LCD panels around it, though admittedly the
> colors and contrast looked very usable. Additionally, the Sony XEL-1
> was basically a little Linux computer (running BusyBox, I think) which
> is just another layer of crap to hack to get your stimuli onscreen
> (though admittedly, Linux is vastly more hackable than Windows or
> Mac).
>
> Now the other technology that I mentioned is called TMOS. It was just
> recently announced, it is a brand new type of display that relies on
> MEMS technology, like DLP. Personally, I am very excited about this
> technology. The first thing that it has going for it is that it needs
> no new fabrication facilities. It can be made in ordinary LCD
> fabrication plants. That means the time-to-market should be short
> relative to OLED, which is still not cheaply or widely available. TMOS
> means "Time Multiplexed Optical Shutter". Basically, it is a display
> scale DLP device. Each pixel is a little mirror, capable of 2
> microsecond on/off times. They are situated above a backlight/FTIR
> light pipe which is being lit with LEDs that are modulated extremely
> quickly. Color is generated by flashing the mirrored element on and
> off over this blinking backlight as it transitions from R to G to B.
> Early claims from engineering/marketing people are 300hz refresh
> rates. If they meet 20% of that, we won't be doing too badly. And for
> those of us who study vision without color, that backlight can be
> comprised of only white LEDs, allowing for very, very good temporal
> resolution. In addition, the time-critical nature of this display
> (meaning, that the backlight must be refreshed exactly with the
> mirrored pixels, unlike LCD or LCoS, but like DLP) should presumably
> mean that timing is taken seriously with respect to input as well,
> though, since I have seen/analyzed no prototypes, this is just wishful
> thinking/speculation.
>
> I think LCoS may be a good interim solution (especially because JVC is
> trying to work with the high-end market, see , and TMOS may be the
> best future solution. Perhaps the vision community could get in touch
> with UniPixel or Samsung (the TMOS people) and play with
> prototypes/help guide development. It seems that all of us could use a
> standard display with good luminance, 200:1contrast, and fast temporal
> response (reliable 60hz, 8bit per primary), but furthermore, we could
> all use purpose-built displays. Because the TMOS technology is simply
> on-off at its core, there is no reason not to support, for example,
> more than three primaries, infrared plus RGB, or two whole different
> color sets defined by two different sets of primaries. (A photopic and
> scotopic display in one!). People interested in color could select
> their primaries of interest, and people interested in time could
> select fewer primaries to optimize temporal properties. Furthermore,
> since TMOS is completely digital, maybe we can get rid of all those
> nasty analog processors and drive the things ourselves, directly over
> DVI, or some other digital interface. Removing the analog-digital
> conversion step (with all the associated hardware voodoo/signal
> processing) would be a boon to vision researchers everywhere.
>
> In my mind, this is a technology that has the potential to be a magic
> bullet for vision research.They're talking about releases in Q1 2010.
> If you are at all interested, I hope you'll consider making the
> desires of the vision community known to them so we don't lose another
> interesting display down the "cheaper faster crappier" consumer-tech
> plug hole.
>
> Regards,
> Daniel Reetz
>
> PS. Their approach to color-breakup problems is interesting:
> http://www.wipo.int/pctdb/en/wo.jsp?wo=2007016511
|