The new IBM 9Mpixel monitor works reasonably well but it is a colour display and only displays 8 bits. It has two graphics cards but the high resolution means with some software you can hardly see the icons/text etc.
It is not built for the medical market and for the money I think a pair of displays which are designed for Medical use and provide a ten bit display are better value. Companies like Eizo and Barco have a better solution.
In response to Michael Sparks question, after comparing with several competitors we have chosen Tokoku ME203L grey scales TFT for our Radiologists workstations. These will operate in conjunction with a Dell 19inch colour TFT.
Royal Bournemouth Hospital (Presently implementing a Storcomm PACS)
Does anyone have any good data on whether monochrome monitors/graphics cards set to display at 10 bit greyscale (1024 shades of grey) are perceptively any better than when the same monitors are set to display at 8 bit (256) greyscale. Conventional teaching was that if the human eye cannot resolve more than 256 shades of grey (if they are evenly spaced). Is this not the case with modern high brightness displays?
Also, does anyone have a view whether blue filter LCDs offer any real advantage over clear LCDs for greyscale reporting - or is it more down to personal preference. Can any of the manufactrers tell us which are more popular? Thanks.
[Antonio Antonini] The number of grey levels the human eye can perceive depends on a number of factors: the human eye itself (depending on the personal visual acuteness, the age, the health status of the organ), the contrast ratio of the picture (depending on the max brightness, the black level, the ambient light, the surface reflections), the color of the picture (the human eye visual perception depends also on the frequency of the stimulation), the grey scale resolution of the picture (i.e. the number of TRUE grey levels visible on the screen AFTER linearization. Given all the above, when we talk about the display monitor, these are the factors that increase the chances to see a sufficient number of grey levels for a good diagnosis, in order of importance: number of original grey levels, contrast ratio, blue tint.
[Antonio Antonini] If it is true that 256 UNIQUE grey levels are believed to be sufficient for a good diagnosis, the real possibility to be able to perceive them all (better to read the difference between one and the next - or the previous) depends on the viewer as well as on the environmental conditions. A display with limited capabilities might do the job in ideal ambient conditions and with a 'high performing' viewer but a high quality display will extend the range of correct utilization to more viewers and to diversified ambient conditions. firstname.lastname@example.org is my email address for Rhidian Bramley and for others interested in more details.
Thanks Antonio. I'm not sure I've explained the question clearly enough.
If you use the same monitor, same viewing conditions, brightness, contrast ratio etc - in fact everything the same EXCEPT changing the greyscale display settings from 10 bit (1024 unique grey levels) to 8 bit (256 unique grey levels) - can the human eye tell the difference?
Sorry if I did not understand well your question. Before answering I need to clarify any picture displayed on a screen connected to a workstation based on a Windows Operating System will be able to display not more than 256 levels of grey. This because Windows is an 8 bit OS and cannot go beyond this limit. If you use an 8 bit video board what happens is that during the DICOM gamma curve correction some of the grey levels of the original picture are lost. Using a 10 bit LUT (look up table) you make sure you send to the screen 256 'unique' levels of grey or, in other words, you do not loose greyscale information during the DICOM correction. If you want more details on this subject I can send you some literature. Going back to your question, we can restate it this way: can I see the differences between a 10 bit and an 8 bit system? My answer is, unfortunately: it depends. We can say the answer get closer to 'yes' if you improve the viewing conditions for both systems (better viewer, higher contrast, lower ambient light/reflections, more bluish light) and closer to 'no' if, again on both systems, you go the opposite direction (poor visual acuteness by the viewer, loser contrast, higher ambient light, more reddish or greenish light). To deal with this subject the JND (Just Noticeable Difference) concept has been introduced. In other words, if the basic performances (like the contrast ratio or viewer acuteness) and environmental conditions (like ambient light or reflections) are poor, you can do a great job with the other parameters (like the number of grey levels) but the viewer will never be able to perceive them. Antonio (email@example.com)
I thought windows after win95 was a 16 bit system or am I wrong?
posted on Thursday, November 18, 2004 - 11:09 am
Ummm... Well, yes and no. Windows OS was 16-bit from Windows 3.1. ME was structurally based on 16-bit DOS, although it is a native 32-bit OS. NT had a full 32-bit engine. W2000 was mainly a 32-bit OS using the NT code base, but 64-bit versions also came out for Intel's Itanium processor. XP is, well, 32-bit but.......
Are we confusing 8-bit/16-bit/32-bit applications (programs) with 8-bit/16-bit/32-bit Operating Systems?
I think what people are talking about is bit depth per pixel. Obviously 8 bits would only allow 256 shades per pixel. I am not aware of any modern display hardware which only allows 8 bits per pixel, and the OS,s are all at least 16 bit so I was confused as to what Antonio was referring to.
As Alan Carter said is it not a question of confusing Hardware / software / and ops systems. While modern Ops systems and hardware are 16 or 32 bit some applications software are only 8 bit. This is particularly true of web viewers, where nearly all are restricted to 8 bit
This because some web viewer developments come from a purely 'image display' point of view, whereas others come from a 'medical imaging' point of view.
One is a 'medical imaging' perspective as to what image display means (requiring full bit depth for manipulation), and the other is a more IT based approach (simply filling the graphics card memory buffer with a snapshot of the image in a static way that matches the frame buffer display).
posted on Thursday, November 18, 2004 - 04:16 pm
Do not confuse whether an operating system is 16/32 or soon 64 bits with the way the graphics card is driven by Windows.
In maximum colour resolution mode Windows uses 24 bits. The 32 bit mode just packs the 24 bits loosely to allow the machine to run faster. The down side of this is that more memory is used.
The human eye can definitely detect more than 22 grey scales. Back in the days of MS/DOS and ancient VGA cards I had to re-program the pallette registers to get 256 grey scales. I did some test programs to set the screen at 64, 128 and 256 grey scales with CT-Brains. You could definitely see the difference between 64 and 128. 128 and 256 was not so obvious.
I think that the grey scales on old TV test cards were more of a product of the limitation of the cameras and the screens.