UK Imaging Informatics Group
Monitors - again PreviousNext
UK Imaging Informatics Group > Questions & Answers > PACS Workstations & Display Devices > Archived Subtopics >
Subtopic Last Post Last Poster Posts
Archive through February 04, 200404-02-04  09:12 amNadeem butt15
 Link to this message Tom Naunton Morga  posted on Wednesday, February 04, 2004 - 10:40 pm Edit Post Delete Post Print Post
John, I whole heartedly agree with your comments. On another vain I only found out today that Xograph is the supplier for Barco monitors in the UK. We got given ours and they are very good !!!
 Link to this message Nick Collett  posted on Friday, February 06, 2004 - 08:58 am Edit Post Delete Post Print Post
The new IBM 9Mpixel monitor works reasonably well but it is a colour display and only displays 8 bits. It has two graphics cards but the high resolution means with some software you can hardly see the icons/text etc.

It is not built for the medical market and for the money I think a pair of displays which are designed for Medical use and provide a ten bit display are better value. Companies like Eizo and Barco have a better solution.
 Link to this message Neil Kinsman  posted on Wednesday, March 17, 2004 - 06:22 pm Edit Post Delete Post Print Post
What is the latest thought on workstation monitors? TFT or CRT. Are there any recent articles comparing the two?
 Link to this message Peter Adams  posted on Thursday, March 18, 2004 - 05:31 pm Edit Post Delete Post Print Post
I do not know if this will help but there is some useful information on the PACSnet website.
 Link to this message Jonathan Teece  posted on Thursday, April 15, 2004 - 01:25 pm Edit Post Delete Post Print Post
In response to Michael Sparks question, after comparing with several competitors we have chosen Tokoku ME203L grey scales TFT for our Radiologists workstations. These will operate in conjunction with a Dell 19inch colour TFT.

Royal Bournemouth Hospital (Presently implementing a Storcomm PACS)

we are expecting their delivery shortly.
 Link to this message Rhidian Bramley  posted on Thursday, November 11, 2004 - 10:41 pm Edit Post Delete Post Print Post
Does anyone have any good data on whether monochrome monitors/graphics cards set to display at 10 bit greyscale (1024 shades of grey) are perceptively any better than when the same monitors are set to display at 8 bit (256) greyscale. Conventional teaching was that if the human eye cannot resolve more than 256 shades of grey (if they are evenly spaced). Is this not the case with modern high brightness displays?

Also, does anyone have a view whether blue filter LCDs offer any real advantage over clear LCDs for greyscale reporting - or is it more down to personal preference. Can any of the manufactrers tell us which are more popular? Thanks.
 Link to this message Antonio Antonini  posted on Friday, November 12, 2004 - 04:43 pm Edit Post Delete Post Print Post
[Antonio Antonini] The number of grey levels the human eye can perceive depends on a number of factors: the human eye itself (depending on the personal visual acuteness, the age, the health status of the organ), the contrast ratio of the picture (depending on the max brightness, the black level, the ambient light, the surface reflections), the color of the picture (the human eye visual perception depends also on the frequency of the stimulation), the grey scale resolution of the picture (i.e. the number of TRUE grey levels visible on the screen AFTER linearization. Given all the above, when we talk about the display monitor, these are the factors that increase the chances to see a sufficient number of grey levels for a good diagnosis, in order of importance: number of original grey levels, contrast ratio, blue tint.

[Antonio Antonini] If it is true that 256 UNIQUE grey levels are believed to be sufficient for a good diagnosis, the real possibility to be able to perceive them all (better to read the difference between one and the next - or the previous) depends on the viewer as well as on the environmental conditions. A display with limited capabilities might do the job in ideal ambient conditions and with a 'high performing' viewer but a high quality display will extend the range of correct utilization to more viewers and to diversified ambient conditions. is my email address for Rhidian Bramley and for others interested in more details.
 Link to this message Rhidian Bramley  posted on Friday, November 12, 2004 - 10:22 pm Edit Post Delete Post Print Post
Thanks Antonio. I'm not sure I've explained the question clearly enough.

If you use the same monitor, same viewing conditions, brightness, contrast ratio etc - in fact everything the same EXCEPT changing the greyscale display settings from 10 bit (1024 unique grey levels) to 8 bit (256 unique grey levels) - can the human eye tell the difference?
 Link to this message Antonio Antonini  posted on Thursday, November 18, 2004 - 08:51 am Edit Post Delete Post Print Post
Sorry if I did not understand well your question. Before answering I need to clarify any picture displayed on a screen connected to a workstation based on a Windows Operating System will be able to display not more than 256 levels of grey. This because Windows is an 8 bit OS and cannot go beyond this limit. If you use an 8 bit video board what happens is that during the DICOM gamma curve correction some of the grey levels of the original picture are lost. Using a 10 bit LUT (look up table) you make sure you send to the screen 256 'unique' levels of grey or, in other words, you do not loose greyscale information during the DICOM correction. If you want more details on this subject I can send you some literature. Going back to your question, we can restate it this way: can I see the differences between a 10 bit and an 8 bit system? My answer is, unfortunately: it depends. We can say the answer get closer to 'yes' if you improve the viewing conditions for both systems (better viewer, higher contrast, lower ambient light/reflections, more bluish light) and closer to 'no' if, again on both systems, you go the opposite direction (poor visual acuteness by the viewer, loser contrast, higher ambient light, more reddish or greenish light). To deal with this subject the JND (Just Noticeable Difference) concept has been introduced. In other words, if the basic performances (like the contrast ratio or viewer acuteness) and environmental conditions (like ambient light or reflections) are poor, you can do a great job with the other parameters (like the number of grey levels) but the viewer will never be able to perceive them. Antonio (
 Link to this message Phil McAndrew  posted on Thursday, November 18, 2004 - 10:10 am Edit Post Delete Post Print Post
I thought windows after win95 was a 16 bit system or am I wrong?
 Link to this message Alan Carter  posted on Thursday, November 18, 2004 - 11:09 am Edit Post Delete Post Print Post
Ummm... Well, yes and no. Windows OS was 16-bit from Windows 3.1. ME was
structurally based on 16-bit DOS, although it is a native 32-bit OS. NT had
a full 32-bit engine. W2000 was mainly a 32-bit OS using the NT code base,
but 64-bit versions also came out for Intel's Itanium processor. XP is,
well, 32-bit but.......


Are we confusing 8-bit/16-bit/32-bit applications (programs) with
8-bit/16-bit/32-bit Operating Systems?
 Link to this message Phil McAndrew  posted on Thursday, November 18, 2004 - 12:03 pm Edit Post Delete Post Print Post
I think what people are talking about is bit depth per pixel. Obviously 8 bits would only allow 256 shades per pixel. I am not aware of any modern display hardware which only allows 8 bits per pixel, and the OS,s are all at least 16 bit so I was confused as to what Antonio was referring to.
 Link to this message Elizabeth C Beckmann  posted on Thursday, November 18, 2004 - 01:48 pm Edit Post Delete Post Print Post
As Alan Carter said is it not a question of confusing Hardware / software / and ops systems. While modern Ops systems and hardware are 16 or 32 bit some applications software are only 8 bit. This is particularly true of web viewers, where nearly all are restricted to 8 bit
 Link to this message Phil McAndrew  posted on Thursday, November 18, 2004 - 01:55 pm Edit Post Delete Post Print Post
Wonder why that is?
 Link to this message Robin Breslin  posted on Thursday, November 18, 2004 - 02:10 pm Edit Post Delete Post Print Post
This because some web viewer developments come from a purely 'image display' point of view, whereas others come from a 'medical imaging' point of view.

One is a 'medical imaging' perspective as to what image display means (requiring full bit depth for manipulation), and the other is a more IT based approach (simply filling the graphics card memory buffer with a snapshot of the image in a static way that matches the frame buffer display).
 Link to this message Martin Hoare  posted on Thursday, November 18, 2004 - 04:16 pm Edit Post Delete Post Print Post
Do not confuse whether an operating system is 16/32 or soon 64 bits with the way the graphics card is driven by Windows.

In maximum colour resolution mode Windows uses 24 bits. The 32 bit mode just packs the 24 bits loosely to allow the machine to run faster. The down side of this is that more memory is used.

See for a fuller explanation.

The 24 bits are split into three groups of eight: one each for red, green and blue. Each grey value is represented by equal values of RGB, hence only a 256 greyscale is available in Windows.
 Link to this message Richard Wellings  posted on Thursday, November 18, 2004 - 05:05 pm Edit Post Delete Post Print Post
I thought the human eye could only detect 22 greyscale levels as being different Thus the old scale on test cards when they didnt put on daytime TV. Well that was what was taught when I did part one.
 Link to this message Martin Hoare  posted on Friday, November 19, 2004 - 09:15 am Edit Post Delete Post Print Post
The human eye can definitely detect more than 22 grey scales. Back in the days of MS/DOS and ancient VGA cards I had to re-program the pallette registers to get 256 grey scales. I did some test programs to set the screen at 64, 128 and 256 grey scales with CT-Brains. You could definitely see the difference between 64 and 128. 128 and 256 was not so obvious.

I think that the grey scales on old TV test cards were more of a product of the limitation of the cameras and the screens.

Looking at the test cards on
they seem remarkably crude by modern standards.
Add Your Message Here
Username: Posting Information:
This is a private posting area. Only registered users may post messages here.
Options: Automatically activate URLs in message