Friday, January 20, 2006

1920x1200? not so fast

Driving a flat panel display at 1920 x 1200 is far more difficult than you can imagine. Sure, it's easy if you only have one, but what if you have three? Well, unless you're building a brand-new custom system with dual PCIe 16x slots and dual, dual DVI outputs, life can get a little hairy.

You see, I used to operate with just four displays. One was an Apple Cinema Display 23 inch and three were Sony Trinitron 500PS CRT monitors. This gave me a desktop resolution of 6720 x 1200. If you examine the image below, you can see how this colossal resolution compares to several traditional monitor resolutions.

Desktop Resolutions



As a professional software engineer, I've come to not only value, but profit from having so much space at my beck and call. All my work is laid out clearly and cleanly before me. I cover my screens with xterms, debug output, monitoring applications, reference material and development tools. This has boosted my productivity tremendously. Gone are the days of 'virtual desktops' and panning about in a tiny viewport. Now all windows are available at all times with a quick glance and in extreme cases, a turn of the head.

I have tried working on mere two-head systems, and while they are quite effective, they do not compare to having an odd number of displays. You see, with two displays, one has a left and right display and must always work turned slightly one direction or the other. In my opinion, one should have at least three monitors. One becomes the primary focus of your work, the others are secondary. In my case, I use the central display as my main workspace and the right hand one as my test space. The left contains reference material and the far-right contains email and mesaging apps.

This was all working because one could acquire quite high quality NVidia based video boards that fit into PCI slots and had plenty of video ram for high resolution use. Since my CRTs were limited to 1600x1200, this proved to be sufficient and I was able to get by with a set of PCI MX400s by eVGA. A generally (and usually) excellent video card maker.

I recently acquired two new displays. Namely another two 23 inch Apple Cinema Displays and was intent upon replacing the two CRTs that flanked my primary display with them. I thought this would be quite straightforward, I was wrong.

Now, I knew that my new FPDs were DVI based and not VGA, however I felt that surely NVidia made a chip and someone made a card that would connect things. It took a signifigant amount of searching for the right card, but it turns out that the eVGA FX-5200 has not only a PCI interface, but a DVI connector. I thought I was home free. I even was willing to wait the two weeks it would take for mail order. You see, few places stock PCI-DVI video boards.

Two weeks later, I finally have the boards and set about replacing my massive 75lb CRTs with new svelte aluminum and glass panels. Much heaving and sorting later, I have my new desktop configured. Now all I need to do is power up and tweak some resolution settings. Not so.

If your display has a resolution of 1920x1200 and a refresh rate of 60Hz, then your video board needs to push 1920x1200x60 pixels each second down the DVI cable. This equals 138,240,000 pixels per second. Additionally, video signals have small gaps in them called timing gaps used to tell the monitor when the end of a row has been reached and when the last line has been drawn and the display should resume drawing at the top. This decreases the amount of time available for sending pixels and instead of pixel data, updates are used to send blanking signals. Therefore, the actual speed of the signal going down the DVI cable will be somewhere north of 138Mhz. The speed of the signals is controlled by a chip on the video board called the pixel clock which is rated in Mhz.

Well, it turns out that the manufacturer of my video board decided that their quality NVidia video chips should be matched up with a crappy signal generator. The FX-5200 has a maximum pixel clock of 135Mhz. I guess that the fact that the DVI spec has a maxiumum (single link) pixel clock of 165Mhz was not worth trying to match. Not even in a board with 128MB of ram and capable of extremely high resolutions. Normally this wouldn't be a problem as I would be happy with a slightly slower refresh rate than 60Hz, however another manufacturer decided to cut a corner or two.

It turns out that the 23 inch aluminum Apple Cinema Display has brought us back to the wonderful world of fixed-frequency monitors. It only supports a refresh rate of 60Hz. No more, no less. Something I only discovered after I acquired a DVI/PCI board that plugged into it and then refused to drive it. This is the real moral of the story. Apple refuses to publish specifications on their equipment, a wise approach for a company whose motto seems to be "if it doesn't work automatically, you shouldn't be doing it anyway", however it's a little rough when you are obligated to pay a 10% restocking fee when you find out your monitor doesn't work with your equipment.

Therefore, if you want to drive an Apple Cinema Display at 1920x1200 (0r any resolution, really), you MUST have a pixel clock that is close to the 165Mhz limit. Without it, you will not be able to use the display at ANY resolution. Of course, the wonders of modern computer retailing means that if you've opened it, you can't return it.

Where things stand now is that I'm waiting on another video board to arrive, a Matrox P650 low-profile PCI with only 64MB of ram. The catch is that it has dual-dvi out and each is documented as being capable of driving the full 1920x1200 resolution. The lack of video ram isn't really a problem for me as I don't do 3d work. The upshot is that if I was insane I could drive seven of these monitors by replacing all of my PCI video boards. Something I'm not likely to do anytime soon, especially since they cost over 200 dollars each.

Here's hoping I'll be up and running soon.

1 comment:

  1. Thanks for that clear description of the issues with DVI and clock frequency. I've been in (almost) exactly the same position.

    I've just bought a new Dell 2407 Ultrasharp which has a native resolution of 1920x1200@60Hz. I thought that since my Nvidia 5200 card said it did that resolution (on the box) that I wouldn't have an issue. But I didn't know that they'd cut corners on the DVI interface so I'm force to run in analog mode.

    I tried replacing it with a GeForce 5600 Ultra, but I've got exactly the same problem.

    I can't get DVI to display any higher resolution than a 1600x1200 viewport. I thought I was onto a winner with the reduced blanking option, but that isn't enabled when I'm running in DVI mode... only when in Analog.

    So I'm stuck with Analog for now. (Perhaps I'll try my 9800 Pro from home as well)

    ReplyDelete