The DisplayPort standard is probably the least appreciated video interface standards in existence. It supports higher resolutions at 30bpp color depth (i.e. 30-bit true color) than HDMI and even dual-link DVI and future revisions of the interface will support USB signals, which essentially means that a monitor with a USB hub will only need a single connection to the PC and to a power source. It also allows multiple monitors to be connected to a single port via a hub while being seen by the PC as independent displays.
It's also very low-profile:
So far, the only companies that are taking this standard seriously are AMD, NVIDIA, Apple, HP, Lenovo and Dell. Apple, however, pushes their proprietary "Mini DisplayPort" connector that offers no benefits over the standard sized one, aside from having a more narrow (but taller) profile, which allows video cards installed in PCs with standard sized expansion brackets to have up to six outputs. The other companies are only pushing DisplayPort for higher-end products like professional laptops (e.g. Lenovo ThinkPads, Dell Latitudes, HP EliteBooks) and business oriented displays.
For the last two years, I have not seen many consumer-grade monitors with DisplayPort, whereas I've seen plenty with HDMI which is simply not suitable for connecting computers to desktop monitors, as it was originally designed for televisions and set-top appliances. HDMI requires a royalty to be paid for every device sold with an HDMI output or input, and another royalty for the use of the "HDMI" logo, while DisplayPort is royalty-free like Ethernet (although the use of the DP logo requires a fee.) And even with HDMI, most consumers don't even use it - most people continue to use analog VGA even for 1920x1080 displays.
I believe that the reason HDMI is catching on in the desktop world is simply "HD" marketing - the average Joe sees that gigantic "HDMI" logo on a product and thinks "HD!" DisplayPort allows for thinner displays as it allows the video card to directly control the LCD panel, rather than converting VGA/DVI/HDMI signals to LVDS using active logic, and consequently superior longevity, as I've seen plenty of monitors where the LCD components work perfectly, but the cheap electronics used to convert the signals are dead, bricking the monitor.
TL;DR: DisplayPort is easily the best available interface for hooking up PCs to displays, but it's seeing very limited adoption.
Got any thoughts about this?
asked Sep 13 '10 at 02:07
As with all standards, it takes time to adopt. DVI was suppose to replace VGA, however VGA never did die out through the life of DVI. Another off-topic example is MP3 and AAC, AAC was suppose to be an open and license free replacement to MP3 released in 1999.. yet it has only just recently started conquering over the MP3 market.
I've seen the growth in Displayport rise quite a bit in the past 2 years, almost every video card I see now has a displayport port, some even going as far as making displayport the primary port, with an optional DVI port:
I guess the issue you're concerned with is displays. Display makers have been a bit slow at adopting it, but it's getting there. Also Mini Displayport is not proprietary, it's an official open standard, anyone can adopt and implement it, it just hasn't been very appealing to other computer makers as of it (I do see it being adopted on future (sub)notebooks. The Mac Pro video cards uses normal Displayport ports.
Here's a graphics card that supports 6 displays with Mini Displayport, below is AMD's Radeon HD 5870 Eyefinity card:
on to macs, no. mabey on to pc but unless apple wants to go with a bigger video port than no.
answered Sep 13 '10 at 07:32