Why do video cards have better quality on Windows? Is there something about Linux that stops video cards from being utilized to their full capacity? What would it take to get the best possible graphics out of a video card on Linux?
Linux – What’s stopping video cards from being utilized to their full capacity on Linux
graphicslinux
Related Solutions
Open source drivers are getting pretty good these days. I haven't had any problem with Intel or AMD hardware.
Intel
I hear the old ones are pretty bad, but my G4500HD does everything I need well. Video acceleration could be better though. There isn't a proprietary driver for Intel either, your only choice is open source. The composited 3D desktop in KDE works great on my laptop which has an Intel chip.
AMD/ATi
Right now the older cards are better supported than the new ones. If you could somehow get an x1800 or something from the same generation that would probably be the best. The r300g
driver is getting more development work than r600g
. That's not to say r600g
is bad, in fact it's great! It's just somewhat behind the driver for the older hardware. AMD has a proprietary driver for the new hardware, but in my experience you want to avoid it; it's pretty bad. The hardware covered by r300g
isn't supported by that driver, so the open driver is your only option there. And like the Intel chip I have, my Radeon 4850 runs the composited desktop in KDE well.
At the moment, I wouldn't recommend an HD6000 series. The 6900s have no support at all in the open driver, and the others have basic support. Go for an HD5000 or an HD4000.
Nvidia
They have a really good proprietary driver, but the open driver is struggling along. It's getting better all the time, but Nvidia is doing nothing to help the developers. At least AMD helps out a little bit for their hardware.
The advantage to having an open driver is that it will work out of the box in any distro. If you install Fedora, everything will work including dual screen and 3D. The proprietary ones are painful to setup. Neither of them properly set up my dual screens. It was easier to setup with Nvidia which isn't saying much because the AMD blob was just awful at this. Also, anytime you update the kernel, you have to reinstall the driver. Most distros take care of this if you install the in-repo version, but if you don't it's annoying to boot up one morning and realize you updated the kernel and now X.org doesn't work.
If you aren't planning on playing 3D games, either the Intel or AMD drivers are the best. The AMD driver is more modern than the Intel one, it uses the Gallium3D architecture within Mesa (that's what the g
stands for in r600g
), but they both get the job done.
This is all due to the fact that the X server is out-dated, ill-suitable for today's graphics hardware and basically all the direct video card communication is done as an extension ("patch") over the ancient bloated core. The X server provides no builtin means of synchronization between user rendering the window and the screen displaying a window, so the content changes in the middle of rendering. This is one of the well-known issues of the X server (it has many, the entire model of what the server does and is outdated - event handling in subwindows, metadata about windows, graphical primitives for direct drawing...). Widget toolkits mostly want to gloss over all this, but tearing is still a problem because there is no mechanism to handle that. Additional problems arise when you have multiple cards that require different drivers, and on top of all this, opengl library has a hard-wired dependency on xlib, so you can't really use it independently without going through X.
Wayland, which is somewhat unenthusiastically trying to replace X, supports a pedantic vsync synchronization in its core, and is advertised to have every frame exactly perfect.
If you quickly google "wayland video tearing" you'll find more information on everything.
Best Answer
There is no fundamental reason for not being able to get the same quality output from graphics cards for the same hardware under Windows and Linux. However the development of software, both the drivers, and any application software, needs to be written and doing that both for Windows and for Linux just takes extra effort.
Such double effort always takes away resources or interferes with doing it just for one platform. So even if a company might gain additional hardware/software sales by making the effort, this has to be counted against possible loss of advantage if they would focus more, or exclusively, on one platform.
In the early 90's I worked mostly with Silicon Graphics (SGI) machines running their Unix variant (Irix). These were the only computers that could handle 3D graphics and video processing at levels acceptable for movie makers. The high-end graphics (for that time) would not work under Windows, because for the longest time Windows was not an installable option for the SGI hardware. In movies like Jurrasic Park would use SGI machines both for making the movies and in the movie itself.
Nowadays, in the gaming industry e.g., Windows is dominant in quality of output and that is just because more effort is put into that platform because of higher expected sales not because Linux could not obtain these quality levels.
However in creation of high quality graphics for movies like The Hobbit with its 48fps (whether you like it or not), the software used is Linux based. Using Linux was the developers decision based among others on the level of graphics quality that needed to be obtained. (Disclaimer: I have been involved in providing development services for this software company, so you are right to think me biased about the quality of the graphics output).