"Bit wise, there shouldn't be a problem whether you render in 8 or 10,
but there's a major problem with linux anyway with bit-depths. In
linux, there unfortunately is no front-end software that allows you to
change the bit-depth. The ONLY way to change bit-depth in linux, is to
compile your video drivers to call 10 bits per channel for your
interfaces, instead of 8."
Actually Nvidia GeForce Drivers have supported 10 bit per channel in Linux for a while now. Of course, you're going to need a display with 10 bit per channel color support connected via a DisplayPort to your video card for it to work.
In contrast, with Windows, only the Nvidia Drivers for their Quadro Cards allow 10 bit per channel color. But, most applications don't work with it (since the Windows desktop itself is 8 bits per channel, and it's difficult to implement 10 bits per channel color on part of the screen like you'd see with an image editing application. But, Photoshop has 10 bit per channel working in Windows using either AMD Firepro cards or Nvidia Quadro cards with CS6. With Photoshop CC, I've seen reports of problems with Firepro cards though.
Basically, Windows requires "Pro" cards (AMD Firepro or Nvidia Quadro) to get that feature (10 bit per channel color with applications that support it), as the drivers for AMD Radeon Cards and Nvidia GeForce cards do not support the 10 bit per channel OpenGL buffers used by applications like Photoshop. See this page for some info on that from Nvidia:
http://nvidia.custhelp.com/app/answers/detail/a_id/3011
But, interestingly, the Nvidia Proprietary Linux drivers for GeForce Cards do support 10 bit per channel OpenGL buffers if you have a card with a DisplayPort and a Display that supports 10 bits per channel. They added 10 bit per channel (a.k.a., 30 bit) color support to their Linux drivers beginning with 295.20. Here's an old Phoronix article mentioning it:
http://www.phoronix.com/scan.php?page=news_item&px=MTA1NzM
If you go to the Nvidia X-Server Control Panel in Linux, and look under Display Settings, you should see a Color Depth choice if the detected monitor is capable of 10 bits per channel and connected by a Display Port. You'll see a 30 bit choice marked as Experimental there. Or, you can just select X Screen 0 (versus the detected display) and see the color depth choice that way (where it defaults to 24 bit but will have 30 bit available in the drop down list). Of course, you don't want to select that choice unless you have a true 30 bit (10 bit per channel) hardware setup (card with a DispayPort with a monitor connected to it with a true 10 bit per channel color panel).
No, you can't (only if someone will make an application/program to do so or if you select Mirror displays in System settings > Hardware > Displays). The thing you're asking is inconsistent.
Think that you have one application (Firefox) opened on the display #1 and another one (let say Chromium) on the display #2. Now, you look at the display #2, so you use Chromium and you will see in global menu the menu for Chromium. What reason do you have now to see in global menu the menu for Firefox that is displayed on display #1? And vice versa. This makes no sense...
Best Answer
Install ccsm (compiz config settings manager) and compiz plugins:
In CCSM, enable the
Opacity, Brightness, Saturation
Plugin. In that plugin, you can reduce the saturation (I myself make saturation zero and use invert plugin to invert colors (white on black is less strenuous on eyes when casual reading/browsing) generally to reduce eye-strain :))To use the OBS plugin, simply make a New rule which can be used to match windows by name, type, class, etc. For example, to match all windows, use
type=any
Screenshots: