One of my professors was telling us about scalability problems, and said that the X protocol was a prime example of a not scalable protocol. Why is that? Is it because it is very hardware dependent? I know that X is used in modern unix/linux environments, if it's not scalable than why is it used so widely?
Does the X windowing system suffer from scalability
Protocolsx11
Related Solutions
GNU (Gnu is Not Unix) is an Operative System, created by Richard M. Stallman. You can use this operative system with different kernel: such as Linux kernel, Hurd kernel, Darwin kernel, etc.
The X Window System (common on Unix like system) is just the basic layer for a GUI environment.
Every Linux distribution is a GNU operative system with a Linux kernel and an X Window System; on top of X Windows, you have the window manager (GUI) such as Xfce, Gnome, or KDE that lets you easily use your system.
The userland GUI stack begins with the Xorg server, which manages hardware -- both the display and input devices -- and provides the foundations of a windowing environment. It is a server whose clients are GUI applications that appear in a window. I believe the name was originally derived from one of the X shaped cursors (but treat that as apocraphal, see vonbrand's comment below); the earliest versions of a Unix based X Window server date back to about the same time as the orignal Microsoft Windows and Macintosh systems. You can run X all on its own, but you need to configure it to start some applications or else you are left with an empty screen and a mouse manoeuvrable X. Not so interesting.
Xorg is common to most general purpose linux distributions, though there are some (bleeding edge) alternatives available. The next layer in the stack is the Window Manager (WM), at which point linux becomes heterogeneous -- there are a wide variety of window managers available. Their primary purpose is to provide a unified interface to Xorg for the user; they are responsible for titlebars, borders, and arranging and controlling (maximize, minimize, iconify, etc.) all the windows on the desktop. They may also include specialized applications of their own, such as taskbars, and provide application independent menus. Many, but not all, window managers can be used on their own as together with X they provide the essentials elements of what most people would consider a graphical desktop.
About 15 years ago a third layer began to appear, the Desktop Environment (DE). These built on the more fanciful aspects of window managers and provide various kinds of integrated services. They have as their centrepiece a suite of applications including a file browser and GUI terminal -- previous to this, these existed only as independent entities. DE's usually use stripped down and simplified window managers which are responsible strictly for window decoration and management (i.e., they build upon a clear delineation of responsiblities).
With regard to mixing and matching applications associated with specific DE's, it is usually possible. GNOME is tied into a larger infrastructure parts of which are very commonly used by linux applications, so your system will inevitably have bits and pieces of GNOME related software running regardless of what WM and DE you use.
Best Answer
One reason he may have said this is that if you look at the traffic that flows back and forth between a client and a server, it's fairly verbose. This doesn't present an issue when the traffic is only having to go locally on a single box between the 2, however when the traffic needs to go over a network connection, then it becomes more painfully obvious that it's an inefficient protocol.
The protocol is tolerable on a LAN network, but as soon as you try and span it over a WAN connection, or introduce encryption in the form of a VPN or by using an SSH connection as a link between the client and the server, the Protocol really starts to show it's lack of scalability.
Benchmarking
You can use the tool
x11perf
to get a sense of the impact of running the applications localhosted vs. running them over an SSH connection to another X system.Here I'm running the
-create
test to give you a taste of what I'm talking about.localhost
LAN host
WAN host
Notice the extreme drop off from:
localhost:
LAN host:
WAN host:
That's a pretty steep decline in performance. Now realize that this isn't all X's fault. It is going over a 100MB network in the LAN test, and a ~20MB connection for the WAN test, but the point is still the same. X isn't helping itself with it's overly beefy communications it's throwing back and forth between the X server and the X client.
Communications Breakdown (couldn't resist the Led Zeppelin reference)
This is more for effect but just to give you an idea of the amount of data that's roughly flowing back and forth during the
x11perf -create
test that I used above I decided to run it on my LAN host again, only this time I usedtcpdump
to capture the SSH traffic, and dump it to a file.I used this command:
The resulting log file:
So the resulting amount of traffic was in the ballpark of ~5.5MB. Granted this is not all X traffic but it gives you an idea of the amount of data flowing. This is really the Achilles' heel of X, and the major reason why it can't scale.