Linux – Why Wayland is using OpenGL ES instead of OpenGL

graphicslinuxwayland

As far as I know, Wayland is not using OpenGL but OpenGL ES for 3D rendering, usually used on embedded systems (except for Intel IGPs). In the long term, I read that OpenGL support would be implemented but was not a priority for now.

I guess it is because OpenGL ES is somewhat simpler but it does not seems like a strong point for making such a choice.

I was wondering what were the reasons for this decision, and what were (and would be, for the future of Linux) the consequences of this choice.

Update:

The Wayland FAQ was my fist stop before even thinking about asking it here. Feel free to correct me if I am wrong, but the last part seems, at least, not very clear, IMHO:

EGL is the only GL binding API that lets us avoid dependencies on
existing window systems, in particular X.

As far as I understand, it's not that simple. EGL is an interface between GLs such as OpenGL and OpenGL ES. OpenGL ES calls are possible directly through Wayland/Weston while OpenGL support needs XWayland.

GLX obviously pulls in X dependencies and only lets us set up GL on X drawables. The alternative is to write a Wayland specific GL binding API, say, WaylandGL.

So, this part refers to what I was saying above and, as far as I know, Wayland development team does not want to take that alternative route. So, for now, people willing to ports their applications which does not make direct use of Wayland/Weston are forced to translate their OpenGL API calls to OpenGL ES ones.

A more subtle point is that libGL.so includes the GLX symbols, so
linking to that library will pull in all the X dependencies. This
means that we can't link to full GL without pulling in the client side
of X, so Weston uses OpenGL ES to render.

This seems understandable, on a short-time basis, at least. Still, on the long run, Wayland devlopment team wants to add OpenGL APIs as well, so it seems more like a workaround for now to me, until things get serious. This is one of the sentences which triggered my question here in the first place.

As detailed above, clients are however free to use whichever rendering
API they like.

If I am not mistaken, which means going for XWayland for OpenGL applications and Weston OpenGL ES, which seems to be a bigger deal that what the sentence implies, especially when it comes to 3D rendering, not to mention the fact that Wayland/Weston aim to replace Xorg.

For the record::

XWayland is a series of patches over the X.Org server codebase that
implement an X server running upon the Wayland protocol. The patches
are developed and maintained by the Wayland developers for
compatibility with X11 applications during the transition to
Wayland,[28] and was mainlined in version 1.16 of the X.Org Server in
2014.When a user runs an X application from within Weston, it calls upon XWayland to service the request.

N.B.: I am trying to learn more about Wayland/Weston, especially when it comes to (3D) rendering, but exact information on this subject is diffcult to find, especially because it seems that the only people really X11-savvy seem to be Wayland developers.

As far as I can tell so far, for OpenGL:

  • if OpenGL function calls are made through GLX interface, it falls back to XWayland, so the programme is (really) not using Wayland.

Addendum

It might seems that the discussion is out of the scope of the original question but it is actually linked to underlying OpenGL interfaces/libraries and it is difficult to separate all of this from the original question.

As it seems to be a complicated and confusing subject, here are some various links and quotes which lead me to think that OpenGL (not ES) is not really supported by Wayland per se, but falls back to X11, through XWayland:

What does EGL do in the Wayland stack

The Wayland server in the diagram is Weston with the DRM backend. The server > does its rendering using GL ES 2, which it initialises by calling EGL.

Hacker News comments

Wayland is actually pretty stable. Nvidia has problem with OpenGL in
Xwayland (i.e. 3d accel for x11 apps), otherwise, it should work.
There are warts though, when using Wayland. When using scaling
(doesn't have to be fractional, either), X11 apps are being upscaled,
not downscaled, resulting in blurriness. Unfortunately, neither
Firefox nor Chrome does support Wayland natively, and who wants to use
their most used app on their computer in blurry mode?

How come GLX-based applications can be run on Wayland on Ubuntu?

So based on the link @genpfault provided:

So based on the link @genpfault provided:

  • XWayland is a part of XOrg that's providing an X server on top of Wayland. Any application that's linked against X11 libs and running
    under Wayland will automatically use XWayland as its backend. So the
    GLX part of XWayland is the mechanism that allows GLX-based OpenGL
    applications to run on Wayland.
  • Not being able to use MSAA in GLX-based applications seems to be a known bug of XWayland, at least for Intel and AMD GPUs (cf.
    https://bugs.freedesktop.org/show_bug.cgi?id=98272 ). But I couldn't
    find any additional information on the matter.

Best Answer

The premise of your question is wrong. Wayland does not use OpenGL ES or OpenGL at all. Lets get things in order to achieve proper understanding of the software stack:

  1. Wayland is an IPC protocol that allows the clients and the compositor to talk to each other. While technically libwayland is just a single implementation of that protocol and should not be solely identified with it, for now it remains the only implementation and is generally called 'wayland' as well. It is not a full compositor that runs your hardware.

  2. Wayland Compositor is an application that uses wayland protocol to receive buffers from clients and compose it into a single image shown on the display. The wayland protocol makes relatively little assumptions about the inner workings of the compositor itself. In particular, the choice of the rendering technology is left completely open. The default buffer type defined by the core protocol is a simple shared memory buffer that is not accelerated by the GPU in any way, and is meant mainly for simple applications that render their UI using the CPU only. This buffer type is not interesting in our case, and will be conveniently forgotten in the rest of the answer.

  3. Weston is a reference implementation of a wayland compositor. While it is developed by the people involved in the development of the libwayland itself, it is not an essential part of the wayland ecosystem - it is just a single compositor implementation. If you are running any of the Linux distributions that include wayland desktop environments, you are almost certainly not using Weston, but rather some other compositor implementation. Weston uses OpenGL ES for rendering - this is mainly dictated by the fact that the current libGL implementations still link to some X-related libraries, and Weston creators wanted to keep it pure wayland - this is a reference implementation after all. Additionally, it makes it compatible with the embedded devices, which may not support the full OpenGL.

  4. EGL - libEGL is a library that contains glue-code that allows to initialize multiple rendering contexts of huge variety (OpenGL, OpenGL ES or OpenVG in different versions). It also allows sharing of data between such contexts - i.e. it allows to pass a framebuffer rendered with OpenGL and pass it to OpenVG for further processing. Sharing of these resources can occur across process boundaries - the receiver of a resource may be a different application than the process that has created it. A reference to a shared resource (buffer) can be passed between processes in a variety of ways, e.g. over a compatible wayland IPC connection. A buffer (EGL Image) passed in such a way does not retain any reference to the rendering API used to obtain it. While it is claimed that the EGL layer is also responsible for binding the framebuffers to the underlying OS elements like windows or displays, in practice that means sharing buffers with some system process that can use it to e.g. paint it in a window or on a particular display. Therefore, it is just a variation of the above functionality rather than a separate feature. The libEGL is heavily extensible, and there is a huge list of extensions available, so your libEGL implementation may be also responsible for other tasks, that do not fit the above description.

  5. GLX - an older and more limited variant of the EGL. It allows sharing of the buffers of various kinds, but only between an X11 client and X11 server. It is inherently tied to the X11 protocol - if the client application uses X11 protocol, it can use GLX as well. If it uses wayland protocol, it cannot. EGL was developed as its replacement, to allow sharing of such data more generally. Modern X11 servers allow clients to use EGL instead of GLX as well.

So the wayland technology does not require you to use OpenGL ES, nor does it even vaguely point in its direction. The reference compositor Weston uses it internally, but that has no influence on the client rendering API. The only requirement is that whatever you render can be transformed into EGL Image. Since this is the job of libEGL, the choice of the rendering API on the client side is dictated only by the limitations of your libEGL implementation. This is also true for other compositors which may or may not be using OpenGL ES to render the final desktop image.

The libEGL is a part of the GPU driver software (just like e.g. libGL) so whether it allows converting OpenGL buffer into an EGL Image (and subsequently EGL Image to OpenGL ES texture on the compositor side) is dependent on your hardware, but in practice virtually every hardware allows that as long as it supports the full OpenGL at all. This is why you have difficulty finding the definitive proof that wayland supports the full OpenGL - wayland does not care about the rendering technology at all. Just as the FAQ says:

What is the drawing API?

"Whatever you want it to be, honey"[...]

Therefore, the question whether OpenGL is supported is out of scope for wayland. It is actually determined solely by the capabilities of the libEGL and the hardware.

The client application must use a particular API in order to initialize its windows and the GL(ES) contexts. If the client application uses X11 API to create its windows, then it will connect to the XWayland compatibility shim which pretends to be a full X11 server to the client. Then the client will be able to use either GLX or EGL-on-X11 to initialize its contexts and share rendered buffers with X11 server. If the client uses wayland client API to create its windows, it will be able to use the EGL-on-wayland to initialize its contexts and share rendered buffers with wayland compositor. This choice in most cases lies entirely on the client side.

A lot of older software that is not Wayland-aware uses just X11 API and GLX - simply because the wayland and EGL API did not exist (or was not mature enough) during the development. Even more modern software often uses just X11 API for compatibility reasons - there is still quite a lot non-wayland systems out there. Modern UI toolkits like GTK or QT actually support multiple "backends", which means that they can detect the session type on initialization and use the most appropriate API to create windows and drawing contexts. Since games generally don't use such toolkits, the burden of such detection falls entirely on their developers. Not many projects like that bother to actually implement it, and they often rely on X11 and GLX protocol on both X11 and wayland sessions (through Xwayland). So if you have a game uses GLX to initialize OpenGL means that it has opted to use X11 API. Whether this is because the game does not support wayland or EGL at all, or whether the game tried to use EGL to initialize OpenGL and failed for some reason I cannot judge without a ton of additional information. In any case, it is not in any way dependent on the wayland protocol or the used compositor.