The premise of your question is wrong. Wayland does not use OpenGL ES or OpenGL at all. Lets get things in order to achieve proper understanding of the software stack:
Wayland is an IPC protocol that allows the clients and the compositor to talk to each other. While technically libwayland is just a single implementation of that protocol and should not be solely identified with it, for now it remains the only implementation and is generally called 'wayland' as well. It is not a full compositor that runs your hardware.
Wayland Compositor is an application that uses wayland protocol to receive buffers from clients and compose it into a single image shown on the display. The wayland protocol makes relatively little assumptions about the inner workings of the compositor itself. In particular, the choice of the rendering technology is left completely open. The default buffer type defined by the core protocol is a simple shared memory buffer that is not accelerated by the GPU in any way, and is meant mainly for simple applications that render their UI using the CPU only. This buffer type is not interesting in our case, and will be conveniently forgotten in the rest of the answer.
Weston is a reference implementation of a wayland compositor. While it is developed by the people involved in the development of the libwayland itself, it is not an essential part of the wayland ecosystem - it is just a single compositor implementation. If you are running any of the Linux distributions that include wayland desktop environments, you are almost certainly not using Weston, but rather some other compositor implementation. Weston uses OpenGL ES for rendering - this is mainly dictated by the fact that the current libGL implementations still link to some X-related libraries, and Weston creators wanted to keep it pure wayland - this is a reference implementation after all. Additionally, it makes it compatible with the embedded devices, which may not support the full OpenGL.
EGL - libEGL is a library that contains glue-code that allows to initialize multiple rendering contexts of huge variety (OpenGL, OpenGL ES or OpenVG in different versions). It also allows sharing of data between such contexts - i.e. it allows to pass a framebuffer rendered with OpenGL and pass it to OpenVG for further processing. Sharing of these resources can occur across process boundaries - the receiver of a resource may be a different application than the process that has created it. A reference to a shared resource (buffer) can be passed between processes in a variety of ways, e.g. over a compatible wayland IPC connection. A buffer (EGL Image) passed in such a way does not retain any reference to the rendering API used to obtain it.
While it is claimed that the EGL layer is also responsible for binding the framebuffers to the underlying OS elements like windows or displays, in practice that means sharing buffers with some system process that can use it to e.g. paint it in a window or on a particular display. Therefore, it is just a variation of the above functionality rather than a separate feature.
The libEGL is heavily extensible, and there is a huge list of extensions available, so your libEGL implementation may be also responsible for other tasks, that do not fit the above description.
GLX - an older and more limited variant of the EGL. It allows sharing of the buffers of various kinds, but only between an X11 client and X11 server. It is inherently tied to the X11 protocol - if the client application uses X11 protocol,
it can use GLX as well. If it uses wayland protocol, it cannot. EGL was developed as its replacement, to allow sharing of such data more generally. Modern X11 servers allow clients to use EGL instead of GLX as well.
So the wayland technology does not require you to use OpenGL ES, nor does it even vaguely point in its direction. The reference compositor Weston uses it internally, but that has no influence on the client rendering API. The only requirement is that whatever you render can be transformed into EGL Image. Since this is the job of libEGL, the choice of the rendering API on the client side is dictated only by the limitations of your libEGL implementation. This is also true for other compositors which may or may not be using OpenGL ES to render the final desktop image.
The libEGL is a part of the GPU driver software (just like e.g. libGL) so whether it allows converting OpenGL buffer into an EGL Image (and subsequently EGL Image to OpenGL ES texture on the compositor side) is dependent on your hardware, but in practice virtually every hardware allows that as long as it supports the full OpenGL at all. This is why you have difficulty finding the definitive proof that wayland supports the full OpenGL - wayland does not care about the rendering technology at all. Just as the FAQ says:
What is the drawing API?
"Whatever you want it to be, honey"[...]
Therefore, the question whether OpenGL is supported is out of scope for wayland. It is actually determined solely by the capabilities of the libEGL and the hardware.
The client application must use a particular API in order to initialize its windows and the GL(ES) contexts. If the client application uses X11 API to create its windows, then it will connect to the XWayland compatibility shim which pretends to be a full X11 server to the client. Then the client will be able to use either GLX or EGL-on-X11 to initialize its contexts and share rendered buffers with X11 server. If the client uses wayland client API to create its windows, it will be able to use the EGL-on-wayland to initialize its contexts and share rendered buffers with wayland compositor. This choice in most cases lies entirely on the client side.
A lot of older software that is not Wayland-aware uses just X11 API and GLX - simply because the wayland and EGL API did not exist (or was not mature enough) during the development. Even more modern software often uses just X11 API for compatibility reasons - there is still quite a lot non-wayland systems out there. Modern UI toolkits like GTK or QT actually support multiple "backends", which means that they can detect the session type on initialization and use the most appropriate API to create windows and drawing contexts. Since games generally don't use such toolkits, the burden of such detection falls entirely on their developers. Not many projects like that bother to actually implement it, and they often rely on X11 and GLX protocol on both X11 and wayland sessions (through Xwayland). So if you have a game uses GLX to initialize OpenGL means that it has opted to use X11 API. Whether this is because the game does not support wayland or EGL at all, or whether the game tried to use EGL to initialize OpenGL and failed for some reason I cannot judge without a ton of additional information. In any case, it is not in any way dependent on the wayland protocol or the used compositor.
Best Answer
My system is fedora27,gnome on wayland,and I add "export DRI_PRIME=1" to my "~/.profile" file. It works well with my amd HD8730M card.
For more information,you can visit this website: https://archive.fosdem.org/2014/schedule/event/wayland_gpu/attachments/slides/364/export/events/attachments/wayland_gpu/slides/364/GPU_offloading.pdf
This is a PPT about gpu offloading on wayland.