Immediately after I upgraded from VMware Fusion 10.x to 11.0.1, I got the following error when trying to start the VM of any "advanced" OS like Windows 10 and Ubuntu 16.04, which worked perfectly fine in VMware Fusion 10:
No 3D support is available from the host.
The 3D features of the virtual machine will be disabled.
After several hours of digging around and experimentation, I sorted out the answer which I'll post below.
Best Answer
Edit your main VMware Fusion preferences file. This should be "
~/Library/Preferences/VMware Fusion/preferences
". Add the following lines (or edit any that are present to make them conform), then save the file:If you have a 2013 or later Mac, and a recent version of macOS, and are running the Metal 2 3D graphics library, you can try setting the first of those (the MTL one) to "1", and could try setting the one below it (GL) to "0", but I have not tested this personally.
[whatever].vmx
file, and add themks.gl.allowBlacklistedDrivers = "TRUE"
line to that file as well. Save the file.preferences
and*.vmx
).What seems to have happened was that either Hardware Version 14 did support DirectX and this support was removed, or there was a Version 15 that did and that entire version was removed. Either way, any VMs under VMware Fusion 10.x that supported DirectX, under the highest Hardware Version then available, suddenly lost that ability in VMware Fusion 11.x, until upgraded to Hardware Version 16. And on top of this, VMware silently assumed everyone was using Metal 2 and had abandoned OpenGL, which of course isn't true.
On the up side, you now get up to 3GB shared VRAM instead of 2GB, so you'll be able to play some less-ancient games and use slightly more modern 3D-rendering apps. However, VMware is still lagging badly on 3D support (DirectX 10 has been obsolete since 2008, so VMware's a decade behind user needs – it's still stuck at DirectX 10.1).
Credits: