Quantcast
Channel: VMware Communities: Message List
Viewing all articles
Browse latest Browse all 252940

Re: Workstation 9 - No 3D Acceleration to Guest - Ubuntu 12.04 Host - Nvidia Card

$
0
0

cmillersp wrote:

 

Ok, here's the guide to get vmware workstation working with nvidia acceleration.

 

ASSUMING: vmware workstation 9.0.1 or 8.0.5, ubuntu 12.10, optimus, working bumblebee config, x86_64 os.

 

If you don't have bumblebee set up yet, there are numerous tutorials to help you. Follow them.

 

1.  sudo apt-add-repositoryppa:zhurikhin/primus

2. sudo apt-get update; sudo apt-get install primus primus-libs primus-libs:i386

3. Check that primus works. Try primusrun glxspheres.

 

4. If you don't have vmware installed, install it now.

5. sudo chmod a+s /usr/lib/x86_64-linux-gnu/primus/libGL.so.1

6. sudo ln -s /usr/lib/nvidia-current/tls/libnvidia-tls.so.304.43 /usr/lib/x86_64-linux-gnu/

sudo ln -s /usr/lib/nvidia-current/libnvidia-glcore.so.304.43 /usr/lib/x86_64-linux-gnu/
sudo ldconfig

 

7. sudo /etc/init.d/vmware stop

sudo mv /usr/lib/vmware/bin/vmware-vmx /usr/lib/vmware/bin/vmware-vmx.real

8. sudo gedit /usr/lib/vmware/bin/vmware-vmx
(paste in the following)

#!/bin/bash

LD_PRELOAD=/usr/lib/x86_64-linux-gnu/primus/libGL.so.1 exec primusrun /usr/lib/vmware/bin/vmware-vmx.real "$@"
9. sudo chmod u+s,a+x /usr/lib/vmware/bin/vmware-vmx
sudo /etc/init.d/vmware start
10. Run vmware, AS ROOT (sudo vmware)
11. Enjoy 3d acceleration.
So far, no crashes, decent performance.  Of course it's nowhere near an ideal solution, but it's the best I've been able to come up with at this point, and I'm really, really happy to have hardware acceleration.

 

 

Great work cmillersp.

Only now I had the chance to try this out... and it's working great.

Now it's possible to run from simple directx apps (2D games) and 3D heavy ones too (like ones using pixel shaders and such).

Whereas before neither worked with WS 9.

 

It seems that for heavier games (I only tested one so far, and it wasn't that heavy, it's a TB game) it still doesn't run as fluid as one would want.

And it eats up a lot of CPU resources. I'm guessing part of it is from the game itself, while the other part is probably due to all the interface layers involved.

But you can see the difference when running a 2D adventure game (like Resonance) and a 3D one (like Age of Decadence). Both CPU wise and GPU wise (with optirun nvidia-settings -c :8).

 

 

FWIW, I also get these errors:

Xlib:  extension "NV-GLX" missing on display ":0".

 

 

Will have to investigate it latter on.


Viewing all articles
Browse latest Browse all 252940

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>