A possible way to accelerate 3D and Video on USB display devices (applies for any display device but I show how to use with USB display devices) using VirtualGL. Based on Bumblebee I have summarised steps to enable 3D acceleration.
For this to work you need to have a decent Graphics card capable to accelerating Video and 3D, most of the recent Intel, AMD/ATI, nVidia graphics card supports this with proper drivers installed.
Before I go into further details I would like to inform you that I have not tested this method (as I don’t have a USB display device to use), this may run your system crazy, may have security implications or any other issues. As usual I am not ready to take responsibility for any negative things this approach may have to your system or data or in any way. I always welcome your experience, feedback and suggestions.
Here we go
Install latest stable version (unstable if you like to have a bleeding edge system but this may have some bugs) VirtualGL from VirtaulGL website.
Run “vglserver_config” as root (su vglserver_config in fedora, sudo vglserver_config in ubuntu).
Answer as below:
1) Configure server for use with VirtualGL in GLX mode
2) Unconfigure server for use with VirtualGL in GLX mode
Restrict 3D X server access to vglusers group (recommended)?
Restrict framebuffer device access to vglusers group (recommended)?
Disable XTEST extension (recommended)?
Now your X will be configured properly. Quit the application.
append the following two lines to /etc/profile or to your ~/.bashrc if that exists..
Now you need to make sure that your Desktop Enviroment runs the following command “vglclient localhost &” after starting up so add it to your start up location. Now reboot the machine and hope your computer comes up ;). If so Hurray!!! (it should come up if not rename /etc/X11/xorg.conf to /etc/X11/xorg_old.conf, reboot, and if needed reconfigure X – the last step is needed only for older versions of X)
Try vglrun <application> to see if it gets accelerated. glxgears comes handy here. If so you can go through rest of the steps, if not revert back and report bug.
You can configure your system such that 3D acceleration is always on using the following steps. Note: This is ultra experimental.
Name=KDE Plasma Workspace Accelerated by Native Graphics Card
Comment=The desktop made by KDE
/usr/bin/vglrun -d :0.1 /usr/bin/startkde
chmod +x /usr/bin/startkde-vgl
Name=GNOME Accelerated by Native Graphics Card
Comment=The GNU Network Object Model Environment. A complete, free and easy-to-use desktop environment
/usr/bin/vglrun -d :0.1 /usr/bin/gnome-session
chmod +x /usr/bin/gnome-session-vgl
The accelerated sessions option should be available in GDM and KDM (i.e., the login screen). Choose “GNOME Accelerated by Native Graphics Card” instead of “GNOME” session Once these session is launched all the application should get 3D acceleration. Compiz did not work in a similar environment but other applications and games (using wine) had 3D acceleration.
3D acceleration for USB based display devices. Video acceleration may also be available with Xv testers needed.
No Compiz – do some bug fix or modify the code to work with this. No idea on GNOME 3, KWin someone have to check and confirm.
I guess this will not work on extended or mirror mode. Someone can come forward to test and confirm. I will work on this if I could somehow (magically) get a USB display device.
Security issue making it not suitable if you cannot trust the users who are going to use the machine (as is the case in internet cafes etc).
Immediate Improvements possible:
Automatic changes to xorg.conf whenever a USB display device arrive and disappear. Seems this is implemented visit plugable.
If user starts this (accelerated by native graphics card) session from a normal graphics card which has built in OpenGL support (in some hybrid graphics card) the scripts should detect this and open normal session instead of VirtualGL. This seems to be easier but needs to be implemented and tested.
User test cases, results, feedback. This will be updated here as and when information is available.
Consistency in naming the device (fb0, fb1, etc) and remembering the settings (resolution, colour depth, etc) each time a device is connected or configured by the user. Should we care for this? Don’t know if Desktop Environment should take care of this?
Also make sure all the steps mentioned here are correct. It should be hopefully but let me know if not.
Some more info and concerns
This may or may not work on your system just try it and let me know. I have not tested this myself (because of lack of USB display device). I will be relying on independent users to test this and to provide feedback.
VirtualGL also has some security issues (virtualgl website states this information) so use this if you are not concerned about security or if you are going to use it in home with your family and friends.
Please provide feedback and improvement suggestions.
The above steps can be used to accelerate any USB display device like displaylink USB devices as this device has open source Linux frame buffer (called udlfb) driver and X driver (xorg-video-displaylink).
Update 1: (13 August 2011) A couple of weeks ago I tried to test this. The outcome is neither success nor failure. I tried this post in a laptop with Intel GPU (Intel D945GCCR) running ubuntu. I was not able to use Intel and USB display device at the same time. Either one of the devices run X. When I tried to launch two X sessions only USB display device is up and the intel monitor shows nothing. It seems using USB display devices in systems with Intel Graphics cards have some issues. So for now I am still not able to get OpenGL accelerated for USB display devices. Is there any one dare enough to try out this in a machine with AMD (ATI) or nVidia graphics cards? If you have got one please try this with both closed source and open source drivers and comment out the results in this page. I will be glad to hear more on this.
Update 2: (25 September 2011) A few days back I got access to my friend’s laptop manufactured by HP with nVidia graphics card. I installed VirtualGL and USB display drivers in Ubuntu Linux and configured the system as above. Instead of a multi seat configuration I tried for dual head config. After modifying xorg.conf I rebooted the system. While booting a green screen appeared in second monitor showing that udlfb is loaded. Laptop monitor was blank. After some time Caps Lock key indicator LED started blinking continuously – I guessed it as a kernel panic. Restarted the system a tried and in rescue mode I modified xorg.conf such that laptop monitor acts as screen 1 and USB display as screen 0. Rebooted again and kernel panic again.
Mistakes I made:
- Used Ubuntu 11.04 (as this is what is installed in my friend’s system)
- Using nVidia binary blob
- dual head mode instead of multi seat mode.
My next try is to test in multi seat mode instead of dual head mode with nVidia binary blob. If that fails I will try using nouveau instead of nVidia binary driver (both multi seat and dual head configs). I will be able to try these things when I get access to the machine again which will happen mostly in mid October or later. I urge people who read this page or who posses USB display adapter to try out and provide comments and feedback on this. I really thank my friend “Vijay” for providing his laptop graciously for my experiments.
Update 3: (5 November 2011) As I did not get access to the laptop that I mentioned I did not test it and also due to Deepawali (festival of lights that destroys evil and brings good things) I had to defer my tests. In the mean time I came to know about the fact that “Dave Airlie” is doing a huge work to re-architect X and make multi GPU hot plugging possible (IRC chat). Also he is trying to accelerate displaylink (or similar device) using Native GPU. More info is available at phoronix page. Recently I came across another page which does a similar thing in kernel by exposing Virtual CRTC and making the native GPU DRM/KMS driver to think it as another CRT controller attached to it. This module requires a little modification to Native GPU driver’s code and doing that seamlessly accelerates all the operations done in displaylink device. If this module gets merged upstream then there is no need of these dirty hacks. Things will work out of the box and there is no need to change anything in user space (not even X). More info is available in this HOWTO, code is available in this github page, mailing list message by the author, and a phoronix article about this. The latter solution has a lot of advantages and will work with any display server that relies on DRM (people who are using binary blobs cannot make use of this). The former solution will make huge changes and binary driver once modified can work but this works with X alone. Both these are really interesting and challenging code.