cancel
Showing results for 
Search instead for 
Did you mean: 

Linux: Run Rift as separate X display

bluenote
Level 2
I'm trying to follow this advice from the LINUX_README:


KNOWN ISSUES
* Frame loss on NVidia TwinView setups. Use an independent X screen for an optimal experience.


... but I'm still having issues to set this up. Since there seems to be a known solution to this problem at Oculus, I'm wondering if it is possible to share a bit more information on that? I don't think my issues are caused by an exotic setup -- in fact my specs are pretty average: Ubuntu 14.04, Geforce 670, nvidia driver 331.

What I have tried so far:

I'm currently trying two different approaches. In the first approach, my xorg.conf looks like this (generated mainly by means of nvidia-xconfig / nvidia-settings):


# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings: version 331.20 (buildd@roseapple) Mon Feb 3 15:07:22 UTC 2014

Section "ServerLayout"
Identifier "Layout0"
Screen 0 "Screen0" 0 0
Screen 1 "Screen1" RightOf "Screen0"
InputDevice "Keyboard0" "CoreKeyboard"
InputDevice "Mouse0" "CorePointer"
Option "Xinerama" "0"
EndSection

Section "Files"
EndSection

Section "InputDevice"
# generated from default
Identifier "Mouse0"
Driver "mouse"
Option "Protocol" "auto"
Option "Device" "/dev/psaux"
Option "Emulate3Buttons" "no"
Option "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
# generated from default
Identifier "Keyboard0"
Driver "kbd"
EndSection

Section "Monitor"
# HorizSync source: edid, VertRefresh source: edid
Identifier "Monitor0"
VendorName "Unknown"
ModelName "LG Electronics W2443"
HorizSync 30.0 - 83.0
VertRefresh 56.0 - 75.0
Option "DPMS"
EndSection

Section "Monitor"
# HorizSync source: edid, VertRefresh source: edid
Identifier "Monitor1"
VendorName "Unknown"
ModelName "OVR Rift DK2"
HorizSync 30.0 - 150.0
VertRefresh 56.0 - 77.0
Option "DPMS"
EndSection

Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BoardName "GeForce GTX 670"
BusID "PCI:1:0:0"
Screen 0
EndSection

Section "Device"
Identifier "Device1"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BoardName "GeForce GTX 670"
BusID "PCI:1:0:0"
Screen 1
EndSection

Section "Screen"
Identifier "Screen0"
Device "Device0"
Monitor "Monitor0"
DefaultDepth 24
Option "Stereo" "0"
Option "nvidiaXineramaInfoOrder" "DFP-0"
Option "metamodes" "DVI-I-1: nvidia-auto-select +0+0"
Option "SLI" "Off"
Option "MultiGPU" "Off"
Option "BaseMosaic" "off"
SubSection "Display"
Depth 24
EndSubSection
EndSection

Section "Screen"
Identifier "Screen1"
Device "Device1"
Monitor "Monitor1"
DefaultDepth 24
Option "Stereo" "0"
Option "metamodes" "HDMI-0: nvidia-auto-select +0+0 {rotation=left}"
Option "SLI" "Off"
Option "MultiGPU" "Off"
Option "BaseMosaic" "off"
SubSection "Display"
Depth 24
EndSubSection
EndSection


So the idea here is to have 1 ServerLayout which has 2 Screens. With this configuration I'm launching my projects as follows:


DISPLAY=:0.1 command_to_run


I can confirm that this has much less "frame loss" compared to what I got with NVidia TwinView setups. However, applications are not able to read any keyboard input with this configuration. Mouse input works, as long as I move the mouse over to the Rift screen and programmatically "grab" the cursor. Is there any known trick to send keyboard input to the second screen? I also tested to just run "DISPLAY=:0.1 gnome-terminal", and it is the same: Even if I click on the window on the Rift screen, I cannot type anything. I should note that on the second screen, there is no window manager at all (not quite sure if this is intended behavior or a bug). Am I supposed to run a separate window manager on this screen as well to enable input handling? I tried to run "DISPLAY=:0.1 compiz" on the Rift, but this immediately fails.

The idea of my second approach is not to run two separate screen within one ServerLayout (:0.0 and :0.1), but configure two independent ServerLayouts (= display ids), each with only one screen (i.e. resulting in :0 and :1). This is probably the cleaner solution, since it makes more sense that the Rift and the main desktop have individual display IDs and are not two screen belonging to one desktop. If I understand it correctly, the first occurrence of a ServerLayout in xorg.conf will be used as default desktop, and all others can be started manually by "startx". The modified xorg.conf looks like this:


Section "ServerLayout"
Identifier "Layout0"
Screen 0 "Screen0" 0 0 # only the main screen, no Rift
InputDevice "Keyboard0" "CoreKeyboard"
InputDevice "Mouse0" "CorePointer"
Option "Xinerama" "0"
EndSection

Section "ServerLayout"
Identifier "Rift"
Screen 0 "Screen1" 0 0 # only the Rift, no main screen
InputDevice "Keyboard0" "CoreKeyboard"
InputDevice "Mouse0" "CorePointer"
Option "Xinerama" "0"
EndSection

# the remained is identical to the version above...


What is strange though: The Rift is still enabled after rebooting with this configuration (login screen visible in the Rift). What is worse: I haven't found a way to start anything on the Rift yet. According to the documentation I was expecting this to work:


sudo startx /full/path/to/command -- :1 -layout Rift


However, this freezes my system completely, not even CTRL+ALT+F_ is working. In the corresponding /var/log/Xorg.1.log I at least saw that my command was (correctly) trying to read the "Rift" ServerLayout but later panicked with an error mentioning "device not found".

I'm wondering if anyone has managed to configure the Rift in this way?

I should mention that the approach with "sudo startx" will change the owner of the .Xauthority of the current user to root. This is pretty nasty, since a login in lightdm (the default Ubuntu login manager) will just fail without any error, and it is not possible to login anymore. Login is still possible in a text shell (CTRL+ALT+F1), though. Changing the owner back to the user solves the problem. Took me quite some time to figure this one out...
11 REPLIES 11

matus
Level 2
try starting x server from tty console. also do it without sudo. works for me.

bluenote
Level 2
You are right, "sudo startx" seems to be a deadly sin. And indeed, running startx from tty1 does not require root privileges. However, it still crashes for me when I specify "-layout Rift". When I omit "-layout Rift", I get a working X session on CTRL+ALT+F8, running on my primary monitor (but in both sessions the Rift is enabled as extended desktop, not quite what I expected from my configuration of the ServerLayout).

I was trying to track down the issue in the Xorg.log for this second session, which looks like this:


[ 683.686]
X.Org X Server 1.15.1
Release Date: 2014-04-13
[ 683.687] X Protocol Version 11, Revision 0
[ 683.687] Build Operating System: Linux 3.2.0-70-generic x86_64 Ubuntu
[ 683.687] Current Operating System: Linux fabuntu 3.13.0-45-generic #74-Ubuntu SMP Tue Jan 13 19:36:28 UTC 2015 x86_64
[ 683.687] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.13.0-45-generic root=UUID=58115c9d-156a-4608-9adb-e20d7ffc9b11 ro quiet splash
[ 683.687] Build Date: 10 December 2014 06:15:52PM
[ 683.688] xorg-server 2:1.15.1-0ubuntu2.6 (For technical support please see http://www.ubuntu.com/support)
[ 683.688] Current version of pixman: 0.30.2
[ 683.688] Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
[ 683.688] Markers: (--) probed, (**) from config file, (==) default setting,
(++) from command line, (!!) notice, (II) informational,
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[ 683.690] (==) Log file: "/var/log/Xorg.1.log", Time: Sun Feb 15 11:29:08 2015
[ 683.690] (==) Using config file: "/etc/X11/xorg.conf"
[ 683.690] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[ 683.691] (++) ServerLayout "Rift"
[ 683.691] (**) |-->Screen "Screen1" (0)
[ 683.691] (**) | |-->Monitor "Monitor1"
[ 683.691] (**) | |-->Device "Device1"
[ 683.691] (**) |-->Input Device "Keyboard0"
[ 683.691] (**) |-->Input Device "Mouse0"
[ 683.691] (**) Option "Xinerama" "0"
[ 683.691] (==) Automatically adding devices
[ 683.691] (==) Automatically enabling devices
[ 683.691] (==) Automatically adding GPU devices
[ 683.691] (WW) The directory "/usr/share/fonts/X11/cyrillic" does not exist.
[ 683.691] Entry deleted from font path.
[ 683.691] (WW) The directory "/usr/share/fonts/X11/100dpi/" does not exist.
[ 683.691] Entry deleted from font path.
[ 683.691] (WW) The directory "/usr/share/fonts/X11/75dpi/" does not exist.
[ 683.691] Entry deleted from font path.
[ 683.691] (WW) The directory "/usr/share/fonts/X11/100dpi" does not exist.
[ 683.691] Entry deleted from font path.
[ 683.691] (WW) The directory "/usr/share/fonts/X11/75dpi" does not exist.
[ 683.691] Entry deleted from font path.
[ 683.691] (==) FontPath set to:
/usr/share/fonts/X11/misc,
/usr/share/fonts/X11/Type1,
built-ins
[ 683.691] (==) ModulePath set to "/usr/lib/x86_64-linux-gnu/xorg/extra-modules,/usr/lib/xorg/extra-modules,/usr/lib/xorg/modules"
[ 683.691] (WW) Hotplugging is on, devices using drivers 'kbd', 'mouse' or 'vmmouse' will be disabled.
[ 683.691] (WW) Disabling Keyboard0
[ 683.691] (WW) Disabling Mouse0
[ 683.691] (II) Loader magic: 0x7f4186908d40
[ 683.691] (II) Module ABI versions:
[ 683.691] X.Org ANSI C Emulation: 0.4
[ 683.691] X.Org Video Driver: 15.0
[ 683.691] X.Org XInput driver : 20.0
[ 683.691] X.Org Server Extension : 8.0
[ 683.691] (II) xfree86: Adding drm device (/dev/dri/card0)
[ 683.693] (--) PCI:*(0:1:0:0) 10de:1189:1043:841b rev 161, Mem @ 0xf6000000/16777216, 0xe8000000/134217728, 0xf0000000/33554432, I/O @ 0x0000e000/128, BIOS @ 0x????????/524288
[ 683.693] Initializing built-in extension Generic Event Extension
[ 683.693] Initializing built-in extension SHAPE
[ 683.693] Initializing built-in extension MIT-SHM
[ 683.693] Initializing built-in extension XInputExtension
[ 683.693] Initializing built-in extension XTEST
[ 683.694] Initializing built-in extension BIG-REQUESTS
[ 683.694] Initializing built-in extension SYNC
[ 683.694] Initializing built-in extension XKEYBOARD
[ 683.694] Initializing built-in extension XC-MISC
[ 683.694] Initializing built-in extension SECURITY
[ 683.694] Initializing built-in extension XINERAMA
[ 683.694] Initializing built-in extension XFIXES
[ 683.695] Initializing built-in extension RENDER
[ 683.695] Initializing built-in extension RANDR
[ 683.695] Initializing built-in extension COMPOSITE
[ 683.695] Initializing built-in extension DAMAGE
[ 683.695] Initializing built-in extension MIT-SCREEN-SAVER
[ 683.695] Initializing built-in extension DOUBLE-BUFFER
[ 683.696] Initializing built-in extension RECORD
[ 683.696] Initializing built-in extension DPMS
[ 683.696] Initializing built-in extension Present
[ 683.696] Initializing built-in extension DRI3
[ 683.696] Initializing built-in extension X-Resource
[ 683.696] Initializing built-in extension XVideo
[ 683.696] Initializing built-in extension XVideo-MotionCompensation
[ 683.696] Initializing built-in extension SELinux
[ 683.697] Initializing built-in extension XFree86-VidModeExtension
[ 683.697] Initializing built-in extension XFree86-DGA
[ 683.697] Initializing built-in extension XFree86-DRI
[ 683.697] Initializing built-in extension DRI2
[ 683.697] (WW) "glamoregl" will not be loaded unless you've specified it to be loaded elsewhere.
[ 683.697] (II) "glx" will be loaded by default.
[ 683.697] (WW) "xmir" is not to be loaded by default. Skipping.
[ 683.697] (II) LoadModule: "glx"
[ 683.697] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/libglx.so
[ 683.703] (II) Module glx: vendor="NVIDIA Corporation"
[ 683.703] compiled for 4.0.2, module version = 1.0.0
[ 683.703] Module class: X.Org Server Extension
[ 683.703] (II) NVIDIA GLX Module 331.113 Mon Dec 1 20:24:35 PST 2014
[ 683.703] Loading extension GLX
[ 683.703] (II) LoadModule: "nvidia"
[ 683.703] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so
[ 683.704] (II) Module nvidia: vendor="NVIDIA Corporation"
[ 683.704] compiled for 4.0.2, module version = 1.0.0
[ 683.704] Module class: X.Org Video Driver
[ 683.704] (II) NVIDIA dlloader X Driver 331.113 Mon Dec 1 20:01:51 PST 2014
[ 683.704] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[ 683.704] (++) using VT number 8

[ 683.712] (II) Loading sub module "fb"
[ 683.712] (II) LoadModule: "fb"
[ 683.713] (II) Loading /usr/lib/xorg/modules/libfb.so
[ 683.713] (II) Module fb: vendor="X.Org Foundation"
[ 683.713] compiled for 1.15.1, module version = 1.0.0
[ 683.713] ABI class: X.Org ANSI C Emulation, version 0.4
[ 683.713] (WW) Unresolved symbol: fbGetGCPrivateKey
[ 683.713] (II) Loading sub module "wfb"
[ 683.713] (II) LoadModule: "wfb"
[ 683.713] (II) Loading /usr/lib/xorg/modules/libwfb.so
[ 683.713] (II) Module wfb: vendor="X.Org Foundation"
[ 683.713] compiled for 1.15.1, module version = 1.0.0
[ 683.713] ABI class: X.Org ANSI C Emulation, version 0.4
[ 683.713] (II) Loading sub module "ramdac"
[ 683.713] (II) LoadModule: "ramdac"
[ 683.713] (II) Module "ramdac" already built-in
[ 683.713] (EE) Screen 0 deleted because of no matching config section.
[ 683.713] (II) UnloadModule: "nvidia"
[ 683.713] (II) UnloadSubModule: "wfb"
[ 683.713] (II) UnloadSubModule: "fb"
[ 683.713] (EE) Device(s) detected, but none match those in the config file.
[ 683.713] (==) Matched nvidia as autoconfigured driver 0
[ 683.713] (==) Matched nouveau as autoconfigured driver 1
[ 683.713] (==) Matched nvidia as autoconfigured driver 2
[ 683.713] (==) Matched nouveau as autoconfigured driver 3
[ 683.713] (==) Matched modesetting as autoconfigured driver 4
[ 683.713] (==) Matched fbdev as autoconfigured driver 5
[ 683.713] (==) Matched vesa as autoconfigured driver 6
[ 683.713] (==) Assigned the driver to the xf86ConfigLayout
[ 683.713] (II) LoadModule: "nvidia"
[ 683.713] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so
[ 683.713] (II) Module nvidia: vendor="NVIDIA Corporation"
[ 683.713] compiled for 4.0.2, module version = 1.0.0
[ 683.713] Module class: X.Org Video Driver
[ 683.713] (II) UnloadModule: "nvidia"
[ 683.713] (II) Unloading nvidia
[ 683.713] (II) Failed to load module "nvidia" (already loaded, 32577)
[ 683.713] (II) LoadModule: "nouveau"
[ 683.713] (II) Loading /usr/lib/xorg/modules/drivers/nouveau_drv.so
[ 683.715] (II) Module nouveau: vendor="X.Org Foundation"
[ 683.715] compiled for 1.15.0, module version = 1.0.10
[ 683.715] Module class: X.Org Video Driver
[ 683.715] ABI class: X.Org Video Driver, version 15.0
[ 683.715] (II) LoadModule: "modesetting"
[ 683.715] (II) Loading /usr/lib/xorg/modules/drivers/modesetting_drv.so
[ 683.715] (II) Module modesetting: vendor="X.Org Foundation"
[ 683.715] compiled for 1.15.0, module version = 0.8.1
[ 683.715] Module class: X.Org Video Driver
[ 683.715] ABI class: X.Org Video Driver, version 15.0
[ 683.715] (II) LoadModule: "fbdev"
[ 683.715] (II) Loading /usr/lib/xorg/modules/drivers/fbdev_drv.so
[ 683.716] (II) Module fbdev: vendor="X.Org Foundation"
[ 683.716] compiled for 1.15.0, module version = 0.4.4
[ 683.716] Module class: X.Org Video Driver
[ 683.716] ABI class: X.Org Video Driver, version 15.0
[ 683.716] (II) LoadModule: "vesa"
[ 683.716] (II) Loading /usr/lib/xorg/modules/drivers/vesa_drv.so
[ 683.716] (II) Module vesa: vendor="X.Org Foundation"
[ 683.716] compiled for 1.15.0, module version = 2.3.3
[ 683.716] Module class: X.Org Video Driver
[ 683.716] ABI class: X.Org Video Driver, version 15.0
[ 683.716] (II) NVIDIA dlloader X Driver 331.113 Mon Dec 1 20:01:51 PST 2014
[ 683.716] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[ 683.716] (II) NOUVEAU driver Date: Thu Nov 7 14:56:48 2013 +1000
[ 683.716] (II) NOUVEAU driver for NVIDIA chipset families :
[ 683.716] RIVA TNT (NV04)
[ 683.716] RIVA TNT2 (NV05)
[ 683.716] GeForce 256 (NV10)
[ 683.716] GeForce 2 (NV11, NV15)
[ 683.716] GeForce 4MX (NV17, NV18)
[ 683.716] GeForce 3 (NV20)
[ 683.716] GeForce 4Ti (NV25, NV28)
[ 683.716] GeForce FX (NV3x)
[ 683.716] GeForce 6 (NV4x)
[ 683.717] GeForce 7 (G7x)
[ 683.717] GeForce 8 (G8x)
[ 683.717] GeForce GTX 200 (NVA0)
[ 683.717] GeForce GTX 400 (NVC0)
[ 683.717] (II) modesetting: Driver for Modesetting Kernel Drivers: kms
[ 683.717] (II) FBDEV: driver for framebuffer: fbdev
[ 683.717] (II) VESA: driver for VESA chipsets: vesa
[ 683.717] (++) using VT number 8

[ 683.717] (WW) xf86OpenConsole: setpgid failed: Operation not permitted
[ 683.717] (WW) xf86OpenConsole: setsid failed: Operation not permitted
[ 683.717] (WW) Falling back to old probe method for modesetting
[ 683.717] (WW) Falling back to old probe method for fbdev
[ 683.717] (WW) Falling back to old probe method for vesa
[ 683.717] (EE) No devices detected.
[ 683.717] (EE)
Fatal server error:
[ 683.717] (EE) no screens found(EE)
[ 683.717] (EE)
Please consult the The X.Org Foundation support
at http://wiki.x.org
for help.
[ 683.717] (EE) Please also check the log file at "/var/log/Xorg.1.log" for additional information.
[ 683.717] (EE)
[ 683.718] (EE) Server terminated with error (1). Closing log file.


It looks like the first difference when using the Rift ServerLayout is the error "(EE) Screen 0 deleted because of no matching config section".

Would you mind to post your xorg.conf for comparison?

matus
Level 2
I just use the linux mint display settings gui to set up the screen - instead of editing x.conf. Try that. You can then look up the resulting x.conf and save it for later use. If that doesn't work for you, I will can share my x.conf ...

bluenote
Level 2
As far as I can tell, there is no option to save the xorg.conf from the Ubuntu display settings GUI (not sure if installing the Mint display manager would work). I was using nvidia-settings to generate the xorg.conf but it does not offer to generate configurations according to the second approach (different display-identifier, it only allows different screen-identifier). A working xorg.conf would still be highly appreciated.

FictionX
Level 2
Are there actually any advantages to messing about with a separate x screen instead of just using xrander to sync to the Rift?

I tried the xorg way a while ago, but got pretty frustrated with it. I ended up creating a few xrander shortcuts instead. It seems to work fairly well.

I described my setup here:
viewtopic.php?f=26&t=19893#p242513

Edit: I know the read_me talks about frame loss on nvidia twinview - but I'm not sure that's an issue anymore with the newer drivers (I think I read that somewhere). And even if it was, switching the main monitor off and only having the Rift active would fix that, right? - or am I missing something about the TwinView concept?

bluenote
Level 2
"FictionX" wrote:
Are there actually any advantages to messing about with a separate x screen instead of just using xrander to sync to the Rift?


Well, it seems that this is the only way for me to get rid of my judder issues when using the Rift under JVM via JOVR.

Apart from that I think the best long term solution will be to have two separate X displays. With this you avoid all the issues a TwinView extended setup has (for instance evince and gedit tend to open on the Rift, which is annoying). Running (only) two separate X _screens_ also feels wrong, since in this case you still have to setup the Rift relative to the main screen (for instance I often lose the focus of the right border scrollbars since my mouse cursor jumps over to the Rift). According to the X terminology a display is composed of multiple screens, which form a visual unit (=desktop). Since the main screen and the Rift are not a useful desktop configuration, it only makes sense to configure them as individual displays. This is the closest equivalent to "direct mode", i.e., the Rift is not part of the desktop, one would by-pass any window manager, and it would also allow to run a true multi-screen setup as desktop environment. In order to run something on the Rift you can start it from your desktop with "DISPLAY=:1 cmd_to_run", and use CTRL-ALT-F8 and CTRL-ALT-F7 to switch your input devices exclusively to either the Rift or the main screen.

I just saw (thanks for the link) that you always turn off the main display when switching to the Rift. I'll have to try if this solves frame loss for me as well. However, developing like this must be rather difficult? I typically use an automatic reloading of my shaders and certain scene configurations. This allows me to run my stuff on the Rift, while modifying things using the regular desktop and quickly check how it looks in the Rift. Well, maybe if invoking xrandr does not screw up my virtual workspaces, using a set of hotkeys may work.

bluenote
Level 2
@FictionX: I made some experiments using your xrandr-turn-primary-off idea. Surprisingly, this does not help with my judder issues, i.e., whether or not I turn off my primary does not seem to have any effect.

I now have found a way to make my first approach (two X screens) work with user input. I simply start a window manager on the second screen before running any Rift applications. I had tried this before with "DISPLAY=:0.1 compiz --replace &", but running compiz twice seems to produce some weird errors. Instead of running compiz, I picked the first lightweight window manager that came to my mind (icewm). Now, my start scripts look like this:


export DISPLAY=:0.1
icewm --replace &
command_to_run_on_rift


In order to get rid of the taskbar of icewm it suffices to create a file "~/.icewm/preferences" containing the line "ShowTaskBar = 0".

Regarding the favorable approach with two individual X displays: All my further attempts have failed so far.

Maybe this also means that the "frame loss" issues are actually caused by using a composite window manager in the first place.

@cybereality: I still would appreciate if Oculus could share a bit more information on this. Maybe a kind of best-practices for setting up the Rift under Linux. I now have lost several weeks of development because I was assuming the judder is caused by a mistake in my code. It also took me quite some time to finally come up with a working X configuration. This all could have been avoided by just a bit more official documentation. What about the plan to provide some kind of new wiki/knowledge-base to support developers, which was brought up some time ago?

FictionX
Level 2
Thanks for the update!

But yeah - if Oculus think that this is indeed the best way to go, there should be some best practice instructions available.

bullale
Level 2
@bluenote and @FictionX,

I know that Oculus doesn't support Linux anymore but I'm back to trying to get my experiments running in Linux. I have a minor problem that I hope you (or anyone else that uses Linux) can hep with.

I'm using Ubuntu 14.04 and nVidia GTX 970. I've installed ovr_sdk_0.5.0.1

I use `sudo nvidia-settings` to set my primary monitor (1080p 144Hz) to screen 0 and the DK2 (75 Hz) to screen 1. Then I save my /etc/X11/xorg.conf file and apply settings. The settings cannot be applied so I do a `sudo restart lightdm`. Then I can run


/usr/local/bin/ovrd &
DISPLAY=:0.1 /usr/local/bin/OculusWorldDemo


Then I have to press F9 to get the application to go full screen and move the mouse over to the 'screen' so I can use mouse-yaw, but that all works. I didn't have to do anything special to get it to capture the keyboard. It's updating at a solid 75 Hz, no judder or any other visual artifacts.

Problem 1: When I restart the computer, my display settings are lost. I have to `sudo restart lightdm` again to get them to load. It seems to be a common problem that Ubuntu (maybe lightdm specifically) ignores xorg.conf on startup and falls back to whatever was specified in Ubuntu's display settings. However, most of the solutions I found elsewhere were about setting the relative display positions either by using Unity's display settings app or by changing the display manager. Neither solution worked for us. Any ideas? Did you have to do anything special to get your xorg.conf settings to load on reboot?

I have further problems but I'm still trying different things so I'll ask later if still necessary.