-
-
Notifications
You must be signed in to change notification settings - Fork 442
OpenXR - Basic integration for Meta Quest #577
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can I help you with this project? I am a programmer with very little Java experience, but I am a Linux power user and work as a Linux sysadmin; I also have a Quest 2 I also wanted to ask/suggest possibly leveraging wxrd (https://gitlab.freedesktop.org/xrdesktop/wxrd), a lightweight Linux VR compositor based on wlroots, or alternatively xrdesktop-gnome-shell, or xrdesktop-kdeplasma. Unsure if this helps or hinders your development, but these compositors provide a 3D VR environment with free-floating app windows, and controller support |
Let me make the basic integration working first. Currently it is just a black screen and it does nothing.
I didn't know about xrdesktop, it looks pretty wild. It would be ride to make it working on standalone. I imagine it quite challenging to made that working on Quest but I might be wrong. |
Termux:X11 is not related to Wayland. |
It was my understanding that Termux:X11 is an xwayland session. Weston reportedly works, and that is Wayland-based. @lvonasek The xrdesktop project also has Gnome and KDE-specific builds that are x11-based (https://gitlab.freedesktop.org/xrdesktop). The wxrd window manager was created to have an extremely small footprint. In all the aforementioned cases, xrdesktop is the underlying platform, which already has movement tracking and controller support. (Hoping "Direct Input" option for touchscreen passthrough could work to pass the controllers and head tracking to Monado without much trouble) |
|
@beef-ox It is nice to see there are many opportunities. But until I have the basic integration working, I won't distract myself with other possible stuff. Key to success is to do small steps and do them properly. |
It was first few years. Termux:X11 implemented a small subset of wayland protocol only to make it possible to run Xwayland. But at least year ago project dropped it because of architecture restrictions.
Weston works on top of X11 session. It does not need Wayland session to work, it starts wayland session.
wxrd requires wlroots, which requires GLES with some extensions which can not be implemented on Android. Android vendors do not implement support for these extensions and even if they do they are not a part of SDK/NDK and not guaranteed to work.
You have illusions about how that works. It is implemented only for touchscreen and passes only touchscreen events. |
|
Termux:X11 does not use C++ to keep APK size as small as possible. Currently I am not intended to merge C++ code, only C. |
Ok, good to know. I will move the XR code to C. |
|
There are a few more things:
|
I believe I can move to GLES2 completely. GLES3 would be needed if I use stereoscopical rendering using multiview extension.
The swapchain is required by OpenXR. The only way to render in OpenXR is to render into texture and then let headset reproject it. This architecture is very helpful in VR as you can get fluent experience even when not rendering lower framerate than headset's refresh rate. In 2D rendering it doens't bring much benefit but it still needs to be used.
I would like to avoid mapping Meta Quest touch controllers thumbsticks to joystick. The thumbsticks are getting after some time extreme noise. In other XR projects I check if the stick is 70% on right and if so then I send event to right key arrow. But of course we could make it optional at some point. |
|
@lvonasek I am not really sure how exactly it works. How exactly you are intending to extract frames in the activity process? Currently LorieView (which works in context of MainActivity) does not output anything to Surface. It simply passes this Surface to X server process via Binder and X server (which works in |
|
First, I need to figure out how the rendering in this project works. Ideally, I would call glBindFramebuffer (binding my XrFramebuffer) and render the frame using OpenGL into it. That way the frame is in OpenXR. In OpenXR, I'll define I want to render it on a plane in 3D space. It is work in progress and I am new to this repo, please be patient if I commit or say something stupid. |
I explained where exactly Surface is being used so I can explain rendering process too. Actually this process is pretty much simple. You can reimplement the whole thing in pure vulkan and integrate it to your OpenXR related code. But. I am not sure why OpenXR context is initialized with JavaVM and global reference of Activity. So I am not sure if it can run completely in X server process. I think I will understand it better in the case you elaborate how exactly that works. |
Thank you, this is very helpful.
For JavaVM I found nowhere any info why is it required. The activity itself is needed for app lifecycle (listening to onWindowFocusChange, onPause and onResume events). I try to elaborate but I am really not good at explaining: AR/VR headsets have two app modes: 2D (Android apps flying in 3D space) and immersive OpenXR mode. In immersive mode the app cannot render anything using Android API. The only way to show something on screen is OpenGL/Vulkan. Meta recently added support for hybrid apps where you can switch between 2D and XR activity. I added hybrid app support into this PR and trigger OpenXR runtime only if the app is running on a headset. The final APK will run on regular Android and XR headset(s). Currently it is under construction but in the future I would like to start XR only if the XServer is running (currently there is no way in the headset to go into the preferences or open help page).
|
|
With all due respect, I would rather not lose the ability to render as a 2D app on the Quest's home launcher when the x server is displaying 2D content. There should be no need to do that. x11 is a very important and well-understood protocol. If you want to implement Quest support, I don't think you should be creating a custom, made by you 3D environment to reproject onto, from which all further users of Termux:x11 will then be forced into using, over the Quest's multi-tasking launcher which lets you have 3 2D apps side by side; perfect for my programming workflow for example (and many others) The goal of Tx11 should be to implement as much of the x11 client protocol as possible, and as close to spec in all respects as possible. The distinction as to whether it should attempt 2D mode vs immersive mode should not be reliant upon the device it is on, but upon whether the x server is attempting to display OpenXR content AND the hardware supports it. I 100% agree, if the Linux environment is trying to output stereoscopic content over x11, this should indeed display it in immersive mode, but if not, it should display it as a 2D app window. Ideally, this could work like full screen, where the rendering pipeline is direct vs going through a compositor. 2D content displays in a traditional desktop "display" as a 2D app within a WM/DE, but attempting to display XR content would switch to immersive mode to display that content. |
I will definitely try to make that optional. |
|
I am not skilled enough to check your C/Java OpenXR code, but I think I can review the Java part.
The rest of code seems to be fine. |
|
Thank you for the review.
|
|
Any another merge blocker @twaik? |
|
I do not think so. I'll check it later. |
|
Don't you mind if I squash all the commits into one? |
|
I am fine with squashing it. |
Without log from logcat I cannot say much. Maybe an older OS version? (AFAIK it should work on V62 and higher) The XR mode is mostly usable for gaming (using "mobox"). |
Do you mean > Mobox is a project designed to run windows x86 applications in Termux? |
Yes
Yes but without any advantages. I did a similar integration like this for Winlator project: https://youtu.be/c4faL1G1St4?feature=shared |
|
Cool! XFCE4 works in both modes, howeever. No way to return from preferencies activity, pressing/clicking/touching BACK buttons does nothing (in both modes). Have to close tx11 and open again to apply changes. |
|
If you open system notifications, there is tx11 and from there you can open preferences. I stopped working on Winlator because it isn't fully opensource. |
Ofcourse, but question is - how to close preferences |
|
With the X button. However TX11 might need a restart if you change XR mode when running. |
|
@lvonasek
|
|
I mean you can use it directly. |
|
I just followed this: I will create a PR cleaning it up. |
|
I am pretty much sure the is not necessary, but only you can check if that is true. |
|
Can you please find out how to get out of 3d mode programmatically? Starting MainActivity and finishing XrActivity does not seem to work. |
|
I mean |
|
I will take a look |
|
No, this isn't connected to my implementation. It seems something to do with a GPU driver. Report it to Mobox or whatever you use. |
|
Hi |
|
If its possible to move screen in xr mode that solves my problem |
|
It is a GPU problem. XR renders the data into an own framebuffer and then on a screen. I guess that makes the difference. I stop doing support in this PR. All further messages will be ignored. |
|
Sorry for wasting your time with🙏 this |
Introduction
This Pr adds OpenXR support for Meta Quest (2,3,Pro). Using the smartphone Android version in the headset is very hard due to missing controllers support and relative mouse pointer. Intent of this PR is to add full controller support and render the screen using OpenXR (no stereo/6DoF).
How does it work
In the code is detection if it is running on a Meta/Oculus device. The OpenXR is initialized only if the detection indicates it runs on a XR headset. By that said, it means the same APK will be possible to use on mobile and XR. This is possible due to hybrid apps support (Hybrid app == part of the app could be 2D and another XR).
Instead of drawing on a screen, it is rendered into OpenGL framebuffer which is then transformed into a flat screen in XR space. Mouse cursor is still relative but it is mapped on controller translation which works perfectly even in games. Controller buttons are mapped to most common game keys.
Notes