I want to use the Quest for creating Lightpainting Artwork.
This is essentially a long exposure on a camera where you use light sources to draw something into real environments. Because it is always a bit of a challenge to keep track of what and where you already drew something I had the idea to use the Quest to help with that. This would mean to attach a light source to one of the controller so it is in sync with the virtual representation of it. For a start this can be just a simple LED with a button taped to the controller, but can be extended if this works out.
The problems I am trying to figure out are if it is possible to see the environment while running a program. In another thread here I have read that someone was asking if it is possible to use the live feed from the camera, altho I am not sure if this is the same here. While setting up the guardian you basically do what I am trying to achieve, so technically it should be possible. Does not need to be live and fully detailed, so some kind of collision map that gets generated by a command would be more then enough, really only need to have obstacles and positions of objects displayed.
Then there is the question of what framework is best for this. It doesn't have to be extremely fancy. Something like a very lite version of Tilt Brush would be all that is needed at this point. So ability to draw simple lines in 3d space and load an image and place/scale it somewhere in the room is all that is needed. Later along development can think about different brush shapes because the light sources can have different shapes too and color control but to start this off simple lines would be enough.
Also a topic for later would be to remote control the light source in some way. Is it possible to send some info to external devices (using wifi, bluetooth or usb port)? ll that is needed is probably brightness and color information, so simple arduino with wifi or something like that could be used to control an RGB LED.