cancel
Showing results for 
Search instead for 
Did you mean: 

About Next-Mind SDK for Unity by Oculus Quest 2 develop.

alejandrocastan
Protege
Hi Everyone, 
I am very interested and excited to tried the new Next-Mind device to develop my own VR Game. More info about some reviews on both device here :
https://www.youtube.com/watch?v=SMXfyZc_Gvg
since there is not a lot of info about its development for VR Games, if there is somebody here who tried it,  Could share with me your experiences about it, Would it be possible?
Thanks for your time
Alejandro
9 REPLIES 9

kojack
MVP
MVP
That's interesting. Shame it's about as expensive as the 256GB Quest 2.

This is an interesting alternative to the Emotiv Epoc style of brain interface.
The Epoc was a general EEG device that sensed certain actions via feelings. You could think about how it felt to push, pull, lift an object, or sensations like hot and cold. There was something like 12 of those that it could be trained on.

The Next Mind is quite different. It only senses the visual cortex (the very back part of the brain). In VR your program flashes certain patterns (NeuroTags) over the top of objects (the flickering green lines in the video above). When you concentrate on one of these flashing patterns, the sensor can detect which one. There's only a single action: activate.

So let's say you have three buttons in VR as part of a HUD. You can look at a button and concentrate on it, and it will be clicked. Or you could look at a character in the game and it would select them. It's just a single trigger event.

One issue is that the pattern needs to be large-ish. Smaller patterns have lower sensing reliability.

It would make a very interesting device for making VR experiences for people with physical restrictions (in a hospital bed, limited arm movement, etc). Think of those old first person RPGs (Dungeon Master, Eye Of The Beholder, etc) that had the movement arrows on screen for mouse control, you could have those to move in VR by concentrating on them.

For general gaming use though, I think the flashing green lines over the top of everything interactable would drive me crazy after a while. 🙂
Author: Oculus Monitor,  Auto Oculus Touch,  Forum Dark Mode, Phantom Touch Remover,  X-Plane Fixer
Hardware: Threadripper 1950x, MSI Gaming Trio 2080TI, Asrock X399 Taich
Headsets: Wrap 1200VR, DK1, DK2, CV1, Rift-S, GearVR, Go, Quest, Quest 2, Reverb G2

alejandrocastan
Protege
Hi Kojack,
First, thanks for your comment.
On the other hand, it is very interesting your idea about its use like for example for making VR experiences for people with physical restrictions (in a hospital bed, limited arm movement, but I have some doubts about if "now" it is right moment to get it.
I asked one people  "own of Next-Mind" on Reddit because I told him that I am interested in integrating it into my VR environment and he told me the following:

" I do t think next mind will be the right tool for such integration. I saw some about valves next VR including BCI built-in. Not a lot is known now but ideally, we will see dev kits for that in 2022."

For that reason, I think I will wait to see some comments from the Unity/Oculus devs and other reviews about it until I bought.
Maybe it can be a useful tool for my Health project but I feel that it would be a good decision wait to until the second version of it.
Thanks again for your help 
Best Regards
Alejandro



DrGenome
Explorer
Hello Alejandro,
I do not know the details of your health project, but I got the Dev Kit yesterday and it was a lot of fun. It took some tinkering but a couple of hours later I had my own very (very) simple interactive experience where I could press three different buttons just by looking at them. While this is extremely simple it is just as extensible and transferable to a complex application where you could use up to 10 different NeuroTags as triggers for different events. The trigger is not instantaneous but you can simply look at the flashing textures and the event will trigger in about 1-2 seconds. Great concept tool for development.

In regards to VR Development this can certainly be an alternative for interaction for users with physical restrictions. However, this really depends on the type of restriction; for instance, if the user can't interact with traditional VR controllers due to limitations in mobility then the next alternative would be controllers that do not require motion, and this can be in the form of a console controller or specially designed controllers for specific disabilities.
Next, if the user does not have any arm mobility but can still rotate the head, then gaze interaction is a completely feasible alternative and it is well developed within unity. This can still allow to experience teleport to navigate around environments and gaze click to interact with objects.
Finally if the user does not have head mobility the VR experience could still provide an exciting FOV experience, and here is where the Next-Mind could help with interactions. Without voice commands or traditional gaze interactions (which would require head movement) the user would need to point to clickable objects just by looking at them, and this could be accomplished either by eye tracking, or by EEG signal processing of the visual cortex. For example using the Next-Mind Dev Kit, the person could teleport to different locations, and even rotate the FOV by clicking arrows around the scene. I hope this helps!

alejandrocastan
Protege
Hi @germanvargas,
First, thanks a lot to share with me your experience with Next-Mind device and your ideas about how to use it with people with health problems. At this moment we are making VR Games for people with Motor, Cognitive, Visual, Auditory, Language problems. If you desire to know some more about our VR Health project, please see on our website for more details :
https://vrhealth.health/
I am very excited to try with it but the developer, unfortunately, does not ship to my country( Argentina) and will see how to fix this problem.
If you have some other idea about the Next-Mind device according to our project needs, please let me know it.
Thanks one more time.
Have a nice weekend !!
Best Regards
Alejandro

DrGenome
Explorer
Very interesting VR Health project! I would be glad to chat in more detail, and brainstorm opportunities. Feel free to reach out: germanvargas@yahoo.com.

Un cordial saludo,

German

alejandrocastan
Protege
Hi German,
Thanks for your kind words about our project.
Yes, of course, I will contact you soon.
Thanks again for offering us your help !!
Saludos Cordiales
Alejandro


Nekto2
Superstar
Could it detect eye close/open faster then 1-2 sec?
Will it detect if one looks at a pattern and then briefly close and open eyes? Or just one eyes?
Dbl Click will be for 2 open/close one after another.

Could it be used to auto-calibrate ordinary eye tracking hardware with feedback from brain seen patterns on certain parts of screen?

alejandrocastan
Protege
Hi Nekto2
Thanks for your advice
Best Regards
Alejandro

Hello, I am a disabled Oculus user. I represent a charity for sports injured quadriplegics/tetraplegics. The majority of us have good head and arm movement but very weak or no finger movement. Apps and games which use predominately head movements would be extremely beneficial. Unfortunately very hard to find. Could you please suggest some apps/games which use this method, or any other suggestions would be gratefully received. Thanks, Dom