Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

NerveGear, A Reality Not Far Away

raiogamraiogam Posts: 18
NerveGear
edited June 2015 in PC Development
958d4ae953c9f023ed22a882689b2eae1347517864_full.jpg
Here I was, sailing the deep seas of the internet and are suddenly encounter a blog that caught my attention. I started looking threads and found one on the NerverGear, the helmet that can take you to the virtual world of anime Sword Art Online. My goal is not to debate this topic on the anime, since this is not the appropriate place, but specifically on the helmet and say why it is a not very distant reality. I'll tell you that the text will be rather large, but those interested, read, will not regret. Let's GO!
But ... What is NerveGear?
Many people do not see anime, or did not see or know this specifically, then, here's a basic explanation: In 2012, the anime Sword Art Online (SAO) showed a fictitious reality - and well palpable - as we could see in terms of Virtual reality technology in the year 2022. The NerveGear, shown in the anime, is a helmet that reads and sends information to the brain. It serves to read and transform into brain pulse commands within the virtual world and to send pulses to the brain in order to emulate sensation and perception. In the case of anime, sending impulses is so advanced that the helmet can make the user to disconnect from the real world and feel literally inside a virtual world. In technical terms what the helmet sends are several images per second to the brain in order to emulate an environment (how many frames per second? I do not know, but no lag, then it's more than 24), this along with sensations of pain , cold, ect. All this while keeping the user in a semi-coma, unable to feel, hear, or move your body in the real world, also the result of sending electrical impulses to the brain.
When I saw this I found very plausible, including the date, 2022. When the anime went however, although many people have found a cool idea, skepticism was huge in discussions when it would actually be possible: "2022 not in 2050 we have something !!! ". The skepticism was due to the fact no one even be researching something, and they had not any virtual reality devices in reasonable time (at least not widely publicized). Coincidence or not, in 2012 (the same year the anime) the "Oculus Rift" appeared in a technology fair, and left big names in the games industry baffled with the immersion that he could pass.
Oculus Rift?
The Oculus Rift is a device created by Palmer Luckey, an amateur games in seeking a more immersive experience. The idea of the Oculus Rift left the Palmer frustration after testing various virtual reality devices and disappointed with each of them. In 2012 he presented his project at a technology fair and the result was much higher than he expected. Big fish like the owner of Steam and other companies offered millions in investment for research, development and mass production of the device.
OculusRift1.jpg
The Oculus Rift is not just a rectangular display in front of you, he gives you the same field of vision of the human eye - that reaches almost 180 degrees. Combine this with the motion sensor to emulate human vision when you turn your head.
After capital investment of large companies started the project to transform what was a prototype (which could be purchased through a more than $ 300 donation to the project on Kickstarter) in a commercially viable device. And then started the changes, improvements launched and, guess what? SUCCESS!
The biggest problem the company still faces the Oculus Rift (or facing) is the sick feeling that many people felt (some had no problems) after prolonged use of the device. The second cause explanations of analysts, was due to absurd soaking the device causes. To put it simply, the sense of immersion is so great that your brain think you're seeing something real, just that, as the prototype still gives some lags from time to time and still can not graphically convey the feeling of fluides real world (responsible for the device say that the final version that has been almost completely corrected), the human brain is confused and hence comes the feeling of sickness and malaise. This also occurs in games in first person, it is quite common. They say that the final version had this and other outstanding issues, but never had the opportunity to test ... As I understand technology however, I am pretty sure that the device still has several bugs, and "things" that will need be improved. "Things" that will come in the second corrected version of the device, which will be better than the first (it will be repeated in version 3 and ect, but not as ridiculously as will happen from version 1 to 2). It is the same that occurs with new processor architectures, the jump from first to second version is always the biggest.
Still not enough!
original_123223_ueuo5_ryeq6eolpscefp5gxz6.jpgrazer_hydra__portal_2.jpg
Only emulate the environment eye will not give a near SAO experience, since you will still need a manual control (mouse or joystick) to move. The solution for this already exists as the video below shows through the simultaneous use of glasses with the Emotiv EPOC Rift and the Razer Hydra - all available for purchase on the Internet.
The Emotiv EPOC can read electrical signals produced by the brain when we think, feel or do facial expressions. But how does it work? Example: When you think of "moving forward" the device will detect the brain wave created by that thought, giving a code to it - like 11101010 - then the device software will convert that number into a simpler reading, marking the code to 1, that the program's language could be the floor of command forward. This process will be repeated for the thought of "going backwards", "aside", ect. After this calibration, game software would read the entry sent by EPOC software (the number converted to 1, 2, 3, ect) and would make virtual avatar in the game walk forward or back, side, depending on the code sent. For every person is different, so before using one of these would be necessary to calibrate your brain impulses to the device, walk forward to you can be the impetus detected as the code 11153612, 11126001 to someone else, that is, the impulse may look like but it is never the same in all people. The direction is given by the user's head since the Oculus Rift has a motion sensor. And as the Emotiv EPOC still does not seem accurate enough to be able to move his arms with agility, the Razer Hydra unit is used to emulate the movement of hands and arms in the game. You can find many users videos calibrating and using EPOC, worth checking out.
Would not it dangerous ?!
Of course the main SAO still have not (or at least not for sale on the Internet for now), which is a device that sends waves to the brain in order to emulate and control sensations and perceptions. But from the moment you read the brain becomes something simple, I do not see why write it will be complicated. If the device can record the pulse coded as 11122210 is arm pain, nothing prevents that it can be optimized to send that 11122210 boost for your brain to make you think you're in pain in the arm. The biggest problem are the dangers that "writing in the brain" has (SAO even showed some of them how you hold in a coma and delete or manipulate their memories). But thinking logically, these devices that send impulses to the brain must come with "engines" unable to create very strong impulses and potentially harmful to the brain, so even that give trouble, the signal they send is so weak that impossible cause data real to the person's brain. But we will have to wait to be sure.
The Perfect World
I am holding myself to use in games here, but it is logical that the application of this device will far beyond that. You can travel to France without leaving home, meet her boyfriend (a) in a long-distance relationship - to be far less lonely - meet with the family even more each other than on a planet or even create your own private virtual world. The porn industry will profit absurd in creating games with companions or realistic virtual companions (even if the sensations emulator is already up and running). Whether sees potential it already has a game with realistic graphics and women created from the motion capture real actresses in dev elopment pro Oculus Rift!
My fear is that in the future people will lock up within these virtual worlds - to be better than your reality in most cases. This already happens in MMORPG online, but immersion is still very low and most one sees controlling a puppet than actually feeling within the game. Now, when you feel inside of it, can turn your body out of shape on a muscular body (or model for women) and having a partner or companion (an NPC with a median AI created by you) perfect both aesthetically, when in terms of personality, by his side 100% of the time, you'll want to get out of there? I'm afraid the answer, especially individuals with weak and disturbed psychological. And for people with a less fertile imagination, think that you could have your actor / actress or favorite idol beside him "doing what you want" all day, but you would have very close feelings the real world. Would you like to leave this place?

The temptation is so great that I venture even a possible lock the device after so many hours of use, to prevent people lock up in his private virtual worlds and do not want to leave the house, go to work or even feed themselves properly. The problem is that as well as unlock consoles to play pirated games, nothing prevents them discover how to unlock something too.
But when it comes? I'm crazy to buy! -q

Take into account that the virtual reality of beta testing will start in a few years with the launch of Morpheus (virtual reality glasses from Sony), it is not hard to believe that in 2022 there to be with something very close to what was shown in SAO . After all, look where cell phones were 10 years ago, only served to make connection. Today are handheld computers with more powerful processing that a top PC 10 years ago. At about four years when the first smartphones were launched doubted they could have graphics of a console ... Has anyone played Infinite Blade 3? or Real Racing 3? None of them is far behind the graphics of an Xbox 360.
The business is now wait and prepare, because the future of virtual reality will start in 2014. And if there's one thing everyone who tested the Oculus Rift agree, is that it will change the form of entertainment in gaming as we know it.
What do you think?
And you, what they think of it? What are your expectations for 2022 in terms of technology?
Well, if one day it really come to become a reality, most likely thing, I probably only use my time in the real world to work and spend the rest in a virtual world. I must say that I approve of in parts.
It is also likely to be not such a striking, something like SAO same. At first, greatest revolution, everyone buying ... Then it would become something very casual. But of course that would have people (which I consider SICK) that pass all day playing, as well as in current MMORPGs.
I think the bid of the risks to the brain would be the biggest problem. No matter the instructions vinhessem, timers, regulators who do the simulation is "less real" would always have trickster wanting to find a way. Bum is cool, says Davy Jones. It's like Nintendo. The cartridge told not to blow, and when the cartridge did not work people did what?
Of course, I'm also looking forward, what I most want is a bit of adrenaline / share. Always enjoyed more hardcore games (where you die long before passing of each part, like "Dark Souls") then I would see a game type SAO focused on physical fights proximity and little magic. As adventure / medieval fan nothing would please me more than using a sword / shield or dual blades and out killing monsters, bosses and players feeling that it was really me who was there doing everything. :geek: :geek: :ugeek: :ugeek: :ugeek: :ugeek: :geek: :geek:
«1

Comments

  • raiogamraiogam Posts: 18
    NerveGear
    The NerveGear (ナーヴギア, Nāvugia?) is the second generation of FullDive technology made by Kayaba Akihiko, released in May 2022.

    Appearance
    The NerveGear is a streamlined helmet coated in dark blue. At the back, it has a wire of the same color stretched out of a long pad. It also has a battery and internal memory to store data from the games. 30% of the NerveGear's weight is from its internal battery.

    Transceivers
    The NerveGear's high density microwave transceivers can determine what the user's face looks like. The transceivers not only block every transmission from the brain to the body, but also from the body to the brain, thus, while using it, the player is completely insensate to the physical world. While details are never specified, the powerful electromagnet is able to destroy a person's brain, and SAO's operating system has been programmed to do so if the player's hit points are reduced to zero. Despite the exact method that kills a player being disclosed in the story, some of the signs left behind in the brain are cerebral hemorrhaging, and arterial occlusion. The NerveGear is equipped with a new generation diamond semiconductor central processor.

    Chronology
    At the time of the Sword Art Online beta test, there were about 200,000 players in possession of a NerveGear. Half of them applied for the beta test.

    After the events of the Death Game incident, Argus no longer produces new NerveGear units, and most users instead use RECT's replacement for the system - the AmuSphere, which has significantly less powerful transceivers; it is thus incapable of doing harm to a player, but cannot completely eliminate normal sensory input. 

    Kirito is one of the few VRMMO players who used a NerveGear even after the SAO incident, though this is likely due to time and monetary constraints in getting an AmuSphere, as he wanted to check out the possibility of Asuna being trapped in ALfheim Online. Eventually, even Kirito switched to using an AmuSphere after the ALO incident.

    After the SAO Incident, NerveGear had been seized and disposed as according to Japanese Code of Criminal Procedure, Article 121.

    Even though the NerveGear is considered obsolete, its base technology was used to develop other VR devices, such as the Medicuboid and the Soul Translator.

    Usage
    A NerveGear cartridge.

    To use the NerveGear, the player wears the game console over the head. Then it is recommended for the player to find a comfortable position to station the body, commonly being a bed. Afterwards, the game will load by the initiation words «Link Start». 

    Known Users
    Kirito wearing a NerveGear unit.

    Sword Art Online beta testers
    Sword Art Online players (Heathcliff might have used an advanced version of the NerveGear instead)

    Trivia
    In the Accel World series, written by the same author as Sword Art Online, volume 4, chapter 8 of the novel mentions an unnamed VR machine that is similar in background to the NerveGear. Episode 22 of the anime adaptation of the series confirms that the unnamed machine is a NerveGear.
    Kazuto is one of the few people who still have a NerveGear after the SAO incident. He was able to keep his hardware due to negotiations with Kikuoka Seijirou to let him keep it even though all copies of the NerveGear were meant to be disposed of.
  • raiogamraiogam Posts: 18
    NerveGear
    edited June 2015
    8bit_graphic.png

    1.png

    We are a community of researchers, engineers, artists, scientists, designers, makers, and more. The one thing we all have in common? We share an unfaltering passion for harnessing the electrical signals of the human brain and body to further understand and expand who we are. As our community continues to grow, so does the range of possibilities of what we can discover and create. What can we build together?</div>
    2.png

    OpenBCI stands for open-source brain-computer interface (BCI). The <a https://openbci.myshopify.com/collections/frontpage/products/openbci-8-bit-board-kit" target="_blank">OpenBCI Board</a> is a versatile and affordable bio-sensing microcontroller that can be used to sample electrical brain activity (EEG), muscle activity (EMG), heart rate (EKG), and more. It is compatible with almost any type of electrode and is supported by an ever-growing, open-source framework of signal processing applications. Check out <a href="https://github.com/OpenBCI"; target="_blank">our Github repos</a> to learn more about the OpenBCI SDK, firmware, and various ongoing projects. Check out ourhttp://docs.openbci.com

    3.png"

    As humans, the biggest challenges we face in understanding what makes us who we are and the greatest advancements we'll make while deciding what we become, will not be solved by a single company, an institution, or even an entire field of science. These discoveries will only—and should only—be made through an open forum of shared knowledge and concerted effort, by people from a variety of backgrounds. We work to harness the power of the open source movement to accelerate ethical innovation of human-computer interface technologies.</div>
    video.html





    https://www.kickstarter.com/projects/op ... video.html
    https://github.com/OpenBCI4

    http://www.openbci.com/index.php/contact
    GETTING STARTED W/ OPENBCI
    I. WHAT YOU NEED
    OpenBCI Contents
    OpenBCI board
    OpenBCI Dongle
    OpenBCI Electrode Starter Kit (ESK) or your own electrodes (not pictured)
    6V AA battery pack & (x4) AA batteries (batteries not included)
    (x4) plastic feet for board stabilization
    1. Your Board
    OpenBCI 8bit TopThis tutorial can be followed if you are working with any OpenBCI board (8bit, 32bit, or 32bit with Daisy). I’ll be working with the 8bit board.
    2. Your OpenBCI USB Dongle
    OpenBCI DongleThe OpenBCI USB Dongle has an integrated RFDuino that communicates with the RFDuino on the OpenBCI board. The dongle establishes a serial (if you’re working on a MAC) or COM (if you’re on PC or Linux) connection with your computer with its on-board FTDI chip. You’ll be connecting to this serial port from the OpenBCI GUI or whatever other software you want to end up using to interface your OpenBCI board.
    3. Your Electrode Starter Kit (ESK) Or Other Electrodes
    Electrode Starter KitIf you ordered an Open BCI Electrode Starter Kit, it should come with:
    10 passive, gold cup electrodes on a color-coded ribbon cable
    4oz Jar of Ten20 conductive electrode paste
    TouchProof AdapterIf you plan to work with your own electrodes, the touch-proof adapter may come in handy. It will convert any electrode that terminates in the industry-standard touch-proof design to an electrode that can be plugged into OpenBCI!
    4. Your 6V AA Battery Pack & 4 AA Batteries
    Battery ConnectionBoth the 8bit board and the 32 bit boards have specific input voltage ranges. These input voltage ranges can be found on the back-side of the board, next to the power supply. BE VERY CAREFUL to not supply your board with voltages above these ranges, or else you will damage your board’s power supply. For this reason, we recommend that you always use the battery pack that came with your OpenBCI kit.
    5. (x4) Plastic Feet
    Plastic FeetYour OpenBCI kit comes with 4 plastic feet that can be snapped into the holes of your board to provide extra stability while working.
    II. DOWNLOAD/RUN PROCESSING & THE OPENBCI GUI CODE
    1. Download Processing for your operating system
    Processing IDEBefore I continue, note that you don’t need to write any code for this tutorial, though you will see all of the code that makes the OpenBCI GUI run! First, go to the Processing Downloads page and download the latest stable release for your operating system. Processing is an open source creative coding framework based on Java. If you are familiar with the Arduino environment, you’ll feel right at home; the Processing IDE is nearly identical. If not, no worries! Once it’s finished downloading, unzip it and place the Processing .app or .exe where you typically place your applications or programs. For more information on Processing or for debugging the steps in the next section, check out the Processing Tutorials page.
    2. Download the OpenBCI GUI Processing code
    a. Download the necessary files & directories OR clone the OpenBCI/OpenBCI_Processing repo to your desktop (do this only if you’re familiar with Github). b. Unzip the download. It should be called OpenBCI_Processing-master after unzip/extract it. c. Locate the Processing sketchbook directory on your computer. This should have been created automatically when you installed processing. Depending on your operating system, this directory’s path is:
    On Windows: c:/My Documents/Processing/
    On MAC: /Users/your_user_name/Documents/Processing/
    On Linux: /Home/your_user_name/sketchbook/
    Note: this directory should be called “Processing” on Windows and Mac, and “Sketchbook” on Linux. This directory should already have a subdirectory called “libraries.” If it does not, create the subdirectory. d. Now, from the OpenBCI_Processing-master directory that you downloaded and unzipped in parts (a) and (b) above, copy the OpenBCI_GUI directory and paste it in the Processing sketchbook directory that you located in part (c) above. e. Finally, copy the controlP5 & gwoptics directories from OpenBCI_Processing-master/libraries and paste them into the libraries directory of your Processing sketchbook. f. Now everything is where it should be!
    3. Open Processing & launch the OpenBCI GUI
    a. If Processing is currently open, close it. The new libraries you added won’t be recognized until you restart the application.Processing IDEb. Double-click any of the .pde files in the OpenBCI_GUI directory and all of the OpenBCI GUI code should open in the Processing IDE, as seen on the right.Play Buttonc. Click the “run” button on the top left of the IDE, and the code should run! If it does not, make sure you installed your libraries correctly and that are using the latest version of Processing. If you continue to have issues, please refer to the software section of our forum for help.Syntheticd. Once the GUI is running, select “SYNTHETIC (algorithmic)” and hit the “START SYSTEM” button to launch the GUI with a synthetic data generator.Start Data Streame. Click the dark overlay on the GUI to exit the SYSTEM CONTROL PANEL and then hit the “Start Data Stream” button to begin the stream of synthetically generated EEG data. You should then see data streaming across the “EEG Data” graph on the left side of the GUI.
    III. PREPARE YOUR OPENBCI HARDWARE
    1. Make sure your FTDI drivers are installed and up-to-date
    FTDI InstallThe FTDI chip on your OpenBCI Dongle requires you to install the FTDI drivers on your machine. You may already have these installed, if you’ve worked with Arduino or other USB hardware accessories. You can download the latest FTDI drivers for your operating system here. Note: you may need to restart your GUI for this to take effect.Unidentified Developer MACIf using a MAC: When you try to install the FTDI driver, your computer may tell you that it is unable to install the application because it is from an unidentified developer. In this case, go to System Preference > Security & Privacy and switch your settings to “Allow Applications Downloaded from: Anywhere,” as seen in the screenshot to the right. You will most likely have to unlock the lock (and type in your root password) at the bottom of the Security & Privacy window before you can make this change.
    2. Plug in your OpenBCI USB Dongle
    Dongle ConnectionPlug this in (facing upwards!) and you should see a blue LED light up.Note: make sure your USB Dongle is switched to GPIO 6 and not RESET. The switch should be set closer to your computer as seen in the picture to the right.
    3. Plug in your 6V AA batter pack (with batteries)
    Battery ConnectionBoth the 8bit board and the 32 bit boards have specific input voltage ranges. These input voltage ranges can be found on the back-side of the board, next to the power supply. BE VERY CAREFUL to not supply your board with voltages above these ranges, or else you will damage your board’s power supply. For this reason, we recommend that you always use the battery pack that came with your OpenBCI kit. There’s a good reason we put this notice in here twice!
    4. Switch your OpenBCI board to PC (not OFF or BLE)
    Power Up BoardMake sure to move the small switch on the right side of the board from “OFF” to “PC”. As soon as you do, you should see a blue LED blink 3 times. You don’t press the reset button just to the left of the switch. If the LED still does not blink 3 times, make sure you have full battery. If you’re sure your batteries are fully charged, consult the hardware section of our Forum.Note: it’s important to plug in your Dongle before you turn on your OpenBCI board. Sometimes, if the data stream seems broken, you may need to unplug your USB Dongle and power down your OpenBCI board. Make sure to plug your USB Dongle in first, then power up your board afterwards.
    IV. CONNECT TO YOUR OPENBCI BOARD FROM THE GUI
    1. Relaunch your OpenBCI GUI
    You may need to relaunch the OpenBCI GUI after installing the FTDI drivers.
    2. Select LIVE (from OpenBCI)
    Select LiveIn order to connect to your OpenBCI, you must specify the data source to be “LIVE (from OpenBCI)” in the first section of the SYSTEM CONTROL PANEL. Before hitting the START SYSTEM button, you need to configure your OpenBCI board (follow the steps below).
    3. Find your USB Dongle’s Serial/COM port
    Select SerialIn the first section of the LIVE (from OpenBCI) sub-panel, find your Dongle’s Serial/COM port name. If you’re using a MAC, it’s name will be in the following format:/dev/tty.usbserial-DNxxxxxxIf you’re using Windows or Linux, it will appear as:COM#Your USB Dongle’s port name will likely be at the top of the list. If you don’t see it:
    Make sure your dongle is plugged in and switched to GPIO 6 (not RESET)
    Click the REFRESH LIST button in the SERIAL/COM PORT section of the sub-panel
    Make sure you’ve installed the latest FTDI drivers, as described in section III.1
    If you’re still having trouble finding your USB Dongle’s port name, refer to the Forum about debugging your hardware connection.
    4.(optional) Edit the Playback file name
    File NameIn the DATA LOG FIlE section of the LIVE (from OpenBCI) sub-panel you can specify the name of your playback file. This file name is automatically defaulted to:SavedData\OpenBCI-RAW- + date/timeYou can edit the the name of this file by clicking in the “File Name” text field.If you’re running the OpenBCI GUI from Processing. This file will be saved at the root of your OpenBCI_GUI directory. If you’re running the OpenBCI GUI as a standalone application, this file will be saved in /Contents/Java/Data/EEG_Data/If working from a Mac, you’ll need to right-click on the OpenBCI_GUI application and then select “show package contents” to see the /Contents directory where your playback files are saved.After creating a Playback file, it can be replayed by running the OpenBCI GUI with the Plaback File data source mode. As a result, you can easily share recorded OpenBCI Playback files with your friends and colleagues.
    5 Select your channel count (8 or 16)
    Channel CountThe CHANNEL COUNT setting is defaulted to 8. If you are working with an OpenBCI Daisy Module and 32bit board (16-channel) system, be sure to click the 16 CHANNELS button before starting your system.
    6. Select your SD setting
    WRITE TO SDIf you want to log data to a MicroSD inserted into the OpenBCI Board, in the WRITE TO SD (Y/N)? sub-panel section you can select the maximum recording time of the file. This setting is defaulted to “Do not write to SD…” and will automatically switch to this if you do not have a MicroSD card properly inserted into your OpenBCI Board.Note: be sure to select a file size that is larger than your planned recording time. The OpenBCI writes to the local SD in a way that enables us to write lots of data very quickly. As a result, however, we must specify how large the file will be before we begin. The technique is known as block writing.
    7. Press “START SYSTEM”
    START SYSTEMNow you’re ready to start the system! Press the START SYSTEM button and wait for the OpenBCI GUI to establish a connection with your OpenBCI Board. This usually takes ~5 seconds. InitializingDuring this time, the help line at the bottom of the OpenBCI GUI should be blinking the words: “Initializing communication w/ your OpenBCI board.”TROUBLESHOOTINGIf the initialization fails, try the following steps in order:
    Making sure you’ve selected the correct serial/COM port
    Power down your OpenBCI board and unplug your USB Dongle. Then, plug back in your USB Dongle and power up your OpenBCI board in that order. Then try restarting the system, but pressing the START SYSTEM button again.
    If this does not work, try relaunching the OpenBCI GUI application and redo step 2 above. Then reconfigure the SYSTEM CONTROL PANEL settings, and retry START SYSTEM.
    Make sure that your batteries are fully charged and then retry the steps above.
    If you are still having troubles connecting to your OpenBCI board, refer to the Forum for extra troubleshooting advice.
    8. Your OpenBCI is now live!
    Start StreamOnce the GUI successfully connects to your OpenBCI Board, click anywhere outside of the SYSTEM CONTROL PANEL to access the rest of the features of the GUI.You can now press the bright green Start Data Stream button (located at the top middle of the GUI) to begin streaming live data from your OpenBCI board.TouchTo make sure that it is responsive, (after you’ve started the data stream) try running your fingers along the electrode pins at the top of your board. ChaosYou should see the 8 (or 16 if you’re using a Daisy module) channels on the EEG DATA montage behave chaotically in response to you touching the pins. The headplot on the right side of the GUI should become fully saturated (turning bright red) when you do this. And all the tracess of the FFT graph on the lower right should instantly shift upwards.If this is the case, congratulations; you are now connected to your OpenBCI board. It’s time to see some brain waves!
    V. CONNECT YOURSELF TO OPENBCI
    In this quick demo, we’ll be showing you how to set up 3 channels of electrophysiological data that reveal your heart activity (EKG or ECG), muscle activity (EMG), and brain activity (EEG)!For more information on these three signals, refer to wikipedia:
    Heart Acitivity - Electrocardiography (EKG or ECG)
    Muscle Acitivity - Electromyiography (EMG)
    Brain Activity - Electroencephalography (EEG)
    1. What you need
    What You NeedNecessary:
    Ten20 conductive elctrode paste (or other conductive electrode gel)
    Your OpenBCI board, USB Dongle, battery pack, and x4 AA batteries
    x6 gold cup electrodes (from your OpenBCI electrode starter kit or other). If you are using an OpenBCI electrode starter kit, use the following electrodes so as to be consistent with the GUI’s color-coding protocol:
    Black
    White
    Purple
    Green
    Blue
    Red
    Optional:
    Paper towels for cleaning excess Ten20 paste
    Medical tape (or other tape) for adding extra stability to electrodes
    Ear swabs for cleaning paste from electrodes, once you’re finished
    2. Connect your electrodes to OpenBCI
    Electrode Connections 1
    Connect the white electrode to the SRB2 pin (the bottom SRB pin). The SRB2 pin is the default “reference pin” for your OpenBCI input channels.
    Connect the black electrode to the bottom BIAS pin. The BIAS pin is similar to the ground pin of common EEG systems, but it uses destructive interference waveform techniques to eliminate the “common mode noise” of all of the active channels.
    Connect the purple electrode to the 2N pin (the bottom pin of the N2P input)
    Connect the green electrode to the 4N pin (the bottom pin of the N4P input)
    Connect the blue electrode to the 4P pin (the top pin of the N4P input)
    Connect the red electrode to the 7N pin (the bottom pin of the N7P input)
    Electrode Connections 2Basic OpenBCI pin overviewThe picture to the right is a perspective view of the electrode inputs that we are working with in this tutorial. The bottom pins are (N) inputs, and the top pins are (P) inputs. The default board settings look at all N channels in refernce to SRB2 (the bottom SRB pin). SRB1 (the top SRB pin) can also be used as a reference, but when it is activated, it is activated for ALL channels. If using SRB1 as the reference electrode, P inputs must be used as the other input of the potential difference measurement. On the contrary, individual channels can be removed from SRB2. If a channel is removed from SRB2, it can be examined as a unique voltage potential, between the N and P pins of that channel. We will be doing this for the heart measurement in this tutorial, while examining 2 EEG channels in reference to SRB2, using the channel 2 and 7 N pins. For more information on this, refer to page 16 of the ADS1299 datasheet. The ADS1299 chip is the analog front-end at the core of the OpenBCI board.
    3. Connect your electrodes to your head and body
    Electrode Pastea) We’re going to start with the electrodes on your head. Begin by scooping Ten20 electrode paste into your white gold cup electrode. This is going to be your reference (or SRB2) electrode for the other electrodes on your head. Fill the electrode so there is a little extra electrode paste spilling over the top of the gold cup, as seen in the picture to the right. Note: Use a paper towl or napkin to remove excess electrode paste as you are applying your electrodes.SRB2b) Now apply this electrode to either one of your earlobes (either A1 or A2 as seen on the 10-20 system image below). You can use some medical tape (or electric tape!) to give this electrode some extra stability, ensuring that it does not fall off. This electrode is the reference that all of the EEG electrodes on your head will be measured in comparison to. The uV reading that will appear in the GUI’s EEG DATA montage is a measure of the potential difference between each electrode and this reference electrode (SRB2). SRB1 (the top SRB pin) can also be used as a reference pin, but we won’t discuss that here. Check out the other docs on how to maximize the usage of the other pins!Fp2c) Follow the same procedure for the purple electrode and apply it to your forhead 1 inch above your left eyebrow (as if you were looking at yourself) and an inch to the left of your forheads centerline. 1020This electrode location is Fp2 on the 10-20 System. The 10-20 System international standard for electrode placement in the context of EEG. Fp indicates the a “frontal polar” site.O1d) Now follow the same procedure for the red electrode and place it on the back of your head, 1 inch above the inion (as seen on the 10-20 system), and 1 inch to the left. This electrode location is O1 on the 10-20 system. The ‘O’ stands for occiptal, meaning above your occipital lobe (or visual cortex).Note: to do this, pull your hair aside and make sure the electrode is nested as deeply as possible, with the electrode paste making a definitive conductive connection between your scalp and the gold cup.headbande) Now follow the same procedure as step 2 above to apply the black electrode to your other earlobe (either A1 or A2 from the 10-20 system). The black electrode is connected to the BIAS pin, which is used for noise cancelling. It is similar to a GROUND pin, which establishes a common ground between the OpenBCI board and your body, but it has some extra destructive interference noise cancelling techniques built in! You’re now done connecting electrodes to our noggin! I like to use a cheap cotton hairband to add extra stability to all of the electrodes connected to my head, by placing it gently on top of all of the electrodes. forearmf) Now connect the green electrode to your right forearm, somewhere on top of a muscle that you can flex easily. With this electrode we will be looking at both heart activity and muscle activity. I also like to use tape to hold this electrode in place. That’s going to hurt a little bit to take off. Hopefully your arms aren’t as hairy as mine…wristg) Finally, connect the blue electrode to your wrist on the opposite arm with the green electrode. This will serve as the reference electrode for the blue electrode. If you noticed, the blue electrode is on the pin above the green electrode. We will be removing pin 4 from SRB2 so that it is not included in the same reference signal being used to measure brain waves. The main reason for this is because the microvolt (uV) values produced by your heart and muscles are much stronger than the signals we can detect from your brain, so we don’t want these signals to interfere. I’ll go into more detail about this later on, when it comes time to adjust the channel settings in the GUI.
    4. Launch the GUI and adjust your channel settings
    a) If your OpenBCI GUI is not already running, relaunch it and configure the DATA SOURCE mode to LIVE (from OpenBCI). Refer to section IV of this guide for more information on this process. Since we are only using 3 channels, set the channel count to 8, even if you have a daisy system. Nothing will go wrong if you start the system with 16 channels, except the EEG DATA montage will be unnecessarily cluttered.b) Once you have pressed START SYSTEM and the GUI has connected to your OpenBCI device, exit the SYSTEM CONTROL PANEL and start the live data stream. You should see live data from your body (and the unattached channels) streaming into the EEG DATA montage on the left side of the GUI.Power Downc) Now we are going to power down the channels we aren’t using. Do this by clicking the channel number buttons outside of the left side of the EEG DATA montage. We are only using channels 2, 4, and 7, so power down every other channel. Don’t bother with the smaller dark grey squares to the right of the buttons with numbers; they are used for impedance measuring, but we won’t go into that now. You can also power down the channels with keyboard shortcuts (1-8). Power them back up with [SHIFT] + 1-8. If you are working with a daisy module, channels 9-16 can be powered down with q, w, e, r, t, y, u, i, respectively. You can power those channels back up with [SHIFT] + the same key. Signals At Startd) Now that you have powered down channels 1, 3, 5, 6, and 8, your EEG DATA montage should look similar to the screenshot on the right (after you relax and let the system settle).Adjust Channel Settingse) Now it’s time to optimize your OpenBCI board’s channel settings for this setup. Click the CHAN SET tab to the right of the EEG DATA tab, and an array of buttons should appear of the EEG DATA montage. These buttons indicate the current settings of the ADS1299 registers on your OpenBCI board. For more information on these settings, refer to pages 39-47 of the ADS1299 datasheet.We have simplified the interface through the OpenBCI firmware and OpenBCI GUI to allow easy, real-time interaction with these registers. For more information on this, please refer to our doc page regarding the ADS1299 interface.By deactivating channels 1, 3, 5, 6, and 8, those channels were automatically removed from the BIAS and SRB2, so as not to interfere with the signal. The only thing left to do is update channel 4, the input we are using for EMG and EKG. Begin by clicking the PGA Gain button for channel 4 until it is set to x8. Then remove it from the BIAS and SRB2. The reason we do this is because the uV values for EMG and EKG are much bigger (and easier to pick up) than the EEG signals on channels 2 and 7. As a result, we want to prevent channel 4 from influencing the common mode noise rejection of the BIAS, as well as remove it from the EEG reference channel (SRB2).EEG DATA AFTER ADJUSTING SETTINGSf) After updating these settings, click the EEG DATA tab again, and your EEG DATA montage should now appear similar to the image on the right. Notice that you no longer see the heart beat artifacts in channels 2 and 7. Additionally, the heart beat signal in channel 4 should be more steady, looking more like a typical EKG signal.
    5. Minimizing noise
    So there’s a good chance your current setup isn’t showing clean data like the screenshots above. There are a number of possible reasons for this. We’ll go through troubleshooting them here.Notch FilterGet rid of AC noiseGet rid of 60 Hz (or 50 Hz if you’re in Europe or any country that operates on a 50 Hz power grid). The OpenBCI has a built-in notch filter, that does a decent job at eliminating 60 Hz noise. You can adjust the notch filter to 50 Hz by clicking the “Notch 60 Hz” button. Additionally, if your OpenBCI board is on a table with any power chords or devices that are plugged into a wall outlet, move it to a location away from any electronic devices plugged into the wall. This will drastically reduce the alternating current (AC) influence on your signal.Stablize Your Cables w/ TapeStablize your electrodesMake sure your electrode cables are steady. If you shake the electrodes that are dangling from your head/body, you’ll notice that it severely affects the signals. This movement noise is something that could be greatly improved with “active” electrodes, but when using the “passive” electrodes that come with the OpenBCI electrode starter kit, you have to be very careful to remain steady while using the system, in order to produce the best signal. Sometimes, I’ll bind all of the electrode cables together with a piece of electric tape to secure them and minimize cable movement. If you do this, don’t worry about including the blue and green electrodes in the bundle, since movement noise doesn’t affect the EMG/EKG signal as significantly.Ensure that your electrodes are securely conncetedEnsure that your electrodes are connected securely (especially your reference)!Make sure your OpenBCI hardware is streaming data properlyEvery so often, an error will occur with the wireless communication between your OpenBCI Dongle and board. If you’ve followed all of the steps above, and the data that you are seeing in the GUI interface is still illegible, try the following:Power down your OpenBCI board and unplug your USB Dongle. Then, plug back in your USB Dongle and power up your OpenBCI board in that order. Then try restarting the system, but pressing the START SYSTEM button again.Further troubleshootingIf you’re still having issues, refer to the Forum for further troubleshooting techniques.
    VI. CHECK OUT YOUR BODY’S ELECTRICAL SIGNALS!
    Congratulations! If you’ve made it this far, it’s finally time to check out your body’s electrophysiological signals!
    1. Check out your heart activity (EKG)
    Heart BeatChannel 4 in the GUI should now be producing a nice steady succession of uV spikes. This is your heart beating! Try taking slow, deep breaths and watch how it influences your heart rate. If you look carefully, you may notice your heart beat more rapidly as your inhaling, and more slowly as you’re exhaling.Analyzing EKGFor more information on how to analyze an electrocardiography (EKG) signal, or on how to set up a full EKG (with 10 electrodes), check out the wikipedia page on EKG. The image to the right (pulled from the Wikipedia page) shows the various segments of a single heart beat.
    2. Watch your muscles flex (EMG)
    Forearm FlexNow, try flexing your forearm or whatever muscle you placed the green electrode on top of. You should see a high-amplitude, high-frequency signal introduced into channel 4. This is the electric potential created by you activating your muscle!If you relax your muscle again, you should see the channel 4 signal return to your heart beat (just EKG). The picture on the right shows this transition. When you’re flexing your muscle, the electrode is picking up EMG and EKG at the same time. After you relax your muscle, the high-frequency signal disappears, and you’re able to see just EKG.
    3. Eye blinks and jaw clenching (more EMG)
    Eye Blink & Jaw Clench EMG ArtifactsNow blink your eyes a few times. Each time you blink you should see a strong spike on the EEG DATA montage. It should be most visible in channel 2, the channel for the electrode directly above your eye! This uV spike is a result of the muscles in your forehead that make your eyes blink.Now try clenching your jaw. You should see a big uV spike in both channels 2 and 7. Each time you clench your jaw, you are introducing a strong EMG artifact into any electrodes on your scalp. If you put your fingers on the side of your head (above your ear) and clench your teeth, you should be able to feel the muscles in your head flexing.In the photo to the left, you can see what these signals look like the green highlighted region shows a single eye blink. The two blue sections show an extended period of jaw clenching.It’s interesting to note that these signals are not picked up in channel 4. This is because channel 4 is only looking at the potential difference across your body—from your right forearm to your left wrist. As a result the EMG/EEG artifacts being produced on your head (in reference to SRB2) are not visible in this channel.
    4. Brain waves (alpha) with OpenBCI!
    Alpha Brain Waves!Now, for what we’ve all been waiting for… let’s check out some brain waves! Firstly, deactivate channel 4 so that you are only looking at the EEG channels (2 and 7).It’s best to do this portion of the tutorial with a friend. You’ll understand why in a second. It just so happens that the easiest way to consciously produce brain waves is by closing your eyes. When you do this, your occipital lobe (the part of your brain responsible for processing visual information) enters into an alpha wave state at a frequency between 7.5-12.5 Hz. Alpha brain waves are the strongest EEG brain signal! Historically, they are thought to represent the activity of the visual cortex in an idle state. An alpha-like variant called mu (μ) can be found over the motor cortex (central scalp) that is reduced with movement, or the intention to move [Wikipedia].For more information on Alpha waves check out Wikipedia and Chip’s EEG Hacker blog post about detecting alpha waves with OpenBCI V3.Once you’ve closed your eyes, have your friend press the ‘m’ key on your keyboard to take screenshots. Tell him or her to wait until a strong alpha spike emerges on the Fast Fourier Transform (FFT) Graph, the graph in the lower-right of the GUI. The spike should be somewhere between 7.5-12.5 on the the x-asix of the FFT graph, indicating that there is a strong presence of waves in that frequency range.After you’ve taken a few good screenshots, open up the .JPGs and take a look. Note: the screenshots are located in the root directory of your application, or in the OpenBCI_GUI directory if you are working from Processing. You’ll notice that the strongest alpha wave signals should be appearing in channel 7, the O2 (O standing for occipital) electrode on the back of your head. Count the number of waves in a single 1-second time period on channel 7 of the EEG DATA montage. The number of waves should correspond x-axis position of the spike on the FFT graph. If you’ve identified your alpha waves, congratulations! You’ve now seen your first brain waves with OpenBCI!
    5. What’s next?
    For more ideas on what to do next, check out Chip’s Blog EEG HACKER and the other OpenBCI Docs pages.Also, if you have a great follow-up tutorial to this getting started guide or something else you want to share, feel free to create your own by following format we have in the Docs repo of our Github. It’s really easy to create your own Docs page with a Markdown editor like Mou. If you do so, send us a pull request on Github and we’ll add your tutorial to the Docs! If you have troubleshooting questions be sure to visit the OpenBCI Forum. For all other inquiries, contact us at [email protected].
    http://www.openbci.com
  • raiogamraiogam Posts: 18
    NerveGear
    The Thync Approach
    neurosignaling_block.svg?t=1433240450459

    A soothing neck massage. A splash of cold water. A kiss from someone you love. Each action influences peripheral nerves in your head and face, signaling brain regions to change the way you feel. Thync works using the same pathways by delivering low-level electrical pulses to these nerves.

    Every day, your body balances the activity between your sympathetic and parasympathetic nervous systems. The sympathetic system is associated with a "fight or flight" response to help regulate your reaction to stress. The parasympathetic system counteracts stress to help you enter a relaxed "rest and digest" mode.

    Thync uses neurosignaling to activate specific cranial and peripheral nerves to influence this balance and shift you to a state of calm or give you a boost of energy in minutes.


    Neurosignaling: The Science
    Neurosignaling is the coupling of an energy waveform to a neural structure (receptor, nerve or brain tissue) to modulate its activity.

    Neurosignaling waveforms or Vibes consist of precise algorithms that bias activity of the sympathetic and parasympathetic nervous systems, so that you can enjoy a shift into a more energetic or relaxed state.

    Neurosignaling builds upon the best features of long-standing tDCS and TENS techniques by using pulsed currents with lower-intensity and higher-frequency outputs delivered through bio-compatible materials for greater safety and comfort.

    At Thync, we have developed proprietary neurosignaling technology that delivers signals to the brain through three neural pathways:


    Learn About Vibes

    Thync System: The Technology
    The Thync System is an integrated set of components innovatively designed to deliver neurosignaling Vibes. Following years of development, engineering and testing, the Thync System is uniquely designed to fit into active lifestyles.

    The Thync Module and Strips represent groundbreaking advances in technology. From the elegant curved design and fit of the lightweight Module, to the bio-materials used in Thync Strips, to the simplicity of the App, the System is designed to be easy, comfortable and effective in a variety of lifestyle applications.

    The Thync System

    Safety
    Thync is the result of years of research and development by Thync neuroscientists and engineers. Thync Vibes were safely tested on several thousand individuals under a variety of conditions to optimize their performance and comfort.

    The Thync System builds upon more than 40 years of extensive research, documentation and consumer use that supports the safety and tolerability of our limited output neurosignaling approach.

    The Thync System is a low-risk transdermal neurostimulation device intended for lifestyle use at home, work, or in wellness applications to temporarily induce mental relaxation or calmness or to temporarily increase energy, awareness, and alertness. The Thync system is a safe and low-risk device. It is not intended to treat or diagnose any disease or medical condition.

    Based on intended use and output characteristics, the FDA notified Thync that its device is not subject to medical device regulations requiring pre-market clearance or approval.

    Safety Study — PDF

    Testing
    Thync Vibes are the culmination of testing on thousands of subjects conducted internally by our scientists, externally through our university collaborators, and by our early adopters in the real world. We capture, record, and analyze data such as heart rate, heart rate variability, galvanic skin response, pupil diameter, and EEG to quantify how Vibes influence the parasympathetic and sympathetic nervous systems. We monitor biometric signals, assay psychophysiological variables, and conduct psychometric evaluations.

    We have developed a technology that can consistently, and in a statistically reliable manner, beat the placebo effect. Our studies incorporate the use of placebo controls in blind tests under a variety of experimental conditions. All studies are performed using IRB-approved protocols and procedures.
    http://www.thync.com/
  • Maddox_J1Maddox_J1 Posts: 1
    Hey! I'm jermiah and I was wondering...um I was thinking Sony was making a VR or um a rift right? Um so what do you think about that? Will it be like Sao? Or something like that
  • YbalridYbalrid Posts: 247
    Art3mis
    Maddox_J1 said:
    Hey! I'm jermiah and I was wondering...um I was thinking Sony was making a VR or um a rift right? Um so what do you think about that? Will it be like Sao? Or something like that
    Sony's making a VR Headset for Playstation 4. It's called PSVR and it's out this fall.
    https://www.playstation.com/en-us/explore/playstation-vr/



    CV1+Touch; Running on GTX980 - 4770K - 32GB DDR3

    I'm writing a C++ open-sourcre game engine for the Rift http://annwvyn.org/ - https://github.com/Ybalrid/Annwvyn
    For now it's only used for my student projects at my engineering school http://en.esiea.fr/
  • jason.bequette.71jason.bequette.71 Posts: 1
    NerveGear
    can't wait till SAO comes out for real
  • SkomasSkomas Posts: 1
    NerveGear
    edited January 2018
    Even though the NerveGear is not possible with current technology, though not far away, ordinal scale can already be achieved. Through the use of global positioning software and augmented reality, it would be possible to overlay reality with a virtual layer by creating a virtual image that is just barely on top of or overlapping the edge of any object in reality. By doing so, we will experience a fully virtualized world with exact same dimensions as the real world while being able to move our bodies to play inside of it. While overlapping images onto reality per say, it would also be possible to create extra objects each having AI that would allow them to roam around in the virtual realm, and be interactable with via the use of special objects or a suit recognized by the satellites. The only problems that aren't manageable with include the fact that monsters are just images that you can see, but not touch, and also the fact that they cannot make an impact with you.
  • SaronStrikerSaronStriker Posts: 2
    NerveGear
    You should not be worried about people who will always stay in virtual world, never come out, not feed themselves finely etc. Why? Because its their problem to take care of their health not of the scientists.

    Take an example - the tobacco products are sold everywhere but they have instructions that they cause cancer. Then also people buy them and its their problem.

    So have no worries (it increases blood pressure).
  • Sao2022hopeSao2022hope Posts: 1
    NerveGear
    Creator, do you still think something like a nerve gear is not that far? And the others, do you think it will be created for 2022-2025?
  • ShadowWolf37ShadowWolf37 Posts: 1
    NerveGear
    edited March 2018
    Sao2022hope Sorry, but the tech is still in it's infancy since due to certain ethics, making a helmet that blocks your brain's electrical signals to the rest of the body and vice versa isn't a good idea. Although there are some news/journalism websites saying that certain researches are experimenting with different frequencies to help effect neural responses which could effectively help with things like depression. (website: https://www.scientificamerican.com/article/could-certain-frequencies/) Emotiv looks like they are on the right track with their tech, but again, still infant since it's reported that your hair, or certain tech near you can mess with the readings.Although if the helmet design, does use a certain type of material to block outside signals, plus be tightened to the person's head to keep the helmet from moving, then Emotiv's head gear could work a little better. Oculus would have to shrink down a bit to be a little less weighty, but that part is already made, just needs improving.

    SaronStriker As long as we don't block signals from the mind to the body and vice versa, then the person will feel the hunger and such in the virtual world.

    In closing  I've also seen a guy going around testing the theory that our mind can make fake signals as such he blindfolds volunteers and tells them he's going to touch them with a match. Though he uses an ice cube, a lot of people freaked and thought they were burnt. Basically, if you hear, see, and move like you do in the real world, your mind could confuse which world is real, so never forget.
  • chevadychevady Posts: 1
    NerveGear
    Excuse me everyone
    I'm really interested in Nervegear    
    I understand some knowledges for BCI
    As I know, theoretically, Nervegear may be created
    Quantum computer already have prototype (I remember it's IBM make that) 
    Brain Neuron Technology I think 30~50 years may dissolve the main problem on connecting human conscious between virtual world(computer, Internet...... etc)

    My questions are:
    1. Is there any formal paper(thesis)   for  Nervegear or Full dive?  What kind of keywords should I use? I search nothing on Pubmed
    2. Any Lab do research on this kinds of machine?
    3. I hope I can get more formal or official thesis, If anyone have  please share with me!!! 


    Thanks a lot!!  

  • athman80athman80 Posts: 4
    NerveGear
    edited May 2018
    thanks ;)


  • athman80athman80 Posts: 4
    NerveGear
    edited May 2018
    It was both exciting and eerie to watch an imagined action from my brain get translated into actual action by another brain," Rao said in a statement.
    plex Lucky Patcher Kodi

  • Foxtrot_BravoFoxtrot_Bravo Posts: 2
    NerveGear
    Hey,  Oculus leave some for me, I'm going into neurology and mechatronics just to solve this mystery. I don't want it finished before I get a chance at it. 
  • Foxtrot_BravoFoxtrot_Bravo Posts: 2
    NerveGear
    In the future when we do have the technology it is the cost,  that will kill it.  As always the more tech you put into it the more it costs,  which is why at this moment I don't have Oculus.
  • Jesusfreaknerve17Jesusfreaknerve17 Posts: 2
    NerveGear
    edited June 2018
    Hello I'm Jesus Freak, I'm a fan of SAO and I believe nerve gear can be created. With God's help I have an idea that could work or at the very least help. My hypnosis that nerve gear could be created by computers and lucid dreaming.

    Lucid dreaming is when someone is in a dream and the dreamer is aware that they are dreaming. When we sleep the brain works almost like it is awake the difference being certain chemicals in are brain are almost blocked. When the chemicals like norepinephrine, serotonin, and histamine are blocked the muscles stop moving so you can move anyway you want in your dream and your body wont move. (Except for sleep walkers Lol) Another thing about lucid dreaming is once you realize your dreaming you can see, hear, smell, taste and move like you would in real life. (though you don't feel pain) The best part is since it's your dream you can even fly, talk to celebrities and visit Hawaii. The different ways people learn to lucid dream are keeping a dream journal, tell themselves they will lucid dreaming  (Believe it or not that actually a reliable way to do it) and practicing reality checks like counting fingers. With lucid dreaming you can disconnect from reality and go on adventures just like the nerve gear of SAO.

    My idea is making a machine, say a helmet, that put you in a lucid dream created by a computer. The helmet first takes someone into a dream state the computer then sends images to the conscience part of the brain or to cortex creating a dream place through electric signals with low or high geomagnetic activity. I also think the helmet should come with earphones so sounds could help convince the brain to accept the fake dream. For example, say the computer is trying to send you to a prairie sounds of birds, grass rustling and window blowing could influence the brain to take the images as part of the dream. Overall the computer can send the place, equipment and perhaps monsters and you since your lucid dreaming move anyway you want and travel, collect items or kill monsters. There is definitely more science and technology to be considered in this but this is only an idea I'm not sure if it can be proven I really hope so.

    That's was my hypnosis I hope it helps or at the very least inspires other ideas.
  • Sanchez852Sanchez852 Posts: 1
    NerveGear
    Nerve Gear requires magical wireless man-machine interface connected to user’s brain, which can paralyze the user, but not kill them, and wirelessly override sensory feedback from all user’s senses.
  • bhangadbhangad Posts: 1
    NerveGear
     I also think the helmet should come with earphones so sounds could help convince the brain to accept the fake dream. For example, say the computer is trying to send you to a prairie sounds of birds, grass rustling and window blowing could influence the brain to take the images as part of the dream. Overall the computer can send the place, equipment and perhaps monsters and you since your lucid dreaming move anyway you want and travel, collect items or kill monsters. There is definitely more science and technology to be considered in this but this is only an idea I'm not sure if it can be proven I really hope so.

    xmodgames for android - download | Popcorn time for android - download
  • Nameless_SoulNameless_Soul Posts: 1
    NerveGear
    hey if anyone could give me a link to a website of a company or something of someone doing work on this project please let me know i don't want links to news articles
  • jamesthomas11jamesthomas11 Posts: 1
    NerveGear
    It was both exciting and eerie to watch an imagined action from my brain get translated into actual action by another brain," Rao said in a statement.
    Regards, redbox tv
  • braddelson11braddelson11 Posts: 1
    NerveGear
    Oculus is a great VR Headset when we compared with other VR Headset like Samsung's or HTC. Get Garageband on PC to create music on VR 
  • ForestalKnave0729ForestalKnave0729 Posts: 1
    NerveGear
    So you figured out eeg but you are forgetting certain factors.  The body is comprised of an electrical impulse circuitory system as well as an impulsory nurological system.  Each fire and are controlled by separate parts of the brain.  Electrodes and eeg interpretation and interceptors would be a further advancement.  Next,  measure the level of electrical signal per movement so as to create precise movements in a game.  Lastly measure the waves per movement of each body part so as to create movements of characters paired with size by touching your body to create movements based on your own body size.  I think you could take it from there. If you decide to have someone test it,  I would like to take part in the testing
  • Dragon11rDragon11r Posts: 4
    NerveGear
    can't wait till SAO comes out for real

  • Dragon11rDragon11r Posts: 4
    NerveGear
    edited January 30
    Dude don’t think your the only one I know I’m 
     reading this last and it’s already 2019 but I have hope that we will get what we want hopefully soon enough so we all get to try it.
  • Dragon11rDragon11r Posts: 4
    NerveGear
    Creator, do you still think something like a nerve gear is not that far? And the others, do you think it will be created for 2022-2025?

  • Dragon11rDragon11r Posts: 4
    NerveGear
    Yes I do believe 
  • ABPro_3ABPro_3 Posts: 2
    NerveGear
    When you sleep you release a chemical called aminobutyric acid (gama) if you were able to get the brain to release this chemical without being in REM sleep, then you could stop all movement in the real world but keep them in the game. This is similar to sleep paralysis where you're sending the signal to move your arm for example, but you can't because the gama hasn't left the muscle yet leaving it, in a sense, paralyzed. If you use that to your advantage, you could add a split off of an existing nerve, permanent or not, but the muscles won't move because there ”paralyzed” then giving the effect of the nerve being redirected. I am unsure of the effects gama would have on the muscle if released multiple times. I would like to make nerve gear a reality however I am only 14 years old, so hopefully, no one makes it before me, won't be surprised if they do though.
  • ABPro_3ABPro_3 Posts: 2
    NerveGear
    ABPro_3 said:
    When you sleep you release a chemical called aminobutyric acid (gama) if you were able to get the brain to release this chemical without being in REM sleep, then you could stop all movement in the real world but keep them in the game. This is similar to sleep paralysis where you're sending the signal to move your arm for example, but you can't because the gama hasn't left the muscle yet leaving it, in a sense, paralyzed. If you use that to your advantage, you could add a split off of an existing nerve, permanent or not, but the muscles won't move because there ”paralyzed” then giving the effect of the nerve being redirected. I am unsure of the effects gama would have on the muscle if released multiple times. I would like to make nerve gear a reality however I am only 14 years old, so hopefully, no one makes it before me, won't be surprised if they do though.
    Not sure if someone else mentioned this, I didn't look through the forum.
  • OQrockOQrock Posts: 4
    NerveGear
    I believe that this is a great idea and that it should be reality. You should put a timer on it so people cant play for more than 10 hours. you should have a 2 hour break between each 10 hours so they can eat and drink and sleep
«1
Sign In or Register to comment.