cancel
Showing results for 
Search instead for 
Did you mean: 

The "Alone in a Crowd" Issue

CareLevelZero
Honored Guest
Hey folks. I've been considering an issue that's bound to come up eventually amongst the consumer element of the virtual reality crowd. For those consumers who are married / have family, it's not an uncommon thing for family members to be watching somebody else game, particularly where a couch is involved -- while this was once the domain of consoles, that has been shifting of late and more PCs are using televisions as monitors.

Naturally, it's a bit more difficult to watch somebody play a game with a screen strapped to their face and the double-visual output that's naturally generated doesn't make for the best viewing material. I'm wondering if anybody else has started considering ways to combat this issue in order to help maintain the "group solo" gameplay experience. After all, not everybody plays games but that doesn't mean people aren't interested in watching them.

We're paving a new future here and there will be some bumps in the road. There's no reason we can't try to anticipate and mitigate them where possible. So, a penny for your thoughts.

Maybe two, if you're lucky.
The old rules no longer apply here
10 REPLIES 10

kojack
MVP
MVP
Some 3D tvs can take a side by side stereo image (such as my samsung 3d plasma).

The distortion shader may not be as suitable for that (I haven't tested my ogre rift demo on my tv yet), so you might need to give it the pre-distortion view. But you could then have one person playing while everyone else puts on shutter glasses and watches it on tv.

Or just give everyone a rift and clone the display, but I think watching a 3d view you can't control would be less confusing on a tv compared to a headset.
Author: Oculus Monitor,  Auto Oculus Touch,  Forum Dark Mode, Phantom Touch Remover,  X-Plane Fixer
Hardware: Threadripper 1950x, MSI Gaming Trio 2080TI, Asrock X399 Taich
Headsets: Wrap 1200VR, DK1, DK2, CV1, Rift-S, GearVR, Go, Quest, Quest 2, Reverb G2

jaimi
Expert Protege
My guess is that the best way to do this would be to render the second screen (the public one) as a normal game view for the audience (perhaps reusing one of the cameras from the stereo view to avoid having a 3rd pov rendered). It might also be possible to just take the rendering of one of the eyes, and using a de-warping shader, reconstruct a passable view (albeit at lower resolution). That is probably the least performance affecting.

Crespo80
Explorer
"CareLevelZero" wrote:

I'm wondering if anybody else has started considering ways to combat this issue in order to help maintain the "group solo" gameplay experience.


so they can choose the exact perfect scary moment to tap on your shoulder and totally freak you out :lol:

CareLevelZero
Honored Guest
"crespo80" wrote:
so they can choose the exact perfect scary moment to tap on your shoulder and totally freak you out :lol:

Damn right! It's what my friends would do to me anyways, if we were couch-gaming. Why should I deny them the opportunity just because I have a headset on? 😄
The old rules no longer apply here

DK_VR
Honored Guest
I honestly think that the 'spectator crowd' is at a low enough number that this sort of thing will not be a big enough problem to not get people excited or cause a worry. Think about it this way. Games over the years have gone away with 'couch coop' which I think was a far greater risk than not having people see a game only you are playing and it seems gaming is doing just fine.

Not everything will fit every situation. I much rather see people putting in the effort to really streamline the VR experience rather than worry about optimizing a 'spectator' mode.

Tummler
Honored Guest
"jaimi" wrote:
My guess is that the best way to do this would be to render the second screen (the public one) as a normal game view for the audience (perhaps reusing one of the cameras from the stereo view to avoid having a 3rd pov rendered). It might also be possible to just take the rendering of one of the eyes, and using a de-warping shader, reconstruct a passable view (albeit at lower resolution). That is probably the least performance affecting.


I like this idea. Relatively easy to implement and shouldn't really affect performance.

But I wonder if the dynamics of watching someone play the Rift are intrinsically different. Usually with consoles you're playing against others, spectators are commenting on the game, people are chatting with each other, etc. When someone is immersed in the Rift with their headphones on, I wonder if that sensory isolation just kills the (in-person) social dynamic. The Rifter might as well be in another location if you can't interact or communicate with them.

CareLevelZero-- what have you experienced in your household? Do you think once the "new toy" novelty wears off, people will still find it enjoyable to watch somebody else play in VR?

Pyry
Honored Guest
If you have the rendering power available, you can give the spectators their own controllable camera (like the spectator view you get in multiplayer Valve games like L4D and TF2). A lot of people like to watch others play video games (see "let's play"), especially story-heavy video games.

The WiiU (and earlier the Gamecube-Gameboy link, such as with Zelda: Four Swords) does some interesting things with asymmetric displays, where some players have additional views (through the WiiU pad or a gameboy), and some of these ideas could transfer over to a rift + traditional screen combination. For example, you could imagine an espionage game where the rift player takes the role of the secret agent, and the traditional screen shows some kind of 'control room' view with security camera views and the like for the spectators to aid the main player (like the game Lifeline except with all human players and without the inconsistent voice recognition).

jwilkins
Explorer
Could have a separate PC render for spectators when it connects to the main PC as a client. That would be a generally useful feature, especially for testing.
(╯°□°)╯︵┻━┻

tlopes
Honored Guest
Since the barrel distortion shader requires (at least with D3D9) using a render target that stores the original unwarped texture (which is presumably not used again after warping) it should be pretty easy to just copy one of the two eyes of that texture and display that on the screen for viewers. Sure it'd take up GPU bandwidth (another full screen's worth) and texture memory, but I don't think it'd be that significant of a hit for most video cards since rendering the scene twice is typically more expensive than running the warp shader.

Edit: After reading the addendum of the Rift documentation, section A.4.1 Duplicate Mode VSync I'm guessing that if you were to implement this method that you'd want to use two different swap chains - one with VSync enabled (on the Rift) and the other with VSync disabled (for the monitor).