cancel
Showing results for 
Search instead for 
Did you mean: 

Week 12 : Oculus Launch Pad 2018 - Due Sep 24 @ 11am PST

Anonymous
Not applicable
Wow we've made it to Week 12
56 REPLIES 56

Anonymous
Not applicable
Week 12. 

We are preparing for OC5!!! YAYYY! 
This week we are going to meet with our mentors and finalize all the writing materials / Budget...etc. 
We are also planning on giving back to our hard working team by taking them out to Korean BBQ 🙂 

See you all at OC5 !!!   

Atley
Protege

Title

Trying To Laugh At This


Gameplay

I wanted the gameplay of fighting a “Fear” to feel like wrestling a witch -- a physical way lasso a large opponent that resists form.  

I found a weapon with multiple weights more interesting.  (I later learned this imitates the South American bolas.) The absence of centripetal force and the spring joint connecting the weights make powering and aiming the weapon a challenge.  Aim is all about timing when you let go.

4cbzpyg1yei9.gif

I was grateful to discover this, because I’ve found that learning when to let go is an important step in overcoming trauma -- the core theme of Trying To Laugh At This.

I focused the gameplay on timing and pacing pull and release of triggers.  Using Oculus haptics with openVR was a bit of a learning curve. I am trying to use haptics to extend the player’s sense of touch - like to indicate that they really touched something far away.


Character Design

Crafting characters out of particles is a ways off for me.  I looked to other ways of making characters feel like fluid compositions of shared materials.

Writing to the alpha channel with volumetric lighting helped to tie the scene together.  This outline could be used as ui to guide player focus.

This opponent is comprised of objects that convey “home” to me -- tissue boxes, house keys, a laundry basket.  From afar, it’s scary. But up close, it’s sad. Maybe this is another way to reward perception with story. 

On impact, the opponent breaks down in a way that conveys fragility.  But then it reforms in a way that's sort of freakish and otherworldly - like a witch.  Today I'll think more about level design and how to get the most out of this one opponent coming back again and again.  That's how fighting traumatic memories feels.  It's like fighting this insidious and amorphous thing that grips you with fear and sadness at the strangest times and the only way to diffuse it of its power is to contend with it head on, which I'm actually doing now in this creative process I realize, to be honest.  This is one form of contention for me. 


Story and Marketing

I have been a bit afraid that what I am making is too strange, overwhelming, ugly, or triggering for the player.  I think these feelings of guilt are residual of the original act of trauma itself -- an extension of myself the victim at a young age choosing to blame myself rather than acknowledge my utter lack of control over my own security.  At this point I am calling on several friends who understand where I am coming from to assist with the story, marketing, and sound design.  Company Not Jelly with Jamie Parreno and Leah Dubuc are helping me bring these fragments together today.  Musician and Sound Designer Julie Buchanan is coming on as a creative partner in all things auditory.  

In describing the project to my friends, I realize that I spent the first 90% of my time in this program on technical tests.  I think I had to ignore these creative ideas and emotions burning in the background in order to operate on the technical challenges with my whole mind.  It would have been impossible for me to effectively design something in this semi emotional creative state of mind without all my technical tools cued up and ready.  I'm grateful for this slow development process in the Oculus LaunchPad just for that realization. 

MichaelAdrian
Protege

l7iqw9rteuy9.jpg

The Home Stretch

As Oculus Connect approaches (tomorrow) and my team and I finalize the last drips and drabs of work needed to get the prototype ready to go I had the added benefit of being able to do a quick diagnostic on my journey and the project.

Over the last almost 3 months I’ve:

• Conceptualized, trashed, conceptualized, trashed and then discovered the diamond idea from the lump of coal.

• Went into pre-production on a 360 film and VR Experience simultaneously

• Put together an incredible “crew”: writer, director, producer, editor, composer, sound designer, SFX, director of photography, gaffer/grip, location scout, line producer, 3D modeler, UNITY developer, and “talent” actors.

• Learned some new technology (Insta 360 Pro) and software (a lot of UNITY, Blender, and After Effects

• Shot on a green screen for 2/5ths of the project

• Shot the rest

• And made it successfully to post where we are now (editing, color grading, integration, testing, and upload)


It has been a long journey compressed into a small amount of time and I am thankful for every second of it. I have made some new friends, learned a lot and inspired even more people to simply try their dreams on for size.


belmorea
Explorer

Week 12: With Friends Like These

 

SO excited for OC5!! This week has honestly been a bit of a cluster as we’ve been scrambling to pick up from last week’s medical issues on the team, but, I’m super proud of my group because we pulled together and focused on making real strides. Luckily we reserved the one missed/late blog post pass for a crazy situation like this past week, so here is our late, but 100% solid update:

 

Build: we originally were building the Android app, but managed to crack the code on a web-based design that is mobile platform agnostic - this is a really big win for us because we all felt strongly that the web-based app would be more effective and user friendly, but, we recognized the app was more practical for the demo. Passion won out on this one though because our developer burned some serious midnight oil to get this working.

 

Art: we found a cheap art environment package on Unity that actually replicates real places on Mars. We have overridden our original maze environment and used these realistic landscapes to make the experience feel more authentic. Our artist who had the major accident last week is back home finally, and was able to deliver some VERY cool models for the alien race we’ve built into the narrative (not the demo though… teasers for the proposal!)

 

Design: we picked a radiation field that hurts the VR player to be the key game play feature to highlight in the demo. We initially struggled to effectively model the radiation, but found two ways that work well with our build. We will be testing both as we integrate other build updates this week to see which one will make it in the final demo. Who knows, maybe we’ll sell the other one as an asset 😄

ladykillmonger
Explorer
Update!!!!!!!
Project:  facadeVR
Since there is no Week 13!

8w2xudsrhsrn.png

After so many attempts the baking and coloring is finally right! The game has been developed in a separate project so we are migrating over and we are excited. (The render above doesn't have the water )

As an added document during submission we will include what we learned. In hindsight, we spent too much time on modeling and aesthetics. The one great thing is now we don't need to worry about a re-design. 

Continuing to work on the proposal to elucidate that we hope to create an entire suite of games. 

I am very excited so this game because there are already people of all ages excited to play. 

lildub
Protege

Hey so this week I attended the OC5 and got to meet more OLP as well as meet up with new and old friends alike at the conference. I’m unfortunately going through a life crisis with my mom and had to fly from San Jose to the mayo clinic so that's why there is no video this week. 😞 I have been working remotely with our team and my main computer on re-rgging the monster character, finding people to do animation or animation cleanup to make a polished experience for our daughter cat and father. We are adding alot of animation that loops in the beginning because this is our most interactive part and has to wait on the user's input. I think that most of the animation will look fine i am a little worried if they will all blend together to create a comprehensive piece, a lot of lessons learned about mocap during this project. If you export from motion builder you lose the controls who would have know :neutral:
j92ouy5nxkaw.png


We submitted to the alpha channel and it was super cool seeing it alive in the store made it feel totally real what we have been working on instead of an abstract idea. I wouldn't have been able to this project without these talented people and taking up the slack when im going through something tragic and am going to be forever grateful. I’ve been also slowly working on getting website up for the project as well. Below is mock of matte painting
3g6u4or187sa.jpg

PS I am terrible at writing so sorry for any spelling or missing words i really tried 🙂 



FOVjpg

I am facing a problem in Museum Multiverse, the third person camera still feels weird following the player. I want to make sure the movement is smooth and comfortable for the camera and player, so I have been looking around for solutions and I think I found one.

Limiting the player’s peripheral view reduces the motion sickness of movement. I learned this from a paper I found on the subject by Ajoy S Fernandes and Steven K. Feiner at Columbia University. Basically the solution they learned from experimentation had a real reduction of motion sickness by limiting the view based on player speed.

So how do we do this in Unity?

First we would need to import the older Image Effects into Unity from the asset store. We are really just looking for the Vignette And Chromatic Aberration script. After you import this add this to your main camera. Once this script is added you need to set everything on the script to 0, you will only be playing around with the vignetting option.

Screen Shot 2017-07-09 at 31117 PMpng

Try playing around with the Vignetting values to see how it effects your camera!

Screen Shot 2017-07-09 at 31453 PMpng

Next we are going to write a script to adjust the Vignetting based of the Camera speed.

using System.Collections;
using System.Collections.Generic;
using UnityStandardAssets.ImageEffects;
using UnityEngine;

public class FOVLimiter : MonoBehaviour {
    private Vector3 oldPosition;
    public float MaxSpeed = 6f;
    public float MaxFOV = .7f;

    public static float CRate = .01f;
    public static float RateCutOff = .25f;

    // max .7 Vignetting

    private VignetteAndChromaticAberration fovLimiter;
    // Use this for initialization
    void Start () {
        oldPosition = transform.position;
        fovLimiter = GetComponent<VignetteAndChromaticAberration> ();
    }
    
    // Update is called once per frame
    void Update () {
        Vector3 velocity = (transform.position – oldPosition) / Time.deltaTime;
        oldPosition = transform.position;

        float expectedLimit = MaxFOV;
        if (velocity.magnitude < MaxSpeed) {
            expectedLimit = (velocity.magnitude / MaxSpeed) * MaxFOV;
        }

        float currLimit = fovLimiter.intensity;
        float rate = CRate;

        if (currLimit < RateCutOff) {
            rate *= 3; //fast rate since the field of view is large and fast changes are less noticeable
        } else {
            rate *= .5f; //slower rate since the field of view changes are more noticable for larger values. 
        }

        fovLimiter.intensity = Mathf.Lerp (fovLimiter.intensity, expectedLimit, rate);
    }
}

So what the heck is the Field Of Vision (FOV)Limiter script is doing? We are grabbing the distance the player has traveled each frame to find the speed of the player and calculate how much of the field of vision should be limited based on the player speed. So remembering some key points from the paper, the rate of the FOV transition can be faster at some points when the field of view is large because fast changes are less noticeable, but FOV changes are more noticeable for larger values.

Right now this is working pretty well but I know this is only step one to making a great 3rd person VR camera. Next week I will be focusing on making a smarter camera that can follow the player without getting stuck on walls.

If you would like to learn more about limiting camera view to prevent motion sickness and other VR tips I would recommend checking out FusedVR these guys are great!