New to the forums? Click here to read the "How To" Guide.

Developer? Click here to go to the Developer Forums.

Microsoft HoloLens Mind BLOWN

PeteoPeteo Posts: 264
Hiro Protagonist
edited February 2015 in General
Holly crap:
HoloLens has see-through, holographic lenses. Built-in high-end CPU and GPU. We invented a third processor, a holographic processing unit. No wires, no phone required, no connection to a PC needed.

Microsoft has been working with NASA secretly for this technology.


http://www.theverge.com/2015/1/21/7867593/microsoft-announces-windows-holographic

Will be out in windows 10 time frame

wired mag hands on: http://www.wired.com/2015/01/microsoft-hands-on/

More hardware info here:
http://www.wired.com/2015/01/microsoft-nadella/
scroll down until you seed the head set
ff_hololens_2.png
ff_hololens_3.svg
hololens_41.svg
ff_hololens_5.svg



Kipman’s prototype is amazing. It amplifies the special powers that Kinect introduced, using a small fraction of the energy. The depth camera has a field of vision that spans 120 by 120 degrees—far more than the original Kinect—so it can sense what your hands are doing even when they are nearly outstretched. Sensors flood the device with terabytes of data every second, all managed with an onboard CPU, GPU and first-of-its-kind HPU (holographic processing unit). Yet, Kipman points out, the computer doesn’t grow hot on your head, because the warm air is vented out through the sides. On the right side, buttons allow you to adjust the volume and to control the contrast of the hologram.

Tricking Your Brain
Project HoloLens’ key achievement—realistic holograms—works by tricking your brain into seeing light as matter. “Ultimately, you know, you perceive the world because of light,” Kipman explains. “If I could magically turn the debugger on, we’d see photons bouncing throughout this world. Eventually they hit the back of your eyes, and through that, you reason about what the world is. You essentially hallucinate the world, or you see what your mind wants you to see.”

To create Project HoloLens’ images, light particles bounce around millions of times in the so-called light engine of the device. Then the photons enter the goggles’ two lenses, where they ricochet between layers of blue, green and red glass before they reach the back of your eye. “When you get the light to be at the exact angle,” Kipman tells me, “that’s where all the magic comes in.”

Thirty minutes later, after we’ve looked at another prototype and some more concept videos and talked about the importance of developers (you always have to talk about the importance of developers when launching a new product these days), I get to sample that magic. Kipman walks me across a courtyard and through the side door of a building that houses a secret basement lab. Each of the rooms has been outfitted as a scenario to test Project HoloLens.

A Quick Trip to Mars
The first is deceptively simple. I enter a makeshift living room, where wires jut from a hole in the wall where there should be a lightswitch. Tools are strewn on the West Elm sideboard just below it. Kipman hands me a HoloLens prototype and tells me to install the switch. After I put on the headset, an electrician pops up on a screen that floats directly in front of me. With a quick hand gesture I’m able to anchor the screen just to the left of the wires. The electrician is able to see exactly what I’m seeing. He draws a holographic circle around the voltage tester on the sideboard and instructs me to use it to check whether the wires are live. Once we establish that they aren’t, he walks me through the process of installing the switch, coaching me by sketching holographic arrows and diagrams on the wall in front of me. Five minutes later, I flip a switch, and the living room light turns on.

Another scenario lands me on a virtual Mars-scape. Kipman developed it in close collaboration with NASA rocket scientist Jeff Norris, who spent much of the first half of 2014 flying back and forth between Seattle and his Southern California home to help develop the scenario. With a quick upward gesture, I toggle from computer screens that monitor the Curiosity rover’s progress across the planet’s surface to the virtual experience of being on the planet. The ground is a parched, dusty sandstone, and so realistic that as I take a step, my legs begin to quiver. They don’t trust what my eyes are showing them. Behind me, the rover towers seven feet tall, its metal arm reaching out from its body like a tentacle. The sun shines brightly over the rover, creating short black shadows on the ground beneath its legs.

Norris joins me virtually, appearing as a three-dimensional human-shaped golden orb in the Mars-scape. (In reality, he’s in the room next door.) A dotted line extends from his eyes toward what he is looking at. “Check that out,” he says, and I squat down to see a rock shard up close. With an upward right-hand gesture, I bring up a series of controls. I choose the middle of three options, which drops a flag there, theoretically a signal to the rover to collect sediment.

After exploring Mars, I don’t want to remove the headset, which has provided a glimpse of a combination of computing tools that make the unimaginable feel real. NASA felt the same way. Norris will roll out Project HoloLens this summer so that agency scientists can use it to collaborate on a mission.
«13456

Comments

  • benplacebenplace Posts: 769
    Neo
    I need to see more about this and how it works. It looks AMAZING....
  • VizionVRVizionVR Posts: 3,022
    Wintermute
    Want
    Not a Rift fanboi. Not a Vive fanboi. I'm a VR fanboi. Get it straight.
  • Here's the video they made to show what it's like. (One of the best examples of what Augmented Reality is like that I've ever seen)


    Still many many questions, such as FOV, Resolution, Horsepower in the HPU/GPU, weather the overlay is transparent or not. We should hear more on these questions once people manage to test them at the event later today.
  • kojackkojack Posts: 5,596 Volunteer Moderator
    They've got a kinect inside of a headset?
    RWz3B41.jpg

    I wonder how much battery life this thing will get.
  • nosys70nosys70 Posts: 466
    Art3mis
    seems CastAR has found some big shot to buy them
  • WirelineWireline Posts: 1,203
    NerveGear
    I need to buy a cabinet for all the VR and AR devices I am going to end up owning.
  • kojackkojack Posts: 5,596 Volunteer Moderator
    I just ordered an Intel Realsense 3d camera the other day (no idea when that will ship, I think I'm in a queue to get into a queue), now there's another 3d camera based thing for me to get.

    I was interested in CastAR too, I've been waiting for sdk info to come out, but this seems to beat it in most ways.
    The one thing CastAR seems cabable of that Hololens isn't is easy collaborative viewing. Since it uses markers, multiple CastAR users have a common point of reference, while Hololens is using point clouds from the mini kinect, that's going to be much harder to synch between people in different positions. Solo use, Hololens. Groups sitting around a table, could be CastAR's field.
  • jeonunhjeonunh Posts: 22
    It will be interesting to see what they mean by releasing the product "Within the Windows 10 timeframe" That could mean they plan to release it near the launch of Win10, or it could mean they will launch it sometime between the launch of Win10, but before the launch of Win11.

    In any case, if the experience is even remotely close to what they just showed in the demos I think they are light years ahead of what I was expecting from Microsoft. Light years... Wow. I get that this is AR and not VR, but their AR is better than anyone's VR is at the moment..
  • mattnewportmattnewport Posts: 92
    Hiro Protagonist
    Very exciting announcement and the demos look promising but the information released so far raises more questions than it answers. I'd love to hear more info about how this works (is the 'holographic' stuff just hype / marketing or is the display actually based on some type of light field display technology?), what kind of specs the hardware in the headset has (CPU/GPU, what exactly does the 'HPU' do?), what do the APIs to interact with this look like (do you still render two stereoscopic views of the scene or is there more to it as the 'holographic' claims would imply), does the display have the ability to block out light from the real world (giving scope for full VR rather than just translucent looking AR)?

    Hoping Microsoft will release more information soon. Also very interested to hear reports from some of the journalists who get to demo it in person.
  • kojackkojack Posts: 5,596 Volunteer Moderator
    jeonunh wrote:
    I get that this is AR and not VR, but their AR is better than anyone's VR is at the moment..
    However we don't know the res, the field of view, the processing power left over in the headset for us to use, or tracking performance. A Kinect 2 with a pc can't track position/orientation (in programs like Kinect Fusion where you move the kinect around) as fast and smooth as the rift, so can a battery powered miniature version do better?
  • willstewillste Posts: 675
    Brain Burst
    Sounds like what I wanted Google Glass to be.

    Seems like it could be amazing, but my excitement is tempered until I see some real world demo's and a price tag. Sounds rather expensive.

    In 20 years though I expect entertainment to be on another level with all the research that is coming out of the wood work.
  • SemicidalSemicidal Posts: 297
    Hiro Protagonist
    Guys I have that same gadget itch I haven't felt since....well, since the DK2 preorder screen went up (right until it arrived).
    Im giddy.
    It's christmas!!
    I need to know everything about it.
    .....and Oculus better hurry up....
  • saviorntsaviornt Posts: 1,951
    NerveGear
    Kipman notes that Microsoft collaborated with NASA while developing the HoloLens and encourages other companies like Motion Leap and Oculus to make a start on creating their own holograms.

    I am drooling over this tech, I really am. However, as someone mentioned above, there are alot of questions to be answered, and Microsoft usually doesn't answer questions until they release the product. FOV, Latency, Heat, Battery Life, Weight.

    If, and this is a huge if, as well as a very skeptical IF, this is "perfect AR": 180 degree FOV, single digit latency numbers, no heat and is comfortable to wear, the battery life is good for up to 36 hours before it needs recharged (which would give you 4 hours of continuous "game play" before it needs recharged).. then this would literally kill every VR and AR headset out there.

    We are not at that point.. yet.

    This is an AR device, not a VR device. Oculus is working on VR.

    Virtual reality: Puts you into the virtual world
    Augmented reality: Brings the virtual world to the real world.
  • jeonunhjeonunh Posts: 22
    kojack wrote:
    jeonunh wrote:
    I get that this is AR and not VR, but their AR is better than anyone's VR is at the moment..
    However we don't know the res, the field of view, the processing power left over in the headset for us to use, or tracking performance. A Kinect 2 with a pc can't track position/orientation (in programs like Kinect Fusion where you move the kinect around) as fast and smooth as the rift, so can a battery powered miniature version do better?

    I'm as skeptical as you are. I started that paragraph by saying "If" the experience is even remotely as good as the demo. Anyone can make a CGI demonstration that far exceeds the capability of the actual device. That said, this has apparently been in development for a long time, and they have a specially designed chip. Who knows... I'll have to see it first hand to believe anything.
  • EarlGreyEarlGrey Posts: 886
    Art3mis
    SKEPTICAL!!!

    Let's be realistic here. The technology they portray in those MARKETING ADVERTISEMENT VIDEOS simply isn't here yet. They're showing a future which they're trying to reach!! They don't have that technology!

    Now, Oculus has already released a couple of headsets showing virtual reality, and are well on their way to release their third consumer headset. (well fourth counting the Gear VR).

    Virtual reality is a technology we've seen and know how works. The Microsoft "Hololens" exists only in the imagination of CG artists working for advertisement agencies.
  • donkaradiablodonkaradiablo Posts: 310
    Hiro Protagonist
    I don't think this one is CGI



    15:30 - 20:45 on this one:

    Design with input solution, unifying mobile and PC product lines
    the input solution that could have been
    ideas
    Revolutionize the way we interact with...
    Change the world...
    Community...
    BJsryuo.gif
  • danknugzdanknugz Posts: 1,988
    3Jane
    It's impressive from a technical standpoint but looks like it could be gimmicky and get old fast. Whatever that lady is doing in that video looks incredibly boring.

    If Oculus were smart they would look into integrating chips into the rift to reduce workload on PCs, cause lets face it, no one is going to buy the Rift if they have to spend 3 grand for a brand new PC only to have to work harder than your day job at configuring each and every single game correctly and then it makes your feel sick and want to vomit cause theres no standardization, everything is grossly unoptimized, and the SDK will never get past 1.0 until 2020.

    No one wants to deal with that incredible bullshit.
    A: Because it messes up the order in which people normally read text.
    Q: Why is top-posting such a bad thing?
    A: Top-posting.
    Q: What is the most annoying thing on forums?
  • PeteoPeteo Posts: 264
    Hiro Protagonist
    edited January 2015
    More hardware info here:
    http://www.wired.com/2015/01/microsoft-nadella/
    scroll down until you seed the head set
    ff_hololens_2.png
    ff_hololens_3.svg
    hololens_41.svg
    ff_hololens_5.svg

    Project HoloLens is built, fittingly enough, around a set of holographic lenses. Each lens has three layers of glass—in blue, green, and red—full of microthin corrugated grooves that diffract light. There are multiple cameras at the front and sides of the device that do everything from head tracking to video capture. And it can see far and wide: The field of view spans 120 degrees by 120 degrees, significantly bigger than that of the Kinect camera. A “light engine” above the lenses projects light into the glasses, where it hits the grating and then volleys between the layers of glass millions of times. That process, along with input from the device's myriad sensors, tricks the eye into perceiving the image as existing in the world beyond the lenses.

    The device has just three controls, one to adjust volume, another to adjust the contrast of the hologram, and a power switch. Its speakers rest just above your ears. Project HoloLens can determine the direction from which a sound originates, so that when you hear something, it'll appear to be coming from where it would be in real life. If a truck is meant to be speeding by your left side, for example, that's where you'll hear the sound of its engine. By the time Project HoloLens comes to market toward the end of this year, it'll weigh about 400 grams, or about the same as a high-end bike helmet. Microsoft's new operating system, Windows 10, powers it, so any developer can program for it.

    NASA has already gotten an early crack at it. As the mission operations innovation lead at the agency's Jet Propulsion Laboratory, Jeff Norris is charged with rethinking how we explore space, with a focus on the interface between humans and technology. He met Kipman nearly five years ago when he was creating Kinect. In Project HoloLens, Norris saw the potential for technology to help space explorers collaborate more closely and to provide them a quality known as presence. (“People make better decisions when they feel like they're in the environment,” he says.) Last March, Norris and several members of his team relocated from Southern California to Redmond for a few months to build a Mars simulation.

    Man my head is going to explode!
  • andrewtekandrewtek Posts: 971
    Art3mis
    edited January 2015
    You will notice in that last video that the videographer is being VERY careful not to let the designer walk in front of the hologram. It would appear that the hologram is always projected in front of everything else and does not (at least in this iteration) attempt to do z-buffering with the real-world. Thus, hands which are closer to you appear under/behind the projected hologram... This is most noticeable at about 1:16 on the 3rd video in the article.

    EDIT: Added clarity to statement "under" to include "under/behind".
  • mattnewportmattnewport Posts: 92
    Hiro Protagonist
    I don't think this one is CGI
    Realtime motion tracking / match moving is neat but not new technology and from a technical perspective is a lot simpler than what would be required for really effective 'holographic' AR (reproducing effects such as depth of field / convergence when viewed by a human observer). It's hard to tell from the live demo videos what we're seeing there. At one point you see the camera and it looks like it might actually be filming through the lenses from a headset which makes the demo more impressive but still doesn't answer the questions about how 'holographic' the display really is.

    The marketing videos look like they're produced using traditional motion tracking (likely rendered offline and not shot through the lenses of a headset) and so I take them as representative of the effect they're going for but not representative of what the actual hardware is capable of right now.
  • PyryPyry Posts: 30
    That video looks plausible except for I'm not sure how the glasses are actually occluding the real scene instead of being additively blended with it. My suspicion is that that video is using the real tracking of the device, but that the rendering is being done separately and then composited rather than actually viewed through the glasses.

    As for light field displays, fundamentally the big problem is that any light field display will have to trade off resolution in order to produce the light field. Or, to put it another way, if you could build a light field display with XxY resolution, you could build a conventional display with something like 3Xx3Y resolution. To use the Lytro as an example (it's a light field camera, but the tradeoffs are symmetric with respect to displays and cameras), it has an 11 megapixel sensor (~3300x3300) but the images it produces are only 1.1 megapixel (1080x1080).

    Personally, I would rather have a non-lightfield 3300x3300 display in my HMD than a lightfield 1080x1080.
  • SharpfishSharpfish Posts: 1,303
    Neo
    GIVE ME 2 ON RELEASE DAY PLEASE*



    *If they work as in the vids :D
    EX DK2, EX VIVE, EX PSVR, Currently RIFT CV1 | VR developer
    Poster of the week who never got a T-Shirt ;( dayum they looked tasty!
  • snappaheadsnappahead Posts: 2,302
    Trinity
    EarlGrey wrote:
    SKEPTICAL!!!

    Let's be realistic here. The technology they portray in those MARKETING ADVERTISEMENT VIDEOS simply isn't here yet. They're showing a future which they're trying to reach!! They don't have that technology!

    Now, Oculus has already released a couple of headsets showing virtual reality, and are well on their way to release their third consumer headset. (well fourth counting the Gear VR).

    Virtual reality is a technology we've seen and know how works. The Microsoft "Hololens" exists only in the imagination of CG artists working for advertisement agencies.
    the marketing video is just a visualization. You can't really take it for face value. The on stage demo looked pretty legit though. I have doubts, but Im still hopeful that theyve got something here. I dont expect much of a gaming device out of this, but it could be a really fantastic tool and media device that could lead to all kinds of applications and future tech.
    i7 3820
    16 gigs of Ram
    GTX 780ti
  • PeteoPeteo Posts: 264
    Hiro Protagonist
    NASA has released more information on the software it built for Windows Holographic, a program called OnSight.

    By using Microsoft's HoloLens visor, NASA scientists will be able virtually explore the areas of Mars that Curiosity is studying in a fully immersive way. It will also allow them to plan new routes for the rover, examine Curiosity's worksite from a first-person view, and conduct science experiments using the rover's data.

    http://www.theverge.com/2015/1/21/7868635/windows-holographic-nasa-curiosity-rover
  • mattnewportmattnewport Posts: 92
    Hiro Protagonist
    Found a little more info on the display in the Wired article. It sounds like it may be some type of light field display:
    Project HoloLens is built, fittingly enough, around a set of holographic lenses. Each lens has three layers of glass—in blue, green, and red—full of microthin corrugated grooves that diffract light. There are multiple cameras at the front and sides of the device that do everything from head tracking to video capture. And it can see far and wide: The field of view spans 120 degrees by 120 degrees, significantly bigger than that of the Kinect camera. A “light engine” above the lenses projects light into the glasses, where it hits the grating and then volleys between the layers of glass millions of times. That process, along with input from the device's myriad sensors, tricks the eye into perceiving the image as existing in the world beyond the lenses.
  • PyryPyry Posts: 30
    I think the technology is probably very similar to the "light guide" technology Vuzix had developed in 2012. Compare the description of Vuzix's display:
    This amazing new technology starts with a compact display engine capable of high contrast and brightness for outdoor use. [...] The output is then relayed into a 1.4 mm thick plastic waveguide lens with input and output hologram structures on the surface which squeezes the light down the waveguide and then two dimensionally expands the image back into the user’s eye, creating an image that is then mixed into the real world.

    Edit:
    The technology appears to be "holographic waveguide" as described here:
    The holographic technique is quite close to the diffraction grating technique described above with the exception that a holographic element is used to diffract the light [...]. Holographic elements reflect only one wavelength of light so for full color, three holograms are necessary; one that reflects Red, Green, and Blue respectively.

    It is not a lightfield display, but rather a thin 'conventional' display that is optically at infinity.
  • mattnewportmattnewport Posts: 92
    Hiro Protagonist
    Here's the first journalist hands-on report I've seen from today's demo (Wired had their's a while ago). The Gizmodo reporter seems pretty impressed, the article's still updating as I post this.

    Some quotes:
    I just put Microsoft's new holographic glasses on my face. It's one of the most amazing and tantalizing experiences I've ever had with a piece of technology.
    And then I was looking at the surface of Mars. Or a narrow sliver of it, anyways. It's not like the Oculus Rift, where you're totally immersed in a virtual world practically anywhere you look. The current Hololens field of view is TINY!
  • donkaradiablodonkaradiablo Posts: 310
    Hiro Protagonist
    What they mean by consumer level price can't be anything close what Oculus VR is targetting for the Rift. This is a lot more than a HMD, the price has to be more on par with a high end laptop.

    The Rift really needs a good input solution and wireless connectivity so bad. Yes, this is AR and yes it is not here yet, but this creates the image that the technology to put me in a game and let me interact with it with my hands and gestures, without being wired to a computer limiting my movements, is within reach. That makes the Rift look like old technology. And come to think of it, it is old technology. It is mild evolutionary, obvious improvements over something an enthousiast and a software wizard could put together with duct tape and custom software 2-3 years ago. After the Facebook deal and the resources it brought with, I think it's natural to expect more from Oculus VR. The HoloLens presentation just makes it worse.

    I really hope that GearVR and Crescent Bay merges into a single product with onboard cams. The funny thing is, apart from the display used, this looks just like what I'd expect the CV1 to look like, with multiple cameras on both sides and a nice, thin design with probably a good weight distribution. I'm even more curious now about what we'll see at GDC from Oculus VR.
    Design with input solution, unifying mobile and PC product lines
    the input solution that could have been
    ideas
    Revolutionize the way we interact with...
    Change the world...
    Community...
    BJsryuo.gif
  • PyryPyry Posts: 30
    A low FOV is consistent with it being a holographic waveguide. The other tell-tale sign would be if there's significant color fringing.

    Another interesting quote out of the gizmodo article:
    But that's when I noticed that I wasn't just looking at some ghostly transparent representation of Mars superimposed on my vision. I was standing in a room filled with objects. Posters covering the walls. And yet somehow—without blocking my vision—the Hololens was making those objects almost totally invisible.

    So it appears that it is indeed able to actually occlude the incoming light and not merely add to it. If you look at the pictures of the device, it seems to have two screens: each eye has a small flat glass plate near it, which I think are the holographic waveguides that actually produce the image. But then there's also this big wrap-around screen further out. My guess is that this wrap-around screen is actually an LCD shutter that is used to selectively block out incoming light.
Sign In or Register to comment.