I stole this explanation from a Facebook group member:
For a brief explanation: your eyes have 2 main ways to optically judge the distance of things. One is stereoscopic vision: a difference in perspective between both eyes. The second is focus: how blurred objects are, in front of and behind the object you're looking at. Currently, all VR and AR headsets render objects at about 2.5m focus, no matter if virtual objects are close or far away.
With varifocal displays, objects can be rendered on different focal planes, coming closer to a more realistic and probably comfortable viewing experience.
Facebook Reality Labs, the company’s R&D department, previously
revealed its ‘Half Dome’ prototype headsets which demonstrated
functional varifocal optics small enough for a consumer VR headset. At a
conference earlier this year, the Lab’s Director of Display Systems
Research said the latest system is “almost ready for primetime,” and
also detailed the Lab’s research into HDR (high-dynamic range) and
pupil-steering displays for XR headsets.
So it looks like the HALF DOME is finally going to see the light of day?