Daydream View phone VR headset

I somehow ended up with a $100 credit at the Google Store, and decided to splurge on something I didn’t need but wanted — the Daydream View phone VR headset.

This is basically one step up from the old Google Cardboard viewer where you put your phone in a lens/goggles getup, but it actually straps to your head so you don’t have to hold it to your face manually. It’s also got a wireless controller instead of just a single clicky button, so you get a basic pointer within the VR world with selection & home buttons and even a little touchpad, allowing a fair range of controls within VR apps.

Using the controller, the Daydream launcher interface also provides an in-VR UI for browsing Google Play Store and purchasing new games/apps. You can even broadcast a 2d view to a ChromeCast device to share your fun with the rest of the room, though there’s no apparent way to save a recording.

The hardware is still pretty limiting compared to what the PC VR headsets like the Rift and Vive can do — there’s no positional tracking for either head tracking or the controller, so you’re limited to rotation. This means things in the VR world don’t move properly as you move your head around, and you can only use the controller to point, not to pick up VR objects in 3d space. And it’s still running on a phone (Pixel 2 in my case) so you’re not going to get the richest graphics — most apps I’ve tried so far have limited, cartoony graphics.

Still it’s pretty fun to play with; I’ll probably play through a few of these games and then put it away mostly forever, unless I figure out some kind of Wikipedia-related VR project to work on on the side. :)

Light field rendering: a quick diagram

Been reading about light field rendering for free movement around 3d scenes captured as images. Made some diagrams to help it make sense to me, as one does…
 
There’s some work going on to add viewing of 3d models to Wikipedia & Wikimedia Commons — which is pretty rad! — but geometric meshes are hard to home-scan, and don’t represent texture and lighting well, whereas light fields capture this stuff fantastically. Might be interesting some day to play with a light field scene viewer that provides parallax as you rotate your phone, or provides a 3d see-through window in VR views. The real question is whether the _scanning_ of scenes can be ‘democratized’ using a phone or tablet as a moving camera, combined with the spatial tracking that’s used for AR games to position the captures in 3d space…
 
Someday! No time for more than idle research right now. ;)
Here, enjoy the diagrams.