I got an opportunity to try out the ‘Tilt Brush’, a room scale VR 3D drawing app. We had a HTC Vive set up and got to basically paint life size 3D brush strokes. Two hand-held controllers are used to manipulate the palettes and brush.
Two base stations are positioned on either side of the designated ‘paint area’. It was fairly amazing to be immersed in the environment and paint in real time, and be able to record, import other 3D objects and walk through what you had just made.
Microsoft’s Hololens is is the first self-contained, holographic computer, enabling you to engage with your digital content and interact with holograms in the world around you. You get to experience Mixed Reality by wearing the Hololens headset. Whilst wearing the lens the holograms are superimposed into what environment you are in and controlled by subtle hand gestures.
Note to Self:
- The Tilt-Brush was great, the only thing being is that the aesthetic it creates is very obviously ‘created by a Tilt Brush‘! If one can get away from this, it is an amazing tool.
- I was a bit disappointed by the Hololens, I found the hand gestures a bit clunky. I’m sure version 2.0 will be better.
After recent experiments with 360 video I have worked out a quick methodology so I can work quickly without going back a few months later, and saying how the hell did I do that?
So basically the rule of thumb is take the 360 source video or stills from the iPhone directly to the laptop, and NOT from the 360 camera directly. The image should be in Equirectangular format and look like the above. This is Monoscopic 360 video, not Stereoscopic 3D which is used in more high end 360 cameras. A standard 360 video is just a flat equirectangular video displayed on a sphere. Think of Monosciopic like the face of a world map on a globe, whereas Stereoscopic 3D can add another level of immersion by adding depth data between the foreground and background. Stereoscopic means you have two copies, a left eye and a right eye.
The image should NOT look like the above image, if it does it won’t look correct when played via VR app in the Google cardboard headset. The spatial injector code only needs to be applied if its to be uploaded to Youtube.
Drag the video above to see it 360 (view in Chrome browser). This is a 360 video taken by my Ricoh Theta S 360 camera and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4.
The exported mp4 is then uploaded to the iPhone again, and opened in a free VR app, one thats been working well for me is HOMiDO player, you simply open the app, hit ‘video player’, click on the folder icon, find your mp4 on the iPhone, then click ‘choose’ and it compresses the video for 360 play. Slot the iPhone it into your VR headset and bobs your mothers brother, you are immersed inside the video.
Note to Self:
- Strange to see a 3D scan from blender rotating inside my studio.
- Need to source some good 360 source material, maybe from some woods/forest?
- Ricoh Theta S video quality is very soft/low quality, so try to get around that by lighting perhaps?
- Do some proper stitching/editing experiments so no joins/lines and imagery looks ok across 360 sphere.
I was in London for a few days so popped down to a UAL affiliated VR/MR maker day in Chelsea College of Art.
It was fairly informal and there were various desks/sections featuring Arduino, Leap Motion, Virtual Reality etc..I was interested in getting an overview into how one sets up a Virtual reality environment in 3D gaming software Unity, and create a Terrain, work with collidors, prefabs etc..
I also got a demonstration of a 360 3D/Spherical camera the Ricoh Theta S, this is for grabbing 360 live video or still footage. Overall a worthwhile and enlightening day.