Category: Mixed Reality

Holo-Tilt

Tilt Brush

I got an opportunity to try out the ‘Tilt Brush’, a room scale VR 3D drawing app. We had a HTC Vive set up and got to basically paint life size 3D brush strokes. Two hand-held controllers are used to manipulate the palettes and brush.

Two base stations are positioned on either side of the designated ‘paint area’. It was fairly amazing to be immersed in the environment and paint in real time, and be able to record, import other 3D objects and walk through what you had just made.

HoloLens

Microsoft’s Hololens is is the first self-contained, holographic computer, enabling you to engage with your digital content and interact with holograms in the world around you. You get to experience Mixed Reality by wearing the Hololens headset. Whilst wearing the lens the holograms are superimposed into what environment you are in and controlled by subtle hand gestures.

Note to Self:

  • The Tilt-Brush was great, the only thing being is that the aesthetic it creates is very obviously ‘created by a Tilt Brush‘! If one can get away from this, it is an amazing tool.
  • I was a bit disappointed by the Hololens, I found the hand gestures a bit clunky. I’m sure version 2.0 will be better.

Man with a 360 Camera

Whilst investigating narrative possibilities within mixed reality and immersive environments and 360 spherical films, I can’t help but see the similarities with early 20th century cinema. With this nascent spherical medium people are experimenting and attempting to work out possible ways of storytelling. Chris milk, CEO of VR studio Within (formerly VRSE), has likened these first steps into the 360 environment to the Lumiere Brothers 1895 first motion picture showing with their short piece ‘Arrival of a train’, people in the audience thought the train was coming towards them and ran out of the cinema.

Milk includes a homage to this in one of his first VR pieces, a showcase of sorts called ‘Evolution of Verse’ where a train comes headlong at the viewer. The standout of this piece though being the foetus reaching out to touch you.

Early Russian cinema was instrumental in pushing the envelope of this new 20th century medium, and Dziga Vertov’s silent documentary – Man with a movie camera (1929) was instrumental in this, called the ’Greatest documentary ever made’, it has no story and no actors.

soviet-1

From dawn to dusk Soviet citizens are shown at work and at play, and interacting with the machinery of modern life. The film is primarily famous for the range of cinematic techniques Vertov invents, double exposure, fast motion, slow motion, freeze frames, jump cuts, split screens, dutch (Deutsch) angles, extreme close ups, tracking shots, backwards footage, stop-motion animation and self-reflexive visuals (at one point it features a split-screen tracking shot; the sides have opposite Dutch angles).

soviet-2

Vertov, and his wife Elizevata Svilova were members of a collective of Soviet filmmakers in 1920s Russia called Kinoks (cinema-eyes). From 1922 to 1923 they published a number of manifestos in avant-garde journals which clarified the Kinoks’ position. The Kinoks rejected “staged” cinema with its stars, plots, props and studio shooting.

Vertov. Kinoks: A revolution.

You–filmmakers, you directors and artists with nothing to do,
all you confused cameramen and writers scattered throughout the
world…..
A friendly warning;
Don’t hide your heads like ostriches.
Raise your eyes,
Look around you–
There!
It’s obvious to me
as to any child
The innards,
the guts of strong sensations
are tumbling out
of cinema’s belly,
ripped open on the reef of revoultion.
See them dragging along,
leaving a bloody trail on the earth
that quivers with horror and disgust.
It’s all over.

Kino-Eye, p. 13

Another interesting early 20th century filmaker/theorist was again a Russian, Lev Kuleshov, he carried out what is called the Kuleshov experiment. It is a film editing (montage) effect by which viewers derive more meaning from the interaction of two sequential shots than from a single shot in isolation.

kuleshov-effect

Kuleshov edited a short film in which a shot of the expressionless face of Tsarist matinee idol Ivan Mosjoukine was alternated with various other shots (a plate of soup, a girl in a coffin, a woman on a divan). The film was shown to an audience who believed that the expression on Mosjoukine’s face was different each time he appeared, depending on whether he was “looking at” the plate of soup, the girl in the coffin, or the woman on the divan, showing an expression of hunger, grief or desire, respectively. The footage of Mosjoukine was actually the same shot each time.

As this seems to have turned into a piece on early 20th century Russian cinema I may as well mention Sergei Eisinstein’s 1925 classic Battleship Potemkin, the famous Odessa Steps sequence featured below.

Resulting

After recent experiments with 360 video I have worked out a quick methodology so I can work quickly without going back a few months later, and saying how the hell did I do that?

methology-1

So basically the rule of thumb is take the 360 source video or stills from the iPhone directly to the laptop, and NOT from the 360 camera directly. The image should be in Equirectangular format and look like the above. This is Monoscopic 360 video, not Stereoscopic 3D which is used in more high end 360 cameras. A standard 360 video is just a flat equirectangular video displayed on a sphere. Think of Monosciopic like the face of a world map on a globe, whereas Stereoscopic 3D can add another level of immersion by adding depth data between the foreground and background. Stereoscopic means you have two copies, a left eye and a right eye.

methology-2

The image should NOT look like the above image, if it does it won’t look correct when played via VR app in the Google cardboard headset. The spatial injector code only needs to be applied if its to be uploaded to Youtube.

Drag the video above to see it 360 (view in Chrome browser). This is a 360 video taken by my Ricoh Theta S 360 camera and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4.

The exported mp4 is then uploaded to the iPhone again, and opened in a free VR app, one thats been working well for me is HOMiDO player, you simply open the app, hit ‘video player’, click on the folder icon, find your mp4 on the iPhone, then click ‘choose’ and it compresses the video for 360 play. Slot the iPhone it into your VR headset and bobs your mothers brother, you are immersed inside the video.

Note to Self:

  • Strange to see a 3D scan from blender rotating inside my studio.
  • Need to source some good 360 source material, maybe from some woods/forest?
  • Ricoh Theta S video quality is very soft/low quality, so try to get around that by lighting perhaps?
  • Do some proper stitching/editing experiments so no joins/lines and imagery looks ok across 360 sphere.

Mixed Quasar

An experiment/Trial and Error investigation to make a video more immersive. Heres the original regular mp4 video below;

and here the 360 version with the ‘Spatial Media Metadata Injector’ code to make it 360 added below.

Drag the video above to see it 360 (view in Chrome browser). It appears very messy, low resolution and nausea inducing, as it was rotating already, a more regular video may be a better example. I haven’t bother to edit/stitch the join also.

Drag the video above to see it 360 (view in Chrome browser). This is a 360 still .jpg taken from Ricoh Theta S, and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4 and spatial meta injector code applied, no audio.

Drag the video above to see it 360 (view in Chrome browser). This is taken by my Ricoh Theta S 360 camera walking in a circle, and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4 and spatial meta injector code applied, audio. This video was sourced from my iPhone.

Drag the video above to see it 360 (view in Chrome browser). This is a 360 video taken by my Ricoh Theta S 360 camera walking in a straight line , and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4 and spatial meta injector code applied, audio. Screen is split into x2 spheres. Be interesting to see which ones play as 360 in app on cardboard viewer. This video was sourced from the camera.

Drag the video above to see it 360 (view in Chrome browser). This is a 360 video taken by my Ricoh Theta S 360 camera walking in a straight line , and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4 and spatial meta injector code applied, audio. This video was sourced from my iPhone.

Note to Self:

  • I will now try to view the various files in Google cardboard via a few VR apps to see which work and to see how immersive they are.
  • The aesthetic is not important here, nor is the fact the footage isnt stitched properly, its just a series of quick tests and reference for myself to see how the imported source material and exported files work best, so as to create a quick working methodology.