Tag: 360

Sandbox 2

Had real issues with the build version of the unity piece I’m doing, it looked perfect in unity player but in the build it wasn’t rendered properly. I had upgraded to a new version so think that maybe broke some settings. So basically had to re-do the whole thing.

I’ll finish this off this week now as the listener music is starting to drive me crazy.

Note to Self:

  • A few forum posts actually taught me a lot about output settings.
  • The aesthetic is a tad too ‘gamer’ for my liking but with my next piece I’ll hopefully combine 360 footage to make it more cinematic looking, maybe grayscale/ B+W, floating cameras, athmospheric with lots of fog etc..

360 Multi-verse

Firstly lets differentiate between Full VR and 360/VR video.
With Full VR:

  • Each moment is generated live
  • The user interacts to control what happens

With 360 Video:

  • Its a single experience rendered in advance
  • The viewer chooses where to look
  • In traditional film editing we think in terms of frames.

    “In a 360 environment a ‘frame’ is a relative window of experience derived from the visitor’s field of vision. This makes everything a potential frame, but also makes a premeditated frame based on my own interests presumptuous and, well, wrong most of the time.” [1]

    These visuals below are more reflective of the spatial reality of the medium, more apt to its multi-verse tendencies where every path exists simultaneously. Worlds of experience extending from one another, much like ripples in a pond or rings in a trunk of a tree.

    points-1

    We need to identify the potential experiences in each world, evaluate the probability that they will occur, and then take into account how a visitor might engage with them, I could then identify possible paths. I could rotate these worlds around each other, using the most probable potential experiences to guide someone through.

    points-2

    Then perhaps I could work backwards through these layers of experience, take what insights editing these worlds provide and use them to help shape the creation of these worlds from the start.

    “…there needs to be the existence of a unique link between the mind of the creator and the mind of the visitor. It appears to be very specific to this medium and something that could have never existed until presence became a factor.” [2]

    Note to Self:

    • Work out a DIY method of writing a 3D treatment/screenplay. current methods will not suffice.
    • To capture proper 360 green screen film footage of actors, a curved circular cyc green screen would have to be constructed, with maybe spot lighting from above?

    Man with a 360 Camera

    Whilst investigating narrative possibilities within mixed reality and immersive environments and 360 spherical films, I can’t help but see the similarities with early 20th century cinema. With this nascent spherical medium people are experimenting and attempting to work out possible ways of storytelling. Chris milk, CEO of VR studio Within (formerly VRSE), has likened these first steps into the 360 environment to the Lumiere Brothers 1895 first motion picture showing with their short piece ‘Arrival of a train’, people in the audience thought the train was coming towards them and ran out of the cinema.

    Milk includes a homage to this in one of his first VR pieces, a showcase of sorts called ‘Evolution of Verse’ where a train comes headlong at the viewer. The standout of this piece though being the foetus reaching out to touch you.

    Early Russian cinema was instrumental in pushing the envelope of this new 20th century medium, and Dziga Vertov’s silent documentary – Man with a movie camera (1929) was instrumental in this, called the ’Greatest documentary ever made’, it has no story and no actors.

    soviet-1

    From dawn to dusk Soviet citizens are shown at work and at play, and interacting with the machinery of modern life. The film is primarily famous for the range of cinematic techniques Vertov invents, double exposure, fast motion, slow motion, freeze frames, jump cuts, split screens, dutch (Deutsch) angles, extreme close ups, tracking shots, backwards footage, stop-motion animation and self-reflexive visuals (at one point it features a split-screen tracking shot; the sides have opposite Dutch angles).

    soviet-2

    Vertov, and his wife Elizevata Svilova were members of a collective of Soviet filmmakers in 1920s Russia called Kinoks (cinema-eyes). From 1922 to 1923 they published a number of manifestos in avant-garde journals which clarified the Kinoks’ position. The Kinoks rejected “staged” cinema with its stars, plots, props and studio shooting.

    Vertov. Kinoks: A revolution.

    You–filmmakers, you directors and artists with nothing to do,
    all you confused cameramen and writers scattered throughout the
    world…..
    A friendly warning;
    Don’t hide your heads like ostriches.
    Raise your eyes,
    Look around you–
    There!
    It’s obvious to me
    as to any child
    The innards,
    the guts of strong sensations
    are tumbling out
    of cinema’s belly,
    ripped open on the reef of revoultion.
    See them dragging along,
    leaving a bloody trail on the earth
    that quivers with horror and disgust.
    It’s all over.

    Kino-Eye, p. 13

    Another interesting early 20th century filmaker/theorist was again a Russian, Lev Kuleshov, he carried out what is called the Kuleshov experiment. It is a film editing (montage) effect by which viewers derive more meaning from the interaction of two sequential shots than from a single shot in isolation.

    kuleshov-effect

    Kuleshov edited a short film in which a shot of the expressionless face of Tsarist matinee idol Ivan Mosjoukine was alternated with various other shots (a plate of soup, a girl in a coffin, a woman on a divan). The film was shown to an audience who believed that the expression on Mosjoukine’s face was different each time he appeared, depending on whether he was “looking at” the plate of soup, the girl in the coffin, or the woman on the divan, showing an expression of hunger, grief or desire, respectively. The footage of Mosjoukine was actually the same shot each time.

    As this seems to have turned into a piece on early 20th century Russian cinema I may as well mention Sergei Eisinstein’s 1925 classic Battleship Potemkin, the famous Odessa Steps sequence featured below.

    Resulting

    After recent experiments with 360 video I have worked out a quick methodology so I can work quickly without going back a few months later, and saying how the hell did I do that?

    methology-1

    So basically the rule of thumb is take the 360 source video or stills from the iPhone directly to the laptop, and NOT from the 360 camera directly. The image should be in Equirectangular format and look like the above. This is Monoscopic 360 video, not Stereoscopic 3D which is used in more high end 360 cameras. A standard 360 video is just a flat equirectangular video displayed on a sphere. Think of Monosciopic like the face of a world map on a globe, whereas Stereoscopic 3D can add another level of immersion by adding depth data between the foreground and background. Stereoscopic means you have two copies, a left eye and a right eye.

    methology-2

    The image should NOT look like the above image, if it does it won’t look correct when played via VR app in the Google cardboard headset. The spatial injector code only needs to be applied if its to be uploaded to Youtube.

    Drag the video above to see it 360 (view in Chrome browser). This is a 360 video taken by my Ricoh Theta S 360 camera and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4.

    The exported mp4 is then uploaded to the iPhone again, and opened in a free VR app, one thats been working well for me is HOMiDO player, you simply open the app, hit ‘video player’, click on the folder icon, find your mp4 on the iPhone, then click ‘choose’ and it compresses the video for 360 play. Slot the iPhone it into your VR headset and bobs your mothers brother, you are immersed inside the video.

    Note to Self:

    • Strange to see a 3D scan from blender rotating inside my studio.
    • Need to source some good 360 source material, maybe from some woods/forest?
    • Ricoh Theta S video quality is very soft/low quality, so try to get around that by lighting perhaps?
    • Do some proper stitching/editing experiments so no joins/lines and imagery looks ok across 360 sphere.

    Mixed Quasar

    An experiment/Trial and Error investigation to make a video more immersive. Heres the original regular mp4 video below;

    and here the 360 version with the ‘Spatial Media Metadata Injector’ code to make it 360 added below.

    Drag the video above to see it 360 (view in Chrome browser). It appears very messy, low resolution and nausea inducing, as it was rotating already, a more regular video may be a better example. I haven’t bother to edit/stitch the join also.

    Drag the video above to see it 360 (view in Chrome browser). This is a 360 still .jpg taken from Ricoh Theta S, and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4 and spatial meta injector code applied, no audio.

    Drag the video above to see it 360 (view in Chrome browser). This is taken by my Ricoh Theta S 360 camera walking in a circle, and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4 and spatial meta injector code applied, audio. This video was sourced from my iPhone.

    Drag the video above to see it 360 (view in Chrome browser). This is a 360 video taken by my Ricoh Theta S 360 camera walking in a straight line , and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4 and spatial meta injector code applied, audio. Screen is split into x2 spheres. Be interesting to see which ones play as 360 in app on cardboard viewer. This video was sourced from the camera.

    Drag the video above to see it 360 (view in Chrome browser). This is a 360 video taken by my Ricoh Theta S 360 camera walking in a straight line , and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4 and spatial meta injector code applied, audio. This video was sourced from my iPhone.

    Note to Self:

    • I will now try to view the various files in Google cardboard via a few VR apps to see which work and to see how immersive they are.
    • The aesthetic is not important here, nor is the fact the footage isnt stitched properly, its just a series of quick tests and reference for myself to see how the imported source material and exported files work best, so as to create a quick working methodology.