Category: 360

Sandbox 2

Had real issues with the build version of the unity piece I’m doing, it looked perfect in unity player but in the build it wasn’t rendered properly. I had upgraded to a new version so think that maybe broke some settings. So basically had to re-do the whole thing.

I’ll finish this off this week now as the listener music is starting to drive me crazy.

Note to Self:

  • A few forum posts actually taught me a lot about output settings.
  • The aesthetic is a tad too ‘gamer’ for my liking but with my next piece I’ll hopefully combine 360 footage to make it more cinematic looking, maybe grayscale/ B+W, floating cameras, athmospheric with lots of fog etc..


Been making a back up video in case there is any problems with the Vive installation. Some short snippets below.

Note to Self:

  • Nice atmospheric immersive quality to these snippets that gets lost when exported.
  • Tech not there yet these export these Tilt Brush video captures and then import into unity as 360 immersive sequences.
  • Was going to import the video and embed on a few walls in Unity but security bug with QT on pc, so decided to leave it alone and use still images instead, might have been too busy anyway.


“The spirit or divine power presiding over a thing or place.
A spiritual force or influence often identified with a natural object, phenomenon, or place.”


I have returned to some 360 video pieces I was doing in November 2016 of places that held a significance for me, but abandoned as I was frustrated at the quality of the final video. I was using a Ricoh theta S 360 camera and the still images are excellent, but the video footage is very soft. In retrospect it may be just when added to Youtube with the 360 meta tag the quality deteriorates because of the added YouTube compression, but even in the Homido player app I have been using in my iPhone the quality looks muddy. In a nutshell I need a 4k 360 camera, I have been looking at Vuze Camera due to be shipped next month, but even the examples shown online look kind of soft also. Sigh. Another route would be to use Unity gaming engine, but I’m after realistic footage, not a computer game type aesthetic.

“The most beautiful thing we can experience is the mysterious. It is the source of all true art and science” – Albert Einstein


“There is no such thing as an environment: wherever we look for it we find all kinds of objects – biomes, ecosystems, hedges, gutters and human flesh. In a similar sense there is no such thing as Nature . I’ve seen penguins, plutonium, pollution and pollen but I’ve never seen Nature.” – Timothy Morton: Realistic Magic Objects, Ontology, causality.

How did numen, a Latin term Numen meaning nod of the head come to be associated with spiritual power? The answer lies in the fact that the ancient Romans saw divine force and power operating in the inanimate objects and nonhuman phenomena around them. They believed that the gods had the power to command events and to consent to actions, and the idea of a god nodding suggested his or her awesome abilities-divine power.

Eventually, Latin speakers began using numen to describe the special divine force of any object, place, or phenomenon that inspired awe (a mystical-seeming wooded grove, for example, or the movement of the sun), and numen made the semantic leap from “nod” to “divine will or power.

Jumping from the Romans to the Greeks…the Platonic solids. The five special polyhedra are the tetrahedron, the cube, the octahedron, the icosahedron, and the dodecahedron, but I opted to use some low-poly 3D object shapes instead, namely a Cone, Sphere and Torus. Objects to thrust into the landscape.


This also brings me back to the main problem of storytelling and narrative in an immersive environment, you need some kind of device to try and direct the viewer where to go/look. The piece ‘Lost’ from 2015 uses a firefly as a device to follow, but you can at any moment look away, hence two peoples experience of the piece will be different.


To capture green screen footage for 360, also some kind of 360 cyc as above would need to be built.

Note to Self:

  • Back to the camera/drawing board. All these disparate elements really need to start coming together, too many tangents. Time wasted.
  • The Cone, Sphere and Torus may be narrative devices to compel viewers to look/jump to a certain section of the 360 round.
  • Gaming narrative the way forward?…leading players/viewers down a set path and explaining their surroundings. But what instead of the gameplay? that is the journey through a games levels. For 360?
  • Does a viewer want to have agency (control) or not?…or be a passive viewer, this is 360 NOT full VR remember.
  • Immersive, immersive, immersive…

360 Multi-verse

Firstly lets differentiate between Full VR and 360/VR video.
With Full VR:

  • Each moment is generated live
  • The user interacts to control what happens

With 360 Video:

  • Its a single experience rendered in advance
  • The viewer chooses where to look
  • In traditional film editing we think in terms of frames.

    “In a 360 environment a ‘frame’ is a relative window of experience derived from the visitor’s field of vision. This makes everything a potential frame, but also makes a premeditated frame based on my own interests presumptuous and, well, wrong most of the time.” [1]

    These visuals below are more reflective of the spatial reality of the medium, more apt to its multi-verse tendencies where every path exists simultaneously. Worlds of experience extending from one another, much like ripples in a pond or rings in a trunk of a tree.


    We need to identify the potential experiences in each world, evaluate the probability that they will occur, and then take into account how a visitor might engage with them, I could then identify possible paths. I could rotate these worlds around each other, using the most probable potential experiences to guide someone through.


    Then perhaps I could work backwards through these layers of experience, take what insights editing these worlds provide and use them to help shape the creation of these worlds from the start.

    “…there needs to be the existence of a unique link between the mind of the creator and the mind of the visitor. It appears to be very specific to this medium and something that could have never existed until presence became a factor.” [2]

    Note to Self:

    • Work out a DIY method of writing a 3D treatment/screenplay. current methods will not suffice.
    • To capture proper 360 green screen film footage of actors, a curved circular cyc green screen would have to be constructed, with maybe spot lighting from above?

    Triptych + Equirectangular


    Another early 20th century film. Abel’s Gances 1927 film ‘Napoleon’ recently remastered, and featuring a famous triptych finale.



    This is an other example of early 20th century cinema attempting to push the envelope and again I see similarities with what is going on now with 360 films, especially Monoscopic 360 video.



    The triptych three-parts screens in the final sequence remind me of the Equirectangular format necessary in the 360 environment. Many innovative techniques were used to make the film, including fast cutting, extensive close-ups, hand-held camera shots, location shooting, POV (point of view shots), multiple cameras set ups, multiple exposure, superimposition, underwater camera, kaleidoscopic images, film tinting, split screen and mosaic shots.

    Man with a 360 Camera

    Whilst investigating narrative possibilities within mixed reality and immersive environments and 360 spherical films, I can’t help but see the similarities with early 20th century cinema. With this nascent spherical medium people are experimenting and attempting to work out possible ways of storytelling. Chris milk, CEO of VR studio Within (formerly VRSE), has likened these first steps into the 360 environment to the Lumiere Brothers 1895 first motion picture showing with their short piece ‘Arrival of a train’, people in the audience thought the train was coming towards them and ran out of the cinema.

    Milk includes a homage to this in one of his first VR pieces, a showcase of sorts called ‘Evolution of Verse’ where a train comes headlong at the viewer. The standout of this piece though being the foetus reaching out to touch you.

    Early Russian cinema was instrumental in pushing the envelope of this new 20th century medium, and Dziga Vertov’s silent documentary – Man with a movie camera (1929) was instrumental in this, called the ’Greatest documentary ever made’, it has no story and no actors.


    From dawn to dusk Soviet citizens are shown at work and at play, and interacting with the machinery of modern life. The film is primarily famous for the range of cinematic techniques Vertov invents, double exposure, fast motion, slow motion, freeze frames, jump cuts, split screens, dutch (Deutsch) angles, extreme close ups, tracking shots, backwards footage, stop-motion animation and self-reflexive visuals (at one point it features a split-screen tracking shot; the sides have opposite Dutch angles).


    Vertov, and his wife Elizevata Svilova were members of a collective of Soviet filmmakers in 1920s Russia called Kinoks (cinema-eyes). From 1922 to 1923 they published a number of manifestos in avant-garde journals which clarified the Kinoks’ position. The Kinoks rejected “staged” cinema with its stars, plots, props and studio shooting.

    Vertov. Kinoks: A revolution.

    You–filmmakers, you directors and artists with nothing to do,
    all you confused cameramen and writers scattered throughout the
    A friendly warning;
    Don’t hide your heads like ostriches.
    Raise your eyes,
    Look around you–
    It’s obvious to me
    as to any child
    The innards,
    the guts of strong sensations
    are tumbling out
    of cinema’s belly,
    ripped open on the reef of revoultion.
    See them dragging along,
    leaving a bloody trail on the earth
    that quivers with horror and disgust.
    It’s all over.

    Kino-Eye, p. 13

    Another interesting early 20th century filmaker/theorist was again a Russian, Lev Kuleshov, he carried out what is called the Kuleshov experiment. It is a film editing (montage) effect by which viewers derive more meaning from the interaction of two sequential shots than from a single shot in isolation.


    Kuleshov edited a short film in which a shot of the expressionless face of Tsarist matinee idol Ivan Mosjoukine was alternated with various other shots (a plate of soup, a girl in a coffin, a woman on a divan). The film was shown to an audience who believed that the expression on Mosjoukine’s face was different each time he appeared, depending on whether he was “looking at” the plate of soup, the girl in the coffin, or the woman on the divan, showing an expression of hunger, grief or desire, respectively. The footage of Mosjoukine was actually the same shot each time.

    As this seems to have turned into a piece on early 20th century Russian cinema I may as well mention Sergei Eisinstein’s 1925 classic Battleship Potemkin, the famous Odessa Steps sequence featured below.


    After recent experiments with 360 video I have worked out a quick methodology so I can work quickly without going back a few months later, and saying how the hell did I do that?


    So basically the rule of thumb is take the 360 source video or stills from the iPhone directly to the laptop, and NOT from the 360 camera directly. The image should be in Equirectangular format and look like the above. This is Monoscopic 360 video, not Stereoscopic 3D which is used in more high end 360 cameras. A standard 360 video is just a flat equirectangular video displayed on a sphere. Think of Monosciopic like the face of a world map on a globe, whereas Stereoscopic 3D can add another level of immersion by adding depth data between the foreground and background. Stereoscopic means you have two copies, a left eye and a right eye.


    The image should NOT look like the above image, if it does it won’t look correct when played via VR app in the Google cardboard headset. The spatial injector code only needs to be applied if its to be uploaded to Youtube.

    Drag the video above to see it 360 (view in Chrome browser). This is a 360 video taken by my Ricoh Theta S 360 camera and a blender 3D rotating object, both imported into Final Cut Pro X, exported as an MP4.

    The exported mp4 is then uploaded to the iPhone again, and opened in a free VR app, one thats been working well for me is HOMiDO player, you simply open the app, hit ‘video player’, click on the folder icon, find your mp4 on the iPhone, then click ‘choose’ and it compresses the video for 360 play. Slot the iPhone it into your VR headset and bobs your mothers brother, you are immersed inside the video.

    Note to Self:

    • Strange to see a 3D scan from blender rotating inside my studio.
    • Need to source some good 360 source material, maybe from some woods/forest?
    • Ricoh Theta S video quality is very soft/low quality, so try to get around that by lighting perhaps?
    • Do some proper stitching/editing experiments so no joins/lines and imagery looks ok across 360 sphere.