I have revisited an experiment I did a few weeks back, of creating a visual graph in Processing generated by triggering a sensor in Arduino. The graph generated didn’t look aesthetically pleasing so I tweaked the Processing code to change the visualisation of the data.
The visuals still don’t look as I want them so I have introduced some new code in the Processing sketch.
This changes the look of the data generated by the Arduino sensor and is closer to the look I wanted to achieve.
Note to Self:
I have always been fascinated by the 16th century tale of Dr. John Dee and the Alchemist Edward Kelly (not to be confused with the Camberwell tutor of the same name!) and their seven year obsession with ‘conversing with Angels’ in an ancient Enochian language that both lead them through dark portals and ultimately their own downfall. Channel 4 featured a program on it in 2002 . and in a recent online lecture on this course the Dean of Camberwell touched on it when mentioning a failed project and seeing John Dee’s obsidian scrying mirror in the British museum.
Obsidian is a naturally occurring black glass and has been used throughout the ages by many cultures, John Dee used it in his translucent ‘scrying mirror’ in conjuncture with his ‘crystal gazers’ to attempt to foretell the future. Many black surfaces can be used for scrying, even dark water.
Charlie Brooker created the TV series ‘Black Mirror’ . which is set in the very near future. It examines through speculative fiction, contemporary society and the unanticapted consquences of new technologies. The term Black Mirror refers to the shiny black screen of our tablet and mobile devices.
I see a parallel in Dee’s 16th century scrying interfaces and present day Virtual reality (VR). Imagine Dee the ‘intelligencer’ or seeker of knowledge with his dubious sidekick Kelly donning Oculus Rift Headsets and immersing themselves in the ‘magic of presence’. Conceptually, both are considered parallel worlds to ours. One is created visually within the mind, the other digitally created and immersive.
Maybe I am reaching here…VR used as some means of digital prophecy? the headset surplanting the crystal gazer of old? but both Dee’s journeys into other realms and the Rift’s advanced display technology combined with its precise, low-latency constellation tracking system enables the sensation of presence – the feeling as though you’re actually there. Both the scrying Mirror and the Rift headsets have as objects/divine interfaces an otherworldly ritualistic quality to them.
As Oculus say on their own website….”The magic of presence changes everything” .
I have revisited this Blynk gateway experiment from October.
Blynk is an app with a digital dashboard where you can build a graphic interface to control virtually the Arduino micro processor. I have set up a simple Arduino sketch, a simple LED light being triggered by a button. After setting up in this case a virtual button in Blynk, You get sent an authorisation token code via email that is pasted into the Arduino sketch, this then enables you to trigger the LED light simply by pressing the virtual button.
The problem was to get the app to talk physically to the Arduino board, I had to get either a wireless or an ethernet shield to act as an online gateway. It would have been better to do it via Wifi but the European Wifi boards were sold out and the ethernet boards were cheaper, so I opted for that instead.
As my studio is in a network in a building and therefore to route it I had to generate special IP addresses and do a lot of fiddling about with ethernet and CAT cables, but after reinstalling Arduino Blynk libraries I finally got it to work. The virtual button triggers the LED on the Arduino board via the web/app.
Note to Self:
- This is handy If I want to further develop some proper Mobile Locative Media installations.
- Get a Wifi shield and set it up use that instead.
- I had rigged up the Arduino board to talk to the Processing app in another experiment, which worked but it seems to mess up all the Ardunio ports and libraries.
Media theorist ‘Lev Manovich’ said:
“19th Century culture was defined by the novel,
20th Century culture by Cinema,
The culture of the 21st century will be defined by the interface.”
His media visualisation techniques compress massive amounts of data into ‘smaller observable media landscapes. Rather than searching through metadata, we’re then able to find relevant information in a way that’s more compatible with the way humans process information. This is particularly valuable in giving us the ability to observe where patterns of structure and colour may exist.
A lot of digital works for me in general are conceptually sound, but the arrived at outcome is not always aesthetically pleasing.
Artist Aaron Koblin expands on this, he takes vast amounts of data — and at times vast numbers of people — and weaves them into interesting visualizations. From elegant lines tracing airline flights to landscapes of cell phone data, his works explore how modern technology can make us more human.
We can set parameters and choose what to pull out of the data glut. Below is a visualisation of data from the U.S. Federal Aviation Administration processed to create animations of flight traffic patterns and density. The outcome is Colour coded by type: Altitude, make and model.
Note to Self:
- I Will continue my reading/research into the representation of information gathered, and how best to present it once captured.
#1 Soil Touch Sensitive/Earth as Interface.
The experiment was sparked by a musical effects unit I saw a while back, but it ties in with my current exploration of what exactly constitutes an interface? and of taking various inanimate objects or physical elements across the threshold into the digital realm and onto the screen.
This started out as a proximity sensor to generate code from the soil, but as It emerged I thought it would be nicer to make people ‘get their hands dirty’ and actually touch the soil to generate a series of numbers (single and double digit), which can be seen in the serial port. These fluctuate depending on how hard the soil is pressed.
The soil is always visible through the perspex, there is a sensor hidden just below the top layer of soil. I have added an alchemist Earth symbol atop a piece of wood, to give it a ritual aspect and maybe remind myself that it is ‘Earth as interface’. The Arduino jumper wire is also green, the wiring earth colour. The numbers can now be manipulated to turn into sound, or colour, images, infographics etc…which is the next step.
The underlying concept and process is important, but also the whole set up must look aesthetically pleasing to me, like in a Japanese fruit market where the price is not based on weight or size but on how beautiful the object is.
Note to Self:
- Question: Do I want to bring the virtual object on the screen to the physical world?…or the object from the physical world into the virtual, or both?
- Question: Is the information generated and how it is displayed the ‘final outcome’ or is it just part of the process, and the finished piece then still to be arrived at?
An experiment with sound. Slo-mo musical scale/paper speaker in leaves. I was getting restless dwelling a lot on context, process and methodology and simply wanted to create some ‘work’ however small, a mini piece, with a view to then developing a larger project based on it called ‘The Old Ohm Tree’ which would feature maybe 5 speakers and a lot more wooden boughs!
Slo-mo musical scale gives it an eerie Autumnal feel. ‘Oiche Samhain’ (Gaelic for Halloween night) will soon be upon us, the leaves are falling off the trees.
I want to harvest this information and represent it online. I am Waiting to receive a Arduino shield that will help connect me via Blynk app.
Note to Self: Old Ohm Tree.
- Boughs: Source x5 other Silver Birch wooden pieces and cut to fit.
- 3D rendered base: Install Cinema 4D? and sketch up 3D object for white base.
- 3D Printing: Does this need to be modular/not all 1 piece for a 3D printer?
“Eunoia” is a performance that uses conceptual artists Lisa Parks brainwaves (collected via a NeuroSky EEG sensor) to manipulate the motions of water. It derives from the Greek word “eu” (well) + “nous” (mind) meaning “beautiful thinking”. Electroencephalography (EEG) is a brainwave detecting sensor. It measures frequencies of her brain activity (Alpha, Beta, Delta, Gamma, Theta) relating to her state of consciousness while wearing it. The data collected from EEG is translated in realtime to modulate vibrations of sound with using software programs. EEG sends the information of my brain activity to Processing, which is linked with Max/MSP to receive data and generate sound from Reaktor.
Park used the EEG headset to monitor the delta, theta, alpha, and beta waves of her brain as well as eye movements and transformed the resulting data with specialized software into sound waves. Five speakers are placed under shallow dishes of water which then vibrate in various patterns in accordance with her brain activity.
While the system is not an exact science, Park rehearsed for nearly a month by thinking about specific people whom she had strong emotional reactions to. The artist then correlated each of the five speakers with certain emotions: sadness, anger, hatred, desire, and happiness.
Composer and experimental musician Alvin Lucier had a somewhat similar performance called Music for Solo Performer back in 1965.