We've come out the other side of the Reframed Project Dev Programme at Lighthouse with a proof of concept Monocular and app and piece of content made especially for Kensington Street, Brighton. Here's the video documenting our first user reactions.
We had some development time during Making Ground's Making Space residency at Fabrica. Neil used a Kinect sensor to map where a body was in space so that walking towards and away from the screen made a film run forwards or backwards. He took a day to write the basic code so we could see how it felt. People's interactions with it were very playful. We realised the length of the film depends on the range of the sensor and the space you have to walk. Realtime or slow motion film feels better than speeded up film so short sequences were the most effective. People examined and amplified small details of movement by repetition.
Robins see the Earth's magnetic field in one eye. I like the idea of turning a smartphone into a near eye viewfinder using an app and a lens attached to the phone screen. We could place a dynamic robin's eye view of the magnetosphere into one eye, like a heads up compass.
We rejigged the brick mould mutoscope after it's first outing at the Open Day, replacing the 3D printed slotted spindles with perfect bound circles using layers of pva along the spine and changing the cogs to a slower ratio.
Exhibiting work-in-progress from Making Ground with Elaine Bolt and Annemarie O'Sullivan, part of Making Lewes Festival.
Tom and Annemarie made this booth to house the flickbooks, mutoscope and looposcope. Neil rigged it and Flora helped paint it.
We had two days experimenting with inventor, Simon de Bakker of Commonplace to make a working prototype handheld viewfinder/video player as a moving-image-on-reality layering device. Inside there is a tiny lcd screen, a prism and a lens so that you can hold it close to your eye. The idea is that, if you keep both eyes open, the recorded image will merge with your view of the real environment.
The first problem we encountered was that the scale of the image ended up distorted by the lens, so that even when taken at the same zoom as our eyes see it didn't match up with the view of the real environment. One solution was to take the image from further away from the subject to compensate for the distortion, so we experimented with distances and used a picture in grid form in Simon's living room to line up the image with.
It worked. By the end of the second day we had a working prototype and knew how to make image sequences to show on it. It also has a scrolling mechanism, like Commonplace's Bioscope.
We discovered that our brains are good at deciding which eye to relay information from, and at the moment the recorded image appears very faintly. We might benefit from a larger, brighter lcd screen that takes over a higher proportion of our field of vision. At the moment the lcd image is still too ghostly to relay clear point-of-view navigational information. We need to talk to someone who knows about optics and what how our brain interprets the visual input from our eyes.
We went to work with Commonplace on the hollow hand/viewfinder idea. We stayed at Metaal Kathedraal, a fantastic fantasy art space in a converted church. It has a backyard like an overgrown fairground, that gives onto fields where, on my first morning, I came across hares, greylag geese and a great white heron.
Here's Blast Theory's video of us talking about what happened. I've updated a stage by stage blog below.
We skyped with Simon de Bakker from Commonplace about how to make the camera viewfinder idea into a bespoke device. Coming across their piece Bioscope, a device that allows you to hand crank digital moving-image, encouraged me to look again at the possibility of using digital moving-image rather than paper films. I also like the way their piece Lumiere, a chandelier of light bulbs containing videos of weather systems, makes the moving-image more tangible and in-the-world.
We met a dog walker who told us that fighter pilots have visual input in one eye and one eye free.
A constant conversation between Neil and I, is how the development process works. I'm all about the User Experience (the UX in developer speak) and Neil likes to think about how we can get there in achievable stages, balancing up requirements with resources.
Neil’s plan for the heartbeat pointer:
Start with core idea –
1) the heartbox takes you to a single destination while the hunter’s headphones enhance the ambient sounds (no playback of recorded sound)
2) add multiple waypoints if a specific route is important to the story
3) add a sound playback device that can be triggered by arriving at waypoints and routed through headphones. This will give a full enough experience to apply to a site and test.
4) Once the tech is stable, experiment with different qualities of haptic feedback as well as the question of having direction relate to different points of the body (like the feelspace belt).
5) Try picking up the sound of solenoid heartbeat to add to the audio
6) Try using other sounds using a waveshield to add other sounds as payoff at destination