Monthly Archives: April 2017

MA show proposal

I am getting closer to understanding what I can do for my MA show. It still needs to be checked for feasibility but here’s the plan so far.

There will be a physical exhibit with video, sculptures, and sound through headphones. Additionally I propose a few separate performances in a different space (probably the Lecture Theatre) using the HoloLens and a single smaller sculpture.

The reasons for separating the HoloLens experience from the main exhibit, is that it necessarily has to be a performance as it cannot be left unattended, and participants need to be instructed and supervised during its use. It also needs entirely different lighting conditions.

The main exhibit will be two spotlit life size sculptures on plinths looking at each other on either side of a transparent projection of the sea and screeching seagulls.Theresa, will be on a taller white plinth, looking down at the refugee on a smaller transparent plinth. The actors’ narratives will be heard through headphones. The projector will be hung from the ceiling, as will the transparent projection screen. The environment needs to be fairly dark, so the ideal location will be in the basement.

Transparent rear projection

I already have an A0 size acrylic sheet and will be coating this with self adhesive rear projection film. I need to experiment with this in the proposed exhibition space with various projectors including my own to see what works best, and to determine screen and projector  placement.

Projector Layout

Leo and Theresa heads

The performance will use the HoloLens to show videos of the actors against a transparent background, with the same audio narrative as the main exhibit, overlaid with the sound of the sea and seagulls. Holograms of the refugee’s head will be scattered around the space representing heads in the sea. A smaller physical sculpture of Theresa, the optician’s wife will be on a white plinth.

Screen Shot 2017-04-24 at 16.38.16

To accomplish what I have described the film of the two actors together must be mid-shot against a black screen with clothes mics in a recording studio. Hopefully this can be done at either the film studio at CSM or Wimbledon. Otherwise it will be offsite in a private studio in Camberwell (not UAL). The video, sound recordings and holograms will be input to and organised in Unity and then input to the Hololens in Microsoft Visual Studio. Video will be 2D, and sound will be spatial.

I need to measure the room/exhibition spaces showing where the exhibit items are positioned. This is necessary to enable spatial sound to be correctly positioned in Unity. As there will be two exhibition spaces there will need to be two set ups within the Unity project.

Headphones used in the main exhibit have an inbuilt audio recorder. So recordings can be heard wire free, without wi-fi or bluetooth, and each set of headphones can operate independently.

 

MA show overview – narrowing down options

I am getting closer to understanding how best to present my exhibit based on ‘The Optician of Lampedusa’:

Main Exhibition piece – ground floor room Wilson Road with blinds down

  • Stock film and audio to provide a background of seagulls diving and screeching in open sea, possibly calming to no seagulls, and a short clip of refugees in the water. This is in effect three videos and one audio clip edited together, looping continuously. I have yet to decide how this film will be projected, but hopefully using the existing ceiling projector or the fitting it uses with my own projector. The audio will be coming from the whole of the wall on which the film is projected using my Feonic invisible speaker device attached to the wall and wired to my digital media player and amplifier.

    Alternatively, I can blue-tooth from my projector to my sound bar. I have to consider how these would be hidden from view, and how I can easily adjust the sound level. If I use my projector, I have a remote, so in that case, problem solved. I also have to see how the projection maps to the wall. I would like to project to where the wall meets the floor, representing the sea, and possibly partially overlaying on the floor itself. If the latter I could cover the floor with the grey lino I lent Leonie for her installation last year. I may need to paint the wall light grey to match.

  • Two life sized sculptures on transparent acrylic plinths. One of Theresa, the optician’s wife and the other of Leo as a refugee. Theresa would look down towards the refugee, representing her in the boat and him in the sea. I need to consider lighting, possibly diffused spots from above, or from below inside the plinths. If possible I will 3D print the head section of Leo’s bust of the refugee in PLA (again – it was too small last time in spite of taking 7 days to print!) with runners and riders attached, so that this section can be remade in unfinished bronze, and attached to the rest of his sculpture in white plaster.Leo head ultimaker
  • The actors narrating their experience. Whilst the sound of the sea can be heard in the exhibition room (hopefully, if others sharing the room do not mind), I will provide two unattached headphones for the audience to listen to this approximately 5 minute track. My FiiO Bravo headphones store the tracks in the headset itself and thus do not need to be attached or linked by blue-tooth or wi-fi to an audio player. They operate independently, and will need to be recharged every night.
  • There will be a ‘performance’ for a couple of hours each day using the HoloLens. The actors’ performance will have previously been video recorded with spatial sound against a green screen using my Vuze 3D 360 degree camera, edited in 2D (sadly, the HoloLens uses a different holographic technology so cannot view Stereoscopic 3D). A viewer using the HoloLens will see the background video and sculptures in the real world view of the room, and two green screened actors performing in the HoloLens headset, while listening to the audio using an attached B&W P-7 set of headphones. The audio in this case will be 3D sound which will be precisely located where the actors performed, with seagulls flying over their heads and the sea lapping around them. A TV monitor will show the rest of the audience what can be seen in the HoloLens.

Performance in the Lecture Theatre

If possible there will be additional performances whenever the Lecture Theatre in Wilson Road is free and for about 2 hours a day. This will demonstrate the use of the HoloLens for digital art ‘conservation’, which was the theme of my MA dissertation. This will be the same performance as shown in my earlier blog for my wife’s MA show, using same sculptures as the main exhibition piece, but smaller and alongside their holographic versions.

I hope to also provide the audience with the opportunity of viewing my main exhibition entirely in Virtual Reality using the HTC Vive. When filming the actors with the Vuze camera I will also undertake a second edit in Stereoscopic 3D which can be viewed stereoscopically in Virtual Reality. I aim to also include the 3D images of the sculptures, and the background video in this Virtual Reality experience. We will see if I have enough time to do this.

Well that’s it for now. Must get back to completing the life size bust of Theresa, which should be finished today. Hoorah.

Happy Easter – a welcome couple of days off with my family.

 

Using my HoloLens and sculpture at another MA Show

This time it’s my wife Suzy’s MA show entitled ‘Curating the Future: Preserving the Past’ at the Museum of Futures, London. I was the contributing artist and the work was curated by Suzy to demonstrate what could be done when Museum exhibits are away on loan, or have been moved to the Museum’s archive or for conservation. The object can continue to be seen holographically.

Here is a picture of Suzy alongside her holographic image entitled ‘The Scream 2030’.

20170410_073439_HoloLens (2)

Here is a slideshow I created in Facebook, showing progress making the sculpture both 20cm tall showing and how it was used in Suzy’s MA show, and life size which I will use in mine.

IMG_7290

Click the ‘Watch on Facebook’ link in the image above to view the slideshow.

Experimenting with my HoloLens for my MA Show Exhibit

Two video screens. One a still image demonstrating spatial sound (the sea). The other filmed against a green screen with the background converted to black (transparent in HoloLens). I plan two actors narrating the piece, both against a transparent background and with spatial sound. The real world mixed reality view of my study can be seen as well as the video.

Screen Shot 2017-04-14 at 16.26.13.png

the Video Clip can be viewed in my Facebook page if you click the link ‘Watch on Facebook’ in the image below.

Progress – Making my MA Project background video

I have joined Videoblocks and downloaded some of their stock clips. Here are two of them alongside the one I originally downloaded from YouTube which I would need to edit.

This is the original downloaded from YouTube. The sound would need to be replaced because the music is inappropriate for my exhibit. It is much longer than most stock film clips (2m 31sec). I could not make something similar myself with the Vuze camera as the waterproof case is as yet unavailable until 4Q. I am disinclined to buy another for this.

The speedboat is either a benefit or a problem. If used the boat would need to represent the optician and his crew dashing to the scene. There is no sound. It is also very short (15 secs) so I would need to see that there was no glitching when it is run continuously.

This is similar to the above, but less frenetic and without a boat. It also has no sound. It is also very short (11 secs).

So I need to look at stock sound clips too. Here is one source.

Seagulls sound clip from AudioSparx

https://www.audiosparx.com/sa/module/searchOpt/srchpost2.cfm/uuid.0D788602-9AC5-5148-B2BCDEB4DB49D747

I have searched for Spatial sound of Seagulls but cannot find any. I will need to check how to do this and possibly this is something I could consider doing myself. I will see.

 

Preparing to film my actors in Green Screen

I have met with my film maker/scriptwriter friend and talked through what I have in mind. He is now on holiday and considering what to suggest in the way of a treatment, script and possibilities for presentation.

I have spent a lot of time thinking about alternatives to including holographic recordings of the actors. You may recall from earlier blogs that this is impossible with the HoloLens at present, as this capability is currently beyond developers, and still in the research labs for down scaling the hardware and technical requirements used by Microsoft to date. So the alternatives are:

  • Use avatars with the faces of the actors superimposed. I have investigated this and it is possible for me to do (after learning the process), but I think that this will detract from the empathy I am trying to create in the exhibit.
  • Film the actors stereoscopically with my Vuze 3D 360 degree camera being delivered next week. However, the HoloLens does not play stereoscopic film. I tried with Jonathan’s 360 degree film he exhibited recently, but it only showed two images rather than one merged stereoscopically. If I wanted to do this I would have to use another AR/MR device. I researched all the other AR/MR (not VR) headsets but none meet my basic requirements, which are listed below. Only the ODG smart glasses r-9 with a 50 degree field of view (FOV) met the criteria, but these are not yet released, and only become available to developers in late Q2/Q3, too late to consider for inclusion in the MA Show.
  1. Stereoscopic Video Play
  2. At least 30 degree FOV, which is that of the HoloLen This is regarded as too small. For comparison VR headsets have a much wider FOV. The HTC Vive for example is 110 degrees.
  3. Light weight by comparison with HoloLens.
  4. Can wear with glasses
  5. Untethered, except perhaps for a smartphone (preferably iPhone which I have) or equivalent.
  • Film the actors conventionally in 2D, which is what I have decided to do.

So I will be in Alasdair’s studio at the bottom of his garden for two days next week. Mady will be joining us and hopefully Vic too (as currently she is the only one that can edit the film using Isadora software). We may have to use After Effects, which Matt (Camberwell Digital Media) can help us with. As well as filming for our Final MA Show we will also be filming for our Symposium 2 on 1 June. We set up the equipment in Alasdair’s studio the week before last, so we are set to go.

Alasdair studio

I have also been researching alternatives for presenting the actors in my MA Show. There are three approaches:

  • Composite the actors (after editing the Green Screen recordings) in the Seagulls background video. I wonder whether I could use the ceiling projector in the MA Show studio space on the ground floor, or whether one of our course Short Throw projectors could be hung from that fitting?
  • Project the actors against another surface on the set (in front, at the side, on the ceiling, or floor) to be seen alongside the seagulls background video. This would require a second (or third) video projector. I would use my Artograph projector for this.
  • View the edited film in the HoloLens. See an example below downloaded from Videoblocks. I first tried a clip with a transparent background, which did not work as you could see this in white, but the one below with a black background worked well in the HoloLens, and was transparent. Figure that one out!
  • So this is a possibility, although I worry about viewers wearing the heavier than I would like HoloLens for 5 minutes to view the film, and I think that a 30 degree FOV is not as immersive as I would like.

Another consideration is SOUND.

  • HoloLens allows the use of spatial sound to improve this experience.
  • Another alternative is to use Feonic devices to make the sound come out of the surfaces on which the video is showing. I have one of these devices so will try it out sometime soon.
  • Alternatively, I could use my LG Soundbar system, which works using blue-tooth from my own Artograph projector. The benefit of this is that I have used this set up before and it works well without wires between the projector and the sound bar.
  • I have to think how sound will work in the MA Show studio space regarding practically, sound quality, and aesthetics.

So more decisions to make.

 

Progress – Making my MA Project Sculptures – Life Size

I have assembled the head of Theresa, the Optician’s wife. Just the shoulders to go. I collect the remaining parts from CSM on Monday. You can see the size difference below. The smallest is a three hour 3D print using my Ultimaker 2+ Extended printer. The medium version is printed in one piece at CSM and is 19 cm tall. The largest version is the one I am assembling now. It was meant to be life size, but it is in fact slightly larger than that at 40cm tall when fully assembled from 10 parts. Each batch of parts takes about 15 hours to print. It took several batches, but I am unsure exactly how many.

Theresa sculpture life size

The assembly process is not as straightforward as it might seem. This is due to shrinkage and warping during the printing process. Whilst this is minimal for two parts, when joining several together, the problems arise. I had to flex some parts to make them fit. I also had to infill gaps and major height differences. Then sand. Due to differences in plaster colour, arising from batch changes when 3D printing, and others due to filling with a whiter plaster, I will also probably need to paint the finish sculpture.

After a 15 hours false start and another 159 hours of 3D printing using my Ultimaker at home, I have now printed the main face of the refugee (Leo), shown as a 3D image in Cura on my earlier blog. This is one of 9 parts, but the largest.

However, I must have done something wrong as the scale is not as intended. It is meant to be life size, but it is not. You can see that Leo’s head is only about the same size as the medium size version of Theresa. It also has some imperfections. These could have been avoided if I had asked for a support structure in Cura, the 3D printing software for my Ultimaker. This would have taken 17 days of continuous printing at high quality, even at this size. So I tried to do get by without the support structure. I had to intervene a couple of times to provide a makeshift version of my own when printing came to dips such as the eye sockets and ears, which had filament hanging in the air until I did something about it. Even then, these areas are very thin or a bit of a shredded mess.

So with Leo I have sadly decided that I cannot make a life sized bust in bronze. The best that I can do is 3D print Leo large in parts at CSM, and then reprint his face with my Ultimaker to match the size as close as I can. The latter could then be used to make an unpolished bronze face to attach to the rest of the bust in white plaster. This approach presents its own challenges.

The first challenge is to take the face I have already printed to the Foundry workshop and ask Becky where I should put the runners and riders, and at what angle. The second challenge is to learn a 3D software feature to add the runners and riders to the 3D file of the face. The third challenge is to print the face the right size this time. It will probably take at least 50% more time, namely about 10 or 11 days of continuous 3D printing using my Ultimaker, or a month if I add a support structure, which I will probably need to do with runners and riders hanging in mid air. The fourth challenge is to clean the 3D printed face up. I have ordered some sculptable wax like specialist material and tools to do this as the PLA filament used for the print is too inflexible.

The fifth challenge is to finish the cleaned up 3D printed face by adding extensions to the runners and riders (as complete lengths will be outside the Ultimaker printer bounding box), and then encasing the whole ready for the foundry lost wax process. When the bronze is poured the original PLA face will no longer exist (hence ‘lost wax’). I will then have to break the bronze face out of the mold, and clean it up, removing the runners and riders with a power saw and grinder. The sixth challenge is to see whether the bronze face fits with the rest of the plaster sculpture, and if it does, to bond the bronze and plaster together. All this effort involves taking risks, and a lot of time!! My back up plan would be to consider just 3D printing the face in a bronze type PLA and not taking it through the foundry process. If this did not work then I would use the face printed in plaster at CSM. I will need to make a decision about which route to take, and I will seek advice on the aesthetic result from Jonathan, and then consider if I have enough time to proceed with either of the bronze or pseudo bronze face options.

I collect a 20cm tall head of the refugee printed in one piece, and the parts for the life size bust from CSM Digital Fabrication on Monday 28th April. I will keep you posted.

 

 

MA Show – Alternatives to Mixed Reality Video

In an earlier blog I included a video of how Microsoft made holographic videos for the Hololens. I mistakenly said that they used 106 synchronised cameras. It was actually 160 cameras as anyone viewing that video may have spotted. However over the past year, the Microsoft Holoportation team have improved on this substantially, as demonstrated in the following two video’s. Stand by to be amazed!!!

Terry, “Thank you for your interest in Holoportation. At this time it is a research project and not available for use outside Microsoft, but perhaps that will change in the future. Stay tuned to    https://www.microsoft.com/en-us/research/project/holoportation-3/     for any updates. Best regards, Ben Cutler, The Holoportation Team”

Despite these dramatic improvements, sadly holographic video with the Hololens is still beyond my capabilities to include in my MA project. This is solely because Hololens uses point cloud technology (as in CGI), and not 3D stereoscopic video. So I sought alternatives (I don’t give up easily).

I looked at other available or soon to the market mixed reality glasses that support viewing 3D stereoscopic video (as opposed to just 2D 360 degree stereoscopic film). The best of these is from ODG, But unfortunately, not available until late 2017. so after our MA show.

Screen Shot 2017-04-04 at 19.37.14

So where do I go now? Well, I am attending the following AR/VR/MR shows to see if there is anything else around:

Virtual Reality Show – 20-22 April, Business Design Centre, London

Augmenting Reality – 26 April, UCL London

VR World – 16 May, Olympia, London

TechXLR8 – 13-16 June, Excel, London

In the meantime, I will have to put my Vuze 3D 360 degree camera (delivery expected 11 April) to one side, as I do not have a mixed reality means of showing these videos for my MA show project. I could show them on my Google Cardboard or HTC Vive but both are Virtual Reality headsets and will not allow the other two layers (see earlier blogs) of my exhibit to be seen at the same time. Shame.

So I am practising green screen filming and editing with the intention of either adding actors to my background (seagulls diving in the sea) or projecting them separately against a black or grey background.

MA Show possible layout

If the latter, I am investigating the possibility of projecting the seagull video into netting hanging from the ceiling. But that depends on the room I will be using at the MA show in Wilson Road.

Ceiling projection

That’s it for now.