Author Archives: terencemquinn91

About terencemquinn91

Artist involved with Mixed Reality integrated with physical art installations. MA Fine Art Digital (Distinction), and Visiting Practitioner at UAL.

Experimenting with my HoloLens for my MA Show Exhibit

Two video screens. One a still image demonstrating spatial sound (the sea). The other filmed against a green screen with the background converted to black (transparent in HoloLens). I plan two actors narrating the piece, both against a transparent background and with spatial sound. The real world mixed reality view of my study can be seen as well as the video.

Screen Shot 2017-04-14 at 16.26.13.png

the Video Clip can be viewed in my Facebook page if you click the link ‘Watch on Facebook’ in the image below.

Progress – Making my MA Project background video

I have joined Videoblocks and downloaded some of their stock clips. Here are two of them alongside the one I originally downloaded from YouTube which I would need to edit.

This is the original downloaded from YouTube. The sound would need to be replaced because the music is inappropriate for my exhibit. It is much longer than most stock film clips (2m 31sec). I could not make something similar myself with the Vuze camera as the waterproof case is as yet unavailable until 4Q. I am disinclined to buy another for this.

The speedboat is either a benefit or a problem. If used the boat would need to represent the optician and his crew dashing to the scene. There is no sound. It is also very short (15 secs) so I would need to see that there was no glitching when it is run continuously.

This is similar to the above, but less frenetic and without a boat. It also has no sound. It is also very short (11 secs).

So I need to look at stock sound clips too. Here is one source.

Seagulls sound clip from AudioSparx

https://www.audiosparx.com/sa/module/searchOpt/srchpost2.cfm/uuid.0D788602-9AC5-5148-B2BCDEB4DB49D747

I have searched for Spatial sound of Seagulls but cannot find any. I will need to check how to do this and possibly this is something I could consider doing myself. I will see.

 

Preparing to film my actors in Green Screen

I have met with my film maker/scriptwriter friend and talked through what I have in mind. He is now on holiday and considering what to suggest in the way of a treatment, script and possibilities for presentation.

I have spent a lot of time thinking about alternatives to including holographic recordings of the actors. You may recall from earlier blogs that this is impossible with the HoloLens at present, as this capability is currently beyond developers, and still in the research labs for down scaling the hardware and technical requirements used by Microsoft to date. So the alternatives are:

  • Use avatars with the faces of the actors superimposed. I have investigated this and it is possible for me to do (after learning the process), but I think that this will detract from the empathy I am trying to create in the exhibit.
  • Film the actors stereoscopically with my Vuze 3D 360 degree camera being delivered next week. However, the HoloLens does not play stereoscopic film. I tried with Jonathan’s 360 degree film he exhibited recently, but it only showed two images rather than one merged stereoscopically. If I wanted to do this I would have to use another AR/MR device. I researched all the other AR/MR (not VR) headsets but none meet my basic requirements, which are listed below. Only the ODG smart glasses r-9 with a 50 degree field of view (FOV) met the criteria, but these are not yet released, and only become available to developers in late Q2/Q3, too late to consider for inclusion in the MA Show.
  1. Stereoscopic Video Play
  2. At least 30 degree FOV, which is that of the HoloLen This is regarded as too small. For comparison VR headsets have a much wider FOV. The HTC Vive for example is 110 degrees.
  3. Light weight by comparison with HoloLens.
  4. Can wear with glasses
  5. Untethered, except perhaps for a smartphone (preferably iPhone which I have) or equivalent.
  • Film the actors conventionally in 2D, which is what I have decided to do.

So I will be in Alasdair’s studio at the bottom of his garden for two days next week. Mady will be joining us and hopefully Vic too (as currently she is the only one that can edit the film using Isadora software). We may have to use After Effects, which Matt (Camberwell Digital Media) can help us with. As well as filming for our Final MA Show we will also be filming for our Symposium 2 on 1 June. We set up the equipment in Alasdair’s studio the week before last, so we are set to go.

Alasdair studio

I have also been researching alternatives for presenting the actors in my MA Show. There are three approaches:

  • Composite the actors (after editing the Green Screen recordings) in the Seagulls background video. I wonder whether I could use the ceiling projector in the MA Show studio space on the ground floor, or whether one of our course Short Throw projectors could be hung from that fitting?
  • Project the actors against another surface on the set (in front, at the side, on the ceiling, or floor) to be seen alongside the seagulls background video. This would require a second (or third) video projector. I would use my Artograph projector for this.
  • View the edited film in the HoloLens. See an example below downloaded from Videoblocks. I first tried a clip with a transparent background, which did not work as you could see this in white, but the one below with a black background worked well in the HoloLens, and was transparent. Figure that one out!
  • So this is a possibility, although I worry about viewers wearing the heavier than I would like HoloLens for 5 minutes to view the film, and I think that a 30 degree FOV is not as immersive as I would like.

Another consideration is SOUND.

  • HoloLens allows the use of spatial sound to improve this experience.
  • Another alternative is to use Feonic devices to make the sound come out of the surfaces on which the video is showing. I have one of these devices so will try it out sometime soon.
  • Alternatively, I could use my LG Soundbar system, which works using blue-tooth from my own Artograph projector. The benefit of this is that I have used this set up before and it works well without wires between the projector and the sound bar.
  • I have to think how sound will work in the MA Show studio space regarding practically, sound quality, and aesthetics.

So more decisions to make.

 

Progress – Making my MA Project Sculptures – Life Size

I have assembled the head of Theresa, the Optician’s wife. Just the shoulders to go. I collect the remaining parts from CSM on Monday. You can see the size difference below. The smallest is a three hour 3D print using my Ultimaker 2+ Extended printer. The medium version is printed in one piece at CSM and is 19 cm tall. The largest version is the one I am assembling now. It was meant to be life size, but it is in fact slightly larger than that at 40cm tall when fully assembled from 10 parts. Each batch of parts takes about 15 hours to print. It took several batches, but I am unsure exactly how many.

Theresa sculpture life size

The assembly process is not as straightforward as it might seem. This is due to shrinkage and warping during the printing process. Whilst this is minimal for two parts, when joining several together, the problems arise. I had to flex some parts to make them fit. I also had to infill gaps and major height differences. Then sand. Due to differences in plaster colour, arising from batch changes when 3D printing, and others due to filling with a whiter plaster, I will also probably need to paint the finish sculpture.

After a 15 hours false start and another 159 hours of 3D printing using my Ultimaker at home, I have now printed the main face of the refugee (Leo), shown as a 3D image in Cura on my earlier blog. This is one of 9 parts, but the largest.

However, I must have done something wrong as the scale is not as intended. It is meant to be life size, but it is not. You can see that Leo’s head is only about the same size as the medium size version of Theresa. It also has some imperfections. These could have been avoided if I had asked for a support structure in Cura, the 3D printing software for my Ultimaker. This would have taken 17 days of continuous printing at high quality, even at this size. So I tried to do get by without the support structure. I had to intervene a couple of times to provide a makeshift version of my own when printing came to dips such as the eye sockets and ears, which had filament hanging in the air until I did something about it. Even then, these areas are very thin or a bit of a shredded mess.

So with Leo I have sadly decided that I cannot make a life sized bust in bronze. The best that I can do is 3D print Leo large in parts at CSM, and then reprint his face with my Ultimaker to match the size as close as I can. The latter could then be used to make an unpolished bronze face to attach to the rest of the bust in white plaster. This approach presents its own challenges.

The first challenge is to take the face I have already printed to the Foundry workshop and ask Becky where I should put the runners and riders, and at what angle. The second challenge is to learn a 3D software feature to add the runners and riders to the 3D file of the face. The third challenge is to print the face the right size this time. It will probably take at least 50% more time, namely about 10 or 11 days of continuous 3D printing using my Ultimaker, or a month if I add a support structure, which I will probably need to do with runners and riders hanging in mid air. The fourth challenge is to clean the 3D printed face up. I have ordered some sculptable wax like specialist material and tools to do this as the PLA filament used for the print is too inflexible.

The fifth challenge is to finish the cleaned up 3D printed face by adding extensions to the runners and riders (as complete lengths will be outside the Ultimaker printer bounding box), and then encasing the whole ready for the foundry lost wax process. When the bronze is poured the original PLA face will no longer exist (hence ‘lost wax’). I will then have to break the bronze face out of the mold, and clean it up, removing the runners and riders with a power saw and grinder. The sixth challenge is to see whether the bronze face fits with the rest of the plaster sculpture, and if it does, to bond the bronze and plaster together. All this effort involves taking risks, and a lot of time!! My back up plan would be to consider just 3D printing the face in a bronze type PLA and not taking it through the foundry process. If this did not work then I would use the face printed in plaster at CSM. I will need to make a decision about which route to take, and I will seek advice on the aesthetic result from Jonathan, and then consider if I have enough time to proceed with either of the bronze or pseudo bronze face options.

I collect a 20cm tall head of the refugee printed in one piece, and the parts for the life size bust from CSM Digital Fabrication on Monday 28th April. I will keep you posted.

 

 

MA Show – Alternatives to Mixed Reality Video

In an earlier blog I included a video of how Microsoft made holographic videos for the Hololens. I mistakenly said that they used 106 synchronised cameras. It was actually 160 cameras as anyone viewing that video may have spotted. However over the past year, the Microsoft Holoportation team have improved on this substantially, as demonstrated in the following two video’s. Stand by to be amazed!!!

Terry, “Thank you for your interest in Holoportation. At this time it is a research project and not available for use outside Microsoft, but perhaps that will change in the future. Stay tuned to    https://www.microsoft.com/en-us/research/project/holoportation-3/     for any updates. Best regards, Ben Cutler, The Holoportation Team”

Despite these dramatic improvements, sadly holographic video with the Hololens is still beyond my capabilities to include in my MA project. This is solely because Hololens uses point cloud technology (as in CGI), and not 3D stereoscopic video. So I sought alternatives (I don’t give up easily).

I looked at other available or soon to the market mixed reality glasses that support viewing 3D stereoscopic video (as opposed to just 2D 360 degree stereoscopic film). The best of these is from ODG, But unfortunately, not available until late 2017. so after our MA show.

Screen Shot 2017-04-04 at 19.37.14

So where do I go now? Well, I am attending the following AR/VR/MR shows to see if there is anything else around:

Virtual Reality Show – 20-22 April, Business Design Centre, London

Augmenting Reality – 26 April, UCL London

VR World – 16 May, Olympia, London

TechXLR8 – 13-16 June, Excel, London

In the meantime, I will have to put my Vuze 3D 360 degree camera (delivery expected 11 April) to one side, as I do not have a mixed reality means of showing these videos for my MA show project. I could show them on my Google Cardboard or HTC Vive but both are Virtual Reality headsets and will not allow the other two layers (see earlier blogs) of my exhibit to be seen at the same time. Shame.

So I am practising green screen filming and editing with the intention of either adding actors to my background (seagulls diving in the sea) or projecting them separately against a black or grey background.

MA Show possible layout

If the latter, I am investigating the possibility of projecting the seagull video into netting hanging from the ceiling. But that depends on the room I will be using at the MA show in Wilson Road.

Ceiling projection

That’s it for now.

 

Making my MA Project Sculptures – Life Size

Here are some pictures of progress so far.

FullSizeRender 25

Suzy posing as Theresa, the optician’s wife. First part of ten, her face. 3D Printed in plaster infused with superglue.

3D File ready for printing. Face of Leo. Part one of nine. Print file prepared using Cura software. Note the print time of Six days 15 hours on my Ultimaker 2+ Extended.

159 hours to go! Then only another 8 parts.

 

Tutorial with Jonathan Kearney – 24 March 2017

Our discussion first focused on my recent work. We talked about what I had done during the Tate Exchange. I explained that I had attended all four Digital Maker Collective days during February and March. This event was held over the entire 5th floor of the Switch House, Tate Modern, I led an activity entitled ‘Virtual meets Reality’ ably assisted by Kirstin Barnes (MFA CSM) and Aurelie Freoua (MA FAD Alumni), and in the Feb sessions also by some BA students from Camberwell and Wimbledon. This activity involved helping visitors experience for themselves Google Tilt-Brush, a 3D painting app using HTC Vive Virtual Reality equipment, and Mixed Reality using the Microsoft Hololens headset, as well as 3D scanning using the Occipital Structure Sensor attached to a mini iPad. These activities were well attended and had a ‘Wow’ factor for most people trying them out for the first time.

I also offered a similar activity during our Low Residency in Feb, with much needed help from Manolis Perrakis, an MA Fine Art Digital first year on-line student from Greece, who had prior experience with the HoloLens. Jonathan tried both the HoloLens and Tilt-Brush during that session. He has a strong preference for the HoloLens, as you can still see what is around you when using it, holograms being projected into the real world space of the Camberwell Photography Studio. Whereas, with the HTC Vive you are in another Virtual World altogether.

I have assisted my wife Suzy, with her installation for an MA Museums and Galleries exhibition at the Platform Gallery, Kingston University which finished last week. This exhibition will be transferring to the Museum of the Future, Surbiton next week. Here, we exhibited ‘The Scream 2030’. This is a 3D printed sculpture produced from a highly detailed scan made using the Veronica Scanner developed by the Factum Foundation (who pioneered 3D printing in archeology allowing destroyed artifacts from antiquity, such as in Palmyra, to be reproduced). It is also a Hologram, produced from the same scan, and shown using the Hololens. The idea was that an original sculpture on display which had to be removed for conservation, was away on loan, or perhaps was ‘conserved’ as a hologram, could still be seen in its original setting. The idea was to show what is possible now, but which may be commonplace in 2030.

Above you can see ‘The Scream 2030’ on a plinth, and then on the floor, with one of the attendees viewing the exhibit holographically using the HoloLens. Below you can see what the viewer saw in the HoloLens (this picture was taken during the Tate Exchange, hence a different colour plinth).

IMG_7042

This has some relevance to my proposed post MA Research, as it illustrates how an object could be ‘conserved’ as a hologram. See my blog on my PhD Research proposal.

Our discussion then moved on to my proposed MA show (which will be installing in only 14 weeks time!!!). I explained that my exhibit was planned to be in three layers and based on the book ‘The optician of Lampedusa’. The three layers are: a large video projection of seagulls diving and screeching as a backdrop scene, two life-size sculptures representing Theresa, the opticians wife, and one of the refugees, and I had hoped that the third layer would be holographic recordings of the actors’ narratives viewed and heard in the HoloLens.

I said that I would be using video from a photo library for the backdrop, as previously discussed with Prof. Lucy Orta and discussed in my earlier blog.

The sculpture of Theresa is the same as the maquette used in the exhibit ‘The Scream 2030’, except that it will be life-size, and representing the time when she first saw refugees drowning in the sea. The face of this sculpture has just been 3D printed and can be seen below. The sculpture is being printed in ten parts, so that each part fits within the bounding box (maximum print size) of the Projet 360 printer at CSM. These will need to be assembled and sanded, which I plan to do over the Easter break.

FullSizeRender 25

I also showed Jonathan a small 3D print that I had produced (on my Ultimaker 3D printer) alongside that of the one I intend to use for the refugee. The latter was produced from the scan I had made of the actor Leo Wringer. Both can be seen below. Jonathan commented that the pose for the refugee was a perfect choice. Well done Leo.

I related the issue I was having with making a 3-5 minute holographic video of the actors narrating their parts, to be seen in the HoloLens. At that time, I only knew that it was proving difficult. Now I know why (see my last blog). It is impossible with my knowledge and resources. So I set the expectation that it would be a 2D video seen in the HoloLens, so that a viewer could also see the rest of the physical exhibit. This is not so easy either, as I am now beginning to discover (more about this in a future blog).

I talked about Lucy Orta’s comment that perhaps all three layers were too much. That the purpose of the piece to invoke empathy for the refugee situation generally, may be better achieved with either the sculptures or the HoloLens narratives alongside the backdrop video, but not both. Perhaps even down-scaling even further with only the voices of the actors. Jonathan could see her point and given the difficulties I was having with the HoloLens narrative thought that the sculptures against the backdrop video were good enough to be my finished exhibit. This may well end up to be the case, but I will continue the learning experience with devising a script for the actors, directing and video recording their performances against a green screen, editing the video to remove the background using the software Isadora, and finally exporting the edited videos to the HoloLens or other device (perhaps editing their performances into the backdrop video of the sea),

I concluded that I would make as many of these ‘assets’ as I could in the time left, and then decide which to use in my final exhibit, which would be dependent upon the exhibition space yet to be allocated to me and the other MA Show exhibits I am sharing it with.

MA Show: Mission Impossible – where my ambitions exceed my capabilities

One of the layers of my MA Show is intended to be a holographic recording of two actors narrating my exhibit. After a great deal of research, I could not find out how to do this. So I asked Microsoft, and was referred to a HoloLens specialist in Romania. He had an idea how this might have been done but did not know for certain, so consulted some other experts. I did not get an answer but was instead referred to the HoloLens Forum, which shed no further light on the subject. Then I found this YouTube video which did.

How Microsoft records Holographic video content for the HoloLens

Screen Shot 2017-03-26 at 12.31.34

Apparently we start by capturing performances with 106 (YES one hundred and six) synchronised RGB and infrared cameras on a calibrated green screen stage. Need I continue? If you do, you can see from the video that it gets MUCH more challenging. Sadly beyond the capabilities and pockets of most mortals including myself.

So I will have to adapt this aspect of my MA Show, and have yet to figure out what and how. I will post another a blog when I do.

Preparing to Make my MA Project Holographic Film

This will be viewed in the Microsoft Hololens, which I continue to discover more about during my trials at home, my visits to CSM 4D Digital Studios, and during my stint at the Digital Maker Collective Tate Exchange (last one next week). I have also bought a book about making Holograms for the Hololens. Terrifying!!! Here is Sharon trying it out at the last Tate Exchange.

IMG_7003

My home project has been to take the 3D scanned image of the Theresa sculpture into the Hololens, scale the result and place it within a real life environment. Much easier said than done. However, I did it, following the idiots guide to Hololens (but for geniuses only). Sadly, Theresa literally filled the room when viewed through the Hololens. I could only scale her down to the size of a house (sorry Theresa), but I could move it in a limited way. I sought help from Sion Fletcher who teaches Unity etc at CSM. His approach was to make a Unity program to achieve what I was looking for. It took him the best part of half a day. Quick for Holographic program development. It worked but was highly unstable. The images of Theresa were correctly scaled but they took 5 minutes to appear, then would disappear, only to reappear in a completely different place in my house, which took me 10 minutes to locate. Then the hologram would refuse to be moved more than a few centimetres at a time.

Suzy, who is participating in her first MA show, having seen the extremely enticing holograms of ballerina’s and the like, asked to include her scan (as Theresa) using the Hololens in her first Kingston University MA show ‘Museums in 2030’. I was on the rack, and panicking as her show set up is this coming Monday. Fortunately, I read the Hololens instructions again and realised that there were size and format limitations to any imported file. This was yesterday (Friday). So I set about trying to convert an SLS file (used to print the Theresa sculpture at CSM), to SDK (used by the Ultimaker 3D printers at Camberwell, and the make I own). I experimented with lots of software, as I had to go via converting to OBJ (used by my 3D scanner). Ughh! I was pulling my hair out by lunchtime. Then I thought about using Blender (which I had never used before). Jonathan loaded it on to the Mac in our studio, and with a few hints from Alejandro (who was busy using it that afternoon for his own presentation the next day) amazingly I did it. This time it worked. Scaled from over 2 metres in size to 20 cm wide. Reduced from 77,000 vertices to 28,500, under the 30,000 prescribed maximum, and directly converted to SDK  from STL, and within the file size limit of 25 MB, down from 100 MB. So I went ho,e to try it on the Hololens using their standard method. Miracles. It worked. And was stable. It did not run about the house. Nor did it keep disappearing.  Success. Relief.

Now I have to teach Suzy how to use the Hololens and manipulate her holographic sculpture in the exhibition space at Kingston University. Exciting. The private view is Tuesday next week. I will put pictures on my blog.

Last week I also tried Green Screen filming for the first time. This is necessary for me to understand when it comes to filming my actors for the Hololens. Alasdair, Vic, and Mady worked together in the Digital Media studio at Camberwell. We played around with Leonora software (which Vic understood) and equipment borrowed form the Camberwell loan store (a first for Alasdair and I). Here are a few video clips and pics of how we got on.

IMG_6924

Due to the four week break due to Easter and Camberwell building work, Alasdair and I and the rest of our small team are decamping to Alasdair’s studio at the bottom of his garden in Barnes. We could not loan the equipment over this time so decided to buy it as it was not as expensive as we first thought. It will give us a level of independence when everyone from Camberwell BA and MA want to use the Digital Media Green Screen studio and loan equipment when we need to. It will also be a backup for me in case I am unable to use the wrap around Green Screen facility at Wimbledon for my final shoot. This looks exceedingly likely as it is currently booked out to Wimbledon Theatre Design students until July! So a necessary investment we think.

IMG_6994

 

Making my MA Project background video

I had intended to film this myself, leaning over the side of a boat with a camera dangled half in and half out of the sea, while attracting seagulls to dive and screech. No mean feat for a complete novice film producer!

However, I was rescued by two events. Firstly, Lucy Orta said that this was not the main event of my project and that it was therefore OK to use (expensive for the right quality) stock library clips. Secondly, while my Vuze camera is due to be delivered shortly, they have not yet released a waterproof case, and do not expect to until around August/September. After my MA Show!

I have looked at stock film from Getty Images, and Michael is going to point me towards some others. I will find out more when Michael and I meet this coming Friday.

Screen Shot 2017-03-18 at 17.23.07

Getty Image Library