header image
Jungle Rumbles the Melbourne Festival E-mail
Oct 28, 2011 at 12:00 AM
It’s A Jungle In Here  opened at the Melbourne International Arts Festival  last week. Despite some initial issues with an overheating computer and a troublesome hard drive, audience reactions and feedback were great.

 
We’re looking forward to some well deserved rest…
 
UPDATE: there's a nice review in Real Time  by Ulanda Blair.
Atelier Edens Warms Up E-mail
Oct 20, 2011 at 12:00 AM
Over the last three days Willoh S Weiland, Thea Baumann, Josh Gardiner, Elizabeth Dunn, Liam O’Keefe (Creative Environment Enterprise) and I went through our plans for the upcoming Atelier Edens labs – and specifically the kinds of projects, equipment, resources and energy requirements for the Wilderness Lab at Point Nepean in December this year.

We came up with a structure that looks like:

AE_LAB-STRUCTURE_FINAL-1 

Basically I’m quite keen to approach the labs as a series of content gathering and technology prototyping sessions. In terms of outcomes we’re interested in the idea of working toward performance outcomes in which audiences are either physically or digitally re-located to natural and remote environments.

Bunyip Development II E-mail
Sep 21, 2011 at 12:00 AM
After a week off for me – and a week of rehearsals for the puppeteers and performers – I’m back in Sydney working on I, Bunyip. The set is now complete and looking awesome:
 
bunyipRockStage 
 
So this week Phil Downing and I are integrating the Kinect tracking and mask textures produced by my openFrameworks application with the overall show control – which is mostly being done in Qlab  and Quartz . Although all of the video playback and effects could be done in oF, we’ve opted for a system that maximizes Phil’s control over the content and effects – he is not a c++ programmer (and nor does he want to be!), but has been doing a lot of work with Quartz over the last few years – so it makes sense to ‘serve’ tracking data and mask textures to Quartz, and use OSC/MIDI messages to control the oF application.
 
Getting control data in and out an oF application is quite straightforward. ofxMidi  and ofxOsc  are simple to integrate, so that cues sent from Qlab can trigger different ‘scenes’ (which have different depths at which we are tracking and/or creating mask textures to produce registered projections).
 
Getting the textures out in an efficient way would be problematic, if it wasn’t for the genius of Anton Marini aka Vade, and Syphon . Basically Syphon, and the oF addon, ofxSyphon allows you to ‘share’ textures between applications. It’s super efficient and super handy. Using Syphone I am literally uploading the registered projection ‘masks’ to the GPU, and then sharing them – still on the GPU – with Quartz.
 
Phil can then use these textures to mask any color, video and visual effects he builds in Quartz, which is also receiving Midi cues from Qlab.
 
bunyipRockKinect01 
 
Pretty neat really. 
Bunyip Development I E-mail
Sep 05, 2011 at 12:00 AM
I just spent two weeks in Sydney working with the inimitable ERTH  on their upcoming show ‘I, Bunyip ’. Phil Downing, Scott Wright, Steve Howarth and I have done quite a bit of thinking about how registered projection and architectural projection might be used in this show about indigenous mythical creatures from the New South Wales Blue Mountain region.
 
bunyipRockStageBuild
 
Essentially we’re aiming to use a Kinect camera and a projector to create a virtual lighting system for the main presenter in the show, as well as for special moments when performers and puppets need to move freely around the stage. Essentially this is a registered projection system – where we can use any color or video to light performers or puppets.
 
In ‘traditional’ lighting for puppetry it is necessary to use ‘corridoors’ of light (usually with one or more profiles, tightly focused from the left and right of the stage). The main problem with this is that performers cannot cross this boundary, and puppets cannot move freely up and down stage. By using a ‘virtual’ lighting system – one where we use a camera to dynamically detect the silhouette of puppets and performers – these restrictions can be overcome.
 
In order to simplify the amount of computation and to make the system fast, reliable and easy to set up, I devised a system that uses optical alignment between the Kinect and a particular series of Benq ultra-wide throw projectors to eliminate much of the distortion and rectification issues registered projection systems often run into. The Benq 8xx series is perfect for this as it has a FOV very close to the Kinect.
 
In order to disguise the system we put it inside a ‘rock’! 

bunyipRockKinect02(1)

Ars Electronica Residency E-mail
Jul 23, 2011 at 12:00 AM
I’m super-excited to announce that I am going to the Ars Electronica Futurelab  to do a residency from May to September 2012. Last April I was invited to apply for the residency program by Antoanetta Ivonava from Novamedia .

The other two recipients for 2011-13 are Jon McCormack and Daniel Crooks . Previous residents at the Futurelab include Matt Gardiner, Leon Cmelewiski, Josephine Starrs and Adam Nash.
 
I had to read the email three times before I actually believed it...!
 
The residency is made possible through the support of the Australia Council for the Arts , Novamedia and Ars Electronica Futurelab. 
Flash Is Not My Friend E-mail
Jun 27, 2011 at 12:00 AM
Turns out that JSFL is not the answer to our problems. Flash ActionScript being the super-reliable, always intuitive and not to mention thoroughly documented language it is, has a LOT of problems giving accurate translation, rotation and scale information about graphics and movie objects. As previously mentioned graphic objects are not storing their positions (as far as ActionScript is concerned). And Movieclips are not allowing Isobel to see onion skinning. So…Isobel has to animate with Graphic instances and then convert these to MovieClips using a variation of the JSFL below.

But the only way to accurately get translation, rotation and scaling of a MovieClip in Flash is to do it from INSIDE that object. Trying to traverse the object tree from the top down results in PAIN. So I came up with the following idea for getting the absolute transforms of the heads for each of the characters:

1). Put a ‘dummy’ MovieClip inside the heads of the characters aligned to a consistent position in relation to each of the animations. This clip has ActionScript that broadcasts it’s Parent’s (ie., the head’s) position, rotation and scale.

2) Broadcast all of the dummy/head clip data via a TCP/IP Socket connection to an openFrameworks application that then serializes this data into a more useable format for c++ (we ended up saving this out using boost::serialize as a vector of custom data type – which makes loading into the application very simple).

3). Tearing our hair out because of random, single frame errors in a number of files. Of course the Flash files are huge, and there are literally thousands of frames of animations. So sometimes by accident we end up with multiple instances of heads, or dummy MovieClips, or they were named wrong or some other inexplicable error (of the variety Flash is very good at)…the weirdest being extra heads/dummy clips sitting at positions like -3578.432, -2987.908 which cannot even be selected, since they are completely off-stage and in fact beyond the bounds of what Flash technically will allow an object to be positioned!! In the end Isobel reverse engineered the process of selecting things by using Select All, and then de-selecting everything you can see one by one, until you are left with only these impossible, off-stage objects. 

Then you hit delete.
JSFL Jungle E-mail
Jun 15, 2011 at 03:50 AM
I've just started working with Van & Isobel on their new work 'It's A Jungle In Here'. The work is a darker, two-player animated interactive, loosely based on a similar aesthetic to the web camera insertion of audience faces explored in You Were In My Dreams.
 
That project was coded in Flash. Although I do a lot of work with Flex/Flash I'm quite keen to avoid it most of the time.
 
So, although it's early days, we're testing out how well openFrameworks can handle HD video, multiple web cams, voice and force interaction. 
 
But first we needed to get access to some properties of animated graphics in FLA files in order to create masks and tracking objects from the animation itself (which Isobel & Van are doing in Flash).
 
One reason I hate Flash/Flex is that objects (and indeed code) often have attributes or properties that only work with one type, and not with another. In order to get onion skinning and 'start-loop-at-frame' functionality you need to work with "Graphic" instance types on the timeline...but to get programmatic access to these same objects I need "Movie Clips"...[sigh]...
 
So how do you convert hundreds of Graphic symbol instances into Movie Clips, rename the symbol instances and crop the Movie Clip library items so they start at the right frame number?
 
You have to learn JSFL [double sigh] . JSFL is another under-documented, non-standard, Adobe pain-in-the- but it does let you do some pretty nifty programmatic things to documents, timelines and objects within the Flash environment.
 
After a lot of mucking around with the syntax (JSFL is VERY picky about what order you can do things and which commands can be combined with others) I came up with the following. I hope it eases someone else's pain. 
JSFL Replace Graphic Instance With Renamed Movie Clip
  1. var name = prompt("New name for all selected instances:");
  2.  
  3. if (name != "") {
  4. var itemCount = 0;
  5. var curLayer = fl.getDocumentDOM().getTimeline().currentLayer;
  6. var myElements = new Array();
  7. var myFrames = new Array();
  8. // this is kind of nasty as it iterates every instance in every frame for the currently selected layer
  9. // (ie., no need to select each element -- for some reason selections are not working with our animations???)
  10. // eg., for(var i in fl.getDocumentDOM().selection) should work but it just doesn't so I'm using the method below
  11.  
  12. for(var i in fl.getDocumentDOM().getTimeline().layers[curLayer].frames) {
  13. //fl.trace(fl.getDocumentDOM().getTimeline().layers[curLayer].frames[i].elements[0].symbolType);
  14. var currentElement = fl.getDocumentDOM().getTimeline().layers[curLayer].frames[i].elements[0];
  15. if (currentElement.name != name) { // dodgy identification of an instance/symbol on the timeline
  16. itemCount++; // count how many we've done for statistical purposes
  17. myFrames.push(currentElement.firstFrame); // store the frame number where the graphic 'loop' starts
  18. currentElement.symbolType = "movie clip"; // convert the graphic instance into a movieclip
  19. currentElement.name = name; // rename it
  20. myElements.push(currentElement); // store a ref to this instance
  21. }
  22. }
  23. fl.trace("Renamed and retyped " + itemCount + " instances");
  24. // iterate through all the stored refs to instances on the timeline and
  25. // a) create a copy of the movieclip library item used to create the instance, then
  26. // b) swap the itemRef for the instance on the timeline to the new library we just created, then
  27. // c) edit the new movie clip lib item and crop the starting frame to simulate the graphic 'loop' starting frame functionality
  28. // ...simple really -> pity JSFL like all adobe scripting languages is woefully under documented and doesn't behave as expected
  29. for (var i in myElements) {
  30. // set var to each element ref previosuly stored
  31. var originalLibraryItem = myElements[i].libraryItem;
  32. // get the name of the instances library item and the actual movieclip item name
  33. var lib_path = originalLibraryItem.name;
  34. var mc_name = lib_path.substr(lib_path.lastIndexOf("/")+1);
  35. // create new name for our duplicate lib item
  36. var newItemName = mc_name + "_" + i;
  37. // select the original lib item in the library
  38. fl.getDocumentDOM().library.selectItem(originalLibraryItem.name);
  39. // duplicate it and set it's properties
  40. fl.getDocumentDOM().library.duplicateItem();
  41. fl.getDocumentDOM().library.setItemProperty("name", newItemName);
  42. fl.getDocumentDOM().library.setItemProperty("linkageExportForAS", true);
  43. fl.getDocumentDOM().library.updateItem();
  44. // get a ref to this new library item (probably a better way but I couldn't find it)
  45. var itemIndex = fl.getDocumentDOM().library.findItemIndex(lib_path + "_" + i);
  46. var newLibraryItem = fl.getDocumentDOM().library.items[itemIndex];
  47. // set the instance/symbols item reference to this new item (swapItem command didn't seem to work)
  48. myElements[i].libraryItem = newLibraryItem;
  49. // make sure we have nothing else selected in the library
  50. fl.getDocumentDOM().library.selectNone();
  51. // re-select the newly created lib item and open it for editing
  52. fl.getDocumentDOM().library.selectItem(lib_path + "_" + i);
  53. fl.getDocumentDOM().library.editItem();
  54. // tell us about it...
  55. fl.trace("Editing: " + fl.getDocumentDOM().getTimeline().layerCount);
  56. // cycle through all layers and remove frames from first frame to graphic 'loop' start frame
  57. for (var j = 0; j < fl.getDocumentDOM().getTimeline().layerCount; j++)
  58. {
  59. fl.trace("Layer: " + j + " :: " + fl.getDocumentDOM().getTimeline().layers[j].name + " remove " + myFrames[i] + " frames");
  60. fl.getDocumentDOM().getTimeline().setSelectedLayers(j);
  61. fl.getDocumentDOM().getTimeline().removeFrames(0, myFrames[i]);
  62. }
  63. }
  64.  
  65. }
(c) m gingold
HydraPoesis Prompter II E-mail
Jun 03, 2011 at 12:00 AM

Back in Perth this week working on the Prompter development. Finally got sync’d audio and video recording to work in oF, even if it is only for Mac! Had to use QTKit (ie., Quicktime X) as opposed to Quicktime 7 (c/c++) SDK. But the results are WAY better. We’re adding a bunch of features to the Flex application and then using local network/socket connections to communicate between a windowless oF application and the Flex application. By using ManyCam (a utility that allows you to mirror your webcam to multiple applications) we can now stream using Flex and record in Hi-def using openFrameworks. Cool stuff.

Great Walls of Fire E-mail
May 16, 2011 at 09:32 AM

If Korea wasn’t exhausting enough now we’re in Shanghai looking for good food, good views, crazy Chinese bars and custom clothes. I'm hoping there's a nightclub as cool as this one I found in downtown Seoul:

hackerKorea 

USD & Tae Eun Kim E-mail
May 12, 2011 at 08:02 PM
A major highlight of our time here in Korea has been collaborating with USD (dance) and Tae Eun Kim (video). We performed 4 x 15 minute improvisations for Hi Seoul Festival. USD performed inside the Book, with Tae Eun’s video projected onto a big rear-projection screen behind them.
Book of Risings E-mail
Apr 14, 2011 at 11:10 PM
Back on the Book and back in Seoul!! My brief this time around with the Great Wall of Books is to collaborate with Renato Vacirca on creating sound for all performances and durational events. We’ve got a really nice studio with a grand piano to work in, and I’m honing my Ableton skills. 
 
We’re working mostly on live improvisational strategies and sounds, using a lot of percussion, delay effects and samples from some old records Renato brought back from Europe.
 
It’s really great to get this time to concentrate on music composition – usually I don’t get the time when I’m doing video production.

HydraPoesis Prompter E-mail
Apr 04, 2011 at 08:50 AM

Started work this week with Perth’s Sam Fox of HydraPoesis. Sam is developing a performance work set in a tv/online broadcasting studio and wants to create real online collaborative tools for the creation of content as well as develop a system for playing back content (some live, some pre-produced) in the performance itself.

I’ve been looking into Flex/Flash Media Server options as well as FFMPEG/Quicktime/Directshow options for recording and streaming the content. So far I still can’t get oF to do sync’d audio and video recording and since Flex/FMS has streaming fairly well sorted I’ve prototyped the collaborative tool in Flex. Basically (!) it’s a multi-user custom-skype-like interface which allows users to read off a teleprompter and record the content.

I’m off to Korea and Shanghai for a little while but we’ll continue the development in May/June.

Multiple HD Video Sync E-mail
Mar 28, 2011 at 04:35 PM

It’s one of those things that media art gallerists (or exhibitionists?) are constantly needing to do: sync multi-projector video works. Back in the days of SD DVD works you used to use Pioneer VT7200/7300 DVD players and David Jones sync box. Now…well how do you do it now with HD video in about a million formats. Richard Manner (Performance Space Tech Manager) and I have talked about this on and off for years. So while I was working over at ERTH I took some time out to code up a simple sync manager for video in openFrameworks. It’s not doing sync lock (just a simple offset and ping-to-play-all videos on any number of connected clients) but it does the job within 2 or 3 frames across as many computers as you hook up. Simple fast solution.

syncVideo 

ERTH I, Bunyip E-mail
Mar 21, 2011 at 10:01 AM

This week I’m in Sydney working with ERTH Theatre on their upcoming I, Bunyip show. Combining performaers and puppets onstage, we are developing a system for virtual lighting and set projection. I’m using registered projection with both kinect and cctv cameras to allow us to track and light performers/puppets. I’ll also be writing software that allows as to easily calibrate projectors to the set in order to project Sam James’ video design onto individual elements onstage. 

Kinect & openNI Development E-mail
Mar 09, 2011 at 09:50 AM
I’ve been working for a while with Gorkem Acaroglu on a work called Exception. The show is based in Second Life, and the brief has been to investigate projection techniques to allow interaction between physical and virtual performers, as well as to develop an interface for physical performers to control avatars in the Second Life environment.
 
In parallel I bought myself a Kinect camera for Christmas and after playing around with it for a while realized it would be perfect for controlling an avatar. Of course there are a number of gotcha’s when trying to control an avatar in Second Life (not least of which is the quite closed nature of Linden Labs’ platform). There’s already some existing software called FAAST which can convert a limited number of gestures to key strokes (as Second Life has no direct control of animation you have to input key board commands, unless you want to run a Second Life-like environment such as Open Sim, etc).
 
But I was interested to see what I could do with skeleton recognition in openframeworks. Roxlu (insert name) had begun wrapping openNI drivers for oF, but seeing some bugs and further scope for development I’ve taken his project and been steadily extending it, including porting it to Windows, and implementing gesture recognition, hand tracking and ONI recording options (see http://github.com/gameoverhack/ofxopenni and http://forum.openframeworks.cc/index.php/topic,5125.120.html).

Eulogy for the Living E-mail
Mar 07, 2011 at 01:41 PM

Eulogy for the Living premiered in Australia at the Ballarat Mining Exchange over the weekend. The collaboration was with Tony Yapp Company, featuring Tony Yapp (director), Yumi Umiumare (creative collaborator), Pia Interlandi (set & costume), Dori Bicchierai (lighting), Gus Macmillan (pre-recorded sound), Kath Papas (producer), Janette Hoe, Geraldine Morey, Daniel Mounsey, Brendan O'Connor, Adam Forbes, Yoka Jones and Ben Rogan (performers). For more info visit the TYC homepage.

I did the video design, performed live sound and video as well as collaborating on the set design with Pia. Here's some video:

The collaboration is one of the best I've had in live performance: given the scale and scope of the project and limited time for rehearsal and bump-in the team worked super-seamlessly to create the best possible outcome. And also given that Ballarat is my home town it was especially awesome for me to see ~200 people cue down Lydiard Street to get in to see a contemporary butoh performance!!

Here's a couple of images of the original set design and the finished installation at the exchange:

eulogy04

eulogy02

 eulogy03 

Big Week in Funding E-mail
Feb 28, 2011 at 06:49 PM

It’s been a big week in funding! Firstly, “Stranger Danger” (working title), the new work from Van Sowerwine and Isabel Knowles , for which I am doing the coding and interface design got funded by Inter-arts (projects). Then the Aphids Atelier Eden inter-lab residency for which I’m a lead artist got funded. And then Well got a Theatre board development grant. Wowser! Big year(s) ahead. 

Tomb @ MONA/FOMA E-mail
Jan 31, 2011 at 03:23 PM
Last year as part of 24@Dancehouse I collaborated with Balletlab to create a short film. Controversially (for a curated dance show involving 4 choreographers creating a dance work in 24 hours) our collaboration did not include any “dance”. We shot and I edited the movie in 24 hours – it was still rendering 15 minutes before the show! For 24, the film was basically just a rush-cut and ran to ~50mins…
 
This year Balletlab are presenting Trilogy at MONA/FOMA for the opening of the new Museum of New and Old art. They are showing Amplification,  Miracle and a new work Above (featuring 2 choirs and 50 toy piano’s!). As part of Above I’ve re-edited, re-graded, mixed the audio in surround and mastered the work from 24 to create a short film called Tomb. Check it out:
 
PS: the MONA opening was as amazing as the gallery itself!!
Mirror/Mirage E-mail
Nov 01, 2010 at 11:39 AM

Super happy with the finished work and super pleased to present Mirror Mirage at the second Melaka International Art Festival. Janette and I re-shot the original idea and have got Philippe Pasquier to compose music for the installation. Janette is also going to do a performance with the work. One of the figures in the video will be replaced by her performing the choreography live. Unfortunately I can't make it to the Festival (as I'm driving around in the south of Mexico doing the edit of the work on the road, in hotel rooms and hammocks!) but I'm hoping it all goes smooth. Here's the final work:

Mexico Mexico E-mail
Oct 31, 2010 at 01:04 PM

After all that crazy fountain music at the Expo Bicentenario it’s time for a well deserved rest. We’re off to explore Mexico, starting with Pátzcuaro, Michoacán, for Day of the Dead and then heading south. Let’s hope we run into cowboys and the EZLN!

mexicoElectronica 

Solenoid Concert E-mail
Oct 30, 2010 at 02:48 AM

And here’s a little video excerpt of the collaboration these solenoid’s were used for. Roberto Morale’s is a world renown  noise musician specializing in generative and physical live composition. He does great things with Wii controllers, flute, piano and any input signal you can throw at him! For the concert I totally re-wrote my VJ software to implement openCL effects and perfect midi and OSC signal routing so that any midi or OSC instrument/signal can control any of the software parameters. Roberto’s software does real-time analysis of the sounds he is monitoring and then outputs this analysis as OSC signals which are then (at various times) either driving video parameters or my drum machine or the solenoids or all three…there’s more info on the Vimeo page too.

Solenoid Development E-mail
Oct 15, 2010 at 02:18 AM

Originally we wanted to turn the book into a giant instrument – but time of course got the better of us, and so instead I’ve begun developing a solenoid percussion instrument. Solenoids are very similliar to a DC/AC motor (ie., they have a shaft which is surrounded by wound copper wire), but instead of turning when electricity is applied to the wire, the magnetic force produces lateral movement (ie., the shaft goes in and out ;-). Inspired by instruments like this vibraphone and this hit-anything approach, I’ve set out to do much the same thing. 

 solenoid3

The circuit is basically a series of power tranistors operating a bit like a relay – 5v power from an arduino (controlled by an application written in openFrameworks) is used to switch 12v-high amperage power which drives the solenoids. The software let’s me use my drum pads (or any other midi note) to control the solenoids.

The solenoids needed mounting in aluminium as they get REAL HOT when sucking down 1-2A of 12v power!

Virtual Book & Telephone Interactive E-mail
Oct 04, 2010 at 11:51 AM

For the Book of Revolutions I've completely re-programmed the Well's Virtual Book: ported it from Flash to Flex and taken advantage of the new Spark TextFlow features. I’ve improved the page turning, added the ability to load images for intro pages, fixed up auto-page turning when typing, completely re-designed the contents pages and method of leaving a story (previously you had to make up a story title; now you can either make up a story or re-write someone else’s story).

theVbook2theVbook1theVbook3 

I've also been working on a new Telephone Interactive for the Book. Basically audience members will be able to walk up to a telephone reciever and when they pick it up a voice will let them know that they are about to be asked a question. Around 10 to 20 questions will be recorded and one will be played at random from the database. After a beep the audience member can leave a message. All the messages go into a database and these messages, along with other music and sound effects will be automatically played back through the main PA and several radio recievers placed inside and outside the Book itself. We'll have radio's and headphones so people can wonder around and find the radio signal's (situated in marked box's)... 

telephoneInteractiveMaking 

Book of Revolutions E-mail
Sep 16, 2010 at 10:07 PM
Viva Mexico!! As part of Expo Bicentenario (Mexico) the Great Wall of Books is touring to the land of cactus, cowboys and icecream and I’m going with them! The Book of Revolutions (as this tour is called) involves a 6 week development and 2 week showing. We’re looking at further developing the Virtual Book (an interactive where participants can type stories and read other people’s stories stored in a database), investigate audio interactives and broadcast, and hopefully develop the idea of the-book-as-instrument. More soon, but right now it’s tortilla time!
Mirrored Dance E-mail
Aug 31, 2010 at 10:40 AM

 Janette Hoe and I met at the Melaka International Art Festival in 2009. After meeting a few times to discuss what we like and don’t like about dance on/in video we’ve begun a collaboration. Janette is keen to explore ideas about her identity as a Malaysian Australian. I’m keen to continue exploring ways in which physical choreography can be used to highlight inherent properties of digital media (and vice versa). As a starting point we’re looking at ways to create “mirrored” movement without actually mirroring the video – that is creating movement that is as close to possible as exactly the same, but mirrored physically. 

mirrorDev 

We were intending to just have two images of Janette in the frame, but when we were reviewing the footage we put all 9 takes next to each other and really liked the format. Janette is now working on a physical choreography that better highlights moments of synchronicity and difference as well as changing velocities in order to physically approximate the digital “language” of slow motion and fast-forward. More soon.