OrchScape is a part of Moon Moons, a multimedia group performance piece that features dance and a projected camera fly-through of different virtual environments. It was realized for the occasion of the 2019 UCSB MAT End of Year Show by members of the transLAB.
My particular contribution to MoonMoons, OrchScape, is a one-minute-long segment of the fly-through, featuring a spatiotemporal audiovisual environment exploring “spatial orchestration.” The environment contains twelve architectural-scale sculptures, each with an associated audio track mapped to its position. The twelve tracks consist of three separate timbral “orchestrations” (of four tracks each) of an original electroacoustic composition. The components feature different artificial electroacoustic timbres or “instruments.” Using a distanced-based amplitude panning approach (DBAP) accounting for the positions of the sculptures and their affiliated audio tracks, the tracking of the camera fly-through generates a given audio mix of the tracks, where increased proximity maps to increased amplitude in the resulting mix. As such, the resulting audio weaves in and out of different orchestrated possibilities. The orchestrational approach is thus spatiotemporal, dependent on the moving position of the camera.
On the technical side, the camera’s path was precomposed in Unity. The sculptures were 3D modeled from music data. I used FormZ to compose a base form, Mathematica to revolve the amplitude contours of a sound wave file and blend the vertices of the two meshes into a single point cloud, and MeshLab to remesh the object using Poisson reconstruction.
I composed the sonic motifs using a timbral tool in Max MSP, and with these sound files, we created a link from Unity to Max to spatialize these files according to the placement of the sculptures.