I’m on a quest! I want to integrate Pure Data with Unity to make a Virtual Reality game focused on performing music. I’m fairly new to game development. My prior experience with game dev was when I was in my pre teens making Flash games. For this quest, I want to focus on this project being catered to performers….at least for now. How that will look? I’m not sure but I have built a framework to start with.ย I’m very awareย  there are many iterations that are currently in development or have been around for some time, and I admire those projects immensely. This will just be my flavor…

Backstory!

Lets get some back story on how I got here. Music with event based controls over 3D Visuals or better yet, Virtual Reality worlds, has been a dream of mine since I started making music. I was heavily inspired by Summer Wars and dabbled in Blender. I learned to model by giving myself the task of recreating pieces of the Land of OZ from Summer Wars. Sadly I’ve lost the file so I cant do much for it anymore ๐Ÿ™ This is my last remaining render:

https://www.youtube.com/watch?v=hk19uvMnFmM

Concept Art

After familiarizing myself with 3D modeling I set my sights on developing a concept for this project.This was around 2014 and at this time I created a music video using the cube in this render. I mapped the scaling animations to the peaks in the audio file for the song. Not much going on there but it technically was my first audio controlled world.

Unity!

I began exploring Unity3D and found it to be very capable for sound purposes. I got sound running and instantiated audio reactive cubes across the spectrum.

Learnings!

I continued documenting the project via Instagram as I learned new things. I’ve explored going the iOS/mobile route and found it to be somewhat limiting in user input.

Progress intensifies

A post shared by Anthony Burchell (@antpb) on

Controllers!

A year passed and while still learning, progress was pretty much at a stand still. I acquired a Vive and very quickly got my prior proof of concept loading. I then shifted to exploring controller input and how to attach it’s properties to other objects. This allowed for picking up objects. Small step but was very important.

Shaders!

I became obsessed with iridescent shaders in the process of learning this stuff (partially because my girlfriend turned me on to how cool iridescence looked) I found that this type of shader was extremely good for understanding how to script color and lighting.

Pure Data!

Pure Data has been an eye opening experience. I had built very basic sequencer/osc a few years ago and stopped looking into it due to the complexity of the routings. I found recently that an open source framework called Automatonism simplified a lot of my problems with Pd.

Pure Data and Unity!

After experimenting for a day I was hooked. I took a weekend to learn everything I could.ย I found that there was an amazing C# library called libpdย and compiled myself a 64bit version of the library for use in Unity3d. The results? INPUT AND CONTROL OVER SOUND! ๐Ÿ˜€ My most significant progress towards this dream.

Proof of Concept

After going on a code binge I was able to bring to life a very simple proof of concept using the same Pd patch shown above. This gives the basic building blocks for what I need.

It’s starting to look and feel like something.

After a few years of breaking through the learning curve of C#, Pure Data, and Unity3D, I finally have a working game that feels like something I would be excited to perform in.

I’ll be continuing to update my blog as new things develop on this project. ๐Ÿ™‚