

Just another WordPress dev
antpb · · Leave a Comment
antpb · · Leave a Comment
antpb · · Leave a Comment
antpb · · Leave a Comment
What would a basic WordPress look like in VR? Using the WordPress REST API, I could easily pull things like author information, post content, or even custom meta. Using WordPress gives a user friendly interface to control environment variables of a game. One example would be using custom meta to define something like the color of the world lighting. These sort of level variables could also be interesting for an application that allows user registration. Users could have their own levels or posts to show their content without knowing how to be a game dev. A travel blog comes to mind for this POC. Using 360 images as the post featured image, I wrapped them around the user with post content attached to UI elements.
After a two day hackathon the project was loading nicely. SimpleJSON proved to be an interesting library to dissect, but I was able to get JSON objects accessible via C#. In the controllerWordPress scripts (Assets/Scripts/) you'll find the bulk of how to accomplish this. I'll document the code further as I work on this. 🙂
Made a VR #WordPress in #unity3d using the WP REST api. toggle posts with the controller. Feat. Image gets attached to skybox. ?#vrdev #vr pic.twitter.com/ictJpERAN5
— Anthony Burchell (@antpb) September 14, 2017
Github: https://github.com/anthonyburchell/VR-WordPress
The readme in the repo should cover basic usage.
antpb · · Leave a Comment
I’m on a quest! I want to integrate Pure Data with Unity to make a Virtual Reality game focused on performing music. I’m fairly new to game development. My prior experience with game dev was when I was in my pre teens making Flash games. For this quest, I want to focus on this project being catered to performers….at least for now. How that will look? I’m not sure but I have built a framework to start with. I’m very aware there are many iterations that are currently in development or have been around for some time, and I admire those projects immensely. This will just be my flavor…
Lets get some back story on how I got here. Music with event based controls over 3D Visuals or better yet, Virtual Reality worlds, has been a dream of mine since I started making music. I was heavily inspired by Summer Wars and dabbled in Blender. I learned to model by giving myself the task of recreating pieces of the Land of OZ from Summer Wars. Sadly I’ve lost the file so I cant do much for it anymore.
After familiarizing myself with 3D modeling I set my sights on developing a concept for this project.This was around 2014 and at this time I created a music video using the cube in this render. I mapped the scaling animations to the peaks in the audio file for the song. Not much going on there but it technically was my first audio controlled world.
I began exploring Unity3D and found it to be very capable for sound purposes. I got sound running and instantiated audio reactive cubes across the spectrum.
I continued documenting the project via Instagram as I learned new things. I’ve explored going the iOS/mobile route and found it to be somewhat limiting in user input.
A year passed and while still learning, progress was pretty much at a stand still. I acquired a Vive and very quickly got my prior proof of concept loading. I then shifted to exploring controller input and how to attach it’s properties to other objects. This allowed for picking up objects. Small step but was very important.
View this post on InstagramA post shared by Anthony Burchell (@antpb) on
Using Keijiro’s many frameworks I was able to get sound manipulating a 3D object! I split out a song over three tracks (drums, synth, percussion) and began experimenting with what a live song would look/feel like.
View this post on InstagramA post shared by Anthony Burchell (@antpb) on
I became obsessed with iridescent shaders in the process of learning this stuff (partially because my girlfriend turned me on to how cool iridescence looked) I found that this type of shader was extremely good for understanding how to script color and lighting.
Pure Data has been an eye opening experience. I had built very basic sequencer/osc a few years ago and stopped looking into it due to the complexity of the routings. I found recently that an open source framework called Automatonism simplified a lot of my problems with Pd. I've since moved away from this framework but it was an excellent resource for prototyping.
After experimenting for a day I was hooked. I took a weekend to learn everything I could. I found that there was an amazing C# library called libpd and compiled myself a 64bit version of the library for use in Unity3d. The results? INPUT AND CONTROL OVER SOUND! My most significant progress towards this dream.
View this post on InstagramA post shared by Anthony Burchell (@antpb) on
View this post on InstagramA post shared by Anthony Burchell (@antpb) on
View this post on InstagramA post shared by Anthony Burchell (@antpb) on
After going on a code binge I was able to bring to life a very simple proof of concept using the same Pd patch shown above. This gives the basic building blocks for what I need.
After a few years of breaking through the learning curve of C#, Pure Data, and Unity3D, I finally have a working game that feels like something I would be excited to perform in.
I’ll be continuing to update my blog as new things develop on this project.