• Skip to primary navigation
  • Skip to main content

antpb

Just another WordPress dev

  • Home
  • Music
  • Dev Blog
  • Bio
  • Donate

antpb

testpod

antpb · Feb 11, 2022 · Leave a Comment

vr

https://antpb.com/wp-content/uploads/2022/02/space-pod.glb

1

#0066cc

189

1

0

-0.82

0

1

Test GLB

antpb · Feb 7, 2022 · Leave a Comment

vr

https://antpb.com/wp-content/uploads/2022/02/antpbbb.glb

1

#FFFFFF

458

0

1

0

0

1

Interacting with Augmented Reality Music Interfaces

antpb · Jun 3, 2019 · Leave a Comment

Broken Place AR is making leaps of progress! I have officially uploaded the first 0.1 beta to TestFlight and am very close to opening beta to the public. Of all of the progress that I think would be worth highlighting, the interaction system redesign was particularly interesting. I never thought I’d have to reinvent the knob in the metaverse, but here we are!

Challenges in interacting with UI in Augmented Reality

Imagine walking through your every day life via AR. How would you reach for items without being able to pass through the screen? I initially thought some type of hand recognition using computer vision would be the way to go but I learned it is not remotely close to being accessible on iOS. The computer vision approach with mobile AR is assuming way too much about players and leaves out folks that do not have use of more than one arm. Using touch input all over the screen also assumes two hands and I scrapped that approach quickly.

I have strived to build systems in Broken Place with the intention of allowing the game to be playable using only one hand. Broken Place VR has mostly been doing a good job of that. Broken Place AR, however, was severely lacking in that pursuit. In Broken Place AR’s previous interaction system, a user was expected to keep the phone aligned to the object they wanted to interact with while touching the screen with their other hand on the UI element desired. It didn’t really feel like an AR experience and more like I was trying to keep UI pages aligned to fill my viewport. Hands are also insanely obstructive to the experience. It was hard to see where you were touching.

I looked for inspiration. The average camera app is a great example of a way to trigger an action while not obstructing a scene and it is generally able to be used with only one hand. I decided to take some queues from that approach in the new system.

Click and Rotate 🧬

Image of knob UI element being clicked and rotated

Now, a player can use the familiar action of toggling from the right (or left) center, much like taking a photo. So long as you are holding the toggle, you will stay attached to the UI element you are interacting with. Gaze center is where I determined it best to send the raycast to the UI, but proved to be a bit confusing understanding that the toggle was associated with the center reticle. I added a thin line that renders from the toggle to the center to assist in giving a more clear indicator that the two are related. I’m giving myself bonus points for making the line animate to the music. 😉

The other unintended benefit of this approach is that now a user can attach to a UI element from much further away. Having a clear center reticle makes it much more precise in selecting your interaction point.

Most every music interaction in Broken Place AR can be accomplished with a knob like z rotation. Sliders were a bit difficult in translating the Z rotation to a value because of how thin they were. I added sudo knobs on the bottom of the sliders to allow a wider hit point for attaching to the element. Visually, it just kinda makes more sense.

Explaining the Unity side in a tweet length description:

Each UI item has a box collider. When the main toggle is pressed, a raycast is sent from center reticle and hits the collider. The name of the collider is stored while held and the the object’s child knob’s (or slider’s) value increases/decreases based on the z rotation of the camera.

Lefty FTW!

I’m left handed. No way I’m going to ignore my lefties! I added a setting to flip the landscape orientation and align the toggle to the opposite side.

I’m left handed, so you best believe I put a setting for that. 🤗 pic.twitter.com/TsdFdGdp1t

— Anthony Burchell (@antpb) May 30, 2019

Beta Signup

I am very close to a public ready beta. If you would like to be notified when beta is ready, please sign up for the newsletter at https://broken.place

Broken Place, but in AR.

antpb · Apr 12, 2019 · Leave a Comment

I have been experimenting with augmented reality music on iOS and I think my game Broken Place would be perfect on mobile! I’m working on building a miniature version of my vision of Broken Place that will focus purely on the music and performance aspects. I think there will be tons of learnings in this pursuit!

I’ll be opening beta very soon on iOS for a port of Broken Place in it’s most basic form. I plan to continue to maintain the app for at minimum two years after luanch. I will be adding more songs and content in that time. I’ve spent the last month building what I think is a fun way to interact with music in AR and I think I’m ready to give you all a few songs to play with. Below is a very early video I shot last week in the park.

(note: I found in this test that some bluetooth headphones have major latency problems)
A more recent update to the design/layout

Keeping Accessibility top of mind

I’ve been diligent in researching accessibility standards in spacial computing. While my early prototypes may seem to lack accessibility options, I have been documenting the things I plan to incorporate before launch as well as already implementing some of the more immediately actionable items that are within my current technical abilities.

AR in itself is pretty inaccessible partly due to hardware limitations. My goal is, at the very least, to have the music playable with only one hand. I think once we have HMD’s integrated in our everyday lives, this will help many of the accessibility concerns I have but, until then, I’ll cover what I can.

Pending settings that I am building out currently are high contrast, text size, and height adjustment settings. If you have any other suggestions, please submit them through this contact form.

Content Submission

I’m also still thinking through the content editor side of things. I’m hoping to build a way for users to submit their own patches as a later update. Unfortunately, this requires a bit of Pure Data knowledge on the player’s end. Content submission will likely run on a WordPress app with access handled on a case by case basis. Users would be able to upload their vanilla Pure Data patch in a new WordPress post and load the post from their iOS device running Broken Place. I will initially make the song data public only with very clear indicators of this being the case. I want to be respectful of original content but in full transparency, it’s a technical limitation currently to have patches run remotely without latency. An option will be added that allows players to log in to their own private songs. Subscribing for updates is a good way to get access first!

Will it be free?

🙁 I can’t. Moving to the iOS platform was a task that required unexpected expenses. I’d like to be as transparent as I can about this project. Publishing an app on the App Store is difficult for an independent developer like myself. For this game, I had to buy a Mac computer to even allow me to test things via xcode. I bought a modestly configured Mac Mini for $600 and was in business. I also have to pay my Apple Developer License ($99) to get it published, not to mention the need for a legitimate Pro Unity license (an expense of a few hundred dollars.) Simply put, I need to recoup some of this cost and that’s partially why I’m racing to a release. I’ve seen many simple AR apps going for around $1.99 and will likely follow suit. Apple will already take 30% off of that so I hope you understand why it is a paid app. 🙂

What other cool features are you exploring?

Here’s a list from my whiteboard:

  • BoseAR integration
  • Bluetooth/Wireless MIDI connectivity
  • “Bop-It” mode where you are against the clock to perform actions in the patch
  • Gaze Input for interacting with music UI elements
  • Epic visuals
  • wireless display mirroring so folks can see what you’re doing in AR easier
  • More visual indication of where a sequencer is on the UI
  • A dog to pet

Pure Data Gibberish Generator for Unity3D

antpb · Feb 2, 2019 · Leave a Comment

I made a fun dialogue system for my game recently. I thought it would be great to have a gibberish system to translate text dialogue to the user’s preferred language while still feeling conversational. I think I can work out translation side pretty easily, but what I found to be a blocker was making convincingly conversational gibberish sounds. The solution was obvious; Pure Data excels at this task. Below is a rough sketch of a patch I made to generate the ‘words.’

I took an oscillator and created an 8 step sequencer of slopes. These slopes will randomly animate the pitch of the oscillator from one value to the other. The slope step is chosen at random by a random number generator that is triggered by each word of the dialogue. The oscillator goes out to another oscillator to give us that robotic FM synthesis sound. I think it’s a solid foundation that I can build on going forward. Maybe sample rate reduction on the voice or some type or robotic effect.

Here’s a video of it in action:

A bit about the Unity side of things

Let’s talk a bit about the Unity side of this! You’ll see in the Unity editor screenshot there is an editor script for the dialogue triggers and the manager for all the objects to assign information to and manage Pure Data messages to send at the start and end of the conversation. In the case of the video above, it sends a message of thronStart to the patch. When dialogue has reached the last line, it sends throneEnd which I have routed to start the metronome and song. This editor script is great because I can make different versions of the voice patch and change the character name associated to give voices some variation. This is stored in the Pure Data Character Name field (in screenshot, defined as orbVoice.)

Next I hope to make the system more aware of pauses in speech. Currently it’s just triggering at a uniform space of time between words. I’d like for it to later recognize commas or periods and translate those to pauses and more dynamic delivery of the phrase. Maybe even have properties for each line of dialogue to define the emotion of the delivery (angry, sad, scared, etc.) Onward!

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to Next Page »

A website made with WordPress on WP Engine