Skip to content

Adeeb Syed

Enter the FlowJo VR

Table of Contents


Project Description: Enter the FlowJo is a calm, rhythm-based painting VR experience that helps players “learn” Japanese characters using their whole bodies.
Context: MIT Reality Hackathon 2020 Project/Prototype
Contribution/Role: “Developer” and consummate youtuber/googler on a team of 4
Technologies: Unity, C#, Oculus Quest, Blender
Project Duration: 5 long days


Environmental Challenges 

Since we were one of the last teams to be formed at this hackathon, we naturally also had last choice of a workspace.  Unfortunately, one of the last remaining locations was in the basement of an MIT building that was adjacent to a set of doors which led outside to a very cold and snowy January.  

Trying to program or be productive while dressed for a snowball fight (we even toyed with the idea of some sort of VR snowball game-mechanic) inside of what was effectively a giant freezer was difficult indeed.  After attempts to use portable heaters failed, we were eventually moved to a much warmer oasis upstairs.

I am thankful that I had a team with a great sense of humor. I don’t think any of us were particularly upset by the loss of productive working time — we all just wanted to learn something new and create something together.

Technical Challenges

Our initial goal was to create a musical rhythm nunchaku VR game that explored some music theory concepts — sort of like fruit ninja meets beat saber (which if they were actually married together, could be called “Beetsaber”…where you chop…beets…while listening to some dope…beats…get it? ok, i’ll stop).

While great in theory (and also in practice, because we bought some real nunchaku for “research”), our biggest challenge to surmount was tied to the physics system in Unity.  We tried our best to simulate the natural snap motion of nunchaku in VR, but it led to all sorts of undesired behavior.  As the only other developer on the team (and I consistently maintained that I wasn’t a “real” developer, just a “really good researcher”), one of my main tasks was to research how we might go about simulating the physics of nunchaku in the Unity game engine.

I set out to research what other games used mechanics that were proximate to our own goals.  I obviously first searched for existing VR games/experiences that used nunchaku, but none were to be found.  Next, I looked for anything that used a mace, because I was sure someone somewhere was working on a VR gladiator game.  While I did find several examples of gladiator games that used a mace, it wasn’t quite what we were looking for since they all used handheld maces (i.e., more like baseball bats), not the swinging type of mace that would have employed physics similar to that of nunchaku.   

real footage of how I felt at every dead end

Then, I took a step back and decided to focus on the core mechanic that swinging maces and nunchaku had in common: the swinging itself — duh! I found some information on 2D games that used a swinging rope mechanic, which eventually led me to the code on this blog that we adapted for our prototype.  

While it initially seemed to work quite well, we were still experiencing some undesired behavior from our nunchaku. In particular, the slightest movement in the hand controller sometimes caused violent jerking motions from the nunchaku.

What we hypothesized might be causing the strange physics issue led us to yet another root issue which is actually one of the central problems/trade-offs for immersive technologies in general: positional tracking and occlusion.

To understand the problem with positional tracking, consider, for example, that when Bruce (below) typically engages in nunchakuing (nunchucking?), he tends to swing his hands slightly behind the back or under the arms.  In fact, one of the best parts of using nunchaku is swinging them slightly behind the body in this manner in order to achieve the desired force when snapping them forward.

The issue here is that in the short moments when the hand-controllers/virtual nunchacku are behind the body, they lose a direct line-of-sight with the thing that is doing the tracking, which in this case are the cameras mounted on the Oculus Quest headset itself.  In other words, the hand-controllers’ position could not be adequately and consistently tracked when it was occluded by the physical body. 

To be clear, to my understanding the hand-controllers are, in fact, still being tracked based on their last known position using some sort of magic AI (Oculus refers to this magic AI technology, along with computer vision and simultaneous localization and mapping, SLAM, as Oculus Insight).  The longer the disruption or interruption with the tracking, however, the more out of sync things may become. 

This flickering of the tracking may have been what was causing the violent and explosive physics we were experiencing during tests.  While this is often not a problem with inside-out markerless tracking generally, it was a problem specifically for a VR nunchacku game or any experience in which the core mechanic involves the adequate and consistent tracking of the hand-controllers.  

Whether another headset with outside-in tracking may have solved our issues, we can never be sure.  However, if we were to use an outside-in tracking solution, we may have lost the ability to use a wireless headset and the freedom of movement that a tetherless headset affords. 

As things stand now, there are trade-offs with many of these new, emerging technologies and all one can do is make the best possible decision given those trade-offs to achieve a particular design goal.  With advancements developing rapidly, I am quite sure it is a trade-off that won’t matter in the near future. 

After consulting with a Unity expert mentor that was present at the hackathon, we decided to abandon an attempt to achieve the realism of nunchaku physics and pivot to something else.  With what limited time we had left, we revisited a calm, flowy mechanic we experimented with from a previous iteration and added the ability to paint from the tip of each nunchaku.  

I would be lying if I didn’t admit that testing this new iteration was sort of like a therapeutic-Bob-Ross-meets-Bruce-Lee-in-heaven sort of experience, especially given the many hours we wracked our brains against a problem we just couldn’t solve.

The last challenge was that our team was split on the value of making what we had “educational”; that is, whether to include this calligraphy tracing aspect.  While I did not think this addition would be particularly educational at all, sometimes to meet a deadline you have to read the room, make concessions, and get the project done.


Personal Reflection 

This was my first time as “developer” in a social setting. Just one year prior to this, I worked on 2 programming projects all by myself, but I had no foundation in programming or in Unity.  In fact, I consistently let people know at this hackathon that I was just a really good googler, I had no idea what any of the code was really doing under the hood.  However, after 5 days of engaging with experienced and professional programmers, I learned that much of what actual developers do on a day-to-day basis is sort of what I was already doing.

Nevertheless, I have a very strong idea of what it means it to really “know” something and this hackathon inspired me to pursue programming fundamentals on my own, so that I can finally understand what the code is doing. 

Technical Reflection 

My only other prior experience with VR was on a Windows Mixed Reality Headset (WMR) just a year before this hackathon.  I purchased an Oculus Quest right before this hackathon started and the Quest is an incredible improvement and advancement over the WMR.  Although I outlined some of the Quest’s downsides above and how they emerged during the specific context of our project, I am sure those downsides will be overcome very soon.  As good as the physics system in Unity and other game engines are, achieving the smooth snap motion of real nunchucks was difficult to implement. 

In sum, I am glad my team and I chose to explore the nunchaku game-mechanics because I learned more than I otherwise would have from designing, and failing to design, at the edges and limitations of these technologies

Social Reflection 

There is something deeply special about environments like a hackathon.  This was my first one and I didn’t really know what to expect.  I witnessed a vast array of learning/communication styles, what types of set-ups people like to have when learning/working hard, and how people prefer to step away and ruminate on a problem and come back to them, whether that’s napping, smoking cigarettes alone, or chatting with others. 

I think there are two main lessons that schools and other environments can learn from a hackathon.  The first is that when everyone freely chooses to be somewhere together to solve problems that they want to solve, beautiful things can happen and people can achieve way more than they otherwise would.  Second, although many of these projects were not beautiful state-of-the-art VR projects (Enter the FlowJo is certainly not), the learning that really happens is not really contained in the end-product. 

Rather, the best parts are in the mind and what people will take to their future projects.  In a way, hackathons are like this weird positive Purgatory that you visit for a short period of time and where you try to achieve extraordinary goals under the strangest and strictest of circumstances — and then simply return to the real world as a new person afterwards.  This is the experience I have been trying to articulate in this post.

Future Work

Unfortunately, life oft gets in the way of revolutionizing how people nunchuck, so we have decided not to continue development on Enter the FlowJo…

…but we still keep in touch and talk about Bruce Lee from time to time.

Here is a corny rhyme that I wrote when submitting a short description for our final project, the stream-of-consciousness that was the result of 5 very sleepless days: 

Music brought the most geographically distant group together. Nunchucks threatened to chuck us apart.  But the slowmo flow of our dojo got us through it together, yo. 

Yes, cringe.

More Media