Indie VR & Game Developer

Hello! I'm Jason.

I'm a computer programmer with almost 2 decades of experience, now trying to transition from a hobby to a career. I've spent the last several years learning game development with the Unity game engine, and specialize in C# scripting.

I publish my own assets on the Unity Asset Store, aspire to make my own games, and will entertain offer for jobs on projects that suit my skills and interests.

Please enjoy my projects, consider my services, and contact me anytime!

Find Out More


Wednesday, December 11, 2019

Update: BodyLanguageVR 1.1

Update: BodyLanguageVR 1.1

This first update of this asset mainly sees me finish the Detect Inverse option. This basically give a user more freedom to trigger some inputs in a more natural feeling way. For example, the head shake no motion previously required a Right->Left->Right->Left, but a users natural instinct may be to go Left->Right->Left->Right. Now both can work.

Besides some minor bug fixes and adding a manual, most of this update is done as preparation for the future direction of the asset. I wanted to get more done in for the next update, but felt it better to release what I have early so I can take my time and do things properly.

I've been reflecting on the direction I want to take this asset, and to begin with that I had to break down and define its current goals. I want this asset to allow replacing traditional input with VR motions. Traditional input is stuff like digital button presses and analog stick/button presses. As is, my asset does not imitate such input very well. Even a simple digital button press. When you press  button down, sure its simple a TRUE/FALSE thing and my asset does that. But what it also allows which my asset did not is an amount of time that button was held down. When you do a motion its very event based. Preforming a motion triggers a TRUE value for a single frame. That's it. That's only good for yes/no types of scenarios.

This update attempts to allow you to return values for more than a single frame, but its kind of limited. And not well thought out. I have no stock demonstration input ideas for the additions.

This brings me to the future direction of the asset.

Right now, the asset is designed around what I refer to as a "DoThen" setup. What that means is, the users Does a motion, Then a value is returned. This is as opposed to what I'd like to additionally support with what I currently refer to as "WhileDo". That would be, return a value While the user Does a motion/sequence. This would better simulate holding a button/stick for a period of time. While you can currently just make a 1 motion sequence, I'd also like to better support singular motions.

In the future I can see this asset supporting input like Fruit Ninja, oddball movement setups, etc. I may even just be able to finally make the game idea I had that originally sparked creation of this asset! =)


  • Added DetectInverse option. This allows the user to do the mirror image of a sequence to trigger it.
  • Added GetMotionAxis() to MotionDetection. It's still crude atm, but this allows you to get an analog value back from the input. Right now the value is just a float from 0 to 1.0f, and based off speed of which the user completes the sequence.
  • Along with that, there's a new option called ResultDurationType. This allows you to get the non-zero result of GetMotionTriggered() and GetMotionAxis() for longer than just a single frame.
  • Fixed some issues with changing the size of the Input and MotionSequence lists.
  • Removed the GetMotionTriggered() method in the BodyLanguageVR class in favor of calling it directly form MotionDetection.
  • Tweaked the expected usage to use a more common instance/object oriented approach instead of the string based lookup method.
  • Added an optional OnMotionTriggered() method selector to the UI. I don't recommend its usage for most users. Its just for new users who aren't great with code, to help grasp the idea of the asset quickly. Proper/recommended usage is getting a instance of the InputMethod, checking myInputMethod.motion.GetMotionTriggered() every frame and calling your own code when true. GetMotionAxis() is not supported with OnMotionTriggered().
  • Split the InputExample script, along with the Demo scene, into 2 to reflect the differences with setup for the above.
  • Created new Manual.pdf instead of a ReadMe.
  • Updated demo scene objects.

Sunday, December 8, 2019

ReRelease: AdFighter

ReRelease: AdFighter

Lately something has been irking me. That something being that the one game I have made is unable to be played by anyone! All because I removed it from the internet due to having a bad experience with a sleezy company who ran the game jam I originally made it for.

At first I was was embarrassed by it because of the failure, but I got over that soon enough. Since then I intended to re-upload it... I just didn't want to do so with references and promotion of that sleezy company in it. So I needed to remove those... I just never felt up to digging through the source again. I wanted to move on for the time being. And well, I did. But I never came back to release it... because I was moved on. I had forgotten, or been to busy.

But now though, I finally remembered and got around to it!

To frame this re-release. It's exactly as it was at the end of 2016. Same VR SDKs, same game play, same menus/text, etc. The only difference is the removal of those references/promotion.

Also, be warned. This isn't an amazing lost treasure or anything. At least I don't think it is. To each their own I guess. I'm able to look back and be proud of the work I did, considering the constraints I was under. But the games design is still "meh". I conceived of the premise of the game back then, and brought my idea to form on the fly. No iteration and refinement of the idea. So in the end... I wasn't happy with the outcome as it did not feel like what I was originally aiming for.

I have never played it, but I suspect that this should have ended up more like Beat Saber... before Beat Saber ever existed. I thought about going down such paths, but I don't care for rhythm games, and... I was looking more to make something original. not derivative of other genres. I took a swing of my own... and I missed! Oh well.

Looking back on this right now, I feel it would be good to have this game out there just as something in my portfolio. It showcases what I was able to do in 8 weeks (or less since I didn't have a working VR device for half of that time), pulling every idea out of thin air, designing the game systems, art, etc. I don't pretend to think it displays my talent as amazing or anything... just that I am "capable". I have my Unity Assets in my portfolio, but this is something different. Those showcase my coding skills, this showcases my skills in all areas. For better or for worse.

You can find the download link over on the AdFighter page.

My Services

Unity C# Scripting

I'm available for C# scripting for small Unity projects.

Read More

Basic 3D Modeling

I'm experienced with Blender. Basic modeling and rigging.

Read More

Unity Development

I've created many demos, scenes, and even a simple game.

Read More

Virtual Reality Consultation

I've been an VR enthusiast for a few years, and made VR content.

Read More

News / Dev Blog


Contact me

You can reach me by email or twitter.


Twitter: @DarkAkuma_


If you want to contact me for Unity Asset product support, instead please try using the forum post linked on the Asset's Unity store page.

The reason I ask this is because your question may already be answered there, and if it's not then posting your question there allows others to benefit from the discussion made to resolve it.

Due to excessive misuse of this form and a lack of using the designated forum posts that were created specifically for this purpose, any support requests sent via this form/email may be ignored.

Thank you!