Indie VR & Game Developer

Hello! I'm Jason.

I'm a computer programmer with almost 2 decades of experience, now trying to transition from a hobby to a career. I've spent the last several years learning game development with the Unity game engine, and specialize in C# scripting.

I publish my own assets on the Unity Asset Store, aspire to make my own games, and will entertain offers for jobs on projects that suit my skills and interests.

Please enjoy my projects, consider my services, and contact me anytime!

Find Out More

Projects

Wednesday, December 11, 2019

Update: BodyLanguageVR 1.1

Update: BodyLanguageVR 1.1



This first update of this asset mainly sees me finish the Detect Inverse option. This basically give a user more freedom to trigger some inputs in a more natural feeling way. For example, the head shake no motion previously required a Right->Left->Right->Left, but a users natural instinct may be to go Left->Right->Left->Right. Now both can work.

Besides some minor bug fixes and adding a manual, most of this update is done as preparation for the future direction of the asset. I wanted to get more done in for the next update, but felt it better to release what I have early so I can take my time and do things properly.

I've been reflecting on the direction I want to take this asset, and to begin with that I had to break down and define its current goals. I want this asset to allow replacing traditional input with VR motions. Traditional input is stuff like digital button presses and analog stick/button presses. As is, my asset does not imitate such input very well. Even a simple digital button press. When you press  button down, sure its simple a TRUE/FALSE thing and my asset does that. But what it also allows which my asset did not is an amount of time that button was held down. When you do a motion its very event based. Preforming a motion triggers a TRUE value for a single frame. That's it. That's only good for yes/no types of scenarios.

This update attempts to allow you to return values for more than a single frame, but its kind of limited. And not well thought out. I have no stock demonstration input ideas for the additions.

This brings me to the future direction of the asset.

Right now, the asset is designed around what I refer to as a "DoThen" setup. What that means is, the users Does a motion, Then a value is returned. This is as opposed to what I'd like to additionally support with what I currently refer to as "WhileDo". That would be, return a value While the user Does a motion/sequence. This would better simulate holding a button/stick for a period of time. While you can currently just make a 1 motion sequence, I'd also like to better support singular motions.

In the future I can see this asset supporting input like Fruit Ninja, oddball movement setups, etc. I may even just be able to finally make the game idea I had that originally sparked creation of this asset! =)

Changelog:

v1.1
  • Added DetectInverse option. This allows the user to do the mirror image of a sequence to trigger it.
  • Added GetMotionAxis() to MotionDetection. It's still crude atm, but this allows you to get an analog value back from the input. Right now the value is just a float from 0 to 1.0f, and based off speed of which the user completes the sequence.
  • Along with that, there's a new option called ResultDurationType. This allows you to get the non-zero result of GetMotionTriggered() and GetMotionAxis() for longer than just a single frame.
  • Fixed some issues with changing the size of the Input and MotionSequence lists.
  • Removed the GetMotionTriggered() method in the BodyLanguageVR class in favor of calling it directly form MotionDetection.
  • Tweaked the expected usage to use a more common instance/object oriented approach instead of the string based lookup method.
  • Added an optional OnMotionTriggered() method selector to the UI. I don't recommend its usage for most users. Its just for new users who aren't great with code, to help grasp the idea of the asset quickly. Proper/recommended usage is getting a instance of the InputMethod, checking myInputMethod.motion.GetMotionTriggered() every frame and calling your own code when true. GetMotionAxis() is not supported with OnMotionTriggered().
  • Split the InputExample script, along with the Demo scene, into 2 to reflect the differences with setup for the above.
  • Created new Manual.pdf instead of a ReadMe.
  • Updated demo scene objects.

Sunday, December 8, 2019

ReRelease: AdFighter

ReRelease: AdFighter



Lately something has been irking me. That something being that the one game I have made is unable to be played by anyone! All because I removed it from the internet due to having a bad experience with a sleezy company who ran the game jam I originally made it for.

At first I was was embarrassed by it because of the failure, but I got over that soon enough. Since then I intended to re-upload it... I just didn't want to do so with references and promotion of that sleezy company in it. So I needed to remove those... I just never felt up to digging through the source again. I wanted to move on for the time being. And well, I did. But I never came back to release it... because I was moved on. I had forgotten, or been to busy.

But now though, I finally remembered and got around to it!

To frame this re-release. It's exactly as it was at the end of 2016. Same VR SDKs, same game play, same menus/text, etc. The only difference is the removal of those references/promotion.

Also, be warned. This isn't an amazing lost treasure or anything. At least I don't think it is. To each their own I guess. I'm able to look back and be proud of the work I did, considering the constraints I was under. But the games design is still "meh". I conceived of the premise of the game back then, and brought my idea to form on the fly. No iteration and refinement of the idea. So in the end... I wasn't happy with the outcome as it did not feel like what I was originally aiming for.

I have never played it, but I suspect that this should have ended up more like Beat Saber... before Beat Saber ever existed. I thought about going down such paths, but I don't care for rhythm games, and... I was looking more to make something original. not derivative of other genres. I took a swing of my own... and I missed! Oh well.

Looking back on this right now, I feel it would be good to have this game out there just as something in my portfolio. It showcases what I was able to do in 8 weeks (or less since I didn't have a working VR device for half of that time), pulling every idea out of thin air, designing the game systems, art, etc. I don't pretend to think it displays my talent as amazing or anything... just that I am "capable". I have my Unity Assets in my portfolio, but this is something different. Those showcase my coding skills, this showcases my skills in all areas. For better or for worse.

You can find the download link over on the AdFighter page.

Tuesday, October 15, 2019

New Asset: BodyLanguageVR 1.0

New Asset: BodyLanguageVR 1.0



Hello!

Today at last I release my next asset!

This asset is something I conceived of and worked on a few years ago, but was forced to shelve it before it was finished. Ever since then, its been hard to pick up work on it again as it's complex and a very daunting task to work on after stopping!

But recently I was finding it such a waste that it was so near to completion, but going unsold. So I doubled down and finished it up by removing and shelving all the numerous ideas I had for it that contributed to feature creep, and would never let me get around to releasing it. That was partly why it was so daunting.

At the moment, its a little basic. But I think it should be well worth its price as is, and over time as I get back to adding those new features, its value will grow!

For now, this may be my last VR focused asset, as I try and go in a new, less niche direction and make some assets for a larger pool of developers.

Anyway, onto the asset itself!

The purpose of this asset is to allow more VR devs to include more appropriate, intuitive input for VR. Input that suits the VR experience more, and keeps immersion alive. In real life you don't press F to say "Yes". You just speak the word "yes", or in this case you give a nod of your head up and down.

Yes/No nod's are included as examples, and will work on any VR device!

Additionally, I've included an example for waving "Hello". This of course requires a VR device with motion tracked controllers, like the Vive or Rift. It may even be able to work with other AR type hand tracking setups provided you select a hand model as the tracked object. I don't know. I don't own such devices.

I've included a current feature list at the end of this post.

I hope many people enjoy this asset!



Included Example Gestures:
  • - Head Shake: Yes
  • - Head Shake: No
  • - Hand Wave: Hello

Supported Custom Gesture Detection Options:
  • - Positional Distance
  • - Rotational Degrees
  • - Direction Held
  • - more TBA...
Available now on the Unity Asset Store!

Monday, June 17, 2019

Update: TransformEx 1.2

Update: TransformEx 1.2



TransformEx has been updated to v1.2!

This update was a long time coming, and was very needed. It's mostly about improving the demo scenes to both be more appealing/eye catching, and to better convey the usage, purpose, and quality of the asset through screenshots, video and a online demo.

The promotional material in general was really bad and/or lacking. The demo scene/screenshots looked cruddy and amateurish. There was no video, which seems to hurt peoples interest and willingness to give it a chance. There was no web demo. But that's all changed now!

This is all something that has bothered me for quite awhile, but the sales for this asset are slow despite it being a "must have" that all VR devs should be using. I never updated it until now simply because it didn't sell well and thus wasn't worth the time/effort. But it didn't sell well perhaps because it needed such an update? Unsure, I felt my time was better spent on VR3DMediaViewer updates/maintenance as that sells much better.

Additionally, I made some improvements to the Asset's code itself. Like menu options to allow you to hide units of measurement that you probably don't use, and end up just making things more cluttered (like Miles and Kilometers). I polished up the UI a little to be formatted better. I added a "equalize" button, something I wanted to add long ago in some form, but I wasn't sure how I wanted to go about it. And there's even now minor support for RectTransforms, at least with positions.


Change Log:

1.2:
  • New demo scenes.
  • Added menu options to only see the units of measurement that you want.
  • Improved the look of the Ex Transform controls.
  • Ex Position controls added to RectTransforms.
  • Added equalize button for easy equalization of the object proportions.

Available now on the Unity Asset Store!

Monday, June 10, 2019

Update: VR3DMediaViewer 2.3

Update: VR3DMediaViewer 2.3



VR3DMediaViewer has been updated to v2.3!

I'm not sure what happened to 2.2... I guess I decided not to post about it since it was more minor? I'll include the change logs for both though.

The most notable updates are:

A new Manual. This manual is a PDF that replaces the old ReadMe text file. It's largely copied directly from the readme, but heavily updated with better formatting, images, color, etc. It should be a lot more new user friendly.

The Stereoscopic 3D screenshot script/scene was bugging me for awhile. I realized it was geared around developers using it themselves in editor. But it should allow users of the devs app to use it too. So I worked on some example scripts/assets/scene to provide a decent example of how best for devs to allow users to make use of it.

There's a new Panoramic media display method available now. The older way had issues in that as physical projection it didn't lock to the viewers head position, which may not away be desired. Being mesh based, its also is subject to the meshes geometry. You can see odd angles in the media caused by the vertices/uv lines of the mesh, only mitigated by using a higher poly mesh.

The new method is shader based. Its more like a skybox, viewed at infinity. So it's always centered on the viewers camera. And the curve is rendered in the shader, so it both renders a smooth curve without the odd angle issues, but views the same regardless of what shape mesh you use! This shader method is setup for both 360 and 180, supports rotation of the image around the viewer, and even can work with a traditional flat quad canvas. The old panoramic method is still available as an option.

Convergence has also been updated. It supports both the old and new panoramic display methods. Its now split into 2 convergence modes. Cropped (the previous way) and Tiled. Tiled just lets the texture wrap around as its shifted in the canvas. It's not ideal for non-panoramic content.

I improved the look/organization of the Stereoscopic3DImage objects. The controls are now better sectioned out, color is used to make the sections pop more, more controls are disabled/hidden when not relevant, etc. I also added a preview for the media to the object. It's similar to the preview you see when selecting a image/video itself, but in the case of images it will display the image converted to a SBS/TB format. The videoClip preview is limited. Just a low res image texture of the first frame.

I added several new demo scenes, and with that organized all the demos scenes into sub-folders.

Change Log:

2.2:
  • Converted the ReadMe to a Manual.
  • Fixed some position/rotation issues with the Stereoscopic3DScreenshot script.
  • Added new demo content/scene to show how to use the Stereoscopic3DScreenshot script to allow players to make use of it at runtime/gametime.

Change Log:

2.3:
  • Added new Panoramic shader based canvas formats.
  • Added wrapModeOverride option to Stereoscopic3dImage objects. This is mostly for video clip textures that don't expose the WrapMode property in the Import Settings.
  • Added ConvergenceMode option.
  • Fixed a rotation bug with the Example3DScreenshotController..
  • Reorganized the Demo Scenes folder hierarchy..
  • Tuned up the Stereoscopic3DImage object inspector.
  • Tweaked the AnaglyphToSBSTexture() method to support more than just Red/Cyan now..
  • The camera setup notification box will now not display on the example cross-eyed camra rigs..
  • Added scenes/canvas meshes for 180 Panoramic display.

Available now on the Unity Asset Store!

Thursday, April 4, 2019

Update: Project: Empty Soda Can #1

Update: Project: Empty Soda Can #1



Isn't that funny? In my last update I made a complete guess at when my next update for this project would come out. 6 Months. And wow... and here we are, 6 Months later! Exactly!

What's that? I made another update in that time, and this one is 6 months after that instead? Weird...

Well anyway, in that time Project: Empty Soda Can has come a long way!


Recent Example #1:

After 6+ months of painstaking research and feedback, we have found that the Red Cube doesn't track well. Apparently, to some Bulls out there, the color is found to be offensive.

*Looks down in shame, as he hears the chorus of Boos.

I know. I understand the outrage. This is unforgivable, but I hope that you please do forgive me.

To rectify this, we've instead elected to go for a calm, cooler, more neutral Blue.



I'd like to take this time to formally apologize to any Bovines out there. It was never my intent to offend. I truly hope that you can forgive me.


Recent Example #2:

Again, upon further research, we have discovered that the term "Internet" is too confusing to a significant portion of the audience.

Some of such people were often offended by it as it presumed that everyone was educated enough to understand the word.

*Smacks self in face.

I don't know whats wrong. Why we keep making such horrendous mistakes.

*Smacks self in face again.

We briefly considered "World Wide Web", but we quickly deemed that to be to much of a mouthful.

Further, feedback suggested that it wasn't inclusive enough.

So... we think we came up with a solution...

*Holds breath...



Our hope is that "Everybody!" remains inclusive, while retaining an appropriate lack of confusion.


Recent Example #3:

Finally, due to overwhelming feedback, we have realized another mistake...

We have offended yet another portion of the audience...

*Smacks head against wall several times.

While the "Move Right" Animation was an amazing achievement, we made a grievous mistake in not considering the feelings of our Left Handed Audience.

To help be more inclusive to our Left Handed audience out there, we have now added an Animation for you too!

And yes, this too has been updated to now be a BLUE CUBE.



We're extremely sorry that we dropped the ball on this one as bad as we did, and hope that addressing it now is better late then never!

Stay tuned for the next update!

Lets see...

I'll try guessing again...

One plus Two is Three...

...carry the Four...

...one train is going 40 MPH, and the other is going 60 MPH...

...I'll guess... 6 Months!

Yea. Solid guess!


My Services

Unity C# Scripting

I'm available for C# scripting for small Unity projects.

Read More

Basic 3D Modeling

I'm experienced with Blender. Basic modeling and rigging.

Read More

Unity Development

I've created many demos, scenes, and even a simple game.

Read More

Virtual Reality Consultation

I've been an VR enthusiast for a few years, and made VR content.

Read More

News / Dev Blog

Contact

Contact me

You can reach me by email or twitter.

Email: darkakumaz-net.us

Twitter: @DarkAkuma_

LinkedIn: Profile