Given the current state of the world, we’ve been looking into various immersive apps that could assist in various educational frameworks. To this end, I went through all the “Team Collaboration” apps in this list from Road to VR. Here’s my findings.
Character Creator 3 to Unity
I’ve been taking some time to go back to Character Creator 3 and really try to learn what it has to offer. To this end, I’m working through a lot of tutorials both on their official site and from other freelancers on Youtube. I started from the end of the pipeline though for the sake of some project deliverables we needed ASAP. A project required two generic characters with the same animations, so I started by finding out all I could about exporting characters from CC3 to Unity as efficiently as possible. From the tutorials, I was able to figure out a lot of the things I was missing from the export settings.
Sketchbox VR Exploration
Took some time this morning to learn about Sketchbox on the Oculus Rift S. It’s a cooperative VR application for exploring and making 3D scenes/objects. The amount of options available were very robust and the online collaboration only had one major hiccup.
VRTK Implementation Part 3 of ???
Been a while since I did an update post on this project as a whole. A lot of time has been spent refining certain aspects. For example, the grabbing functionality now has highlighted drop zones when a piece has been removed to make it easier to put it back in place while still fitting with the Piece Management scripts I set up prior. Another big update that went by without a post was the lab configuration. The experience can now switch between two modes, one in the large silo from before, and one where the horizontal movement is locked, a ceiling and floor appear closing off the rest of the silo, and the scientific payload of the rocket is turned on it’s side for a horizontal exploration.This lab configuration is controlled using a whole scene Position Manager I set up which will help me moving onto the final interactions of the rocket this coming week. In scene, the player can switch between the lab and silo configuration using a button on a separate menu screen.
VRTK Implementation Part 2 of ???
Got far enough to make an update. Last time I got the hands and touch controls more or less working. Since then, a few things have changed in this setup. For one, I went through the VRTK example scenes to figure out some things about how they used the hands. One major thing that I ended up doing was simply taking the hands from the example scene. They had a cleaner setup and I realized they also had a set of position values already set up for VRTK’s hand alias realignment script, which puts the hands in the right place relative to what controller the given SDK uses (for example, the HTC Vive wands reach farther out from your physical hands than the Oculus controllers). Overall, as I’ve delved deeper into things, I’ve found more and more how VRTK works once all the pieces get lined up right.
Setting Up GitLab with GitHub
Decided to finally get to know GitLab a bit more for the sake of backing up my project and hopefully setting things for version control if I work collaboratively in the future. Thing is, I hate git shell and command line stuff and vastly prefer the ease of GitHub. So I took some time to figure out the handshake between them. Here I’ll try to explain the setup process as simply as I can.
Volume Viewer Pro
Going to make a post about this to better explain the situation.
Volume Viewer Pro is a plugin for Unity to view volumetric data such as MRI scans. I’ve been exploring it as part of a potential project. Initially things were rather promising. Volume Viewer essentially does the reverse of a scan. It takes the volumetric data and creates cameras around a box. These cameras then do raycasts based on the data and seems to calculate the intersections of the rays with the box in order to shave away at the shape’s normal map and create a complex volume rendered by the camera. Due to this setup, the ability to view these volumes in VR is actually possible, at least with the Rift S and our higher end PCs. We have already tested this prior using some data we were given that was a scan of a mouse brain.
VRTK Implementation Part 1 of ???
This will probably be the first of a few posts I do about the use of the VRTK plugin with interactive development within Unity. The Virtual Reality Tool Kit (VRTK) by Extend Reality Ltd. is a plugin used to basically unify the disparate control schemes of the various VR software development kits (SDKs). Due to various VR setups having different input methods, it can be hard to switch between the various setups within your code as you have to adjust all your inputs accordingly. VRTK takes the inputs from all VR solutions and creates it’s own input method from them. All that you theoretically need to do with VRTK is simply change the target device and SDK within VRTK before building out your project, rather than going in and changing the inputs for all your scripts. I say theoretically cause this plugin can be rather confusing and unruly at times. For example, when just switching from Oculus Rift S to Oculus Quest, two systems that actually HAVE the same controllers, I had some issues finding the proper setup after switching to Android publishing, with the answer I eventually found being oddly enough setting VRTK to the “GearVR” setting which is a far different remote than the one used by Quest.
That said, now that I have a better idea of the implementation and setup, things are going much smoother. So here’s what I’ve figured out so far while working on my interactive exploration of the WRXR rocket from a previous post.
Poly Reduction for VR
When it comes to VR experiences, a major concern in development is maintaining a high frame rate. When the framerate dips bad with VR, players are more likely to experience motion sickness and other issues. To avoid this, things within the experience need to be optimized for performance. A major section of optimization is controlling polygon counts on objects.
An object that is 1.76 million triangles could certainly run in VR, but it would choke out many other aspects of the experience and leave little room for other aspects. We’re working on a project involving a rocket model with such a high poly-count, so I’ve spent the past few weeks on reducing the model as much as I could while retaining important details. Over all the reduction I did, I’ve brought it from around 1.76 million tris to about 93 thousand.
Implementation and Issues with Human Avatars
I’ve spent the past week (not counting a few previous days of experimenting) working on getting a fairly simple human avatar into Unity for use in some Virtual Field Trips. The idea was to just use Reallusion’s Character Creator (CC) to make a basic character, modify them in Maya and apply some motion captured animations for use in Unity. Easy to say, much harder to do. Here I’m going to discuss the various steps and pitfalls encountered over the process to hopefully illustrate the difficulties and lay some groundwork to better implement characters in the future.