XR technologies lend themselves well to explorable content as it allows the user to physically dive into the material. Both augmented reality accessible through apps on phones and tablets and mixed reality available on headsets like the HoloLens and the Meta Quest Pro allow pass through via cameras seamlessly bringing virtual elements into your physical space. These tools aid in observing phenomena which may be difficult to examine any other way such as large-scale or time-lapse phenomena which can be shrunk down or sped up for ease of understanding. In this Adobe Aero example the orbits of planets in our solar system are shown within the bounds of an office space (created by TLT Experience Designer Zach Lonsinger at Penn State).
Internal structures that would typically be too fine detail to resolve with the naked eye or that would otherwise be obscured can be viewed as virtual models without any boundaries or barriers. Here this honey bee model outfitted with internal digestive, circulatory and vascular systems serves as an example of how things that would typically only be viewable with microscopy can be explored in AR.
While this example is a static model more complex mechanisms can also be displayed through animations such as those included in the original desktop tool here: https://cie.psu.edu/BeeModel/ . Including the folding of the mouthparts and wing movement that are helpful for students to see in action.
Another model that can be well integrated into AR is a segment of the human circulatory system which will be interesting to dive among the blood cells in motion.
There are 4 ways on the adobe help page to integrate custom animations in your AR scenes:
https://helpx.adobe.com/aero/how-to/create-animations-in-aero.html
And another tutorial I’ve used which references my native 3D modeling software, cinema4D https://www.schoolofmotion.com/blog/using-cinema-4d-art-augmented-reality-adobe-aero
A challenge I’ve faced is that since updates to adobe Aero the typical suggested workflow of integrating animations diverges from the tutorial where following the steps as described results in a “ghost” static model remaining in the scene with the animated model and the materials have failed to transfer to the exported animation file. Once this issue is resolved I will update this post with solutions. In the meantime the behaviors functions still work well to showcase models in the app with interactivity https://www.youtube.com/watch?v=bl5ygyAes4A
While the surrounding walls obscure a great deal of the pass through ability, entire rooms and spaces can be scanned into AR as depicted in this example of a full scale operating room shown below. It is worth mentioning that this model was created with LiDar scanning as described in another blogpost struggles when it encounters reflective surfaces as would be present in a lot of the metal rails and machinery leaving these objects slightly lumpy in appearance. This can be edited through labor and care by manually restructuring the meshes with 3D modeling software.