Jump to content

Apollo 11 VR Exhibit with Azure Text to Speech & MRTK


Recommended Posts

Guest April_Speight
Posted

Made with MRTK is a monthly series where we share sample projects that leverage the Mixed Reality Toolkit (MRTK). You’re welcome to leverage the resources provided to create your own experiences or iterate on the samples provided. For those new to MRTK, the toolkit provides a set of components and features used to accelerate cross-platform MR development in Unity. To learn more, visit aka.ms/mrtk!

 

 

 

Growing up in the DC area, I loved everything there was to love about visiting the Smithsonian Air and Space Museum! Upon entering the museum, you’re greeted by massive aircraft and the chance to see so many awesome artifacts in-person. I’ve personally always been fond of viewing the space exploration exhibits. Over the years, I’ve had my fair share of space exploration encounters – from visiting NASA, to

for Microsoft Build, and even a
where we discussed The Planetary Society – his organization in which I’m a proud member!

 

 

 

largevv2px999.jpg.3fa43c88c41ed58e155688543bd480fe.jpgApril standing in front of airplanes at NASA Armstrong

 

 

 

For this week’s Made with MRTK project, I decided to revisit my love of space exploration and bring the educational experience into VR. I created my very own interactive Apollo 11 exhibit with 3D models courtesy of the Smithsonian 3D Digitization project and NASA.

 

 

 

Models

 

 

 

 

I discovered Smithsonian 3D Digitization a few years ago when I first began exploring educational experiences for AR and VR. This resource provides open-source models of artifacts across the Smithsonian museums. Since I wanted to create a space exploration exhibit, I headed over to their collections and saw that there was already a small collection of models for Apollo 11. Not only do they provide the models, but there’s also a description to accompany the models as well. This turned out to be really helpful as I was able to leverage the description provided when created the words for the exhibit.

 

 

 

The other resource I used for models is NASA Astromaterials 3D. I honestly just learned about this resource while working on the project. I had thought to myself “How cool would it be to have an exhibit of Moon rocks?”. Lo behold, after a quick search online, I came across this website. There’s an entire Apollo Lunar Collection which consists of rocks from various Apollo missions. The files can be pretty heavy, so I recommend using the low-resolution files which are just a few megabytes. Likewise, NASA also provides a detailed description for the models.

 

 

 

Exhibit Inspiration

 

 

 

 

The layout of museum exhibits has always fascinated me. They’re all so different in design with regards to the colors, lighting, signage, and surfaces for artifacts. I headed over to Pinterest for some inspiration to help me decide how to layout the exhibits within the experience. I took a scroll through the results for Space Museum Exhibits to get an idea of how various museums layout and design their own exhibits. A common theme I encountered was that the rooms were relatively dark in nature with just the right amount of mood lighting – which made sense given that it's pretty dark in outer space. I also went on a deep dive in search of examples for exhibit signage. Given that there’s so much to read when viewing a museum exhibit, I wanted to ensure the written parts of the exhibit were both eye-catching and legible.

 

 

 

Since I wanted to feature the models in the Apollo 11 collection and some of the lunar rocks, I narrowed down my exhibit layout to the following:

 

 

 

largevv2px999.png.8dcf3ecd5155bc2a2e3912c24909ba3e.pngExhibit layout

 

 

 

Designing the Environment

 

 

 

 

Lately, I’ve been diving into world building and wanted to push myself outside my comfort zone for this project. Typically, whenever I create a simple proof of concept, I don’t put much effort into creating a captivating environment. However, I truly wanted to emulate the feeling of being in a museum. And not just any museum – I wanted it to feel like being in a museum in outer space! I swapped out the usual default sunny horizon in Unity for a space skybox that I found in the Unity Asset Store. The skybox I used provides a view of Earth from space!

 

 

 

largevv2px999.jpg.7316163f63c765049ab45e8271f0fa87.jpgSkybox for the environment is a view of Earth.

 

Although there’s not a lot of light in space, I wanted to ensure that everything within the exhibit was still visible. Rather than brightening the entire environment with one bright directional light, I opted for gallery lights – or at least the essence of gallery lights! From my search on Pinterest, I saw that most exhibits rely on strategically placed spotlights to brighten the artifacts. I found a model of gallery lights on Sketchfab and paired a spotlight in Unity with each light so that it could look as though the light was coming from the gallery lights.

 

 

 

largevv2px999.jpg.0064b4b8dda9e8debd8f96917477e47d.jpgGallery lights above the exhibits.

 

 

 

As for the exhibit platforms, I used Unity primitives and modified the shapes based on examples I had saw on Pinterest. I headed into Blender to create the exhibit titles so that there would be some depth between the exhibit title in comparison to the exhibit descriptions. This

helped me with creating the text as I had never simply created 3D text before in Blender. It’s tricker than it sounds given that the geometry tends to get a bit messy.

 

 

 

largevv2px999.jpg.3753e4b8f633c768f20ac41a6313b9ea.jpg'Neil Armstrong' written in 3D

 

 

 

As for the images within the exhibit, the Smithsonian had plenty to leverage across their websites. For example, the ones used for the timeline are courtesy of their very own Apollo 11 Timeline which is part of the National Air and Space Museum website.

 

 

 

largevv2px999.jpg.6c49e585502ea1a1f748725f319201cc.jpgPhotos within the Apollo 11 Timeline

 

 

 

Interacting with Artifacts

 

 

 

 

Adding an element of interactivity to a museum always makes the experience that much more exciting for me! In physical world, it’s not often that you can touch and grab the artifacts in a museum. You often must admire from afar. Since I was creating this experience in VR, I decided to break the rules a bit and make it so that you could interact with the artifacts. I chose to add interactivity to the Extra-Vehicular Gloves.

 

 

 

largevv2px999.jpg.675b6113886b7550e86a03905588c74f.jpgExtra-Vehicular Gloves

 

 

 

Using MRTK’s Object Manipulator and Near Interaction Grabble scripts, I made it so that you could pick up and view the gloves either from a far or near distance. I’m honestly glad I did because there’s a lot of little detail that would’ve been missed if the gloves were stationary. For example, Armstrong is written on the inside of each glove – a detail that would surely be missed if a person couldn’t pick up the glove to see.

 

 

 

largevv2px999.png.ba7b6657c2d1092f54d5c4a82db8d16e.png'Armstrong' is written on the inside of the gloves

 

 

 

Azure Text to Speech

 

 

 

 

I’m always a big fan of leveraging our Azure Cognitive Services in my projects – especially when it involves our Speech services! Whether it’s to make an experience more accessible or to provide an auditory layer to the experience, leveraging this service is always a favorite of mine. I chose to include an audio transcription of the exhibit descriptions to accompany the Neil Armstrong and Command Module exhibits. With Azure Text to Speech, I was able to send a string of text to the service which returned an audio clip of the transcribed speech. We have a Speech SDK for Unity that needs to be imported into the project - and you'll need to create a Speech resource in the Azure Portal!

 

 

 

largevv2px999.jpg.0703c104803bb3f3e8c7e29ae8c49af1.jpgPressing the button plays an audio clip of the written text

 

 

 

I created a script which contains a method to trigger the workflow. To trigger the execution of the method, I created a custom button for each exhibit. When pressed, the method would execute and in less than a second, the audio clip returned by Azure plays in the scene. The custom button is configured with the MRTK Pressable Button and Interactable scripts. Configuring this button was a little tricky at first given that I had the wrong collider on the button. I used a cylinder primitive as the base of the button which resulted in me having a capsule collider. When creating a custom button with MRTK, the Pressable Button script provides layers which represents the distance values that are configured in the Press Settings. The capsule collider placed the layers in the wrong direction. Therefore, I had to swap out the capsule collider for a box collider.

 

 

 

largevv2px999.jpg.4f27a73aac3401e814fc6f09ad4dafb8.jpgPress Setting Layers

 

 

 

Demo

 

 

 

 

I recorded a demo of the experience viewed directly from my Meta Quest – viewable via the Unity Editor in Play mode. Feel free to check out the demo:

 

 

 

 

 

 

I’ve also created a GitHub repository for the project. As-is, the experience can be viewed on a Meta Quest while in Play mode within Unity by using the Quest Link cable. If you intend to view the experience for yourself, be sure to follow the instructions provided in the README as you’ll need to create an Azure Speech resource and add your own Key and Region into the Unity project to generate the transcribed audio clips.

 

 

 

Conclusion

 

 

 

 

Creating VR museum experiences such as the Apollo 11 exhibit shines a light on how many resources are available for us to create our own educational experiences from scratch. Resources such as Smithsonian 3D Digitization and NASA Astromaterials 3D provides essentially the most challenging assets that would one need for creating VR experiences – the models! And it’s great that each resource also provides the accompanying descriptions to truly help bring more context to an experience. I hope that this project inspires you to create your own educational experiences - not just for VR but for AR as well! In a future post, I’ll showcase a similar project for AR that leverages Microsoft Power Apps.

 

 

 

Until next time, happy creating!

 

Continue reading...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...