Mixed Reality Experience for Mobile Devices


Usage of AR software for Product Visualization. Source: Picture by Mira Miroshnichenko/Pexels


In the last few years, the popularization of Metaverse has led to an increase in the demand for mixed reality (MR) applications, requiring software companies to invest in new technologies to develop applications with a higher level of immersion. In the context of mobile systems, Metaverse finds an opportunity to grow, considering that smartphones are the devices that people are most familiar with, using them in their daily routines. Also, smartphones have a lot of features and sensors that can be used to provide immersion required to build an MR application.

To develop an MR application, developers should understand the features and tools available at the time of software development, making use of the correct technology to provide the best experience to the user. In this article, we're going to talk about some of the technologies inside mobile systems that can make it possible to bring immersion to an application. Next, we'll show some tools, frameworks, and software for making MR mobile applications. Finally, we'll provide some use cases of MR technologies for mobile devices, with examples of available applications.


Tools and Frameworks for Mixed Reality Apps


Unity is one of the most famous cross-platform game engines and SDK available on the market. Released in 2005, it came with the idea to democratize game development for everyone, making it possible to develop 2D and 3D games fast, without any specific code expertise, thanks to its user-friendly interface and visual scripting system called Unity Playmaker. It provides a range of features that help on the task of making a great game from scratch: such as a robust physics engine, animation tools, rendering capability, scripting support (primarily C#, but accepting many others, such as C++), and an extensive library of pre-built assets and scripts.

Unity can offer many features to make developer's work easier when working with VR for mobile, with an easy setup for both iOS and Android devices on the Unity XR Platform. Once configured correctly, Unity provides various options for VR input and interaction, such as touch based interaction, haptic feedback, rotation and translation, which can be used for implementing locomotion mechanics, grabbing objects, teleportation, and UI interactions. Unity also provides an easy way of debugging the application while developing it, making it possible to check on the target device if the application is responding the way it should. For audio and spatialization, Unity provides audio tools and plugins to enhance the user's immersion, making it possible to create realistic soundscapes and positional sound effects.

For packaging, testing and deployment, Unity provides ways to make it easier for the developer to debug the application and create the final package for the target device.

One of the greatest points about working with Unity for VR/AR development is the engagement of the community and the number of resources/tutorials available across the internet. For learning further about MR development with Unity, there are a couple of tutorials you should check out: VR Development and Mobile AR Development.


Unreal Engine

Unreal Engine is a cutting-edge and widely used game engine developed by Epic Games. It was created by Tim Sweeney in 1995, and it was originally created to develop first-person 3D shooter games. Like Unity, it provides developers with a comprehensive set of tools and features to create high-quality 3D experiences, thanks to its advanced graphics capabilities, real-time rendering, and robust physics simulation. The engine supports programming primarily in C++ and a visual scripting option called Blueprint, allowing the developer to build complex game mechanics without writing a single line of C++ code. To make it possible to create your own game, Unreal Engine offers a wide array of features too, including a visual material editor, a powerful animation system, a physics engine, audio tools, and an extensive asset library.

For building MR experience for mobiles, Unreal is a great choice due to its robust features and extensive support for MR development, and having support for Android and iOS. Once the right target device is configured, developers can create their own project through the VR-specific templates, which include pre-configured settings for MR input and interaction, or create a default project and manually configure the MR settings. For setting up the inputs, it's possible to define the way that the user will interact with the app through the Blueprints nodes, defining the interface controls, locomotion, interaction between objects, and other game mechanics.

Sound is a vital element for creating believable and immersive environments, and Unreal's Audio Engine system offers many ways for audio designers and engineers to mold audio to match their visions, making it possible to implement spatialization, audio effects, mixes, and sub-mixes. Like Unity, Unreal provides many options for packaging and deployment, alongside fully detailed documentation, making developer's life so much easier.

Unreal does provide a lot of plugins, such as ARKit, ARCode, and Vuforia, that make it possible to integrate with mobile projects built in frameworks like Flutter and React Native, however, in contrast with Unity, Unreal Engine doesn't have a direct integration with React Native. Besides all the great features available for Unreal, its community is not as big and engaged as Unity’s, which makes it harder to find tutorials and documents to guide the development of MR games, especially for mobile devices.


Apple’s ARKit and RealityKit

In the case of building specifically augmented reality for iOS and iPadOS mobile devices, ARKit is an excellent framework to consider. It uses motion tracking, environment and scene understanding, light estimation, and rendering capabilities to integrate virtual content with the real world. It supports surface detection, object recognition, facial tracking, and handling complex tasks like camera calibration, motion tracking, and spatial mapping, enabling developers to focus on creating a compelling AR experience.

According to its documentation, ARKit 6 introduces the option of capturing a 4K video feed using a back camera, an advanced Depth API, thanks to its LiDAR Scanner, real-time motion capture, and advanced object occlusion feature. Alongside ARKit, Apple also has RealityKit, a framework that provides more control and customization over AR experience, with high-quality rendering capabilities, animation features, powerful physics engines, and spatial and mixing audio tools, showing as an option for using game engines like Unreal and Unity. For creating a 3D floor plan of a room, Apple introduced a new Swift API called RoomPlan, powered by ARKit, which thanks to real-time scanning, makes it possible to represent indoor spaces easily, being an amazing tool for architecture and interior design.

Google's Cardboard and ARCore

Google created Cardboard, an open-source SDK for creating multiplatform VR applications, and ARCore, a platform for building augmented reality experiences. For both projects, an amazing feature is the fact these can be used for multiple devices, being able to connect with multiple development environments, with Android NDS, iOS, and Unity for both, and Unreal and Web for ARCore.

Cardboard SDK provides essential VR features, such as motion tracking, stereoscopic rendering, and user interaction via viewer button, making it possible to build entirely new VR experiences or enhance existing apps with VR capabilities. ARCore enables the integration of virtual content with real-world using three key capabilities:

  • Motion tracking, thanks to SLAM and IMU's data, the phone can understand feature points captured by the camera and its actual pose and rotation to understand its relative position on the world
  • Environment understanding, allowing the phone to detect the size, location, and boundaries of surfaces
  • Light estimation, making it possible to provide the user with content with proper average light intensity and color correction

It's possible as well to use ARCore with other frameworks besides Android Native and Swift, such as Flutter, thanks to the arcore_flutter_plugin, and React Native, by using the platform ViroReact.

Other Tools

Alongside the tools, software, and frameworks shown before, there are a lot more platforms that can be used for making mobile MR applications. Working with React Native, there's a library called ViroReact, on which developers write in React Native, and Viro runs their code across all mobile VR and AR, including Google Cardboard, Android ARCore, and iOS ARKit. Another tool that can be used for building games is Godot Engine, a free and open-source community-driven 2D and 3D game engine, just like Unity and Unreal, it’s possible to build the game through the visual editor or programmatically, by the code editor.   Since version 3, it's possible to create an MR thanks to its new architecture called AR/VR Server, which can also be deployed for native mobile.

According to VR Tutorial and Godot’s XR Tools documentation, it’s possible to set up the project for working with an MR camera, setting up different kinds of movements, like walking, jumping, flight, and many others, and developing interactions and different kinds of objects. For delivering and testing it on the target device, you can check the documentation related to Plataform Specific, supporting integration with Android, iOS, and even HTML5.

For specifically AR development, it's possible to find solutions like Vuforia, Wikitude, Spark AR, and AR Foundation, which can make it easier to build solutions for cross-platform app development, easier integration with ARCore and ARKit, or even being able to build tools to be used on AR experiences for social media. 


Mobile Sensation Feedback

Considering the key aspects of MR technology, here are some of the many ways that nowadays smartphones can provide immersive sensations to the user:

  • Visual: Basically, the main characteristic of smartphones, the user can get visual feedback through the mobile screen, becoming the most important way for the user to get feedback response from the device. For general MR applications, it's very important for the user to perceive visual information through the screen and be able to get a response once he interacts with it. The usage of SDKs, such as Google Cardboard, makes it possible to provide the user with a VR experience using the smartphone, by splitting up the image on the screen and providing each eye the same image, but with a little offset, giving the sensation of depth
  • Hearing: Environment sounds, speech, and music are great ways to convey immersion, connection, and accessibility, making it possible for every user with any disability to enjoy using the app. Usually, the VR apps provide a profound experience through the headphones, making it possible to isolate the user's surround sound and even emulate the spatial sound, providing the user the sensation of being inside the scene. There are three main methods of spatialization: panning, by adjusting relative gain between audio channels; sound-field spatialization, which uses a spherical harmonic representation of a sound field; and binaural audio spatialization, which takes advantage of psycho-acoustic phenomena to increase the quality of spatialization
  • Haptics: For input of information, smartphones contain sensors that make it possible to perceive tactile interaction by the user, like touchscreen and buttons, and kinetic information, with accelerators, making it possible to sense the user’s touch and movements with the device. For the output, most of the mobile devices have vibration motors, which can provide information of alert and attention to the user


Examples of Mobile MR Applications

From what we've seen, it's possible to notice that the usage of mobile devices is the main way of making an MR application widely used. Considering this, here are some examples of MR applications success cases:

  1. Gaming: As we've seen, a lot of game engines platforms have implemented MR support, making it easier to develop games using VR techniques and especially AR, using the phone's built-in camera. Pokémon Go is a popular example of an AR game that combines virtual creatures with the real world, but there are many others, such as Angry Birds AR: Isle of Pigs, and the scary Five Nights at Freddy’s AR
  2. AR Navigation and Wayfinding: MR can enhance navigation applications by overlaying digital information, such as directions, points of interest, or real-time data, onto the user's real-world view. This provides an intuitive and interactive way for users to navigate their surroundings. Google Maps and Apple Maps incorporate AR features to guide users with visual overlays and directions
  3. AR Product Visualization: AR applications can allow users to preview and visualize virtual products in their real environment. This can be used in e-commerce applications to let users see how furniture, home decor, or other products would look in their own space before making a purchase, or even trying-on clothing, accessories, or cosmetics without physically wearing them. IKEA Place is an example of an app that enables users to place virtual furniture in their homes using AR. Another example is Sephora Virtual Artist, which offers virtual try-on for makeup products
  4. AR Education and Training: By overlaying interactive 3D models, information, or instructions onto the real environment, MR can enhance educational and training in fields like medical training, architecture, or engineering, providing interactive learning experiences. AnatomyAR+ is an example of an app that allows users to explore and learn about human anatomy through AR
  5. AR social media and Filters: MR features are commonly used in social media applications to provide users with filters, effects, and interactive experiences. Users can overlay virtual objects, masks, or effects on their photos or videos to create engaging and shareable content. Snapchat and Instagram offer a variety of AR filters and effects


Key Takeaways

Developing with MR is one of the key trends of these last few years that's becoming a reality thanks to hardware and software improvements, and community collaboration. With a wide variety of tools to implement MR solutions, the user experience design and team management become key factors, in choosing the right tools and the right user interactions. That way, it’s possible to provide the user with an enhanced experience, increasing their immersion and satisfaction in using MR application.



This piece was written by Ian Moura, Fullstack Developer at Encora Inc. Thanks to Flávia Negrão, João Caleffi and João Pedro São Gregório Silva for reviews and insights.


About Encora

Fast-growing tech companies partner with Encora to outsource product development and drive growth. Contact us to learn more about our software engineering capabilities.

Share this post

Table of Contents