Written By

Arie Williams



Stay up-to-date with OST blog posts.

February 22, 2023

An Introduction to Mobile Augmented Reality Applications

Image showing a mobile augmented reality application

At OST, we’ve had numerous conversations with clients and potential clients about augmented reality (AR). Augmented reality has become increasingly popular in the last few years and an exciting frontier in the mobile space. According toStatistaand theDigital Journal,the market value of AR applications is estimated to reach around $200 billion or more by 2028. Additionally, 71% of consumers prefer to order online from shops that utilize AR if no brick-and-mortar shop is available. 

For this article, we’ve created a (relatively) comprehensive guide to AR, discussing the basics of mobile augmented reality, looking at potential business use cases and evaluating development platforms that best suit your specific AR software needs.

What Is Augmented Reality?

Augmented reality integrates digital information with a user’s real-world space using technology to create an enhanced version of reality viewed through an interface — such as a mobile app. That is a high-level definition, but what does all that mean? 

Typically, the precise way augmented reality places items in the user’s 3D space varies from device to device. For mobile devices, the camera on the back of a user’s device can calculate and detect surfaces based on how light interacts with different planes. 

Below, we’ll break down the differences between iOS and Android devices:


On iOS devices, the LiDAR sensors can scan and measure the distance between the camera and objects in its surroundings by pulsing rays of light similar to radio waves but with infrared light instead. This sensor allows incredibly accurate mapping of surroundings and more seamless integration of virtual objects as they are placed in a real-world space, enabling proper lighting and physics values to be attributed. Hence, it reacts to the user’s plane more accurately. All the information gathered from the LiDAR sensors and camera is fed into ARKit, and iOS AR SDK and translated to the model placed in the user’s environment.


On Android devices, like iOS devices, the camera is used to detect surfaces. However, Android devices do not possess the LiDAR functionality that iPhone has. Instead, a lot of the heavy lifting happens programmatically. Information about light, distance and position is calculated through the ARCore SDK, which heavily relies on its ability to detect planes and surfaces. Though it is notably less accurate than the measurements gathered through LiDAR, ARCore’s plane detection allows for realistic light and physics values to be added to a model placed in the user’s real-world space. 


Mobile App Development and Design Services


Common Types of AR

Now that we’ve explained what AR is, we will dive into the several types of AR and how each functions differently while accomplishing the same general end goal — presenting an object into the user’s 3D space. Below are the most common types of AR:

Markerless AR

Markerless AR does not require input from our real-world environment, meaning no specific surface, color, or shape exists in the user’s space for a 3D model or object to render. Most of the time, however, the developer still accounts for plane detection, so the objects are aware of things like the ground or walls. 

For example, users of the Pokémon GO mobile game can view and interact with Pokémon placed in their environment. And Amazon uses Markerless AR when rendering a product demo, such as furniture or paint. Their mobile application allows users to move objects within their space while taking the shape of the area into account and reacting accordingly.

Marker-Based AR

Marker-based AR requires an item or marker for the AR object to recognize and track to ensure it is properly placed in the user’s space. Often the scene presented by the tag is related to its marker in some way or another. Marker-based AR is a prevalent type used in art exhibitions or museums where AR objects are associated with an artist’s collection. Marker-based AR is also commonly used for business cards to showcase 3D models in a compact and interactive format.

Projection AR

Projection AR does not require a mobile device to show AR objects. Instead, the object is superimposed onto a plane by a projector or other device so that multiple people in the same space can view the item without using individual devices. An example would be projection mapping, which BMW recently used during the launch of their new vehicle to show the different customizable designs. Another example of projection AR is hologram and hologram performers. 

Why Are Organizations Using Mobile Augmented Reality?

With the growing adoption of e-commerce services, organizations understand the importance of utilizing digital solutions to optimize the selling process of products and services. And when it comes to AR, companies have several opportunities to drive new value. From furniture to fun, consumers and businesses are using AR in their day-to-day lives. Whether it’s developing a new frontier in mobile gaming, creating interactive maps and travel functions, designing room layouts, customizing vehicles, or creating anaugmented reality medical room, there are numerous possibilities for businesses to invest in AR experiences.

For example, augmented reality product configurators allow consumers to easily configure custom designs and digitally position a product model into their real-world space, helping to visualize a product in context before buying it. Through this virtual “try before you buy” experience, consumers can visualize, place and configure a product in context before making a purchase.


10 Ways to Drive New Business


Best Mobile AR Development Platforms

Before starting development work, organizations need to decide what type of application they need to best fit their AR needs: native or cross-platform.

Native development allows users to utilize the best features of platform-specific software. In contrast, cross-platform enables you to build out the application for Android and iOS without creating two different applications that may look or behave differently.

Once the type of application is selected, it’s time to focus on what development tool to use. For this article, we will focus on the most fleshed-out and accessible tools for creating an application, highlighting the benefits and limitations of each option in the AR space.

ARKit (Native Swift Development)

ARKit is the main framework for creating iOS augmented reality applications in Swift (an object-oriented language most closely related to Objective-C). Having released their newest version, ARKit 6, as of September 2022, Apple’s AR solution boasts plenty of new features that optimize the development process. This framework uses both Swift and SwiftUI to build out its applications.


  • Swift is a safe, open-source, and fast programming language that reduces redundancy in code and makes upkeep an approachable task. 
  • ARKit smoothly integrates with SwiftUI, which utilizes declarative syntax to produce modern and clean designs. 
  • ARKit allows you to create high-resolution AR experiences. Coupled with a powerful LiDAR sensor, it makes incredibly accurate and interactive scenes.
  • With motion capture, depth API, and scene geometry, ARKit is one of the most robust AR libraries in mobile development today.
  • Creating a native iOS application reaches about 56% of the mobile device market share in the United States. 


  • This framework is iOS-specific, meaning any application created using ARKit, and its related framework, RealityKit, would be restricted to iOS devices. 
  • The exclusivity also extends to the lack of support for cross-platform libraries that tap into the capabilities of ARKit, as Apple makes it incredibly difficult for developers to create SDKs for their frameworks. 
  • Limits which iOS users can access its AR functionality since ARKit is only accessible on iPhone 8 and newer.
  • Currently only supports Alembic (.abc), Wavefront Object (.obj), Polygon (.ply), and Standard Tessellation Language (.stl), file types. While these are the most standard formats, problems can arise if models are created through another platform.
  • Building exclusively in iOS restricts applications to less than 50% of the global mobile market share.


ARCore is a platform developed by Google to build augmented reality experiences released in March of 2018. ARCore is a cross-platform SDK that uses motion tracking and a camera to keep track of points in space to create an understanding of the world and users’ positioning. This understanding allows the user to place objects, make annotations, or use the data to change how your application integrates with the world. 


  • ARCore has cross-platform capabilities which provide native APIs for several platforms, including Unity, iOS, Unreal Engine, and web. 
  • This platform uses Kotlin, Java or C as its primary programming languages, allowing users to choose what best suits their needs. 
  • ARCore’s ability to programmatically understand its surroundings without using the advanced sensor is accurate and seamlessly integrated into any scene when it renders. 
  • Ability to calculate the relative position and track user movements allows dynamic changes like light estimation and physics calculations to be quick and clean. 
  • Utilizes Google’s Cloud Feature API allowing an application to share scenes between different devices. The same 3D object is rendered and manipulated by multiple users simultaneously, with real-time updates and changes reflecting all interactions.
  • Since ARCore can be utilized as a cross-platform, you can reach most of the global mobile market.


  • Some of the biggest hurdles ARCore encounters are its relatively limited features for scene creation, such as the lack of body tracking, poor depth sensing, and absence of large-scale object handling. 
  • The platform is not supported on several Android devices, only working on Android 8.1 or later. 
  • There are limitations of the languages, Kotlin, Java, and C, as they can cause significant slowness when specifically running an ARCore application.
  • Java and C have a high capacity to create bloated code with limited scalability while using this SDK.
  • Kotlin, being a Java-based language, removes a lot of the clunkiness of Java, but has a steeper learning curve for AR, and the documentation surrounding AR in all these languages is limited.


Unity is a powerful rendering engine, created and distributed for use in June 2005, typically known for its game development capabilities. As a rendering engine, this powerful cross-platform software provides an interface that gives organizations access to all the essential tools needed for 2D and 3D development in one place. 


  • Allows for cross-platform mobile development as its AR Foundation currently supports features from ARKit, ARCore, HoloLens, and more. 
  • Projects in Unity are written in C# or C++, object-oriented languages (OOL). The code is typically modular, easy to update, and reusable. 
  • Code is often easier to maintain than other platforms, like ARCore or Xamarin, and can significantly reduce the redundancy that other programs may experience when writing out classes for AR objects.
  • Supports features of native Android and iOS devices, including device tracking, recasting, and 2D object tracking. 
  • Animations, modifications, and optimizations of 3D models and their components are typically prioritized, creating high-quality renders.


  • Unity’s AR Foundation has several limitations when building out Android-compatible applications, such as issues with rendering and compatibility issues with camera positioning.
  • Any cross-platform application created in Unity is limited to ARCore limitations and restrictions.
  • Unity is written in an OOL making applications more complex when adding features. This language can sometimes result in bloated code that can affect scalability.


Xamarin Forms is an open-source Windows UI framework that allows developers to build Android and iOS applications. Xamarin added augmented reality capabilities in October 2018 and continues to be a popular cross-platform mobile app development framework.


  • Xamarin projects are typically written in XAML with C# code behind, allowing the two platforms’ applications to share UI, code, tests, and business logic.
  • Developers can create platform-specific code for deeper customization and more utilization of platform-specific features.
  • Using C# and .NET eases updates and reduces redundancy, meaning a single development team could reasonably switch between various platform-specific designs.
  • Cross-platform development allows for a ‘write once, use everywhere’ approach that reduces expendability.
  • Being able to integrate native code gives developers the ability to utilize features that aren’t available for both platforms without making the entire application structure suffer from one platform’s limitations.
  • Since Xamarin compiles apps into native binary, they can be as performant as native apps.


  • While Xamarin is as close to native as possible, it can only achieve about 60% – 95% reusable code between platforms. 
  • There is a limited amount of overlap between the way Android and iOS handle augmented reality and any app created in Xamarin would require a vast amount of individual platform-specific code.
  • The need to build out a lot of native code can be time-consuming and reduce the optimization of the code base. 
  • Frame rate throttling can occur when more complex models animate due to graphic and UI code sharing.

No matter what AR development platform is used, our team of experts will work alongside your organization to avoid and overcome the biggest challenges of developing a mobile AR application to drive immediate value.

OST: Your Mobile AR Development Partner

At OST, wehelpbusinessesof all sizes strategically develop, support and optimize AR applications. We can help your organization at any stage of development, from answering your AR questions to planning and creating your organization’s AR application. We have a proven process for working alongside clients, building alignment across teams, executing projects efficiently, supporting mobile products over the long term and analyzing data to drive new insights for your business. Whether you are seeking help with IoT devices, connected products, or AR applications, OST is ready to help you with all your mobile needs. Contact OST today!



Stay up-to-date with OST blog posts.

About the Author

Arie Williams is a mobile developer and consultant at OST. With an abundance of client experience ranging from e-commerce to IoT development, she prides herself in her hunger for knowledge. When she's not working on projects, she either spends time working with local after-school programs to develop computing curriculum for young students, or she is out looking for the best food experiences Charlotte, NC has to offer.