Immersive Technology Enters Reality

By Adam Kimmel for Mouser Electronics

The article throws light on three types of realities named Virtual Reality, Augmented Reality and Mixed Reality. It further tells how these realities will change humankind experience other than gaming and entertainment. The various technologies like Processing power/thermal management, visual display/illumination, connectivity, and motion tracking are also shaping its growth.

How do you define reality? On the surface, the concept of reality is easy to understand. It is the world around us—all we see, all we feel in a given space. Reality is simply what exists.

At least, this definition is how we’ve viewed reality historically. However, with the proliferation of the Internet of Things (IoT), smart devices, and 5G, the premise of reality has changed from constant, predictable surroundings to a continuously evolving real and virtual experience. We see and feel the physical reality, which is only part of the story. Physical reality is balanced by immersive/extended reality (XR), which can add to and create new experiences.

Immersive realities have three principal types:

  • Virtual reality (VR)–An entirely simulated environment that lets the user feel immersed in a digital-only world.
  • Augmented reality (AR)–The enhancement of the physical environment by adding digital images and experiences. Users experience AR through smartphone cameras frequently.
  • Mixed reality (MR)–The interaction between physical and digital objects. MR uses the strengths and benefits of both AR and VR to optimize the user experience.

Given the saturation of smartphones, AR and MR are gaining popularity quickly. A 2021 Statista market report covering 2018-2023 provides the following insights around immersive tech:

  • The immersive technology industry’s global revenue is close to $6.3 billion (USD) as of 2020 and is forecast to continue growing exponentially for the next five to 10 years.
  • Virtual reality is the largest segment, although the growth rate is asymptotic from 42 percent from 2020-21 to 10 percent from 2022-23.
  • Mobile augmented reality combines with augmented/mixed reality technology to generate $3.7 billion (USD), with the $2.6 billion (USD) balance coming from VR.
  • Augmented and mixed reality are growing at higher rates, with MR experiencing exponential growth as the IoT expands.

Beyond smartphone cameras, the primary technology that delivers an immersive user experience is an XR headset. Computational and processing elements, actuators, and sensors are within the XR devices, among other components. These components link the two worlds together and enable transformative applications to benefit humanity. Some examples of XR usages include creating human and social benefits such as creating safe, simulated social interactions for children, and aid in the treatment of ailments ranging from phobias to pain and anxiety. Along with the clear healthcare benefits, XR also improves human convenience through smart cities and connected streetlights, aids in space travel, and thousands of other applications in between.

The intersection of the virtual and physical worlds can change humanity for the better. The following reviews each type of XR and outlines the roles played by the relevant signal chain components that enable this transformative technology.

Immersive Tech Frames of Reference

Since the invention of the technology in 1957, extended reality has captivated the scientific and consumer spaces. Although scientists developed the terms virtual reality (1975) and augmented reality (1990) much later, the desire to utilize computers and technology to enhance human experience was already well underway.

It is worth defining the frames of reference of immersive tech to understand the signal chain and how relevant processing elements help deliver the experience.

Physical Reality

The simplest to comprehend, physical reality acts as the canvas on which the technology applies digital features. Using all five senses, we receive immediate, accurate feedback as to the world around us. It is the notion of feedback that sets the guardrails of what the user can expect.

Physical Reality Example: Turning the Corner

A simple example of a shopping cart running into the corner of a grocery shelf illustrates this. A distracted shopper attempts to turn the corner of an aisle but hits the shelf corner. However, they quickly back up the cart, adjust the turning radius, and complete the turn.

The feedback that the shelf provided was that the radius was incorrect. This information provided the instant, accurate response the user needed to navigate the turn. Because physical reality is constant and not subject to digital tolerance, the feedback was instant. In addition, the shelf is stationary, so its position is not questioned at any point in time.

The position and size of physical objects are critical input and feedback for creating digital realities, and physical reality engages all five senses. With that framework set and the canvas defined, programmers can create digital experiences from and within the physical environment.

Virtual Reality (VR)

Virtual reality is perhaps the best-known immersive experience. It serves to upgrade the human experience, providing users the chance to do things they could never have done otherwise given physical, financial, or chronological limitations. In addition, VR provides its operator with a fully immersive experience, simulating as many of the five senses as possible.

VR Example: Roman Rumble

Because VR creates an entirely simulated experience, it is possible to spend a day watching gladiators compete at the Roman Colosseum 2,000 years ago. The technology can ignite the senses to see, hear, smell, and feel the experience. Should a user be brave enough, they can participate in the games themselves, with all the historical accuracy of the day.

One of the primary differences between VR and AR is the location of the user within the environment. AR uses smartphone (or other) cameras to orient the user’s sight within the physical environment before adding digital features. VR digitally positions sightlines entirely within the digital environment, usually with a head-mounted display (HMD). The technology reacts to the user’s vision movement, so it is critical that the technology receives, processes, analyzes, and returns data logically and rapidly. The better job the technology does, the more natural the experience seems.

Augmented Reality (AR)

This form of extended reality is where the advancements in connected technology shine. Already present in our everyday lives with Google Maps, NFL’s yellow first-down line, and Pokémon GO (among many other applications), AR enhances what users see to add functionality or features.

The key to this technology is to meet the user where they are in terms of the experiential difference the digital features add. The first-down line should be a subtle adder when watching a game, so the technology does not distract the audience from human action. Users want the Pokémon or street intersections on the map to be visible, thus calling for a more disruptive digital feature.

AR Example: How High the Moon

Stargazers can employ AR when looking at constellations and planets in the night sky. The technology provides users information about what they see, the position, distance away, movement patterns of objects, and context around the origins and movement patterns of the stars. AR provides a real-life planetarium visit from the comfort of home, and it answers many of the common questions asked by space fans of all experience levels.

AR offers helpful and entertaining features and gains popularity with each new connected application by improving the experience. Image capture and mapping are essential to how well AR works. When done correctly, the digital features feel as if they are there in real life, behaving as a physically present object.

Mixed Reality (MR)

Even experienced developers and users of immersive tech interchange merge the definitions of MR and AR. Where they combine physical and virtual features, MR refers to the interaction of the two instead of the digital elements overlaying physical. The utility of this interaction is driving the mixed reality to the highest market growth year-over-year, according to the Statista report. Therefore, it is helpful to consider the example of engineering CAD design to denote the primary difference between MR and AR.

MR Example: A Winning Quality

With a focus on Industrial IoT, applying MR to a production process can determine whether a machine tolerance is sufficient, validating its specification. Quality engineers can project a finished part geometry over a partial physical part to predict whether the components are dimensionally accurate. Achieving this level of predictive accuracy early in the process can decrease scrap rate and increase throughput, improving operating efficiency and reducing cost. MR can also direct mechanics on a detailed repair process remotely.

MR has the chance to be massively disruptive, finally bridging the gap between humans and machines like never before. As in VR and AR, physical object positional accuracy and digital feature response times are critical to achieving the desired results with MR.

Technical Signal Chain Components

Immersive technology has far-reaching benefits for humankind. These advantages are only as good as the technology can deliver, however. Like the supply chain for sourcing materials, the signal chain comprises the network and order of components that provide the XR experience. Processing power/thermal management, visual display/illumination, connectivity, and motion tracking are the limiting technologies to unlock immersive tech’s full power.

Processing Power/Thermal Management

The more intensive the display, the higher the power requirement needed to supply the digital features. Integrating power-dense thermal management and processors ensures the technology will not distract from the immersive experience. In addition, the system must cool the increasing processing load sufficiently to protect the user from equipment failure, especially during medical and human-care applications. The data center cooling market is helping to drive processing and chip cooling solutions that can help XR.

Visual Display/Illumination

The eyes are the first sense to engage with immersive tech, so the display quality is critical. Having a display with an illumination strategy that lets the user’s vision transition from real to virtual (and back) provides the transformative experience the user seeks. Increasingly high-resolution cameras offer more realistic experiences. Balancing real-life and digital illumination can smooth the boundary between the two realms. Smartphones and display makers pace the development of this component.


5G will be the most transformative in connectivity. For the processing power to access the data for rapid analysis, connectivity cannot be a question. The amount of data providing realistic environments will only increase as the system transmits more data to the user. The high speed and decreased lag of 5G will propel XR into applications where technology’s real-time response to user behavior is non-negotiable. The widescale rollout of 5G enables decouples data analysis from a central hub and processes more of the data at the device level (the edge). Edge processing reduces data travel time and distance.

Motion Tracking

With minor shifts in vision patterns and motion, immersion technology must be highly sensitive to user movement. Sensors collect data from the environment, and actuators record and transmit the human response. Autonomous vehicle technology is pulling the development of both of these, especially sensors. For accurate interaction of the physical and digital environments, sensors must create an accurate map of the physical space, and the actuators must transmit the human action as intended.


The benefits of seamless integration of immersive technology are clear. This transformative technology movement improves entertainment and helps society by improving the products we develop, medical care and procedures, safety when traveling, and quality and depth of education. Developments in the areas below will fuel this integration:

  • Processing power and cooling from the data center industry
  • Display and illumination led by smartphones and display manufacturers
  • Connectivity led by 5G manufacturers and network service providers
  • Motion tracking led by sensor and actuator electronics manufacturing

Each of these technologies carries the risk of limiting the growth of immersive technology. It will take combined efforts to develop and grow each segment to realize the exponential gains predicted by the market reports. But with the benefit available for humanity through improving extended reality, the reward is clearly worth the investment.