#3 AR – a problem solver in the fashion industry?

The aim of this blog entry is to demonstrate the main problem online apparel retailers are facing.

Do we have a problem?

Indeed. It’s Holiday season. This means it’s the most popular time of the year for online shopping. When shopping online, a quarter of holiday shoppers intend to return items at a later date, when purchasing them. As a result, ecommerce brands register the most returns throughout December, January and February. But high e-commerce return rates happen throughout the year and across all industries. With a return rate of 12.2%, apparel retailers struggle with the second highest return rate out of all industries and lie above the average return rate of 10.6%.

When having a closer look at return rates by category, it is visible that womenswear has the highest return rate of 23%. Followed by Footwear with 20%. Although the figures presented here relate to the combined product categories of multichannel, they reflect the trend in digital sales.

But why does the fashion industry have so many returns?

It is a fact that returns are the new normal of e-commerce and central to customer experience. Many customers buy with the explicit intention to immediately return some or all of their items. It is known that around 70% of fashion returns are related to size and fit, a challenge that doesn’t occur when ordering a TV or a coffee table for instance. The issue of size and fit contributes to the problem, because when customers are not sure what size they need they order multiple sizes of one item, with the certainty to send back the rejects. 

Consumer preference-based returns like size, fit and style drive the majority of returns in fashion items. Non-preference-based reasons like detective products or account for 10% of fashion returns. In the following, the return reasons with the percentage are stated.

  1. Size too small: 30%
  2. Size too large: 22%
  3. Changed my mind: 12%
  4. Style: 8%
  5. Not as described: 5%
  6. Defective: 5%
  7. Other or not specified: 18%

And why is returning clothes a problem?

The high return rates are actually not only an economic problem for retailer, but also an environmental issue when facing the fact, that free shipping and returns come with a high unsustainable cost. It’s estimated that return shipping in the US alone creates 15 tons of carbon emissions per year. That’s the same amount produced by five million people. Moreover, in some cases returned items get destroyed by the retailer due to cost and time aspects (see picture below).

So, when preventing the likelihood of returns, retailers reduce the brand’s carbon footprint. The less returns, the less resources are wasted.

How can online fashion retailers reduce their return rates?

For starters, brands need to check that all public-facing content—including product descriptions—are accurate and detailed. If the product arrives differently than expected, there’s a high chance of it being returned.
In order to provide more details on items’ fit and size, AR technology comes into use.

Using AR on e-commerce websites

Buying clothes in-store, naturally has some advantages over shopping online. The ability to see in person, try on, and interact with products before purchase, makes returns less likely. Now, to bring those advantages to online shopping, AR technology should help customers to experience an in-store shopping experience when shopping via a device. Retailers can use AR to show what their products look like tried on, in a customer’s home, or next to an item they own for size comparison.
AR prevents the reason behind the majority of returns: the fact the item looks different in person than it did online. 

As a further step, my upcoming blog entry will focus on the market of AR in general and Virtual Fitting Rooms in detail.

References

https://fashionunited.de/nachrichten/mode/augmented-fashion-experience-wie-augmented-reality-kunden-fasziniert-und-zum-mode-accessoire-werden-koennte/2020051935697

https://www.shopify.com/enterprise/ecommerce-returns

https://www.reboundreturns.com/blog-articles/returns-conundrum-blog

https://f.hubspotusercontent10.net/hubfs/2182667/The%20Returning%20Conundrum.pdf?utm_campaign=IMRG%20Report%20-%20July%202021&utm_medium=email&_hsmi=140696218&_hsenc=p2ANqtz-8K4OK-9cu2w0SgS3F7Hxhq9BfXJAwV9iAP11ROVu-_9SJt3PfMrvfHiKB03e0g880hjiNyvU0JH2BwNbpUqlJFCq-f5w&utm_content=140696218&utm_source=hs_automation

https://3dlook.me/blog/how-to-impelement-ar-in-fashion/

Storytelling with Augmented Reality | Part 2

In the last post, I gave an overview about the technical aspects of Augmented Reality Storytelling and the three main components of it. In this post, I want to focus more on the story. I want to give an insight into Interactive Storytelling, which can be used with Augmented Reality to create an immersive experience for the user.

Interactive Storytelling

Interactive Stories are stories that can and need to be influenced by the user. Throughout the story, the user needs to make decisions in order to continue. These decisions influence the further course of the story. The user is no longer a passive observer of a linear story, but can be an active part of it.

A interactive story is usually divided in different parts. At the end of a storyline, the user is asked to make a decision, based on different options, which are provided. After making the decision, the user is forwarded to another storyline.

The term is sometimes used synonymously for digital and transmedia storytelling (storytelling through the use of digital media), but this is not always the case. Interactive Storytelling can also be applied in, for example, books. At some point of the story, the reader has to make a decision and has several different choices to choose from. Depending on what decision he/she makes, he/she has to scroll to a certain page where the story continues.

Use of Interactive Storytelling

Interactive Storytelling often finds its usage in marketing. There are several campagnas, which make use of Interactive Storytelling to promote their products. But Interactive Storytelling can also be used to disseminate social and difficult to communicate topics. An example of that is a campaign from “Wiener Linien”, where they created an interactive campaign to educate about civil courage. Another example if from the “International Red Cross”, which made a spot to generate and show awareness of work in crisis areas.

Common Structures

There are several different options to structure an interactive story. These are some of the most common structures:

Branching Narrative

A relatively classic narrative structure in which viewers can make more and more far-reaching decisions about the course of action. The narrative branches into different endings depending on the choices you make. Depending on how many branches the narrative contains, this type of structure can get very complex very quickly.

Fishbone Narrative

This is a traditional linear structure that allows viewers to explore the sub-stories of their story, but keep bringing them back to the main theme of their story. This structure still gives a lot of control over the route viewers take through your project.

Parallel Narrative

With this structure, on the one hand, viewers are offered choices in the story, and on the other hand, they are repeatedly returned to the main theme of the narrative for decisive moments.

Threaded Narrative

The perfect structure to tell a story from multiple angles. Topics can be linked to one another or remain completely separate. Rather, the story consists of a number of different themes that develop largely independently of one another.

Concentric Narrative

In this structure, there are different storylines, which orbits around a shared central point The viewers are provided with different entry points, where they get to choose one. No matter which entry point the viewer choose, in the end they will always return to the core area.

References

#2 Virtual Fitting Room – let’s see behind the curtain

In my second blog entry, I dive deeper into topic of the technology behind virtual fitting rooms – Augmented Reality. My research should lay the groundwork for understanding how AR works and how it can be used in a fashion context.

And what exactly is Augmented Reality?

Augmented Reality (AR) is available on any camera-equipped device – mostly on smartphone and tablet – and on which the corresponding AR software is installed. AR adds digital content onto a device’s live camera feed, making the digital content seem to be part of the real world. Unlike Virtual Reality (VR), which replaces reality with a completely digital environment, AR enhances the real world by digital information overlay or virtual details, which means the real environment remains central to the user experience.

Widely known and used examples of AR are for instance Pokémon Go and Ikea App. The picture below shows a range of different applications and fields, taking advantage of AR.

But how does AR work?

When a user points the device’s camera at an object, the software recognizes it through computer vision technology, which analyzes the content of the camera feed. The device then downloads information about the recognized object from the cloud and presents the AR information as an object overlay in a 3D experience. The content displayed is part real and part virtual. Computer vision determines an object in terms of semantics (what) and 3D geometry (where). First recognizing an object, then understanding it’s 3D position and orientation. With geometry, it is possible for the rendering module to display the AR content at the right place and angle, which is essential for a realistic AR experience. AR is real-time 3D or in other words it is live, which means the process explained above has to occur every time a new frame comes from the camera (most smartphones today work at 30 fps). As a result, when moving a device, the size and orientation of the augmentation adjusts to the changed context automatically. 

Finally: AR & fashion?

Yes – one field of application for AR are virtual try-ons. However, the experience should go beyond the aspect of trying on clothes, and it has to be said that AR cannot yet completely replace a real try-on due to the lack of display quality. The key aspect in such experiences is the part real and part virtual aspect of AR (person = real and garment = virtual). 

According to Vogue Business, the “AR clothing try-on is nearly here” and it “is getting closer to reality, and the pace of acceleration is increasing”. Companies and start-ups are working on improving try-on capabilities with updates including 3D body mesh to define 3D shapes, cloth simulation and its behavior more precisely. Since there are many interesting possibilities for in-shop-experience and online-experience, investors and tech companies see a great potential in AR clothing. The value of the technology is extended beyond its entertainment aspect.

In my next blog entry, I will focus more on the benefits of AR clothing and how it could potentially solve current problems concerning the fashion industry with a focus on e-commerce.

References

https://hbr.org/2017/11/how-does-augmented-reality-work

https://www.blippar.com/blog/2018/08/21/what-is-augmented-reality-and-how-does-augmented-reality-work

https://medium.com/@shivsoni377/the-difference-between-augmented-reality-virtual-reality-and-mixed-reality-a028bdd81f9d

https://www.robotlab.com/blog/what-is-the-difference-between-ar-and-vr

https://www.voguebusiness.com/technology/why-ar-clothing-try-on-is-nearly-here

Further automotive AR examples

| continuing my last blog entry on examples

In the past week I was searching for further examples of AR implementation in car’s user interafaces. Currently, after three weeks of research in this topic I have the impression that the industry is mainly focusing on visual augmentation as a help for the driver. Here are some further examples that offer some new aspects and features.

GMC Sierra HD trailer camera

The 2020 GMC Sierra HD pickup truck featured a novel implementation of AR technology. The truck was designed to tow heavy duty trailers and has in sum 15 cameras to help the driver navigate the really large truck. One camera can be mounted at the back end of the trailer looking at the road behind. This image can then be displayed added to the built-in rear camera view of the truck, letting the trailer almost disappear. [3]

I personally find the solution to be a nice gimmick but would actually question its practical benefit. The view for sure doesn’t help manoeuvring with the attached trailer.

GMC trailer camera [3]

Land Rover Clearsight ground view

Looking at special utility solutions, LandRover also implemented a camera augmentation in the Evoque and Defender models – on the main screen. The system works with cameras on the side mirrors and on the front grille and help the driver to see a 180° ground view in front of the car and between the front wheels, below the normal vision field. As LandRover targets off-road enthusiasts, this feature could be welcomed for showing any dangerous obstacles on rough terrains, or simply when climbing steep hills. A similar system is also implemented in the higher-class Bentley Bentaiga SUV. [7]

Besides this “transparent bonnet” system, the company Jaguar-LandRover was also conducting research also on a “transparent pillar” solution. That should help drivers in a more urban environment to see their surroundings in 360°, including objects hidden by the roof’s pillars with the help of cameras and AR. The research was done in 2014 and I couldn’t find any further outcome of the idea. Additionally they have also shown a unique way of AR navigation help: a ghost car projected in front of the driver that has to be followed along the route. [8] [9]

Jaguar transparent pillar and ghost car concept [8]

Panasonic’s state-of-the-art AR HUD

Panasonic Automotive is an other supplier (like Continental, etc.) who is developing onboard systems for automotive OEMs, like an AR Head Up Display with high-end features. Their product was shown on the latest CES 2021 exhibition and is claimed to get implemented in a series production car of an unknown brand in 2024. The system stands out from the other existing HUDs in following features:

  • AI software for 3D navigation graphics, supporting smooth responses to sudden changes ahead of the car. It also uses information from all the onboard ADAS systems (e.g. a 180° forward facing radar with 90 m range) and generates updates in less then 0,3 seconds. (the spatial-AI, AR navigation platform is a patent by Phiar)
  • Eye tracking technology to ensure that the driver always sees the projected information on the right place at any head movement.
  • Vibration control: image stabilization for bumpy roads.
  • Advanced optics, 4K resolution with laser and holography technology (by Envisics).

[5] [6]

To cover all relevant sensory fields in a car interior interface, next week I want to focus on the topic of haptics and tactile feedback solutions on the market.

Sources

[1] YouTube video from Roadshow: Car Tech 101: The best ways AR is being installed in cars | Cooley On Cars. Retrieved on 11.12.2021.
https://www.youtube.com/watch?v=PHhvCRexjWQ

[2] YouTube video from Roadshow: 2020 GMC Sierra HD: Heavy-duty hauler debuts “Transparent Trailer” tech. Retrieved on 11.12.2021.
https://www.youtube.com/watch?v=U0gZ9HaCWsA

[3] Online article by Road And Track: The 2020 GMC Sierra HD Can Make Your Trailer “Invisible”. Retrieved on 12.12.2021.
https://www.roadandtrack.com/new-cars/future-cars/a26009122/2020-gmc-sierra-heavy-duty-invisible-trailer/

[4] YouTube video by About Cars: Panasonic’s Innovative Augmented-Reality HUD Could Be in Cars by 2024. Retrieved on 12.12.2021.
https://www.youtube.com/watch?v=cLgMnSTSxog

[5] Online article by Panasonic: Panasonic Automotive Brings Expansive, Artificial Intelligence-Enhanced Situational Awareness to the Driver Experience with Augmented Reality Head-Up Display. Retrieved on 12.12.2021.
https://na.panasonic.com/us/news/panasonic-automotive-brings-expansive-artificial-intelligence-enhanced-situational-awareness-driver

[6] Online article by Auganix.org: Panasonic collaborates with Phiar to bring real-world AI-driven Augmented Reality navigation to its automotive solutions. Retrieved on 12.12.2021.
https://www.auganix.org/panasonic-collaborates-with-phiar-to-bring-real-world-ai-driven-augmented-reality-navigation-to-its-automotive-solutions/

[7] Online article by Car Magazine: Does it work? Land Rover’s ClearSight handy X-ray vision tech. Retrieved on 12.12.2021.
https://www.carmagazine.co.uk/car-news/tech/land-rover-clearsight/

[8] Online article by Autocar: Jaguar Land Rover previews transparent pillar technology. Retrieved on 12.12.2021.
https://www.autocar.co.uk/car-news/new-cars/jaguar-land-rover-previews-transparent-pillar-technology

[9] Online article by Jaguar: Jaguar Land Rover Develops Transparent Pillar And ‘Follow-Me’ Ghost Car Navigation Research. Retrieved on 12.12.2021.
https://media.jaguarlandrover.com/news/2014/12/jaguar-land-rover-develops-transparent-pillar-and-follow-me-ghost-car-navigation

Automotive AR examples

| looking at some state of the art examples of in-car AR systems on the market

MBUX – the newest infotainment system of Mercedes-Benz

2018 was the year when Mercedes Benz introduced their newest infotainment system called MBUX. This uses the front camera (originally used for parking) to create a live stream of the road ahead, combined with graphics of navigation hints or finding adresses. Since then it was continuosly improved and the latest version was revealed in 2021 in the S/EQS-Class models, featuring an AR navigation display and a HUD with distance assist, lane keeping assist and dinamic arrows showing directions.

Video demostrations of the 2021 MBUX system:
https://www.youtube.com/watch?v=hnRbi5UcJnw
https://www.youtube.com/watch?v=DCgy3askMcM

Audi AR HUD

Audi announced their augmented reality HUDs as an optional feature for the newest high-end electric SUV, the Q4 e-tron for 2021. The visual information shown in front of the driver are similar to the MBUX’s content. Audi explicitly defines two areas: the status field (in a visual distance of ca. 3 m) and the AR field (in a visual distance of ca. 10 m), which seems to be bigger than in the German competitor’s solution.

Demo video of the Q4 e-tron HUD:
https://www.youtube.com/watch?v=Ea6o-_smVk8

Hyundai and WayRay

Looking at HUDs further, Hyundai/Genesis was the first brand to implement laser-holographic AR head-up displays in their G80 model, presented by the young AR developer company WayRay in 2019. It is said to have tremendous benefits compared to past HUDs (using reflected LCD screens) in terms of precision and visibility for the driver.

The Swiss startup WayRay claims to be the only company to have implemented holography to HUDs. Their holographic optical elements (HOEs) in their displays should provide unprecedented 3D images while remaining transparent and capable of being bent to windshields. The company presents its uniqueness in the field by covering “deep-tech” holography hardware development (e.g. blue-laser beams) as well as software development, all realised in-house.

They have already received large fundings by Hyundai and Porsche, have presented a 180° AR cockpit experience and offer different add-on solutions for vehicles, boats and airplanes. Their newest project is a shared car concept (“Holograktor”), designed for the “Metaverse” with a complete gaming / working / learning possibility while traveling autonomously. In their cooperation with Pininfarina on a concept car, they proposed solutions of the “True AR” displays also for side windows, providing new ways of passenger infotainment and entertainment experiences.

A report from the FIA Formula E on WayRay’s developments also predicts the use of HUD systems for race cars in the future. The pilots behind the wheels could get visualized ideal racing lines, brake points or a ghost car to chase on the race track.

Hyundai’s In-Car Noise Cancelling

Besides HUDs, Hyundai is pushing the development of AR solutions in cars in other aspects as well. Like in our headphones, noise cancelling also found its way into the car interiors, bringing more comfort to the passengers. According to Hyundai, the earlier systems were only capable of masking steady engine noises, but their newest solution (“Road Noise Active Noise Control”) in the upcoming Genesis GV80 will be capable of deleting different tire noises at different speeds. It uses multiple microphones placed directly into the wheel wells, accelerometers, amplifiers and a digital signal processor. As a result of the complex calculations for each individual wheel, the in-car noise should be reduced by half (3 dB).

Engine sound enhancements

Writing about noises of the car, we also have to take a short look at the opposite effects to noise cancelling – the engine sound enhancement devices. Due to the downsizing of the engine displacements, the roars coming from the combustion got also reduced. To keep the emotions connected to sporty engine sounds though, manufacturers are using additional devices to create compensating sound effects.

These can be pipes from the intake manifold connected to the dashboard walls, in some cases with an extra flap to control the sound throughput only for the sporty driving situations (Toyota, Ford, Porsche).

BMW was known to use engine sound amplification through a synthesised reproduction of the actual engine noise played simply on the car’s speakers.

The Volkswagen Group made it a bit more complicated by adding a special speaker device (“Soundaktor”) below the windshield to produce deep, buzzing tones resembling larger engine sounds. In some models there are also speakers built into the exhaust pipes to alter the natural noises coming from the engine, to make them more emotional or masculin.

Soundmodule for the Mercedes G350d

3D ADAS system of Arkamys

Beeping noises in a car are existent since many years, with the intention to help drivers. But beeping on itself is not always enough to give an understandable signal about what is happening or dangerous around the driver. The company Arkamys presented an intuitive alerting concept for Advanced Driver Assistance Systems – parking, lane keeping, blind spot and other assistants – by placing many different speakers inside the cabin and generating a 3D sound experience. With this it is possible to signalize the direction where a possible danger can exist, making the recognition and processing of the information easier and more intuitive for the driver.

Electric cars

Electric cars are further good examples where in-car noise generators are used to give the driver and passengers the known feeling of vehicle driving dynamics. Porsche is a perfect example where specific sounds are developed for representing the brand’s identity within the driver experience. They call the system “Porsche Electric Sport Sound” that enhances some natural noises of the drivetrain but also reduces disturbing ones, while implementing sounds to compy with the legal regulations for electric vehicle alerting sounds.

Thinking further about sound augmentation in cars, probably the already most spread system is the parking assistant, giving beeping sound feedback on the remaining distance to obstacles around the car. The design of these systems could probably fill a chapter on its own, but as it is already an everday tool, I won’t go further into detail on it.

The above listed examples are not even close to a complete list of use cases. Therefor I want to further research the current technologies. The next step will then be to look into the reasons for these systems, why they were developed and what practical needs, feelings and experiences are the underlying causes.

Sources

Online article on Wired: With In-Car AR, Drivers Get a New View of the Road Ahead. Retrieved on 05.12.2021
https://www.wired.com/story/in-car-ar-drivers-get-new-view-road-ahead/

Article on Wired: Hyundai’s Luxury SUV Mixes Mics and Math for a Silent Ride. Retrieved on 05.12.2021
https://www.wired.com/story/hyundai-genesis-gv80-suv-noise-cancelling/

Online Article on FIA Formula E: How AR and VR are revolutionising the car industry. Retrieved on 05.12.2021
https://www.fiaformulae.com/en/news/2020/june/ar-vr

WayRay – offical website. Retrieved on 05.12.2021
https://wayray.com/#who-we-are
https://wayray.com/press-area/#media_coverage

Online article on WayRay by CNET and Autocar. Retrieved on 05.12.2021
https://www.hyundai.news/eu/articles/press-releases/hyundai-wayray-unveil-next-generation-visual-technology-at-ces-2019.html

YouTube video by Roadshow: CES 2019: WayRay’s holographic AR windshield is real, hitting the road soon. Retrieved on 05.12.2021
https://www.youtube.com/watch?v=HFIgjQI2E6Y

AutoCar article on the Pininfarina concept car. Retrieved on 05.12.2021
https://www.autocar.co.uk/car-news/new-cars/pininfarina-concept-car-showcased-holographic-ar-display

Online article by AutoZeitung: Mercedes entwickelt MBUX weiter. Retreived on 05.12.2021
https://www.autozeitung.de/mercedes-infotainment-192628.html

Mercedes-Benz MBUX System – online articles and images, retrieved on 05.12.2021
https://www.wired.com/story/in-car-ar-drivers-get-new-view-road-ahead/
https://www.extremetech.com/extreme/314758-2021-mercedes-s-class-2-hud-sizes-level-3-autonomy-4d-sound-5-lcds
https://carbuzz.com/news/new-mercedes-s-class-shows-off-amazing-augmented-reality-display

Audi AR HUD system: online article and Youtube video on Slashgear. Retreived on 05.12.2021
https://www.slashgear.com/the-audi-q4-e-trons-augmented-reality-head-up-display-is-dashboard-genius-09662735/
https://www.audi-technology-portal.de/de/elektrik-elektronik/fahrerassistenzsysteme/audi-q4-e-tron-ar-hud-de/

Online article on GeekDad: Augmented Reality for Your Ears. Retrieved on 01.02.2022
https://geekdad.com/2016/02/arkamys/

Image of Mercedes G350d soundmodule. Retrieved on 01.02.2022
https://www.tuningblog.eu/kategorien/tuning-wiki/soundgenerator-nachruesten-232502/

CarThrottle article on sound enhancers. Retrieved on 05.12.2021
https://www.carthrottle.com/post/5-ways-that-manufacturers-enhance-the-sound-of-their-cars/

The Porsche Sound – online article, retrieved on 05.12.2021
https://newsroom.porsche.com/de/produkte/taycan/sound-18542.html

Storytelling with Augmented Reality | Part 1

Augmented Reality opens new possibilities of storytelling. With Augmented Reality, you are not just watching a story been told. You are immersed in the experience and become part of the story.

“We witness stories our entire lives. All the storytelling mediums we know and love are ones where an author recounts a tale and we bare witness to that tale. What gets me so excited about these immersive mediums is it feels like we’re crossing the threshold from stories we witnessed to stories we live as our own.”
– CEO of the VR tech and entertainment company, Within

You experience the story as a character of it, you can interact with other characters and they interact with you and you have the ability to influence the story. You walk away with the memory of your own story and not of just media you have consumed.

Three main components of Augmented Reality Stories

In most of AR scenes, you need to focus of the three main aspects.

1. Assets

Assets are all the elements of a AR story, like 3D or 2D models, audio files or videos. They help you tell your story. 3D models, especially when they are combined with audio, can create an immersive experience by taking the user into the world of the story. 2D aspects can also be an important part, for example by providing information via text.

Something you need to also keep in mind is on which device the user will be experiencing your AR story. Not every user is using the latest device, so you need to pay attention on the size of your assets.

2. Interactions

While creating an AR story, you have to consider, how you want the user to be able to interact with the story. These could be through really simple interactions, like the user can for example rotate assets, take a closer look at some of them or look at the scene from a distance. Or more complex ones, for example interacting with characters, speak to them and in order to that influence the story.

3. Environment

Augmented Reality takes place in the real world. So you need to consider where it takes place and how it does influence the role of the user. Does it take place in a room, like the surface of a table, where the user is in the middle of the story, or does the story take place outside, where the assets are far away and the user gets the role of an observer.

Example: Between Worlds by Skip Brittenham

A great example of storytelling with Augmented Reality is Skip Brittenhams book “Between worlds”. Through the use of the Augmented Reality technologies, the fantasy world becomes alive in interactive 3D.

AR in Education #3: Technological aspects of AR

Hello again! In this 3rd blog entry I will give an overview of the technology behind AR that makes the magic happen. Let’s go.

Technology

To superimpose digital media on physical spaces in the right dimensions and at the right location 3 major technologies are needed: 1) SLAM, 2) Depth tracking and 3) Image processing & projection

SLAM (simultaneous location and mapping) renders virtual images over real-world spaces/objects in the right dimensions. It works with the help of localizing sensors (i.e. gyroscope or accelerometer) that map the entire physical space or object. Today, common APIs and SDKs for AR come with built-in SLAM capabilities.

Depth tracking is used to calculate the distance of the object or surface from the AR device’s camera sensor. It works the same a camera would work to focus on the desired object and blur out the rest of its surroundings.

Then the AR program will process the image as per requirements and projects it on the user’s screen (For further information on the “user’s screen” see section “AR Devices” below). The image is collected from the user’s device lens and processed in the backend by the AR application. 

To sum up: SLAM and depth tracking make it possible to render the image in the right dimensions and at the right location. Cameras and sensors are needed to collect the user’s interaction data and send it for processing. The result of processing (= digital content) is then projected onto a surface to view. Some AR devices even have mirrors to assist human eyes to view virtual images by performing a proper image alignment.

Object detection

There are two primary types used to detect objects, which both have several subsets: 1) Trigger-based Augmentation and 2) View-based Augmentation

Trigger-based Augmentation

There are specific triggers like markers, symbols, icons, GPS locations, etc. that can be detected by the AR device. When pointed at such a trigger, the AR app processes the 3D image and projects it on the user’s device. The following subsets make trigger-based augmentation possible: a) Marker-based augmentation, b) Location-based augmentation and c) Dynamic augmentation.

a) Marker-based augmentation

Marker-based augmentation (a.k.a. image recognition) works by scanning and recognizing special AR markers. Therefore it requires a special visual object (anything like a printed QR code or a special sign) and a camera to scan it. In some cases, the AR device also calculates the position and orientation of a marker to align the projected content properly.

Example for marker-based augmentation with a special sign as trigger

b) Location-based augmentation

Lacotion-based (a.k.a. markerless or position-based augmentation) provides data based on the user’s real-time location. The AR app picks up the location of the device and combines it with dynamic information fetched from cloud servers or from the app’s backend. I.e. maps and navigation with AR features or vehicle parking assistants work based on location-based augmentation.

BMW’s heads-up display as an example of location-based augmentation

c) Dynamic augmentation

Dynamic augmentation is the most responsive form of augmented reality. It leverages motion tracking sensors in the AR device to detect images from the real-world and super-imposes them with digital media.

Sephora’s AR mirror as an example of dynamic augmentation. The app works like a real-world mirror reflecting the user’s face on the screen.

View-based Augmentation

In view-based methods, the AR app detects dynamic surfaces (like buildings, desktop surfaces, natural surroundings, etc.) and connects the dynamic view to its backend to match reference points and projects related information on the screen. View-based augmentation works in two ways: a) Superimposition-based augmentation and b) Generic digital augmentation.

a) Superimposition-based augmentation

Superimposition-based augmentation replaces the original view with an augmented (fully or partially). It works by detecting static objects that are already fed into the AR application’s database. The app uses optical sensors to detect the object and relays digital information above them.

Hyundai’s AR-based owner’s manual allows users to point their AR device at the engine and see each component’s name + instructions for basic maintenance processes.

b) Generic digital augmentation

Generic digital augmentation is what gives developers and artists the liberty to create anything that they wish the immersive experience of AR. It allows rendering of 3D objects that can be imposed on actual spaces.

The IKEA catalog app allows users to place virtual items of their furniture catalog in their rooms based on generic digital augmentation.

It’s important to note that there is no one-size-fits-all AR technology. The right augmented reality software technology has to be chosen based on the purpose of the project and the user’s requirements

AR Devices

As already mentioned in my previous blog entry, AR can be displayed on various devices. From smartphones and tablets to gadgets like Google Glass or handheld devices, and these technologies continue to evolve. For processing and projection, AR devices and hardware have requirements such as several sensors, cameras, accelerometer, gyroscope, digital compass, GPS, CPU, GPU, displays and so on. Devices suitable for Augmented reality can be divided into the following categories: 1) Mobile devices (smartphones and tablets); 2) Special AR devices, designed primarily and solely for augmented reality experiences; 3) AR glasses (or smart glasses) like Google Glasses or Meta 2 Glasses; 4) AR contact lenses (or smart lenses) and 5) Virtual retinal displays (VRD), that create images by projecting laser light into the human eye.

That’s it for today 🙂 

_____

Sources:

https://thinkmobiles.com/blog/what-is-augmented-reality/

https://learn.g2.com/augmented-reality-technologies

AR in Education #2: Comparing XR, AR, VR & MR

Hello again! My second blog entry will be about the the differences between four concepts: Extended Reality (XR), Augmented Reality (AR), Virtual Reality (VR) and Mixed Reality (MR).

XR, AR, VR, MR,… What??

Extended Reality (XR): XR is a “catch-all”-term for technologies that enhance or replace our view of the real world. This can be done through overlaying or immersing computer text and graphics into real-world and virtual environments, or even a combination of both. XR encompasses AR, VR and MR.

Augmented Reality (AR): AR enhances our view of the real world by overlaying the real-world environment with digital content across multiple sensory modalities. It detects objects in the real-world environment and overlaps those with computer-generated data such as graphics, sounds, images, and texts. In other words: AR comines the real world with the digital world. Users can experience AR very easily through an smartphone application, but also through special AR wearables (i.e. headsets, glasses), displays, projectors or even contact lenses.

Virtual Reality (VR): While AR enhances the user’s real environment, VR completely replaces it with a virtual one. By using full-coverage headsets the user’s real-world surroundings are completely shut out while using. Advanced VR experiences  even allow users to move in a digital environment and hear sounds. Moreover, special hand controllers can be used to enhance VR experiences.

Mixed Reality (MR): MR is the newest of these immersive technologies and combines aspects of AR and VR. When experiencing MR, virtual content is not only overlaid on the real environment (as in AR) but is anchored to and interacts with that environment. Instead of relying only on remote control devices, smart glasses, or smartphones, users can also use their gestures, glancing or blinking, and much more to interact with the real and the digital world at the same time. 

Long Story short:

  • Extended Reality (XR) is an umbrella term for technologies that enhance or replace our view of the real world
  • Augmented Reality (AR) overlays virtual objects on the real-world environment
  • Virtual Reality (VR) immerses users in a fully artificial digital environment
  • Mixed Reality (MR) not just overlays but anchors virtual objects to the real world

For a better understanding, I found this nice infographic:

Comparison of VR, AR and MR

Okay, got it. But why AR?

As far as I know at this point, all three techniques – AR, MR & VR – can be useful for educational purposes. The choice of the technology might depend on several factors like the field of education, the equipment or the target group. Still, I chose to focus on AR for several reasons: 1) I like the idea of learning new things by enhancing the user’s environmental view instead of replacing it like it is with VR (my subjective opinion); 2) AR is easily accessible via smartphones or tablets, while VR and MR need more advanced technology (i.e. headsets). There might come up more advantages (and maybe some limitations and disadvantages too) the further I dive into the topic, let’s see. But that’s it for now! 🙂

_____

Sources:

Headerimage: https://www.vsight.io/the-difference-between-ar-vr-and-mr/

Infographic: https://rubygarage.org/blog/difference-between-ar-vr-mr

Arm Blueprint. (2021, 11. August). xR, AR, VR, MR: What’s the Difference in Reality? Verfügbar unter: https://www.arm.com/blogs/blueprint/xr-ar-vr-mr-difference 

RubyGarage. (2021, 27. August). VR vs AR vs MR: Differences and Real-Life Applications. https://rubygarage.org/blog/difference-between-ar-vr-mr 

Vsight. (2021, 9. Oktober). The difference between AR, VR, and MR. Verfügbar unter: https://www.vsight.io/the-difference-between-ar-vr-and-mr/

AR basics and automotive trends

| a short and basic definition on Augmented Reality, the first implementations in vehicles and current innovation trends

What exactly is Augmented Reality and when was it first used?

To have a clear distinction between related expressions, Paul Milgram’s Reality-Virtuality Continuum from 1994 shows the relation of Augmented, Mixed and Virtual reality in a very comprehensible way. [3] As shown in the illustration below, AR is the evolution of real environments in the direction of complete virtuality, but still having a majority of real content. Augmented Virtuality on the hand would describe systems using more virtual than real models.

Illustration by P. Milgram and H. Colquhoun Jr., in A Taxonomy of Real and Virtual
World Display Integration [4]

To have an official definition, in The Concise Fintech Compendium AR is described as “an enhanced version of the physical, real-world reality of which elements are superimposed by computer generated or extracted real-world sensory input such as sound, video, graphics or haptics.” [1]

Already in 1997 R. T. Azuma stated three essential characteristics of AR systems [2]:

  • combining reality with a virtual world
  • interacting in real-time
  • registering in 3D space

Azuma also described the two basic possibilities of combining virtual inputs with the real world: virtual objects can be added to the real perception or real objects can be hidden by overlaying virtual effects. This may be possible not only for optical perception, but also for sound and haptics. He described systems with speakers and microphones, altering the incoming sound of our surroundings (like today’s noise-cancelling), or gloves with additional haptic feedback of simulated forces. [2] Basically AR could help us to enhance all of our senses, but it is mostly implemented in visual systems. [6]

After reading basics theories on Augmented Reality from the early 1990’s, one wouldn’t think that the first personal AR system was developed in 1968 at the Harvard University by Ivan Sutherland, the “father of computer graphics” – a HMD (Head-Mounted-Display) system. [8]

Regarding vehicles and and the first implementation of AR, we have to go even further back in time. The predecessor of today’s BAE System plc., Elliot Flight Automation along with Cintel claim the development of the first Head-Up-Display (HUD) in operational service in 1961 – for a military aircraft of the British Royal Navy, the Blackburn Buccaneer. [9]

The first HUD in a passenger car is stated to be used in the Oldsmobile Cutlass Supreme Indy 500 pace car made by General Motors in 1988. [10] Following photo depicts this very simple AR solution on the windscreen.

The HUD in the Oldsmobile Cutlass Supreme Indy500 pace car, from 1988.
Source: https://www.autoevolution.com/news/how-to-add-a-head-up-display-to-your-car-136497.html

In the last decades, AR was further decveloped and implemented in many different areas, and with the evolution of displays, projectors and computer graphics, we can have now our own AR applications on our smartphones or passenger cars. While starting to dig deeper into existing automotive AR solutions, I found the following interesting study as a foundation to enclose my topic of interest.

AR innovations in the automotive industry today

A study carried out by the Austrian “innovation intelligence company” StartUs GmbH analysed over 400 startups and created an overview on the most innovative use cases of AR in the automotive industry [7]:

The study chart by StartUs GmbH [4]

They state that the the total augmented reality automotive market is growing by 177% every year and will reach $5.5 billion by 2022. [7]

From their five areas of innovation my main focus will be on “Experience Enhancement”. The use cases are see-through displays, windshield projectors or various wearables, that can help the driver with additional, immediate information on important events of the surroundings without any distraction. [7]

Existing solutions for this area will follow in my further research.

Sources

[0] Wikipedia – Summaries on Augmented Reality
https://en.wikipedia.org/wiki/Augmented_reality
https://de.wikipedia.org/wiki/Erweiterte_Realität
https://en.wikipedia.org/wiki/Mixed_reality

[1] Schueffel, P.: The Concise Fintech Compendium. Fribourg: School of Management Fribourg/Switzerland, 2017
https://web.archive.org/web/20171024205446/
http://www.heg-fr.ch/EN/School-of-Management/Communication-and-Events/events/Pages/EventViewer.aspx?Event=patrick-schuffel.aspx

[2] Azuma, R. T.: A Survey of Augmented Reality. In: Presence: Teleoperators and Virtual Environments. 6, Nr. 4, 1997, S. 355–385

[4] Milgram, P., Colquhoun Jr., H.: A Taxonomy of Real and Virtual World Display Integration. In: Mixed reality: Merging real and virtual worlds, Springer, 1999, p. 1-26

[5] The basics of Augmented Reality – Interview with an AR expert; Indestry.com; Retrieved on 27.11.2021
https://www.indestry.com/blog/the-basics-of-augmented-reality-interview-with-an-ar-expert

[6] Kipper, G., Rampolla J.: Augmented Reality: An Emerging Technologies Guide to AR; Elsevier; 2013

[7] Online article: How Augmented Reality Disrupts The Automotive Industry; by StartUs Insights Research Blog. Retrieved on 28.11.2021
https://www.startus-insights.com/innovators-guide/how-augmented-reality-disrupts-the-automotive-industry/

[8] Online article by Javornik, A: The Mainstreaming of Augmented Reality: A Brief History; Harvard Business Review; 2016. Retrieved on 28.11.2021
https://hbr.org/2016/10/the-mainstreaming-of-augmented-reality-a-brief-history

[9] Online article by BAE Systems: The evolution of the Head-Up Display. Retrieved on 28.11.2021
https://www.baesystems.com/en/feature/our-innovations-hud

[10] Wikipedia summary on automotive Head-Up Displays:
https://en.wikipedia.org/wiki/Automotive_head-up_display

AR in Education #1: Intro

Hello there! This is my very first blog entry about my journey of finding a suitable topic/project for my master’s thesis, so here we go: I chose “AR in Education” as an overall topic, which I would like to approach rather broadly at first and then gradually narrow it down in order to find a specific research question to work with. The aim of this first blog entry is to give a quick overview of 1) what AR is and 2) how it’s used in the educational sector. Let’s get started:

AR in a nutshell

Augmented Reality (AR) allows to enhance the real physical world through digital visual elements, sound or other sensory stimuli delivered via technology. It incorporates three basic features: 1) a combination of real and virtual worlds, 2) real-time interaction and 3) accurate 3D registration of virtual and real objects. AR thus provides both the real and virtual world simultaneously to the users – either in a constructive (i.e. additive to the natural environment) or a destructive (i.e. masking of the natural environment) way. Further information on the technology behind AR (i.e. hardware, software, algorithms and development) will be covered in another blog entry.

AR in the educational sector

AR techniques are already used in various fields like entertainment, tourism, health care or cultural heritage – just to mention a few. But it’s the educational sector, that caught my attention – especially children’s education. I asked myself “Can AR be used to make learning faster, better and more fun?”. As far as I know at this point, the answer is yes. There is already a range of educational materials like textbooks or flashcards that contain embedded “markers” or triggers that, when scanned by an AR device, produce supplementary information rendered in a multimedia format. But that doesn’t mean that I am not sceptical about AR as an educational tool – In my opinion “children & digital devices” is a double-edged sword. That’s why I would like to take a very close look at where AR has added value and where it doesn’t (in another blog entry).

My next steps

  • Dive in deeper into the technology behind AR 
  • Find out, what already exists on the market (and hopefully find a niche, where there’s a need)
  • Discuss, where AR has added value and where it doesn’t

That’s it for today! 🙂

_____

Sources:

Afnan, Muhammad, K., Khan, N., Lee, M.-Y., Imran, A., & Sajjad, M. (2021). School of the Future: A Comprehensive Study on the Effectiveness of Augmented Reality as a Tool for Primary School Children’s Education. Applied Sciences, 11(11), 5277. MDPI AG. doi: http://dx.doi.org/10.3390/app11115277

Elmqaddem, N. (2019). Augmented Reality and Virtual Reality in Education. Myth or Reality? iJET, 14, 234-242. doi: 10.3991/IJET.V14I03.9289