Microsoft Flight Simulator 2020 – ready to be in the very front seat?

Introduction

Microsoft Flight Simulator (abbreviated as MSFS) is a series of flight simulation applications that was first released in 1982. Starting with the 2020 version, the application now runs in Virtual Reality (VR) mode, allowing users to experience a highly interactive and realistic flight simulation.

What makes it engaging?

Porting the simulator to VR makes this application especially engaging and entrancing to the users. The users feel like they are inside the cockpit and maneuvering the actual aircraft. This sense of immersion is strengthened by the detailed depiction of surroundings, including the airports, cities, and skyscrapers, as well as the natural landscape, providing the users with a life-like experience. While flying over the Grand Canyon, users can see the intricate details of the canyon’s rock formations and the winding Colorado River. Additionally, while flying over a city like New York, users can see detailed 3D models of famous buildings such as the Empire State Building and the Statue of Liberty.

Features that are well-done

High-fidelity representation of the surroundings

Being a simulator, when it comes to representational fidelity, MSFS strives to be on the realistic side of the triangle. As such, MSFS employs a number of methods to ensure that the surroundings the users interact with are as realistic as possible. To begin with, it uses high-quality 3D photogrammetry data from Microsoft’s Azure 3D maps library. When the area is not captured well by the in-house data, it then applies a deep-learning algorithm on the 2D satellite image to restructure the sight into 3D graphics. Lastly, some areas that are worthy of attention are modeled by the designers, manually[1].

User-friendly interactions inside the cockpit

When run in VR mode, MSFS does not use any HUD. Instead, it relies on the virtual world reference frame, where the users learn the status of the current flight via the dashboard inside the cockpit. Previously, hand-tracking features were not supported by the MSFS, meaning that users had to purchase simulator-compatible controllers to maneuver the aircraft fully[2]. With the update in November 2021, the users are now able to use hand-tracking controllers to interact with the cockpits of the planes, thus greatly increasing the immersion of the simulation.

Features that need improvement

Performance

As MSFS tries to resemble the flying experience as realistic as possible, performance and hardware requirements come as a natural concern. There are complaints from the users about the FPS drops and the performance issues when the application is run, especially in VR settings. While Asobo Studio, the original creator of this simulation, regularly releases updates and hotfixes to improve the frame rates, more can be done to optimize hardware usage and graphics rendering[3].

Control complexity

While it is fantastic that users can directly interact with the dashboards and controls, the complexity of flying an aircraft poses a significant learning curve for most users. As MSFS claims itself to be an amateur flight simulator, while it is important to preserve the realistic experience of flight simulation, it also needs to consider the entering users and provide a more intuitive way to maneuver the aircraft, such as adding voice-recognition commands or virtually placing a person (i.e. a co-pilot) who will do the heavy-lifting.

Conclusion

If you are one of those who are into flight simulations and want to have the surreal experience of becoming a pilot by yourself, MSFS is a phenomenal application that could achieve your dreams. Nevertheless, beware of the influx of complex instructions and preferences–whether on the hardware or software side—you should learn before embarking on your dream journey.

References

[1] “Exploring the Whole World in VR with Bing 3D Maps and MRTK.” TECHCOMMUNITY.MICROSOFT.COM, 1 May 2022, https://techcommunity.microsoft.com/t5/mixed-reality-blog/exploring-the-whole-world-in-vr-with-bing-3d-maps-and-mrtk/ba-p/2245284.

[2] Feltham, Jamie, et al. “Microsoft Flight Simulator Finally Has VR Controller Support.” UploadVR, 22 Nov. 2021, https://uploadvr.com/microsoft-flight-simulator-vr-controller-support/.

[3] Chawake, Anurag. “Microsoft Flight Simulator ‘Fps Drops’ & ‘Performance Issues’ in V1.25.9.0.” PiunikaWeb, 12 May 2022, https://piunikaweb.com/2022/05/12/microsoft-flight-simulator-fps-drops-performance-issues/.

Hololy – The AR app that brings your idol to the real world.

Hololy is an Augmented Reality (AR) application that allows you to project 3d anime girls into the real world through your phone. The application allows you to choose from a selection of models, poses as well as dances that allows your AR model to appear as if they are right where you are.

How it works.

The main attraction of the application is the ability to use your creativity to bring the 2D world into the 3D world and make interesting pictures as well as videos. First, the application identifies a flat piece of area where the model can be suitably placed, this prevents the model from floating in thin air. This piece of area will be a reference point where the model is. After confirming the location, you can then place the model and rotate around. Moving the camera around will show different perspectives of the model, as if the model itself was really in said area. From then, the user can choose poses, expressions and dance moves to make the model look as if it were alive in the place.

What makes it fun?

The idea that you can see something usually seen in the 2d world, whether it be from a cartoon or game, in a 3d sense is pretty amazing. Usually we see 3d models and animation through a 2d screen, and hence your perspective is locked through that screen. Though perspective may change, the user is glued at the same spot. Through this AR app, the user can see and feel, how we can see a model in different ways and how it fits in with the environment. The idea that you can position your model to sit on a chair, or let them dance in your room allows a lot of wacky photos to be taken.

Things to improve on

The first glaring thing upon using the app is that the 3d models are lighted in a consistent way. Thus, it conflicts with how the lighting is at the environment. There will be shadows appearing on sides where it is clearly lighted in the scene. If the app could allow tracking of light through machine learning it could enhance the immersion as it would fit more into the picture.

The second thing was the identification of a base to stand on was not very accurate. At times the model would seem like it’s floating in the air. Also, restricting the reference point to standing actions limits the creativity of poses. For example, they could do sitting or leaning poses where the reference point would be on furniture. This would enhance the experience as standing straight up in someone’s house isn’t the most natural pose to be in.

Conclusion

Hololy is a decent app for AR immersion, especially for fans of the anime characters. It allows you to utilize your creativity along with the AR powers of the application to create memorable photos and videos. However, there is plenty of room for improvement as the model does not really fit in with the 3d world.

References:

https://hololive.hololivepro.com/en/special/2127/

https://pbs.twimg.com/media/FcC4b_ZakAEfF0i?format=jpg&name=900×900

https://gamerbraves.sgp1.cdn.digitaloceanspaces.com/2022/09/Hololy_FI.jpg

[2022/23S2 CS4240 VAR] XR – Ikea Kreativ

Introduction

Ikea Kreativ is a Virtual Reality (VR) design tool to help customers visualise how furniture will look like in their room. Powered by VR and AI-technology, home redesign/renovation is made easy, helping customers to visualise their ideas into their home. Features include:

  1. Scanning of users’ room using iPhone camera to accurate capture room dimensions.
  2. Remove of existing furniture and placing virtual furniture in its place to see its fit.
  3. Availability of more than 50 virtual showrooms for users to place furniture in.
  4. Includes more than thousands of furniture and decorations for users to interact with.
Figure- Ikea Kreativ

What do I like it?

Ikea Kreativ provides the convenience and hassle-free of redesigning space, skipping the process of measuring furniture, space and provides users the ability to imagine their space not just in their mind but right in front of their eyes.

It also provides good variety of customisability, with over thousands of furniture, decoration, and accessories, allowing user to customise their space with. The ability to select items to remove in the room further provides greater customisability for users who wish to integrate new items into the space containing existing items.

Why is it engaging?

I found the application engaging because of the good degree of freedom it provides. Ikea Kreativ allows user to adjust Ikea items with ease, through rotation and movement around the space, just like how user would move their items in real life.

What features are done well?

Ikea’s furniture into the virtual space/room does a good job in allowing users to picture the proportion of the furniture to the room. With Ikea Kreativ’s VR element, it helps to visualise how items would look like in users’ space, skipping the hassle of having to measure the dimensions of furniture and room.

The wide selection of items and showroom also does a good job replicating Ikea’s real-life stores, allowing users to browse through item selections, without comprising the experience users get from real-life stores.

What features could be improved and how?

Though marked as a selling point, the erasing tool is not perfect. For example, when removing existing items, the software does not perfectly understand how to fill in the empty space. On this note, this can be improved by feeding the software more data to learn. Nonetheless, it does not obstruct users in adding Ikea’s furniture into the space. Afterall, the eraser tool is a new feature and I believe Ikea will continue to improve it to make a better user experience.

Another feature I think could be improved on is the degree of freedom to navigate around the room. This would require more devices, such as head-mounted display and tracker, allowing users to “feel” its surrounding space and the ambience the new environment (new environment here refers to the space after user completes with setting up the space) creates.

References

Porter, J. (22, June 22). Ikea’s new virtual design tool deletes your furniture and replaces it with Ikea’s. Retrieved from The Verge: https://www.theverge.com/2022/6/22/23178125/ikea-kreativ-room-scanner-ios-app-virtual-furniture-showroom

Wilson, M. (2022, June 22). Ikea’s new app deletes your living room furniture so you can buy even more. Retrieved from Fast Company: https://www.fastcompany.com/90762368/ikeas-new-app-deletes-your-living-room-furniture-so-you-can-buy-even-more

IKEA Launches AI-Powered “Kreativ” Mixed Reality App. (2022, June 22). Retrieved from Hypebeast: https://hypebeast.com/2022/6/ikea-ar-furniture-preview-app-function-info

Extended Reality (XR) in Arts and Entertainment

Van Gogh experience Virtual Reality (VR)

I am not a big art fan. Especially with it being so up to interpretation and abstract, it usually bores me. However, I recently got a chance to visit the Van Gogh exhibition which includes a 360º immersive experience of his art and journey throughout his life. It was held in a cathedral where all the walls, including the floor, had projections on them. It was quite a spectacle, bringing the audience through his thought process and showcasing his artwork and this was just the beginning.

Image of 360º projection display in York, St Mary’s Cathedral [1]

After the 35 minutes showcase, there is a VR experience which included a headset, speakers and we were seated on a chair that allowed us to turn 360º. It brought us through the time periods and locations where Van Gogh was inspired and explained the motivation behind each painting.

A snippet of Van Gogh Experience VR section [2]

One of the more famous pieces that were showcased was Starry Night. We were brought to the exact location of the painting, with Van Gogh “narrating” his thought process; like the colors he saw and why he decided to use a particular color in the art. I liked that the information was easy to digest on top of the fact that we could see what Van Gogh saw and thought when he was painting the various art pieces.

However, one thing I think can be improved is the mobility of the experience. I think it would have been better if we could walk through the whole “exhibit” as if we were in that time period and explore the area and paintings at our own pace. This could be done by providing users with controllers and adding demarcations in the VR to allow users to move around the area without having a big venue to move around in.

VR brings art to life, making it easier for people to understand the artist’s point of view. This reinvents museums and strays away from traditional art galleries, which is likely to attract more youths to the art scene.

Zero Latency (Sol Raiders)

VR in gaming is nothing new anymore. With the rise of games like Beat Saber, it pushes innovation and the possibility of multiplayer games. Zero Latency is a free-roaming, multi-player VR experience providing games such as Survivor, Outbreak Origin, and Sol Raiders. I, alongside 7 players participated in Sol Raiders which is a 4v4 game, where the objective is to complete as many tasks and minimize the number of deaths on each team.

Sol Raiders trailer by Zero Latency [3]

I do not usually play first-player shooter-type games but this was so much fun! After putting on the vest, headset, headphones, and guns we were transported into the game dimension and all the players (teammates and opponents) were dressed up like robots! Everything was so life-like, it really felt like I was a robot in that reality, especially with the sound effects.

Players in real world [4]
Players in game world [5]

At some point during the game, I felt very lightheaded because of the mismatch between the real world and the game world. The game world was multi-dimensional with slopes and lifts while the real world was a flat ground room. So walking up the slopes and taking the lifts were disorientating.

I think one thing that can be improved is the UI of the headset. The existing screen did not include any information about the game status, only showing it on the screen at the end of each round. Since this was a team game with a common objective, the screen could have included more information like the number of kills/deaths and objectives fulfilled. This would allow us to better plan our game instead of constantly trying to keep track of these data.

Overall, I think the game was very well done, especially since it had to sync the 8 players throughout the game.

References

[1] Google maps. [Online]. Available: https://www.google.com/maps/place/Van+Gogh:+The+Immersive+Experience+York/@53.9571904,-1.0809633,3a,83.9y,90t/data=!3m8!1e2!3m6!1sAF1QipO3uPo03Wb4cJR9A05JAQ7rzK_QfEesIPJ3zPY8!2e10!3e12!6shttps:%2F%2Flh5.googleusercontent.com%2Fp%2FAF1QipO3uPo03Wb4cJR9A05JAQ7rzK_QfEesIPJ3zPY8%3Dw203-h135-k-no!7i1024!8i682!4m7!3m6!1s0x48793130726fa6ed:0x651d1a44837b68c1!8m2!3d53.9572689!4d-1.0808674!14m1!1BCgIgARICGAI. [Accessed: 20-Jan-2023].

[2] “Van Gogh: The immersive experience,” YouTube, 24-Feb-2021. [Online]. Available: https://youtu.be/dZkQSjZYsgc?t=60. [Accessed: 20-Jan-2023].

[3] “Sol Raiders – trailer – zero latency VR,” YouTube, 07-Feb-2019. [Online]. Available: https://www.youtube.com/watch?v=ADOEuzyYstc&%3Bab_channel=ZeroLatencyVR. [Accessed: 20-Jan-2023].

[4] “Zero Latency’s Latest Free-Roam Experience Made Me A Believer In VR Esports” VRScout, 10-Aug-2018. [Online]. Available: https://vrscout.com/news/zero-latency-vr-esports/. [Accessed: 20-Jan-2023].

[5] “Sol Raiders,” Zero Latency Luxembourg, 07-Aug-2022. [Online]. Available: https://zerolatencyvr.lu/en/experiences/sol-raiders/. [Accessed: 20-Jan-2023].

Snapchat 👻

Snapchat is primarily a social media application, where people can send photos and videos in form of snaps directly to their friends or put it on their stories for all of their friends can see. They can also use the application to look at highlights, other people’s stories, and many other features. However, it also has an AR application called filters and lenses.

Snapchat Filters

Snapchat is unlike other AR apps, where most of them interact with the surroundings, but Snapchat interacts with your face and your surroundings. It was one of the first few apps to come up with interactive filters, then other social media applications soon caught on with this trend. Some filters or lenses can be as simple as just adding some colors or effects to the surroundings, while others can be a little more complex like swapping your face with your friend, or changing your appearance to look like a different gender. Snapchat also has a new scan feature that allows users to interact with their surroundings, such as solving a mathematics problem identifying a plant, or a car, and many more possibilities. The scan feature is relatively new compared to the filters and lenses and adds to the AR experience. Just the scan function is very similar to google lens, but what makes it different is the inclusion of filters and lenses, and the social media aspect.

Soon after this new feature, Snapchat also released Snap AR Lens Studio – where developers and artists can create new augmented reality experiences for users, and get to use their creativity.


Why do I like it?

It is simple and fun. It allows you to experiment with so many different filters, and users can create filters as well. The fun part about these filters is that they are very realistic. On top of that it is very easy to use. All you have to do is point it at your face, and it easily detects your face to put the filter on top of it. I also like the ‘explore’ option, which does not limit a user to a small number of filters. They also keep coming up with new filters, and ideas, and keep changing the default filters, so the users are not bored. The Scan feature, on the other hand, is like a visual search engine. It is very useful, quick, responsive, and easy to use. It is linked with the filters and lenses as well, for example, once a user points the camera at a face, the scan will suggest some face filters. It also has many different options to choose from, and the fact that all of these features are in one place makes it even better. Users can use it to find a product online, The feature is not limited to visual elements, it can also detect the music playing, which I think is very useful and fun.


Why is it engaging?

The filters are the most engaging aspect, as they allow you to play around and see how you would look in different ways. Some filters just enhance some facial features, while others add dog years, tongues, and things like that. They become more engaging as they add interactions. For example one of the most famous dog filters adds dog years and nose, and if you put your tongue out, it will do a dog tongue-swiping motion. Many other filters add interactions such as raising your eyebrows or smiling. The filters also detect faces on pets or screens, or anything that looks like a face.

The dog filter on pets

Another aspect that makes these filters engaging is that you get to share them with other people. You can send snaps to your friends or put them on your stories for more people to see. It creates something fun for people to do together. The scan feature is also engaging as it now suggests lenses or filters based on what the camera is pointed at. This makes it less tedious for users and allows them to explore new lenses easily.

What features are well done?

I think the filters look very realistic, and that is the part I believe is well done. For example, a guy can look exactly like a girl using some of these filters and it even fools some people. Finally, I think the fact that you can create filters is also a really good feature. Users get to showcase their creativity and some people also use it to market and increase publicity. The scan features are also built very well. For example, the find music feature allows users to identify any music playing in the background. It is very accurate and also gives links to the most popular music applications. The find a product feature is also really helpful, it also gives links to the Amazon page for users to directly buy it. Small details like providing these links, which make it easier for users and require the least effort from them, is a well thought-out aspect. The scan has other features such as finding a plant, or car or solving a math problem that are also very useful.


What features can be improved and how?

I think the one thing that can be improved is face detection, which is not accurate all the time.

Inaccurate face detection

It is especially poor for people wearing glasses. A lot of the time if a person tries on a filter with glasses, it is not at the right position on the face. This excludes a portion of the users and does not give them the full experience. Another feature is the filter accuracy while the face moves. Especially make-up filters do not stay in the correct position if someone is speaking. This makes it look less realistic and so makes these types of filters less engaging and fun. The filters should be able to move with the movement of facial features. The scan feature is relatively new, but so far I have not encountered any problems. The only way it can be improved is by introducing more options for users, than the small limited ones available now.


Overall, Snapchat is a fun and engaging application, which makes use of Augmented Reality in various ways. It has many good features, giving its users an enjoyable user experience, with an easy-to-use and fun user interface.

References:

https://www.tekrevol.com/blogs/best-ar-apps-for-augmented-reality-is-the-future/

https://screenrant.com/snapchat-scan-ar-camera-app-feature-benefits-explained/#:~:text=The%20Scan%20feature%20allows%20users,something%20to%20start%20using%20Scan.

MEDIVIS – AR Surgery Platform

Introduction

MEDIVIS is an XR application in surgery. It creates the 3D models of the human body (organs) through Magnetic Resonance Imaging (MRI) scan, and then the 3D models can be used in AR as the environment. The application provides 3D realistic structure visualization for the human body through a VR environment, and the user can zoom the visualization (model).

Attractive Reasons

MEDIVIS provides two primary services: an AR-based surgical support platform and a VR/AR-based health education platform. The application combines two stages (training and working) with the same interface, which smooths the transaction for people from a student to become a surgeon. The teaching and reality tools intergrade together is a highlight which attracts me.

Another advantage of the application is that it offers medical students without qualifications to get real surgy experience. Every surgeon has the first time conducting the surgy for a realistic patient, but using the AR/VR platform beforehand can reduce the nervous or unskilled situations that may happen in real surgy. This improvement would increase the success rate for the surgy overall, which saves more lives.

Engaging Reasons

Compared to other XR applications, the highlights of MEDIVIS listed below make it stand out:

  1. The application enables more medical students to have realistic chances to practice. By contrast, the number of medical students is more than the traditional practice materials they can conduct training.
  2. Compared to the monitor screen or textbook, which only provides a 2D view, the application provides a 3D visualization of the human body (organs), which makes students benefit from the AR visualization as the real world is in 3D. Students obtain more intuitive interaction without relying on their imagination.
  3. The application only needs a single glass to implement the provided features. The original devices used to support surgery, teaching, and training are sets of multiple large items. The application achieves better performance overall and is more convenient to use.

Highlights Features

Two main features are well done:

  1. Reusable is a critical feature provided by the application. In the healthcare industry, every case is unique, and one limitation for practitioners is the experiences (illness cases) cannot be saved for further review in a traditional way. The AR application makes the model once and can keep it in the database to share for future reference.
  2. Sharing the same view/opinion synchronously with other students makes abstract concepts in medical subjects easier to understand. With the 3D environment, students can get a deeper understanding of concepts, and communicating with others leads the learning experience to a new level. For example, the application could make students stand in the blood vessel to investigate around and look at the same parts together to discuss.

Improvable Features and the Way to Implement

  1. Delay for realistic surgy may cause serious consequences. The devices for supporting surgery are connected wirelessly. The delay for using a cable is an order of magnitude reduction. The cable connection can make the platform more stable as well. Using high-band cables to connect devices when doing a realistic surgery instead of a wireless connection is necessary to improve delay and provide a more stable application.
  2. Real-time scan modeling provides no pain discovery for patients would be a further improvement. This feature can benefit right before the realistic surgery to provide surgeon feedback for the patient and evaluate the risk of conducting the surgery, which reduces the danger of surgery. At the same time, creating a customized profile for patients is a benefit for further decision-making. The real-time investment needs more computing power and an efficient algorithm. Adding high-performance computing units and implementing a customized algorithm for this device would be the next step to improve the application.

Conclusion

In healthcare, the XR application brings a more intuitive way to better help both patients and doctors through its 3D visual interaction. It realizes the 3D existence in the real world, which greatly makes up for the lack of information caused by the lack of one dimension.

References

1. https://www.medivis.com
2. https://www.medivis.com/surgicalar
3. https://www.medivis.com/anatomyx
4. https://www.youtube.com/watch?v=e9R-YLGYVlQ
5. https://www.youtube.com/watch?v=Ch4RKOD8uHk

Google Earth VR | Breathtaking VR Experience

Google Earth VR

Do you still remember the wonders of exploring Google Earth for the first time? The amusement you get when you get to see the place you live, in 3D, or when you are experiencing the first-person view via street view in a whole different country you’ve never visited. Now imagine that, in Virtual Reality! That’s exactly what Google Earth VR is about, you get to put on a VR headset and travel the world in the comfort of your own home.

Google Earth is a program that utilizes satellite imagery to render accurate 3D representations of our planet Earth. It allows users to traverse the world to view these 3D images by controlling the rotation of the globe and zooming in and out to have a closer view. The fantastic street view functionality is also added to the program, allowing users to stand on the street and soak in the view from a first-person perspective [1]. The VR version of the software was introduced back in 2016 [2], and it is currently free on Steam [3], which means any VR headset owner can hop onto this amazing experience without any additional cost.

Given how the original Google Earth works and functions, it makes a whole lot of sense that the experience of using it in VR would be nothing less than incredible. I have tried it out on HTC Vive before and the experience was simply breathtaking. The immersion of catching amazing views around the Earth is greatly complimented by the intuitive controls and functionalities the program provides to travel the world.

Controlling the Earth [4]

Figure 1: View of Google Earth VR Tutorial

When booting up the program for the first time, users are greeted by an outer-space view of Earth and a quick guide on how to control the program. The software shows an actual representation of both your controllers to easily show you how to control the software. Let’s run through some of the basic controls together to gain a better understanding of them.

Rotate the earth by holding the button on the controller and dragging the globe around.

Fly towards the pointed direction by pointing the controller in the desired direction and moving the joystick/pressing the touchpad.

Change your orientation to place the Earth below or in front of you with a press of a button, this allows you to change your perspective of viewing the Earth. Having the Earth below you give a natural perspective view of the place but having the earth in front of you makes searching for a desired location easier.

See featured places, saved places, and search for places by opening the menu and selecting from the user interface.

Change the time of day by dragging the sky while holding the trigger button.

Street View is available when a tiny globe shows up at the controller and users can hold the globe closer to their eyes to have a quick glance or press a button to enter street view fully.

Teleport to a new place by opening a tiny globe on one controller and using the other controller to select the new location to teleport to.

With these controls, this VR experience enables users to travel to any part of the world and view the place from any angle as if they are there in person.

Why I absolutely loved Google Earth VR

Google Earth VR is a one-of-a-kind experience that makes you realize the potential that VR has for the future. It is the perfect demonstration of integrating VR into an existing piece of software to increase the immersion for its user. The original software allows the user to have a bird’s eye view of different places on the Earth, but it is confined within a two-dimensional computer monitor. With VR, you can simply rotate your head around to have a good look at the environment, this greatly enhances the immersion of the user. The way you control your position in Google Earth feels incredible too, you get to fly around to different places like an eagle or plant yourself on the street to experience being there in real life. I fell in love with this piece of software when I traveled to places on Earth I have never visited, such as the peak of Mount Everest or the Eiffel Tower, and caught the breathtaking views from those places.

The other reason Google Earth VR excites me so much upon using it for the first time is imagining the potential it has. If Google has the potential to model the entire Earth in 3D and have people access them in VR, imagine the future applications where we can put our own human model in a whole metaverse that’s a perfect replication of the Earth we currently have, and we can traverse anywhere in an instant and experience other countries’ culture from the comfort of our own home. Even now, people can use it to plan their travel to a different country by using it to check those locations and have a mental map of the places they are visiting. Parties doing urban planning can also potentially use this technology to experience in person what their future buildings and roads will be like.

Why is Google Earth VR engaging?

I believe that you as a reader would find the prospects of seeing different places on Earth in VR exciting too. Some reasons that make it exciting are its vastness of it and how unbounded it is. With the software of this scale, users would first be in disbelief that they can visit every part of the Earth, even their own home. Once they realized that it is entirely possible, their imaginations would run wild thinking about the next location to visit. Google Earth VR doesn’t bind its users to certain places but gives them the limitless potential to decide where they would like to go next, one can say that the only limit is the user’s imagination. Furthermore, people are compelled to explore around because of a very simple reason – Earth is beautiful. There are so many good sights that most people won’t be able to experience in real life throughout their lifetime, but this VR experience allows them to catch a close representation of it.

We can learn a few things about creating an engaging VR experience from this. One of them includes making the users feel that the VR software contains limitless possibilities, this pushes them to use their creativity and imagination when interacting with the software. This engages the users because they will feel more involved in the VR world that they are in. A perfect complement to that is a beautiful VR environment that captivates the users to remain in it.

Great features in Google Earth VR

The controls in Google Earth VR deserve high commendation as it not only allows the users to control their position in the software easily but makes it feel extraordinary to traverse around. It starts from the software’s choice to use the actual representation of the controllers instead of some kind of virtual hands. This makes it easy to indicate to the users what each button on their controller does. They also enhanced the experience by attaching stuff like a tiny globe to the controller that the user can hold closer to choose where to teleport to or access the street view, this approach of adding flairs to the controller creates a big impact because the controllers are the closest interactable object to the user that users will constantly look at.

The way users use the controllers is also very intuitive, especially the feature that allows the user to fly toward the direction it is pointed at, it’s straightforward but creates an amazing experience of soaring through the skies for the users. I also especially enjoyed the feature of adjusting the time of day by dragging the sky to control the position of the sun and moon, as lighting has a very huge impact on the visuals of a location and users can get to see how a place looks at different times of the day.

Lastly, I was amused by this feature in Google Earth VR where they limit your field of view to a small circle when you are moving around. I later found out this feature is “comfort mode” and is done to reduce the potential of VR motion sickness, this feature is toggleable because doing so removes the full immersion in exchange for a more comfortable experience for people prone to motion sickness [4].

What features can be improved and how?

Despite the amazing feats that Google Earth VR has achieved, it still struggles to fully represent our Earth as users will sometimes see blocky textures, texture pop-ins, and less detailed places. Understandably, this is more of a hardware limitation of the VR headset and limitations from the satellite scanning capability, but for a 3D rendering of the real world, the immersion breaks when users see these occurrences. Hopefully, as the development of VR hardware and software progresses, we will be able to achieve a closer and more detailed representation of our Earth in the VR world.

The other lackluster part about Google Earth VR is the difficulty in finding specific locations or points of interest. The software has a search location feature but typing in VR is difficult as you must point towards the letters on a keyboard and select them one by one which can be quite time-consuming. One way to improve the search experience is to perhaps add voice-to-text functionality commonly found in smartphones so that users can type in search fields more quickly. Besides that, the software doesn’t offer a lot of recommendations on point of interest and place labels such as those found in Google Maps, it does show road names, but it disappears when you get too close to the surface, which makes it hard for users to find interesting locations to look at when using the software. Good geographic knowledge can be very helpful for the user to look for their desired locations but adding more indicators and labels to this 3D rendering of the Earth can improve the experience when searching for places. This feature should be toggleable as having visual indicators is undesirable sometimes as it might break immersion.

Conclusion

Google Earth VR is a breathtaking experience every VR headset owner should absolutely try. It’s simply amazing to visit every corner of the world to see what this beautiful Earth has to offer, and this experience is perfectly complemented by well-thought-out VR interaction designs. It is exciting to live in times where VR technology is greatly advancing, and Google Earth VR is undoubtedly one of the cornerstones that shows the vast possibilities VR technology has to offer.

[1] R. Carter, “Google Earth VR Review: Explore the world,” 04 October 2021. [Online]. Available: https://www.xrtoday.com/reviews/google-earth-vr-review-explore-the-world/. [Accessed 20 January 2023].

[2] M. Podwal, “Google Earth VR – bringing the whole wide world to virtual reality,” Google, 16 November 2016. [Online]. Available: https://blog.google/products/google-ar-vr/google-earth-vr-bringing-whole-wide-world-virtual-reality/. [Accessed 20 January 2023].

[3] “Google Earth VR on steam,” [Online]. Available: https://store.steampowered.com/app/348250/Google_Earth_VR/. [Accessed 20 January 2023].

[4] A. Courtney, “Google Earth VR controls – movement, Street View & Settings,” 13 March 2022. [Online]. Available: https://vrlowdown.com/google-earth-vr-controls/. [Accessed 20 January 2023].

An Immersive Shopping Experience with Shopify AR

Introduction

Due to the Covid-19 pandemic over the past few years, the use of AR in the retail sector grew rapidly. It is even estimated that by the year 2025, up to 4.3 billion people would be using AR on a frequent basis. Top reasons cited by more than 40% of global consumers for not shopping online are — not being able to see products in-person and not being able to try things out before making a purchase. With AR and VR, fit uncertainty can be reduced, increasing confidence in purchase by providing a more immersive shopping experience.

Shopify AR

An example of how consumers can use Shopify AR

Shopify is an e-commerce platform for online stores and retail point-of-sales systems. It allows retailers to set up online stores and provides many different services and solutions for the convenience of both retailers and consumers. One of the services Shopify provides is Shopify AR, bringing a new dimension to customer service with the Augmented Reality (AR) experience. 

Introduced in 2018, Shopify AR allows retailers to create interactive and personal AR experiences for their consumers on iOS devices, where products can be viewed from all angles and scale via the web browser, Safari.

Immersive Shopping Experience

According to a research conducted, the four broad uses of AR technologies utilised in the retail settings – to entertain customers, educate customers, help customers evaluate product fit, and enhance their post-purchase consumption experience. Shopify AR has use cases at every stage of the customer journey that retailers can leverage on.

Rebeca Minkoff shoppers can virtually ‘place’ 3D models in their environment with AR

One of the features is showcasing the product from different angles and scales via any iOS device with a camera and Safari. The consumer would also be able to place the object in their environment using AR technology, to visualise how the product would look and feel like if it were there. This simple feature entertains customers, due to its ability to transform inanimate 2-dimensional objects into 3-dimensional, animated and interactive objects, which creates fresh experiences that entertain and captivates customers. Additionally, by having the physical environment as a background to the virtual products, consumers can visualise how products would appear in their actual environment and context, helping consumers to give a more accurate evaluation about the given product. The brand, Reveba Minkoff also reported that there are 65% more likely to purchase when they utilised this feature on their website.

A traditional retail display reimagined in AR.

Shopify AR allows retailers to go beyond showcasing their products and having the consumers place them in their environment. Firstly, Retailers can choose to recreate a traditional retail displayed in the virtual setting. There would be no need to consider the physical restraints of space and inventory, and consumers would be able to explore the virtual space and products from the comforts of their home. Being a rather new technology, engaging AR displays would also be an avenue to enhance marketing campaigns and product launches.

Secondly, retailers can also create their own immersive local AR experience to enhance the shopping experiences of customers. For example, in a collaboration between the Jordan Brand, Snapchat, Shopify, and Darkstore for a pre-release sneaker during the NBA All-Star game in February of 2018, customers invited to the exclusive event were prompted to scan a mobile Snap code that reveals the new sneakers. The sneakers can then be purchased and delivered to them on the same day.

Demonstration of how a Gramovox floating record player is installed using AR animation.

Lastly, the platform can also be utilised to enhance consumers’ post-purchase experience. It can help to put products into context and provide more valuable and personalised information with clear and simple graphics. For instance, after a customer receives a product, an interactive tour can show them how to assemble and install the product they just purchased. In the video above, each part of the installation process is slowly animated to add a new, 3D visual that helps customers better understand the instruction manual included.

If brands and retailers can offer this post-purchase experience, customers could get more immediate and long-term value from the product, which improves their purchase experience and general satisfaction.

Why do you like this XR application?

As an avid online shopper, having the added AR experience would greatly enhance my online shopping experience. On most e-commerce platforms, only 2D pictures are shown, in much better lighting and it would usually be placed in environments that I may not be using them in. Hence, the product images shown may not be an accurate depiction of the product itself. Due to prior experiences of receiving products that are not the same as what was depicted, even with the simplest feature of being able to look at the product from all angles, zooming in to take a look at product texture and details would be of great help. I would also be more likely to make a purchase if I could see how the product would look like where I would like to use them, especially for furniture and larger items.

Object View
AR view

The feature is also easy to use, as I could easy switch between the AR and object mode with the click of a button. If I need to move the object, all I had to do was pinch it with 2 fingers and I could move it as I desired. The AR mode as shown in the image allows me to imagine the product in my own space and interact with it. If I find that it is a right fit, I would also be more likely to make a purchase, as compared to not having the AR experience. If I like how the product fits in my space, I can also take a picture using the button on the right to save the image.

What features are well done?

In general, Shopify AR has a simple interface and is easy to use. As mentioned earlier, the feature that consumers can use to switch between AR and the object has a simple interface. Both the AR and object functions fits the mental model of how these functions would behave and does not have any other functions that may confuse the user. This makes it intuitive to use, with little to no affordance.

The barrier to entry for retailers to use the simple AR functions are low, as they can easily embed the feature in their own website. Hence, it makes it much easier for smaller scale retailers to integrate simple AR features onto their own websites, without having to invest the time and effort to create their own applications (e.g. Ikea Place).

It is also easy for consumers to use it, since Safari and an iOS device is all that are needed, and additional downloads or installs are not required.

Potential Improvements

While Shopify AR greatly enhances the customer experience, there is still room for improvement.

Firstly, Shopify could try to include beauty categories, providing users with the option to see how they look like in different hair colours, hair styles, makeup and nails.

Secondly, Shopify could also provide templates to lower the barrier to entry for retailers of smaller scale to create their own virtual shops and spaces. I have observed that most retailers that have collaborated with Shopify for virtual or virtual-local experiences are those of larger brands, as they have the resources to create these spaces for their potential consumers. However, smaller retailers tend to not have AR experiences, or simply opt for the simplest features, discounting potential customers of their full immersive shopping experience.

Thirdly, if Shopify AR could to be integrated with e-commerce platforms with large user bases like Shopee and Lazada, it would benefit both sellers and buyers on the platform. The simplest features of Shopify AR would provide consumers with a much better shopping experience, especially for larger items like furniture. This could also potentially reduce the need for customers to return furniture of the wrong size, increasing customer satisfaction.

Lastly, Shopify could extend this feature to non iOS devices and other web browsers. Since 71.8% of mobile phone users use the Android operating system, while only 27% are iOS users. By extending this feature to accommodate Android systems, the Shopify AR user base would rapidly increase, which allows the retailers to reach out to more potential consumers, and consumers would also have a better shopping experience.

Conclusion

The rise of Augmented Reality technology, sped up by the pandemic over the past few years, presents great potential for AR technology to be used in the retail industry. While there is room for improvement, the user experience of Shopify AR has been great in enhancing user’s shopping experience. More research can be conducted to find out more about the impact of AR technology on various aspects of consumer behaviour, to further improve on the usage of AR technology in the retail sector.

References

Augmented reality brings a new dimension of engagement to the customer experience. (2018, November 15). Retrieved January 18, 2023, from https://www.shopify.com/sg/blog/augmented-reality-commerce

Augmented Reality in Retail A Business Perspective. (n.d.). Retrieved January 16, 2023, from https://www.byteplus.com/en/blog/detail/Augmented-Reality-in-Retail-A-Business-Perspective-

K, S. (2020, May 27). Retail in a new dimension. Retrieved January 16, 2023, from https://medium.com/scapic/retail-in-a-new-dimension-503249c4e46e

Laricchia, F. (2023, January 17). Global Mobile OS Market Share 2022. Retrieved January 20, 2023, from https://www.statista.com/statistics/272698/global-market-share-held-by-mobile-operating-systems-since-2009/#:~:text=Android%20maintained%20its%20position%20as,the%20mobile%20operating%20system%20market.

TAN, Yong Chin; CHANDUKALA, Sandeep R.; and REDDY, Srinivas K.. Augmented reality in retail and its
impact on sales. (2022). Journal of Marketing. 86, (1), 48-66. Research Collection Lee Kong Chian School Of Business.

Lost Judgment

Lost Judgment is an is an action-adventure video game developed by Ryu Ga Gotoku Studio (RGG Studio), published by Sega and released in 2021. It is a spin-off of and is set in the same universe as RGG Studio’s flagship franchise, Yakuza, and is the sequel to Judgment, which came out in 2018. The game is a single-player experience featuring an expressive combat system and is also heavily story-driven, touching on topics such as bullying, justice and the importance of knowing the truth. 

https://yakuza.sega.com/lostjudgment/

Lens 9: Elemental Tetrad

Aesthetics

The game has a realistic art style and mostly uses 3D models and assets. It is set in 2 cities the player is able to roam around in. The first is Kamurocho, a shady entertainment district based off of Kabukicho, a real red-light district in Shinjuku, Tokyo. During the game’s night time, Kamurocho has many bright lights and signs which provide a contrast the dark roads and alleyways, doing an excellent job at selling the city’s culture of nightlife.

The second city is Isezaki Ijincho, based on Isezakicho, a shopping district in Yokohama. Ijincho is much bigger and has a variety of environments such as a high school, a park, and a bar district. It has a very vibrant and varied color scheme that suits a lively city.

However, I feel that Lost Judgment’s biggest strength aesthetically lies in its animations. Yagami’s fighting styles consist of moves inspired by various martial arts such as Aikido, Boxing and Kung Fu with stylish and over-the-top movements that are fun to execute and watch. Each fighting style also has its own combat music to make the player feel energized, and every attack feels impactful and weighty, making the combat not just fun to look at but also satisfying to play.

Technology

Lost Judgment was developed in RGG Studio’s Dragon Engine, which was originally developed for Yakuza 6 in 2016. The game was originally a Playstation exclusive, but was eventually ported to PC and is now available for purchase on Steam. This has also allowed the game to be modded more easily, as RGG Studio’s engines are highly scalable, and players have been able to import old models and animations from the older games in the series, as well as modding animations from Lost Judgment into RGG Studio’s other titles.

Mechanics

Lost Judgment’s primary gameplay loop comes from its combat system, where the player can switch between 4 distinct fighting styles and perform combos on enemies. The system’s appeal mainly comes from the wide variety of moves the player can use, which allows for very expressive and varied combat experiences. There are also heat actions, which consume heat, a resource the player gains by damaging enemies. Heat actions are special moves that deal big damage and have cinematic shots of their animations. Furthermore, certain moves can apply certain effects to enemies like stun, fear and agony, which can allow the player to perform special heat actions.

Story

Lost Judgment follows the story of Takayuki Yagami, a private detective investigating Akihiko Ehara, a criminal accused of both sexual harassment and murder. The main draw comes from the mystery, as the initial evidence used to accuse Ehara of sexual harassment acts as an alibi for him during the murder as the 2 crimes seemingly take place at the same time, which would allow him to get away with murder on a much lighter sentence. As Yagami works to uncover the truth, the case eventually leads him to the school Ehara’s son once went to, where he discovers a link between Ehara and a series of murders committed against bullies in Kamurocho and Ijincho.

Lens 50: Character

When it comes to character, games by RGG Studios are widely known for having extremely serious and climactic stories while also including silly and over-the-top moments, and Lost Judgment is no exception to this. For instance, the player can perform a heat action on enemies afflicted with fear where Yagami will perform a Kabedon on them, a move associated with a confession of love in Japanese culture, causing the enemy to pass out from fear in response.

Lens 79: Freedom

In Lost Judgment, players are free to roam around the cities of Kamurocho and Ijincho, and are rewarded with interesting substories and activities for doing so. The only instance where the player may not be free to roam the city is during a main mission, where the game has a fixed path and sequence of events it wants the player to follow, so it restricts them to a certain area or level.

Another aspect of the game which gives freedom is its combat system, where the player is free to tackle combat encounters in many ways due to the wide variety of moves and fighting styles. Together with the animations, this makes the game’s combat feel incredibly satisfying and entertaining.

Lens 87: Character Traits

As a private detective and ex-lawyer, Yagami has a strong sense of justice and regards the truth very highly. This can be seen in his actions and dialogue throughout the story. Although he does not go through much of a character arc in this game, his personality can be largely attributed to the events of Judgment, as much of his growth as a character was written in the first game.

The rest of the game’s cast are also quite likeable and fun, and are generally regarded quite highly among fans for their interesting and charming personalities.

Lens 68: Moments

Many of the key moments in Lost Judgment are climactic boss battles against powerful enemies that are also significant to the story. These are elevated by the intense soundtrack, as well as the dynamic intros that RGG Studios is known for. All of these serve the purpose of hyping the player up for the encounter and do a great job at it.

XR for Construction and Interior Design | Magicplan

Introduction

Magicplan is an AR mobile application that can be used for interior design. Typically, in an industry that utilizes pen and paper or a computer to draw out floor plans and design rooms, Magicplan provides a convenient alternative that can be used on the go. Users can “add a room” to the floor plan, and by selecting the “Scan with camera” option, users will be brought to an interactive AR screen where they can scan the corners of their rooms. Magicplan will then calculate the room’s length, breadth and height and produce a floorplan, with up to 95% accuracy.

Why is Magicplan engaging?

Magicplan has an intuitive and interactive design that makes it beginner friendly. Instructions are readily available on the AR screen for first-time users. It allows people who are interested in interior design to have a unique platform to try designing their rooms to their liking without having to go through the hassle of using measuring tapes and other physical equipment. Users are also now given a chance to plan out the rough design of their house, before engaging a professional interior designer. This benefits both parties as users can better articulate their vision for the design and designers will have a platform to input and share their opinions. Doing so reduces the chances of misunderstanding, which will be problematic in the future if the house is furnished wrongly.

What is Well Done about Magicplan?

Clear markings and measurements placed on objects in the environment

As users are scanning the room, Magicplan shows the measurements of the room on the objects in the AR environment. This makes it very intuitive for users the know which measurement is for which part of the room, rather than having the measurements on a side menu at the side of the AR screen.

The application also marks out exactly where the start and end points are, allowing users to double-check and ensure that the marked points are correct. If users are unsatisfied, they can easily delete that point and mark it again.

Measurements between some marked points

Realistic visualisation of the room (3D View Mode)

After the user has finished scanning the room, they will be able to design the floor plan accordingly. They can add furniture to their floor plan and adjust the size to their liking. After finishing the design in 2D, users can switch to the 3D mode and view how the room will look like. This was a game-changer in an industry that mainly used pen and paper during the initial design stage. Interior designers are now able to show how their client’s rooms were going to look before any renovation even begins. This allows clients to have a physical visual image of the end product of the room, rather than having to leave it to their imagination.

Connect to Bluetooth sensors

While Magicplan can measure the length of the room via AR, it is only 95% accurate. That may not be good enough considering that the next step after designing the room is to go through an expensive renovation. Mistakes in measurements can result in the wrong number of wall tiles used or buying a wardrobe that is too big for the room. All these are expensive mistakes that users would like to avoid. To improve the accuracy of the measurements to 100%, Magicplan allows users to connect a wide variety of Bluetooth lasers to aid in the measurements. This helps to double-check all measurements and ensure a mistake-free design phase.

Possible improvements?

Corner detection

One of the major flaws that I came across as I was trying Magicplan, was the application’s inability to detect a corner. As I am scanning my room, I could not accurately mark the first corner of the room. It was always slightly above or below the corner. I am not sure if it was due to the poor lighting of my room, but I think that this will affect the user experience. The clear and distinct marking of corner points that I mentioned as a “well done” feature, will end up backfiring. If the marking isn’t exactly at the corner, users will want to rescan that corner continuously till it is exactly where they want it to be. The problem is further exacerbated as all other points use the first corner point as a reference. So, if the first point is slightly above the corner, the lines drawn between corners will look elevated on the AR screen. This may affect the accuracy of the measurements, especially the height. A possible way to avoid this is for the application to have the ability to detect a corner automatically.

Conclusion

Magicplan is an interesting and ground-breaking application that aids in interior design and construction. It utilizes the potential of AR to improve an industry that mainly uses pen and paper. Although there are some kinks that need to be ironed out, it is useful application for anyone who is interested in interior design or has a profession in that field.

References

Augmented reality apps archives. Indovance Blog. (2022, August 11). Retrieved January 19, 2023, from https://www.indovance.com/knowledge-center/tag/augmented-reality-apps/

Extended reality in construction – a new frontier for the AEC Industry. Indovance Blog. (2022, September 7). Retrieved January 19, 2023, from https://www.indovance.com/knowledge-center/extended-reality-in-construction/

Magicplan Help center. Magicplan Help Center. (n.d.). Retrieved January 19, 2023, from https://help.magicplan.app/