Experiential Design // Task 1: Trending Experience

21/4/2025 - 15/8/2025 / Week 1 - Week 14
Angel Tan Xin Kei / 0356117
Major Project 1 / Bachelor of Design (Hons) in Creative Media 
Experiential Design // Task 1



⋆ ˚。⋆୨୧˚ Index ˚୨୧⋆。˚ ⋆
  • Instruction
  • Lecture
  • Task 1: TRENDING EXPERIENCE 
  • Feedback
  • Reflection

⋆ ˚。⋆୨୧˚ Instruction ˚୨୧⋆。˚ ⋆


TASK 1: TRENDING EXPERIENCE 

Instructions: 
  • Explore the current, popular trend in the market to give them better understanding of the technologies and the knowledge in creating content for those technologies. 
  • Conduct research and experiment to find out features and limitation which will later allows them to make decision on which technologies they should proceed with in their final project. 
  • To complete all exercises to demonstrate understanding the development platform fundamentals. 
  • Submit the link to your blog post, make sure all the exercises are updated on your blog.

⋆ ˚。⋆୨୧˚ Lecture˚୨୧⋆。˚ ⋆

Week 1/ Introduction to Module

In the first week, Mr. Razif introduced the subject outline and outlined his expectations for the course. We were shown examples of previous student projects, which helped us understand the concept of augmented reality (AR) and gain some ideas for the following tasks. He also explained the various types of AR experiences and the technologies used to develop them, guiding us through the basics of designing an AR concept. Through this session, we developed a solid understanding of the distinctions between AR, VR, and MR, learned to identify AR and MR applications, created our own AR experience, and even built a simple AR app during the lecture.

  • Augmented Reality (AR): Enhances the real, physical world by overlaying digital content.
  • Mixed Reality (MR): Combines the physical and digital worlds, allowing interaction between real and virtual elements.
  • Virtual Reality (VR): A fully immersive digital environment, separate from the physical world.
  • Extended Reality (XR): A broad term encompassing AR, MR, and VR technologies.
Fig.1 Differences between AR,MR, and VR (Week1)

Week 2/ Designing Experiences using User Journey Maps
During class, Mr. Razif divided us into four groups, each with approximately seven members. Our task was to select a location and create a user journey map that highlighted the user experience, identified pain points, proposed solutions, and present our findings before the class ended.

My group chose a theme park as our scenario. We decided to use Figma as our design tool and found a well-structured template that helped us effectively organize and present our ideas.

In our journey map, we visualized the user experience within the Tokyo Disneyland Theme Park as most of us have been there before and experienced through the following stages:

  • Parking Space
  • Ticket Counter
  • Main Entrance Photo Spot
  • Park Navigation
  • Attraction Queue
  • Attractions
  • Parades & Shows
  • Dining
  • Merchandise
  • Night Finale
  • Exits & Memories
we also included Gain Points, Pain Points, Solutions which includes integration of Augmented Reality to enhance user engagement and flow.
Fig 3.1 Miro Link of Tokyo Disneyland

Self-Reflection:
Through the process of creating the Tokyo Disneyland Park journey map, I found it to be a valuable learning experience that deepened my understanding of a visitor’s journey through a theme park. By examining each stage from parking to the exit, I developed greater insight into how user interactions and emotions shift throughout the visit. Mr. Razif’s suggestion to explore augmented reality solutions pushed us to go beyond conventional ideas and think creatively about how features like interactive maps, virtual queues, and immersive content could enhance the overall experience especially how to engage visitors to spend time during queuing or buffer time. Collaborating with my group encouraged rich idea-sharing, helping us uncover various user pain points and devise well-rounded solutions. This experience highlighted the need to align technological innovations with user expectations to deliver a smoother and more engaging journey. Ultimately, this project sharpened my user journey mapping skills and reinforced the importance of user-focused design in shaping memorable experiences.

Week 3/Introduction to Unity 

Fig 3.2 Week 3 Lecture

Task 1: Identify What Type of XR Experience? Is it AR or MR? Describe Why?

Objective: The purpose of this task is to better compare the difference between Augmented Reality (AR) and Mixed Reality (MR) by identifying which type of XR experience is demonstrated in a given example.

Activity 1: Launch Google on phone to search for animals to view in 3D (View in AR) 

Objective: To conceptualise AR using Google's feature on a smartphone.
I love hamster :3
Fig 3.3 Hamster accompanying me in Lesson

Activity 2:
Imagine a scenario in either of the two places. What would the A.R experiences be and what visualization can be useful? What do you want the user to feel?   



Fig 3.4 Group Exercises

Scenario Overview:

Our group has chosen a gym setting as the basis for our AR experience. Many gym-goers—especially beginners—struggle with understanding how to correctly use workout machines. This often leads to improper form, inefficient workouts, or even injury.

AR Experience Implementation:

We designed an augmented reality (AR) solution that allows users to scan gym equipment with their phones. Once scanned, the AR system displays a virtual trainer or step-by-step instructions demonstrating the correct usage of the machine. This includes animated guidance on posture, movement, and targeted muscle groups.

Extended Visualization:

The AR overlay features interactive 3D models animated in real time, showing how each machine should be used. It also provides tips on recommended sets, reps, and the muscle areas each exercise targets. Based on the equipment present in the gym, the system can suggest personalized workout routines.

User Experience and Emotion:

The primary goal is to make users feel confident and empowered during their workouts. With clear, instant guidance, users gain practical knowledge that helps them exercise safely and effectively. The AR experience enhances their sense of security and motivation, reducing anxiety and the risk of injury.

Presentation Slides:



Self-Reflection:

In this project, our team developed the Gym AR Trainer concept aimed at helping beginners like navigate rookie at gym with greater ease and confidence. This experience deepened my understanding of how augmented reality can deliver real-time assistance and visual feedback, especially in environments where users might feel unsure or self-conscious. Designing the AR system highlighted the value of tailored support and clear, intuitive visuals to provide hands-free guidance, eliminating the need for users to seek help directly. Working on features such as the Rep & Set Tracker and the Virtual Personal Trainer helped me explore how AR can reduce workout anxiety, improve precision, and boost user confidence. Overall, this project demonstrated how thoughtful experiential design can enhance everyday routines, making physical activity more accessible, interactive, and user-friendly through technology.

AR Design Workshop 1 - AR Experience Development (Unity & Vuforia)

This week, we installed Unity on our laptops as the main platform for developing our AR applications. Before class, we were required to complete the MyTimes tutorial, which introduces the fundamental steps for creating a basic AR experience.

The tutorial involved scanning an image marker that would trigger two AR elements: a 3D cube and a video plane. For my project, I used a selfie as the image marker and linked it to an online video. By carefully following each step of the tutorial, I was able to ensure that the AR elements appeared correctly during testing and functioned as expected.

Using the Vuforia Engine:

  • Registered a Vuforia account

  • Selected version 10.14 and downloaded the Vuforia Engine

  • Added Vuforia to my Unity project (or upgraded to the latest version if already installed)

  • After the download was complete, I double-clicked the file and imported it into Unity

This hands-on exercise helped solidify my understanding of how to integrate Vuforia with Unity and build a simple yet functional AR experience from scratch.

 
Fig 3.5 Download Vuforia Engine (Week 3)

In this session, we explored how to build a basic augmented reality (AR) experience using Unity and the Vuforia Engine. The process involved several essential steps:

1.Initializing the Unity Project:

We started by creating a new project in Unity, selecting the Universal 3D template, which is compatible with AR development and supports the Universal Render Pipeline (URP).

2. Integrating Vuforia SDK:

To enable AR features, we imported the Vuforia SDK into Unity. This toolkit facilitates image recognition and allows 3D objects to be displayed when specific images are detected.

3. Scene Construction:

We then set up the scene with key components:

  • ARCamera: Responsible for recognizing the image target and aligning virtual content with it.
  • Image Target: A designated visual marker that triggers the AR experience.
  • 3D Object: A simple cube was added as the 3D content to appear when the image target is identified.

4. Creating the Vuforia Target Database:

Using the Vuforia Developer Portal, we uploaded our chosen image target, named hamster, which received a 2stars rating for tracking accuracy. We then generated and downloaded a target database to import into our Unity project.

Fig 3.7 hamster as Target Image import to Vuforia Engine (Week 3)


5. Configuring the AR Elements:

In Unity, we positioned the cube over the image target so that it would be visible when the target was detected. We tested this interaction using Unity’s simulator to preview how the AR content would appear.

6. Building and Testing on Device:

Finally, we deployed the AR scene to a mobile device to verify that the cube correctly appeared and aligned with the image target in a real-world setting.

Fig 3.7 Final Outcome: cube appears on Screen (Week 3)

Self-Reflection : AR Development Experience

This hands-on AR development session was an insightful introduction to working with Unity and Vuforia. It helped me understand the process of triggering and displaying AR content through image recognition. Initially, coordinating the setup between Unity and the Vuforia platform seemed complicated, especially in terms of database management and aligning configurations. However, once everything was correctly connected, witnessing the AR object come to life on a mobile device was incredibly rewarding. Collaborating with classmates and referencing the tutorial video helped resolve confusion and clarified each step of the process. I also realized how critical precise placement of assets is, as small misalignments can negatively affect the AR experience. Overall, this activity significantly increased my confidence in creating AR applications and provided a solid foundation for developing more complex AR projects in the future.


Week 4/ Markerless AR Experience

Exercise 1: User Control & UI

In this session, we advanced our AR development by focusing on screen navigation, building on the previous tutorial. The main goal was to incorporate interactive buttons that let users switch between screens and control the display of AR content. We continued to use the Spiderwoman image target to maintain consistency throughout the project.

1. Scene Initialization:

We began by creating a fresh scene in Unity, opting for the Basic (Built-in) template to ensure a clean and straightforward user interface.

2.UI and Canvas Configuration:

A Canvas was added to the scene to contain all user interface elements. Within the Canvas, we inserted a Panel to serve as the background layer, we renamed the first button to BTN_HIDE and changed its label to “HIDE”, and did the same for the second button, renaming it to BTN_SHOW with the label “SHOW”. This was done by expanding each button object and editing the child Text (TMP) component from the Inspector panel.

3. Button Interactivity:

We used Unity’s UI Button component to define the functionality of each button. Pressing the Hide button concealed the AR object, while the Show button revealed it again. This allowed users to interactively control what appears on the screen, simulating a simple navigation system.

4. Scripting the Behavior:

We attached a script using the “Set Active” method to control the cube’s visibility. Each button was connected to a specific function within the script, allowing users to toggle the panel and cube based on their inputs.We assigned the GameObject containing the AR Cube to the button’s OnClick() field, and then used Unity's built-in function: GameObject.SetActive(bool). For the "HIDE" button, we set SetActive(false) to make the AR Cube disappear. For the "SHOW" button, we set SetActive(true) to make the AR Cube reappear.

5.Testing and Troubleshooting:

We tested the scene to confirm that the buttons worked correctly and that the AR content continued to respond appropriately to the hamster image target.

Fig 4.2 Final Outcome: Square was hidden (Week 4)

Fig 4.3 Final Outcome: Cube and Square appeared together (Week 4)

Self- Reflection : Screen Navigation
This session offered valuable insight into how user interface elements can enhance interactivity within AR applications. Setting up and linking the buttons to control AR visibility gave me practical experience in integrating UI with AR features in Unity. Initially, aligning the buttons with the C# script posed some challenges, particularly in ensuring responsive behavior. Through testing, I also learned how small adjustments in Canvas properties could influence usability, highlighting the importance of accurate alignment and scaling. Overall, this exercise deepened my understanding of screen navigation in AR, showing how even simple interactions can improve user engagement and control.

Exercise 2: Animation of 3D Cube & Controlling Movement

This session offered valuable insight into how user interface elements can enhance interactivity within AR applications. Setting up and linking the buttons to control AR visibility gave me practical experience in integrating UI with AR features in Unity. Initially, aligning the buttons with the C# script posed some challenges, particularly in ensuring responsive behavior. Through testing, I also learned how small adjustments in Canvas properties.

1.Maintaining the Scene:

We used the same AR setup with the Spiderwoman image target and retained the existing cube as the animated object.

2.Creating an Animation Clip:

Using Unity’s Animation tab, we created a new clip named CubeAnimation, incorporating basic movements such as rotation, scaling, and position changes to give the cube more visual appeal. Keyframes were inserted to define specific animation sequences.

3.Animator Controller Setup:

An Animator Controller was created and assigned to the cube. This allowed us to manage the animation’s behavior and transitions within Unity’s Animator window.

  • When the "PLAY" button is clicked, we enabled the Animator component to be ticked and the cube’s animation to play. 
  • When the "STOP" button is clicked, we disabled the Animator component to be unticked, which stopped the animation playback.
  • When the "HIDE" button is clicked, we SetActive(true) to make the AR Cube reappear.

4.Triggering the Animation:

The animation was configured to play when the AR image target was detected. Additional UI buttons were added to allow users to play and stop the animation manually, further integrating user input.

5.Testing and Debugging:

We tested the functionality to ensure that the animation played smoothly upon detection of the image target and responded accurately to user commands.could influence usability, highlighting the importance of accurate alignment and scaling. Overall, this exercise deepened my understanding of screen navigation in AR, showing how even simple interactions can improve user engagement and control.


Fig 4.4 Conecting Fuctions to Buttons

Self-Reflection – Cube Animation

Incorporating animation into our AR scene significantly enhanced the user experience by adding motion and responsiveness to the content. This session allowed me to explore how animation can make digital objects feel more dynamic and engaging. Fine-tuning the animation flow and achieving smooth transitions proved to be a learning curve, as even slight timing differences affected the visual output. Adding user-triggered controls through buttons demonstrated how users can directly influence AR elements, leading to a more interactive experience. Overall, this part of the project reinforced the role of animation in capturing attention and conveying action, forming a crucial step toward creating richer, more immersive AR environments.


⋆ ˚。⋆୨୧˚ Task 1 ˚୨୧⋆。˚ ⋆

A. Ideation:

During the first week, me and Yuk Guan worked together in a pair for this task. We both did some brainstorming and came up with a total of 6 ideas to suggest to our lecturer. For every idea, we wrote down the title, the problem it solves, the solution we suggest, the target audience, the steps or process, the advantages (pros), the disadvantages (cons), and the feedback we gave each other. Below are our ideas documented in a Google Docs.

Google Docs Link: https://docs.google.com/document/d/1Lg4c1uSZ8CLwflM3ud4_vZObHbC-0IATCjV4znpDvl4/edit?tab=t.0#heading=h.tbdxroct9z1y

Mr Razif also mentioned that we can try out his GPT that will give feedbacks based on our proposed ideas, so here is the feedback from TDS XD Assistant.

🔍 Reflection on AR Trends 
Over the past few weeks, I explored Unity and Vuforia through exercises and tutorials. I experimented with both marker-based AR (using Image Targets) and markerless AR (Ground Plane), observing how they behave across different lighting and surfaces. I learned that markerless AR offers more flexibility in real environments, while marker-based AR is excellent for location-triggered or poster-based experiences. These insights shaped the three project ideas below. 

B. Finalised Proposal: 
 
Then, after we consulted TDS XD assistant and ensure me and my partner Phoon Yuk Guan does not clash then I started to focus on my research 
1. Introduction 
2. Ideation 
3. Target Audience 
4. Problem Statement 
5. Proposed Solution 
6. Goals 
7. Visual Mockup & Sketch

Document Link: 

Finalized Document: 



⋆ ˚。⋆୨୧˚ Feedback ˚୨୧⋆。˚ ⋆

Week 2 Consultation

1.⁠ ⁠Culture Heritage 

This idea is about scanning pictures in books or artwork. After scanning, an animation will pop up to explain the history and meaning behind it—like in a museum. But the lecturer said it’s too common and similar to other ideas that already exist.

2.⁠ ⁠Pet Care Trainee 

This app helps people who want to own a pet. It teaches how to feed, clean, and take care of animals through a fun game. But the lecturer said it feels like a normal quest game and is not very special or new.

3.⁠ ⁠AR Study Buddy

The idea is to have a virtual study friend that works like ChatGPT to answer questions and help you study. But the problem is you have to keep holding your phone while studying, which is uncomfortable and not very practical.

4.⁠ ⁠AR Cooking Guide

This app scans fruits or vegetables and shows step-by-step cooking instructions. But food changes every day, so it’s hard for the camera to scan it correctly. The lecturer said it might be better to scan kitchen tools like an oven instead.

5.⁠ ⁠AR Planting Tips

This idea helps people take care of plants. You scan the plant, and the app tells you what’s wrong with it. But the problem is the app may not always recognize the plant clearly because of light or camera angle, so the results might be wrong.

6.⁠ ⁠AR Furniture Placement

Like the IKEA app, this idea lets users place virtual furniture in their room to see how it looks. But users might find it hard to measure the furniture size correctly, so the placement might look confusing or not match real size.


⋆ ˚。⋆୨୧˚ Reflection ˚୨୧⋆。˚ ⋆

Experience:

Collaborating with my teammate to conceptualize the AR trend experiences was both inspiring and deeply rewarding. I aimed to create an experience that fused imagination, emotional resonance, and interactive technology. Influenced by traditional pet simulation games, Culture Heritage Games and AR Studdy experiences, I imagined a world where users could form real-time connections with magical pets in their everyday environments. The development process enabled me to weave together narrative elements, gamified caregiving, and augmented reality, resulting in an experience that felt both creatively rich and emotionally engaging.

Observation:

While exploring the landscape of augmented reality experiences, I noticed that many AR applications prioritize visual novelty or location-based interaction but often lack emotional depth or sustained user engagement. Despite advancements in AR technology, a significant number of apps still offer limited interactivity or storytelling, leaving users with short-lived experiences. I also observed that users increasingly seek more meaningful, emotionally resonant interactions in digital environments—experiences that go beyond entertainment to provide a sense of connection or companionship. This highlighted an opportunity to create an AR experience that balances immersive technology with emotional engagement and long-term value.

Findings:

The ideation process revealed that combining magical fantasy with real-world interaction significantly enhances user engagement and long-term interest. I also found that integrating physical elements such as like scanning a tangible potion book strengthens the link between the digital and physical worlds, making the experience more immersive and memorable. In the end, the concept evolved into something both enchanting and meaningful: a digital companion that brings joy, emotional connection, and care through technology.

Comments

Popular posts from this blog

Information Design / Exercises

Game Development // Task 2: Art Asset Development

Information Design / Project 1: Animated Infographic Poster