Experiential Design / Task 4: Final Project & E-Portfolio

14/7/2025 - 28/8/2025 / Week 12 - Week 14

Angel Tan Xin Kei / 0356117

Experiential Design / Bachelor of Design (Hons) in Creative Media 

Task 4: Final Project & E-Portfolio



⋆ ˚。⋆୨୧˚ Index ˚୨୧⋆。˚ ⋆
  • Instruction
  • Task 4: Final Project & E-Portfolio
  • Feedback
  • Reflection

⋆ ˚。⋆୨୧˚ Instruction ˚୨୧⋆。˚ ⋆

Module Information Booklet

Timeframe: Week 12  – Week 14
Deadline: Week 14

Description: 
A) Final Project – Completed Experience – 30 %

Students will synthesise the knowledge gained in task 1, 2 and 3 for application in task 4. Students will create integrate visual asset and refine the prototype into a complete working and functional product
experience.

B) E-Portfolio - 10 %
Students describe and reflect on their social competencies within the design studio context, supported by evidence. Reflect on how to empathize with others within group settings, interact positively within a
team and foster stable and harmonious relationships for productive teamwork. The reflective writing is part of the TGCP.

Requirements: 
  1. Project file and Folders
  2. Application installation files (APK for android, iOS Build Folder for IOS/iPhones)
Submission: 
  1. Online posts in your E-portfolio as your reflective studies
  2. Video walkthrough (Presentation)

⋆ ˚。⋆୨୧˚ Task 4: Final Project & E-Portfolio ˚୨୧⋆。˚ ⋆

Figma Link: https://www.figma.com/design/OyLme6jzrYzZSu9zWZrnPj/Focus-Space?node-id=732-294&t=W95lTgomZTaGSIoT-1

A. Study Scene

For this scene, things that needed to be add as the main function of the study scene is to have a study panel and a break panel that includes a countdown timer where user able to set their time on how long they wanna study or take a break and start or pause the timer. Further more, the user will need to scan and detect study material only the study or break now button will appear.

1. Creating Study Panel & Break Panel

I created two main panels: StudyPanel and BreakPanel.

Each one is designed as a separate panel layout under a shared UI Canvas in World Space, so I could freely reposition and scale them in the 3D scene.

For both StudyPanel and BreakPanel, I created the same internal layout using Unity’s UI system:

  • A Title Label (TextMeshProUGUI) that says “Study Timer” or “Break Timer”
  • A Countdown Timer Display in the center
  • A row of control buttons:
    • Start
    • Stop
    • Restart
  • + (Add 1 Minute)
  • − (Subtract 1 Minute)

Study Panel and Break Panel Internal Layout on the Hierarchy

Study Timer

Break Timer

To prepare the panels for interaction and also to make sure that the button inside the panel are clickable, I added:

  • A Box Collider on each panel to detect pointer or raycast hits in 3D space
  • A Canvas Group to control visibility and interaction toggles later if needed

This setup ensures that each panel is treated like a proper 3D object with UI logic and can respond to input events like clicks or taps.


2. Countdown Timer Logic

I created a custom script to handle timer logic, UI interaction, and panel switching. The script is created as CountdownTimerUI.cs and its dragged into the ScanCanvas Game Object to be attached.


using UnityEngine;
using UnityEngine.UI;
using TMPro;

public class CountdownTimerUI : MonoBehaviour
{
public TextMeshProUGUI timeText1;
public TextMeshProUGUI timeText2;

public Button startButton1, startButton2;
public Button stopButton1, stopButton2;
public Button restartButton1, restartButton2;
public Button plusButton1, plusButton2;
public Button minusButton1, minusButton2;

private int totalSeconds = 1500; // Default 25 minutes
private int originalSeconds;
private bool isRunning = false;
private float countdown;

void Start()
{
originalSeconds = totalSeconds;
UpdateTimeText();

// Hook up all button events
startButton1.onClick.AddListener(StartTimer);
startButton2.onClick.AddListener(StartTimer);

stopButton1.onClick.AddListener(StopTimer);
stopButton2.onClick.AddListener(StopTimer);

restartButton1.onClick.AddListener(RestartTimer);
restartButton2.onClick.AddListener(RestartTimer);

plusButton1.onClick.AddListener(AddMinute);
plusButton2.onClick.AddListener(AddMinute);

minusButton1.onClick.AddListener(RemoveMinute);
minusButton2.onClick.AddListener(RemoveMinute);
}

void Update()
{
if (isRunning)
{
countdown -= Time.deltaTime;
if (countdown <= 0)
{
countdown = 0;
isRunning = false;
}
UpdateTimeText((int)countdown);
}
}

void StartTimer()
{
countdown = totalSeconds;
isRunning = true;
}

void StopTimer()
{
isRunning = false;
}

void RestartTimer()
{
isRunning = false;
totalSeconds = originalSeconds;
UpdateTimeText();
}

void AddMinute()
{
if (!isRunning)
{
totalSeconds += 60;
originalSeconds = totalSeconds;
UpdateTimeText();
}
}

void RemoveMinute()
{
if (!isRunning && totalSeconds > 60)
{
totalSeconds -= 60;
originalSeconds = totalSeconds;
UpdateTimeText();
}
}

void UpdateTimeText()
{
int minutes = totalSeconds / 60;
int seconds = totalSeconds % 60;
string timeString = $"{minutes:00}:{seconds:00}";
timeText1.text = timeString;
timeText2.text = timeString;
}

void UpdateTimeText(int t)
{
int minutes = t / 60;
int seconds = t % 60;
string timeString = $"{minutes:00}:{seconds:00}";
timeText1.text = timeString;
timeText2.text = timeString;
}
}

Now I need to link my script to UI inspector after I attach my script at my Scan Canvas, then in the Inspector, I assigned all references in the CoundownTinerUI.cs Script.


3. Panel Switch Button

To toggle between panels, I added two buttons inside ScanCanvas, but outside of the panels.

Created two buttons:

  • StudyTabButton → Text: “Study Now”
  • BreakTabButton → Text: “Break Now”

These buttons trigger which panel is visible and which timer mode is active. I used the easiest way in Unity to toggle the panel visibility where I set the game object to active on click when i want to show the panel and set the game object to not active if i do not want to show the panel, which means when i click on the Study now button, it will set the study panel game object to active and set the break panel game object to not active, and if i click on the Break now button, it will set the break panel game object to active and set the study panel game object to not active.

Study Button On Click

Break Button On Click


4. Showing Study Now and Break Now Button Only When Target is Found

To make user able to see the study now and break now button when target image is found, I modified the Image Target's tracking logic using the DefaultObserverEventHandler:

  • When target is found → activate ScanCanvas
  • Alternatively, I also used Unity Events to trigger buttons like StudyNow and BreakNow via:
    • GameObject.SetActive → Enable UI on detection

This ensures the Study and Break modes only appear when the camera detects the physical marker.


B.  Quiz Scene

After completing the Study Scene and Break Timer system, I moved on to building the Quiz Scene, which required a fully working interactive quiz system that activates only when an image target is detected. My main goals were:

  • Create a multiple-choice quiz interface
  • Make the quiz functional with questions and answers loaded dynamically
  • Only show the quiz when a physical image target is scanned using Vuforia
  • Track correct/incorrect answers and show results
1. Activating Quiz When Image Target is Detected

To ensure that the quiz panel only appears when a physical quiz marker is scanned, I used Vuforia’s Image Target tracking.

I added a DefaultObserverEventHandler script to my Image Target. Then, inside the Unity Editor, I created a QuizTarget.cs script that listens for OnTargetFound() and OnTargetLost() events.

In the script, I enabled or disabled the QuizCanvas based on tracking state:

Image Target Event Handler for Heart


Basically, since I did two image target, one is the heart quiz one is the lung quiz, so when I detected the heart image target, it will hide the lung quiz panel and show the heart quiz panel and vice versa.

Image Target Event Handler for Lung

2. Attempt Quiz Panel
Once the image target is detected, it will show this attempt quiz panel to tell user what is the quiz they will be doing base on the image target detected which is the lung or the heart according to my image target. There is also two button on the attempt quiz panel which is Yes or No for the user to click. If the user select the Yes button, it will show the first question and if the user selects no, it will close the quiz panel.

Instead of writing a complex script for the logic, I kept it clean and efficient by using Unity's built-in OnClick() events from the Button component in the Inspector. Here's how each button was set up:

  • Yes Button
    • When clicked, it deactivates the Attempt Quiz Panel (SetActive(false)), removing the prompt screen.
    • Simultaneously, it activates the first quiz question panel, typically named Question1Panel or controlled via the QuizManager script.
    • This creates a seamless transition from the entry screen to the first question without loading a new scene or breaking immersion.
Yes Button OnClick


  • No Button
    • This button acts as a back or exit option.
    • When clicked, it calls a method from my scene controller script that performs a scene change back to the Main Menu Scene.
    • I used a simple call like SceneManager.LoadScene("MainMenu") to achieve this.
    • This gives the user a way to opt out of the quiz experience if they scanned the wrong image or changed their mind.

No Button OnClick

Attempt Quiz Panel for Heart

Attempt Quiz Panel for Lungs



3. Setting Up the Quiz UI

Just like the Study Scene, I designed the quiz interface using Canvas in World Space so the quiz panel can be placed and repositioned like a 3D object on the ground.

Inside the Unity Hierarchy, I created a parent GameObject called QuizCanvas that holds all the quiz-related UI components. Here's the structure I built under that:

  • QuestionPanel (GameObject): This acts as the main container for all quiz UI elements. I made sure this object has a Box Collider, so it can register raycast hits from user touches or clicks in AR mode.
  • Quiz Background (Image): A simple clean background panel to make the quiz visually readable in any environment.
  • Question Text (TextMeshProUGUI): Displays the current question dynamically, such as "What organ pumps blood throughout the body?"
  • Answer Buttons (A, B, C) (Buttons + Text): Three selectable answer buttons, each linked to its respective answer logic.
  • Close Button (Button): Allows users to exit the quiz panel anytime and return to the main experience.
  • Next Question Button (Button): Hidden by default, but becomes visible once the user selects the correct answer.
  • Correct Panel (Panel): A feedback panel that shows a congratulatory message and confetti when the user selects the correct answer.
  • Wrong Panel (Panel): A feedback panel that gently informs the user their answer was incorrect.


I styled the panel with a clean design and added a Box Collider to the parent panel so that it would register raycasts (important for clicking in AR scenes).

Each quiz question provides three options — two wrong answers and one correct answer. I handled the logic using button OnClick() events in the Unity Inspector and some custom script functions.

If the User Clicks the Correct Answer:

  • Show the Correct Panel:
    • I set the CorrectPanel GameObject to active, making it appear immediately on screen to acknowledge the correct choice.
  • Confetti Animation:
    • To make the experience fun and rewarding, I added a confetti explosion. I used a pre-made Confetti prefab (dragged into the scene) and triggered it using ParticleSystem.Play() in the script. This added a nice visual celebration.
  • Hide the Wrong Panel:
    • Just in case it was previously active, I made sure the WrongPanel GameObject is set to inactive to avoid overlapping feedback.
  • Show the Next Question Button:
    • Once the answer is correct, the NextQuestionButton GameObject is set active, allowing the user to progress to the next quiz item.
  • Play Sound Effect:
    • For positive reinforcement, I played a cheerful "correct" sound using my custom SoundManager script
OnClick Correct Answer

Correct Answer

If the User Clicks a Wrong Answer:

  • Show the Wrong Panel:
    • I set the WrongPanel GameObject to active to inform the user of the incorrect selection.
  • Hide the Correct Panel:
    • Just like before, I ensured the CorrectPanel is turned off, so only one panel is visible at a time.
  • Play Incorrect Sound:
    • A short "buzz" or error sound plays
OnClick Wrong Answer

Wrong Answer


To make the quiz more meaningful and educational, I included 5 questions each for the Heart and Lung image targets. The logic for handling these questions follows the same method I used for the first one, but repeated and structured neatly using Unity’s GameObjects and button events.

Since each image target (Heart or Lung) should trigger a different set of questions, I first used Vuforia’s Image Target Detection to detect which model was found. Based on the target detected, I display the appropriate set of questions — either from the Heart Quiz or Lung Quiz group.

Each question has its own QuestionPanel with 3 answer buttons (A, B, C), just like I described earlier.

 Here’s how I implemented the structure:

1. Five Separate Panels per Organ:

  • For Heart, I created: HeartQuestion1Panel, HeartQuestion2Panel, ..., HeartQuestion5Panel
  • For Lung, I created: LungQuestion1Panel, LungQuestion2Panel, ..., LungQuestion5Panel

2. Each Panel Contains:

  • A Question Text component
  • Three Answer Buttons (A, B, C)
  • A Correct Panel and a Wrong Panel
  • A Next Question button (only appears after the correct answer is selected)

3. Answer Button Logic:

  • Every question panel has two wrong buttons and one correct button.
  • When a correct answer is selected:
    • Show the Correct Panel
    • Hide the Wrong Panel
    • Show Next Question button
    • Play the correct sound effect
    • Trigger confetti particle effect
  • When a wrong answer is selected:
    • Show the Wrong Panel
    • Hide the Correct Panel
    • Play the wrong sound effect

Next Question Flow:

  • Clicking the Next Question button will deactivate the current question panel and activate the next one.
  • I did this using SetActive(false) on the current panel and SetActive(true) on the next.
OnClick Next Question

Quiz Panels for Heart: 





Quiz Panel for Lung:






4. Completed Panel

After the user has completed all 5 quiz questions (whether it’s for the Heart or Lung), I wanted to give them a satisfying wrap-up experience — something that not only tells them how they did, but also encourages them to revise if they want to try again.

So once the user answers the final question, a Quiz Completed Panel will automatically appear. This panel is styled cleanly and placed in World Space Canvas, just like the other panels, so it feels like a 3D pop-up in the AR space.

What the Completion Panel Shows:

  • A Congratulations Message ("You’ve completed the quiz!")
  • A Score Display Text that shows how many questions the user got right out of 5
  • Two interactive buttons:
    • Revise Again
    • Back to Menu

The Revise Again button is for users who want to try the quiz again and revise the content.

OnClick:

  • The final question panel (e.g. HeartQuestion5Panel or LungQuestion5Panel) is deactivated using SetActive(false)
  • The very first question panel (HeartQuestion1Panel or LungQuestion1Panel) is re-activated using SetActive(true)
  • The score counter is also reset to 0 (handled in script using a simple score = 0; logic)
  • The Completed Panel itself is hidden (SetActive(false)) so the user can go back through the quiz naturally
Revise Again Button OnClick

The second button is Back to Menu, which exits the quiz scene entirely and takes the user back to the main menu.

To do this, I used Unity’s scene management system. On the OnClick event of this button, I attached the following script logic:

SceneManager.LoadScene("MainMenuScene");

Back to menu Button OnClick


This makes the navigation smooth and responsive. The user can choose to either stay and revise, or go back and choose another experience (like viewing a different organ or going to the Study Scene).

Completed pop up Panel

C. Notes Scene

The Notes Scene is a key part of my AR MVP prototype, allowing users to explore the heart and lungs in an interactive and immersive way. Through this scene, users can view a rotating 3D model, reveal a mind map with clickable anatomy parts, and watch a short educational video.

 1. Image Target Recognition – Switching Between Heart and Lung Content

This scene begins when a user scans either the Heart or Lung image target. I set up two image targets in Unity using Vuforia Image Target Behaviour:

  • One for the Heart model
  • One for the Lung model

Each target is linked to:

  • A unique 3D model (Heart or Lung)
  • A dedicated World Space UI canvas (NotesCanvas_Heart / NotesCanvas_Lung)

In Unity:

  • I parented the 3D model and the canvas to the corresponding image target so that they appear fixed in AR space when tracked.
  • I used SetActive(true/false) to toggle which canvas appears depending on which image is detected.
  • For example, if the Heart is scanned, the script activates NotesCanvas_Heart and deactivates NotesCanvas_Lung.

This ensures a context-aware experience: the correct content is always shown based on the image target, without needing to switch scenes.

Event Handler Script for Heart

Event Handler Script for Lung

2. Making the 3D Organ Model Rotatable in AR

To make the learning experience dynamic and visually engaging, I applied a continuous rotation to each 3D model:

  • I created a script called RotateObject.cs.
  • Inside Update(), I used transform.Rotate(Vector3.up * speed * Time.deltaTime); to make the model rotate smoothly around the Y-axis.

This script is attached to the organ model (or its parent object), allowing it to spin automatically even while the user interacts with other elements.

This can actually keep the scene visually alive and let users see the organ from all angles, adding to spatial understanding.

RotateObject.cs Script:

using UnityEngine;

public class RotateObject : MonoBehaviour
{
public float rotationSpeed = 10f; // Speed of rotation (adjustable)
private float rotationX = 0f;
private float rotationY = 0f;
void Update()
{
// Check if the mouse button is held down
if (Input.GetMouseButton(0)) // Left mouse button (0)
{
// Get the mouse movement along X and Y axes
float mouseX = Input.GetAxis("Mouse X") * rotationSpeed;
float mouseY = Input.GetAxis("Mouse Y") * rotationSpeed;

// Update rotation angles
rotationX -= mouseY;
rotationY += mouseX;

// Apply the rotation to the object
transform.rotation = Quaternion.Euler(rotationX, rotationY, 0f);
}
}
}

Rotate Script Attached to Image Target

3. Clicking the Organ to Reveal “Show Mind Map” Button

Next, I added tap interactivity. When users tap the 3D model in AR, a "Show Mind Map" button appears.

To achieve this:

  • I created a HeartClick.cs script and attached it to the Image Target GameObject.
  • I placed a Box Collider over the image target to detect raycast hits.
  • Inside the script, I used Unity’s OnMouseDown() method to detect when the user taps/clicks the image target in AR.
  • In the Unity Inspector, I exposed a public GameObject mindMapButton, which I manually assigned to the “Show Mind Map” button object in the scene.

HeartClick.cs Script:

using UnityEngine;

public class HeartClick : MonoBehaviour
{
public GameObject infoPanel;

void OnMouseDown()
{
if (infoPanel != null)
{
infoPanel.SetActive(true);
}
}
}


Heart Click Script Attached

Box Collider on Image Target


4. Showing the Mind Map Panel

When the “Show Mind Map” button is clicked, it opens the Track Info Panel, a larger canvas that acts as the interactive learning panel.

  • I created a new World Space Canvas called TrackInfoPanel, parented it to the image target, and set it inactive by default.
  • The button's OnClick() event calls TrackInfoPanel.SetActive(true).
Show Mind Map Button On Click

The Track Info Panel contains:

  • A larger 3D model of the organ
  • UI Buttons positioned around the organ (using anchored UI elements or world space placement)

Track Info Panel for Heart

Track Info Panel for Lung

5. Organ Part Buttons

Each anatomical button (e.g., "Left Atrium", "Right Ventricle") is a Unity Button object.

1. Each button has its own associated info panel, created as a UI Panel GameObject with a Text Description.

2. Each button’s OnClick() event is wired to:

public GameObject descriptionPanel;

public void ShowDescription() {

descriptionPanel.SetActive(true);

}

Organ Parts Button OnClick


3. Inside each description panel, there is a Close Button which calls:

public void HideDescription() {

descriptionPanel.SetActive(false);

}

Close Button On Click

This design allows the user to tap on individual parts, read their function, and close them to return to the overall view.


6. Play Video Button

To further support visual learners, I added a Play Video feature within the Track Info Panel.

I created a UI Panel called VideoPanel, containing:

  • A Video Player component (linked to a .mp4 file in Unity)
  • A Restart Button
  • A Close Button

I also created a script call VideoPanelControl.cs

This script manages all interactions with the video panel:

using UnityEngine;
using UnityEngine.UI;
using UnityEngine.Video;

public class VideoPanelController : MonoBehaviour
{
public GameObject videoPanel;
public Button playVideoBtn;
public Button restartBtn;
public Button closeBtn;
public VideoPlayer videoPlayer;

void Start()
{
// Assign button listeners
playVideoBtn.onClick.AddListener(OpenAndPlayVideo);
restartBtn.onClick.AddListener(RestartVideo);
closeBtn.onClick.AddListener(CloseVideoPanel);

videoPanel.SetActive(false); // hide at start
}

void OpenAndPlayVideo()
{
videoPanel.SetActive(true);
videoPlayer.Stop();
videoPlayer.Play();
}

void RestartVideo()
{
videoPlayer.Stop();
videoPlayer.Play();
}

void CloseVideoPanel()
{
videoPlayer.Stop();
videoPanel.SetActive(false);
}
}

In Inspector:

  • I dragged the VideoPlayer and buttons into the script fields.
  • The “Play Video” button on the Track Info Panel triggers PlayVideo() to activate the panel.

Video Panel Controller Script

Video Panel UI

D. Applying Materials to 3D Models to Add Color

When I downloaded my 3D organ models (Heart and Lung) from Sketchfab and imported them into Unity for use in my AR prototype, I noticed that the models appeared completely grey and lacked any textures or color. This is a common issue when importing external models, as Unity doesn’t always automatically assign the correct materials or shaders from online sources.

To solve this and make the models appear more realistic and visually appealing, I needed to manually create and assign materials in Unity. 

1. Inspect the Imported Model

Once the model was imported:

  • I expanded the model’s folder in the Project window and checked the Mesh and Material files.
  • Often, the model contains several child meshes representing different parts (e.g., ventricles, arteries).
  • However, the material file imported from Sketchfab usually doesn’t function properly in Unity (especially if it used PBR or advanced shaders).
2. Create Custom Materials in Unity

To give my model color, I created new materials in Unity manually:

  • In the Assets folder, I right-clicked > Create > Material.
  • In the Inspector, I selected a Base Color using the Albedo color picker or added a texture if I had one.
  • I repeated this process for each distinct part of the model that required a unique color or shading.
Material Colors Created

Materials Inspector

This allows for full control over how each section of the organ looks in AR, helping to highlight anatomical structures visually for educational purposes.

3. Assign Materials to Model Meshes

After creating the materials:

  • I selected the imported 3D model in the Hierarchy.
  • I expanded the model’s children in the Inspector to access individual mesh renderers.
  • For each part (e.g., left atrium, right ventricle, trachea), I dragged and dropped the appropriate material onto the Mesh Renderer’s Material slot.
Materials Assigned in Inspector

Unity immediately updated the model in the Scene View, now displaying the correct color or texture.

This process had to be repeated for each sub-mesh of the model, especially since Sketchfab models are often composed of several combined objects.

Final Heart Model with Materials

Final Lung Color with Materials

E. Redesigned Overall UI

After receiving feedback from Mr. Razif, I have redesigned the overall UI of the application. One of the key improvements includes the addition of an onboarding pages using a PageView container. 

This onboarding sequence consists of three sliding landing pages that introduce and explain the core purpose of the app, Focus Space. Each page is designed to give users a clear and engaging overview of what the app offers, enhancing the first-time user experience.

Get Started Pages
Onboarding Pages

Menu Scene

I have utilised the onboarding manager to make a seamless transition for the onboarding pages with the script attached below:

using System.Collections;
using UnityEngine;
using UnityEngine.SceneManagement;

public class OnboardingManager : MonoBehaviour
{
[Header("Page Setup")]
public RectTransform[] pages;
public float transitionDuration = 0.5f;
public float slideOffset = 1000f;

private int currentPage = 0;
private bool isTransitioning = false;

void Start()
{
// Hide all pages first
for (int i = 0; i < pages.Length; i++)
{
pages[i].gameObject.SetActive(false);
pages[i].anchoredPosition = Vector2.zero;

CanvasGroup cg = pages[i].GetComponent<CanvasGroup>();
if (cg != null) cg.alpha = 0;
}

// Show first page
ShowPage(0);
}

public void NextPage()
{
if (currentPage < pages.Length - 1 && !isTransitioning)
{
StartCoroutine(TransitionToPage(currentPage + 1));
}
}

public void PreviousPage()
{
if (currentPage > 0 && !isTransitioning)
{
StartCoroutine(TransitionToPage(currentPage - 1));
}
}

public void LoadMenuScene()
{
SceneManager.LoadScene("MenuScene"); // Make sure this matches your actual scene name
}

private void ShowPage(int index)
{
RectTransform page = pages[index];
CanvasGroup cg = page.GetComponent<CanvasGroup>();

page.anchoredPosition = Vector2.zero;
page.gameObject.SetActive(true);
cg.alpha = 1;
currentPage = index;
}

private IEnumerator TransitionToPage(int nextPage)
{
isTransitioning = true;

RectTransform current = pages[currentPage];
RectTransform next = pages[nextPage];

CanvasGroup currentCG = current.GetComponent<CanvasGroup>();
CanvasGroup nextCG = next.GetComponent<CanvasGroup>();

next.gameObject.SetActive(true);
next.anchoredPosition = new Vector2((nextPage > currentPage ? slideOffset : -slideOffset), 0);
nextCG.alpha = 0;

float t = 0f;
while (t < transitionDuration)
{
float progress = t / transitionDuration;

// Slide both pages
current.anchoredPosition = Vector2.Lerp(Vector2.zero, new Vector2(nextPage > currentPage ? -slideOffset : slideOffset, 0), progress);
next.anchoredPosition = Vector2.Lerp(new Vector2(nextPage > currentPage ? slideOffset : -slideOffset, 0), Vector2.zero, progress);

// Fade both pages
currentCG.alpha = 1 - progress;
nextCG.alpha = progress;

t += Time.deltaTime;
yield return null;
}

// Final positions
current.anchoredPosition = new Vector2(nextPage > currentPage ? -slideOffset : slideOffset, 0);
currentCG.alpha = 0;
current.gameObject.SetActive(false);

next.anchoredPosition = Vector2.zero;
nextCG.alpha = 1;

currentPage = nextPage;
isTransitioning = false;
}
}


Sliding Page Container

Organising the Onboarding Pages Accordingly

F. Study Scene Chill Music

I have found 3 chill music clips which users can play three chill songs consecutively then stop as well with using the attached script on the music button: 

When Player click the music icon: it will plays musicSource1, musicSource2, musicSource3 then stop.

using UnityEngine;
using UnityEngine.UI;

public class MusicToggleButton : MonoBehaviour
{
public AudioSource music1;
public AudioSource music2;
public AudioSource music3;

private int pressCount = 0;

void Start()
{
GetComponent<Button>().onClick.AddListener(CycleMusic);
}

void CycleMusic()
{
StopAllMusic();

pressCount = (pressCount + 1) % 4;

switch (pressCount)
{
case 1:
music1.Play();
break;
case 2:
music2.Play();
break;
case 3:
music3.Play();
break;
case 0:
// stop — no music plays
break;
}
}

void StopAllMusic()
{
if (music1.isPlaying) music1.Stop();
if (music2.isPlaying) music2.Stop();
if (music3.isPlaying) music3.Stop();
}
}

script for music button on study scene

music button placement

chill and jazz music for study scene

Final Submission:

Youtube Presentation Walkthrough Link: https://youtu.be/AzDTBEkQCmI


Walkthrough of AR App: 



Phoon Yuk Guan Blog's Link: 

Drive Link: 

⋆ ˚。⋆୨୧˚ Feedback ˚୨୧⋆。˚ ⋆

Week 13
Mr. Razif highlighted that the UI design could be improved to enhance clarity, visual appeal, and user engagement. He suggested that the current layout and interface elements lacked cohesion and could benefit from a more refined and consistent visual style.

Week 14
Overall, he said our app functionality are fine and is good to go.


⋆ ˚。⋆୨୧˚ Reflection ˚୨୧⋆。˚ ⋆
Experience
Throughout Weeks 12 to 14, I focused on finalizing and refining my AR-based Focus Space Study App, building on the foundational work from previous tasks. This phase involved transforming my MVP into a complete, immersive user experience integrating UI logic, functional timers, quizzes, interactive 3d models across multiple scenes (Study, Notes, and Quiz). It was also the most challenging time I truly saw how the AR learning ecosystem I envisioned could function cohesively in a real environment.  From scanning image targets to launching timers or triggering 3D quiz interfaces, every interaction had to feel natural, meaningful, and intuitive for the user.

This phase pushed me and my teammatesto go beyond technical assembly. I had to problem-solve, design with clarity, and polish the details that create user delight. Creating seamless transitions between UI states, handling conditional visibility based on AR tracking, and implementing educational features like confetti feedback and mind maps deepened my understanding of experiential design.  

Observation:

During the final development of my Focus Space Study App, I observed that user engagement significantly improved when interactive feedback such as confetti, sounds, and visual cues were introduced. These small yet effective elements helped make the experience more rewarding, especially in the quiz section. Additionally, organizing UI elements within modular world space canvases made the interface more manageable and responsive in AR. By structuring Study, Quiz, and Notes panels separately, I could easily control their visibility depending on the user’s interaction or tracking events without causing layout conflicts.

I also noticed that the stability of AR image target detection played a crucial role in the overall usability of the app. When markers were not recognized properly, key features like the timer or quiz panels would not appear, disrupting the experience. To address this, I improved tracking reliability by adjusting lighting and using high-contrast image markers. Lastly, simplifying interaction logic using Unity’s Inspector-based OnClick system helped me manage button functionality more clearly, reducing the chance of bugs and improving efficiency during testing.

Experience:

Through this process, I found that AR experiences are most effective when they combine spatial immersion with clear user control. Just placing 3D content in the real world wasn’t enough users needed functional elements like timers, buttons, and feedback to feel truly engaged. This highlighted the importance of blending interaction design with visual storytelling to create a more meaningful and educational experience. Structuring quiz flows with progress indicators and layered UI also ensured users could stay oriented and motivated throughout their learning journey.

I also realized how crucial interdisciplinary thinking is in experiential design. Bringing together coding, interface layout, content accuracy, and user empathy allowed the app to function smoothly across all touchpoints. Designing from the user’s perspective led to choices like adjustable timers, retry buttons, and intuitive panel transitions all of which helped reduce friction and support continuous engagement. Ultimately, I learned that a successful AR learning app must not only be technically functional but also emotionally and cognitively supportive.

⋆ ˚。⋆୨୧˚ Quick Links ˚୨୧⋆。˚ ⋆


Comments

Popular posts from this blog

Information Design / Exercises

Game Development // Task 2: Art Asset Development

Information Design / Project 1: Animated Infographic Poster