Experiential Design / Task 4: Final Project & E-Portfolio
14/7/2025 - 28/8/2025 / Week 12 - Week 14
Angel Tan Xin Kei / 0356117
Experiential Design / Bachelor of Design (Hons) in Creative Media
- Instruction
- Task 4: Final Project & E-Portfolio
- Feedback
- Reflection
- Project file and Folders
- Application installation files (APK for android, iOS Build Folder for IOS/iPhones)
- Online posts in your E-portfolio as your reflective studies
- Video walkthrough (Presentation)
Figma Link: https://www.figma.com/design/OyLme6jzrYzZSu9zWZrnPj/Focus-Space?node-id=732-294&t=W95lTgomZTaGSIoT-1
A. Study Scene
For this scene, things that needed to be add as the main function of the study scene is to have a study panel and a break panel that includes a countdown timer where user able to set their time on how long they wanna study or take a break and start or pause the timer. Further more, the user will need to scan and detect study material only the study or break now button will appear.
1. Creating Study Panel & Break Panel
I created two main panels: StudyPanel and BreakPanel.
Each one is designed as a separate panel layout under a shared UI Canvas in World Space, so I could freely reposition and scale them in the 3D scene.
For both StudyPanel and BreakPanel, I created the same internal layout using Unity’s UI system:
- A Title Label (TextMeshProUGUI) that says “Study Timer” or “Break Timer”
- A Countdown Timer Display in the center
- A row of control buttons:
- Start
- Stop
- Restart
- + (Add 1 Minute)
- − (Subtract 1 Minute)
|
| Study Panel and Break Panel Internal Layout on the Hierarchy |
|
| Study Timer |
|
| Break Timer |
To prepare the panels for interaction and also to make sure that the button inside the panel are clickable, I added:
- A Box Collider on each panel to detect pointer or raycast hits in 3D space
- A Canvas Group to control visibility and interaction toggles later if needed
This setup ensures that each panel is treated like a proper 3D object with UI logic and can respond to input events like clicks or taps.
2. Countdown Timer Logic
I created a custom script to handle timer logic, UI interaction, and panel switching. The script is created as CountdownTimerUI.cs and its dragged into the ScanCanvas Game Object to be attached.
Now I need to link my script to UI inspector after I attach my script at my Scan Canvas, then in the Inspector, I assigned all references in the CoundownTinerUI.cs Script.
3. Panel Switch Button
To toggle between panels, I added two buttons inside ScanCanvas, but outside of the panels.
Created two buttons:
- StudyTabButton → Text: “Study Now”
- BreakTabButton → Text: “Break Now”
These buttons trigger which panel is visible and which timer mode is active. I used the easiest way in Unity to toggle the panel visibility where I set the game object to active on click when i want to show the panel and set the game object to not active if i do not want to show the panel, which means when i click on the Study now button, it will set the study panel game object to active and set the break panel game object to not active, and if i click on the Break now button, it will set the break panel game object to active and set the study panel game object to not active.
|
| Study Button On Click |
|
| Break Button On Click |
4. Showing Study Now and Break Now Button Only When Target is Found
To make user able to see the study now and break now button when target image is found, I modified the Image Target's tracking logic using the DefaultObserverEventHandler:
- When target is found → activate ScanCanvas
- Alternatively, I also used Unity Events to trigger buttons like StudyNow and BreakNow via:
- GameObject.SetActive → Enable UI on detection
This ensures the Study and Break modes only appear when the camera detects the physical marker.
B. Quiz Scene
After completing the Study Scene and Break Timer system, I moved on to building the Quiz Scene, which required a fully working interactive quiz system that activates only when an image target is detected. My main goals were:
- Create a multiple-choice quiz interface
- Make the quiz functional with questions and answers loaded dynamically
- Only show the quiz when a physical image target is scanned using Vuforia
- Track correct/incorrect answers and show results
To ensure that the quiz panel only appears when a physical quiz marker is scanned, I used Vuforia’s Image Target tracking.
I added a DefaultObserverEventHandler script to my Image Target. Then, inside the Unity Editor, I created a QuizTarget.cs script that listens for OnTargetFound() and OnTargetLost() events.
In the script, I enabled or disabled the QuizCanvas based on tracking state:
|
| Image Target Event Handler for Heart |
Basically, since I did two image target, one is the heart quiz one is the lung quiz, so when I detected the heart image target, it will hide the lung quiz panel and show the heart quiz panel and vice versa.
|
| Image Target Event Handler for Lung |
Instead of writing a complex script for the logic, I kept it clean and efficient by using Unity's built-in OnClick() events from the Button component in the Inspector. Here's how each button was set up:
- Yes Button
- When clicked, it deactivates the Attempt Quiz Panel (SetActive(false)), removing the prompt screen.
- Simultaneously, it activates the first quiz question panel, typically named Question1Panel or controlled via the QuizManager script.
- This creates a seamless transition from the entry screen to the first question without loading a new scene or breaking immersion.
|
| Yes Button OnClick |
- No Button
- This button acts as a back or exit option.
- When clicked, it calls a method from my scene controller script that performs a scene change back to the Main Menu Scene.
- I used a simple call like SceneManager.LoadScene("MainMenu") to achieve this.
- This gives the user a way to opt out of the quiz experience if they scanned the wrong image or changed their mind.
Just like the Study Scene, I designed the quiz interface using Canvas in World Space so the quiz panel can be placed and repositioned like a 3D object on the ground.
Inside the Unity Hierarchy, I created a parent GameObject called QuizCanvas that holds all the quiz-related UI components. Here's the structure I built under that:
- QuestionPanel (GameObject): This acts as the main container for all quiz UI elements. I made sure this object has a Box Collider, so it can register raycast hits from user touches or clicks in AR mode.
- Quiz Background (Image): A simple clean background panel to make the quiz visually readable in any environment.
- Question Text (TextMeshProUGUI): Displays the current question dynamically, such as "What organ pumps blood throughout the body?"
- Answer Buttons (A, B, C) (Buttons + Text): Three selectable answer buttons, each linked to its respective answer logic.
- Close Button (Button): Allows users to exit the quiz panel anytime and return to the main experience.
- Next Question Button (Button): Hidden by default, but becomes visible once the user selects the correct answer.
- Correct Panel (Panel): A feedback panel that shows a congratulatory message and confetti when the user selects the correct answer.
- Wrong Panel (Panel): A feedback panel that gently informs the user their answer was incorrect.
I styled the panel with a clean design and added a Box Collider to the parent panel so that it would register raycasts (important for clicking in AR scenes).
Each quiz question provides three options — two wrong answers and one correct answer. I handled the logic using button OnClick() events in the Unity Inspector and some custom script functions.
If the User Clicks the Correct Answer:
- Show the Correct Panel:
- I set the CorrectPanel GameObject to active, making it appear immediately on screen to acknowledge the correct choice.
- Confetti Animation:
- To make the experience fun and rewarding, I added a confetti explosion. I used a pre-made Confetti prefab (dragged into the scene) and triggered it using ParticleSystem.Play() in the script. This added a nice visual celebration.
- Hide the Wrong Panel:
- Just in case it was previously active, I made sure the WrongPanel GameObject is set to inactive to avoid overlapping feedback.
- Show the Next Question Button:
- Once the answer is correct, the NextQuestionButton GameObject is set active, allowing the user to progress to the next quiz item.
- Play Sound Effect:
- For positive reinforcement, I played a cheerful "correct" sound using my custom SoundManager script
|
| OnClick Correct Answer |
|
| Correct Answer |
If the User Clicks a Wrong Answer:
- Show the Wrong Panel:
- I set the WrongPanel GameObject to active to inform the user of the incorrect selection.
- Hide the Correct Panel:
- Just like before, I ensured the CorrectPanel is turned off, so only one panel is visible at a time.
- Play Incorrect Sound:
- A short "buzz" or error sound plays
|
| OnClick Wrong Answer |
|
| Wrong Answer |
To make the quiz more meaningful and educational, I included 5 questions each for the Heart and Lung image targets. The logic for handling these questions follows the same method I used for the first one, but repeated and structured neatly using Unity’s GameObjects and button events.
Since each image target (Heart or Lung) should trigger a different set of questions, I first used Vuforia’s Image Target Detection to detect which model was found. Based on the target detected, I display the appropriate set of questions — either from the Heart Quiz or Lung Quiz group.
Each question has its own QuestionPanel with 3 answer buttons (A, B, C), just like I described earlier.
Here’s how I implemented the structure:
1. Five Separate Panels per Organ:
- For Heart, I created: HeartQuestion1Panel, HeartQuestion2Panel, ..., HeartQuestion5Panel
- For Lung, I created: LungQuestion1Panel, LungQuestion2Panel, ..., LungQuestion5Panel
2. Each Panel Contains:
- A Question Text component
- Three Answer Buttons (A, B, C)
- A Correct Panel and a Wrong Panel
- A Next Question button (only appears after the correct answer is selected)
3. Answer Button Logic:
- Every question panel has two wrong buttons and one correct button.
- When a correct answer is selected:
- Show the Correct Panel
- Hide the Wrong Panel
- Show Next Question button
- Play the correct sound effect
- Trigger confetti particle effect
- When a wrong answer is selected:
- Show the Wrong Panel
- Hide the Correct Panel
- Play the wrong sound effect
Next Question Flow:
- Clicking the Next Question button will deactivate the current question panel and activate the next one.
- I did this using SetActive(false) on the current panel and SetActive(true) on the next.
|
| OnClick Next Question |
After the user has completed all 5 quiz questions (whether it’s for the Heart or Lung), I wanted to give them a satisfying wrap-up experience — something that not only tells them how they did, but also encourages them to revise if they want to try again.
So once the user answers the final question, a Quiz Completed Panel will automatically appear. This panel is styled cleanly and placed in World Space Canvas, just like the other panels, so it feels like a 3D pop-up in the AR space.
What the Completion Panel Shows:
- A Congratulations Message ("You’ve completed the quiz!")
- A Score Display Text that shows how many questions the user got right out of 5
- Two interactive buttons:
- Revise Again
- Back to Menu
The Revise Again button is for users who want to try the quiz again and revise the content.
OnClick:
- The final question panel (e.g. HeartQuestion5Panel or LungQuestion5Panel) is deactivated using SetActive(false)
- The very first question panel (HeartQuestion1Panel or LungQuestion1Panel) is re-activated using SetActive(true)
- The score counter is also reset to 0 (handled in script using a simple score = 0; logic)
- The Completed Panel itself is hidden (SetActive(false)) so the user can go back through the quiz naturally
|
| Revise Again Button OnClick |
The second button is Back to Menu, which exits the quiz scene entirely and takes the user back to the main menu.
To do this, I used Unity’s scene management system. On the OnClick event of this button, I attached the following script logic:
SceneManager.LoadScene("MainMenuScene");
|
| Back to menu Button OnClick |
This makes the navigation smooth and responsive. The user can choose to either stay and revise, or go back and choose another experience (like viewing a different organ or going to the Study Scene).
|
| Completed pop up Panel |
C. Notes Scene
The Notes Scene is a key part of my AR MVP prototype, allowing users to explore the heart and lungs in an interactive and immersive way. Through this scene, users can view a rotating 3D model, reveal a mind map with clickable anatomy parts, and watch a short educational video.
1. Image Target Recognition – Switching Between Heart and Lung Content
This scene begins when a user scans either the Heart or Lung image target. I set up two image targets in Unity using Vuforia Image Target Behaviour:
- One for the Heart model
- One for the Lung model
Each target is linked to:
- A unique 3D model (Heart or Lung)
- A dedicated World Space UI canvas (NotesCanvas_Heart / NotesCanvas_Lung)
In Unity:
- I parented the 3D model and the canvas to the corresponding image target so that they appear fixed in AR space when tracked.
- I used SetActive(true/false) to toggle which canvas appears depending on which image is detected.
- For example, if the Heart is scanned, the script activates NotesCanvas_Heart and deactivates NotesCanvas_Lung.
This ensures a context-aware experience: the correct content is always shown based on the image target, without needing to switch scenes.
|
| Event Handler Script for Heart |
|
| Event Handler Script for Lung |
2. Making the 3D Organ Model Rotatable in AR
To make the learning experience dynamic and visually engaging, I applied a continuous rotation to each 3D model:
- I created a script called RotateObject.cs.
- Inside Update(), I used transform.Rotate(Vector3.up * speed * Time.deltaTime); to make the model rotate smoothly around the Y-axis.
This script is attached to the organ model (or its parent object), allowing it to spin automatically even while the user interacts with other elements.
This can actually keep the scene visually alive and let users see the organ from all angles, adding to spatial understanding.
RotateObject.cs Script:
|
| Rotate Script Attached to Image Target |
3. Clicking the Organ to Reveal “Show Mind Map” Button
Next, I added tap interactivity. When users tap the 3D model in AR, a "Show Mind Map" button appears.
To achieve this:
- I created a HeartClick.cs script and attached it to the Image Target GameObject.
- I placed a Box Collider over the image target to detect raycast hits.
- Inside the script, I used Unity’s OnMouseDown() method to detect when the user taps/clicks the image target in AR.
-
In the Unity Inspector, I exposed a public GameObject mindMapButton, which I manually assigned to the “Show Mind Map” button object in the scene.
|
| Heart Click Script Attached |
|
| Box Collider on Image Target |
4. Showing the Mind Map Panel
When the “Show Mind Map” button is clicked, it opens the Track Info Panel, a larger canvas that acts as the interactive learning panel.
- I created a new World Space Canvas called TrackInfoPanel, parented it to the image target, and set it inactive by default.
- The button's OnClick() event calls TrackInfoPanel.SetActive(true).
|
| Show Mind Map Button On Click |
The Track Info Panel contains:
- A larger 3D model of the organ
- UI Buttons positioned around the organ (using anchored UI elements or world space placement)
|
| Track Info Panel for Heart |
|
| Track Info Panel for Lung |
5. Organ Part Buttons
Each anatomical button (e.g., "Left Atrium", "Right Ventricle") is a Unity Button object.
1. Each button has its own associated info panel, created as a UI Panel GameObject with a Text Description.
2. Each button’s OnClick() event is wired to:
public GameObject descriptionPanel;
public void ShowDescription() {
descriptionPanel.SetActive(true);
}
|
| Organ Parts Button OnClick |
3. Inside each description panel, there is a Close Button which calls:
public void HideDescription() {
descriptionPanel.SetActive(false);
}
|
| Close Button On Click |
This design allows the user to tap on individual parts, read their function, and close them to return to the overall view.
6. Play Video Button
To further support visual learners, I added a Play Video feature within the Track Info Panel.
I created a UI Panel called VideoPanel, containing:
- A Video Player component (linked to a .mp4 file in Unity)
- A Restart Button
- A Close Button
I also created a script call VideoPanelControl.cs
This script manages all interactions with the video panel:
In Inspector:
- I dragged the VideoPlayer and buttons into the script fields.
- The “Play Video” button on the Track Info Panel triggers PlayVideo() to activate the panel.
|
| Video Panel Controller Script |
|
| Video Panel UI |
D. Applying Materials to 3D Models to Add Color
When I downloaded my 3D organ models (Heart and Lung) from Sketchfab and imported them into Unity for use in my AR prototype, I noticed that the models appeared completely grey and lacked any textures or color. This is a common issue when importing external models, as Unity doesn’t always automatically assign the correct materials or shaders from online sources.
To solve this and make the models appear more realistic and visually appealing, I needed to manually create and assign materials in Unity.
1. Inspect the Imported Model
Once the model was imported:
- I expanded the model’s folder in the Project window and checked the Mesh and Material files.
- Often, the model contains several child meshes representing different parts (e.g., ventricles, arteries).
- However, the material file imported from Sketchfab usually doesn’t function properly in Unity (especially if it used PBR or advanced shaders).
To give my model color, I created new materials in Unity manually:
- In the Assets folder, I right-clicked > Create > Material.
- In the Inspector, I selected a Base Color using the Albedo color picker or added a texture if I had one.
- I repeated this process for each distinct part of the model that required a unique color or shading.
|
| Material Colors Created |
|
| Materials Inspector |
This allows for full control over how each section of the organ looks in AR, helping to highlight anatomical structures visually for educational purposes.
3. Assign Materials to Model Meshes
After creating the materials:
- I selected the imported 3D model in the Hierarchy.
- I expanded the model’s children in the Inspector to access individual mesh renderers.
- For each part (e.g., left atrium, right ventricle, trachea), I dragged and dropped the appropriate material onto the Mesh Renderer’s Material slot.
|
| Materials Assigned in Inspector |
Unity immediately updated the model in the Scene View, now displaying the correct color or texture.
This process had to be repeated for each sub-mesh of the model, especially since Sketchfab models are often composed of several combined objects.
|
| Final Heart Model with Materials |
|
| Final Lung Color with Materials |
E. Redesigned Overall UI
After receiving feedback from Mr. Razif, I have redesigned the overall UI of the application. One of the key improvements includes the addition of an onboarding pages using a PageView container.
|
| Get Started Pages |
I also noticed that the stability of AR image target detection played a crucial role in the overall usability of the app. When markers were not recognized properly, key features like the timer or quiz panels would not appear, disrupting the experience. To address this, I improved tracking reliability by adjusting lighting and using high-contrast image markers. Lastly, simplifying interaction logic using Unity’s Inspector-based OnClick system helped me manage button functionality more clearly, reducing the chance of bugs and improving efficiency during testing.
























































Comments
Post a Comment