24/04/2024 - 02/08/2024 / Week 1 - Week 14
Joey Lok Wai San / 0350857
Experiential Design / Bachelor of Design (Hons) in Creative
Media
Task 1: Trending Experience
TABLE OF CONTENTS
LECTURES
YouTube Playlist Link: https://www.youtube.com/playlist?list=PLZXlN49OF94r7vxcxwp6Af7loi05erRwc
WEEK 1
For our first class, Mr. Razif briefed us on the
module as well as the tasks and the expectations. Mr.
Razif also provided us with several examples of our
senior's work, to help us gain a clear understanding of
the design ideas for our upcoming project.
Augmented Reality (AR)
- The content is overlayed on top of a real object
Mixed Reality (MR)
- Can interact with the virtual objects
Virtual Reality (VR)
- Uses a headset in a computer-generated world
- Can interact with the virtual objects
Virtual Reality (VR)
- Uses a headset in a computer-generated world
Designing AR
Marker-based Augmented Reality (AR)
Requires a marker, such as images or shapes, to activate the virtual object
Markerless AR scans
The surrounding environment needs to be scanned to the virtual object, which is placed on surfaces of the real world
Requires a marker, such as images or shapes, to activate the virtual object
Markerless AR scans
The surrounding environment needs to be scanned to the virtual object, which is placed on surfaces of the real world
Design Thinking Process
The design thinking process consists of empathy, define,
ideate, prototype and test. There is no clear structure or
order to follow, as we sometimes have to go back and forth
in design.
AR Animals on Google
Fig. 1.4 Lecture Slides - AR Design, Week 1 (24/04/2024)
WEEK 2
There was no lecture during public holiday.
WEEK 3
Mr. Razif also introduced us to User Mapping and Journey
Mapping to help us understand a user's experiences. This
allows us to understand how to enhance the quality of
the overall experience with the products or
services.
Terminology
Terminology
- XD: Experience Design
- UX: User Experience
- UI: User Interface
- CX: Customer Experience
- BX: Brand Experience
- IxD: Interaction Design
- SD: Service Design
User Mapping
Empathy Map
- Four components: Says, Does, Thinks, and Feels
- Creates a shared understanding of others
- Aids in decision-making
Journey Map
- To map the user’s journey when they perform a
certain task
- Current Journey Map: The current user experience, which has many touchpoints. Each touchpoint may have a gain point, a pain point, and a solution to the pain point.
- Future Journey Map: Addresses all the issues from the current journey map
- Current Journey Map: The current user experience, which has many touchpoints. Each touchpoint may have a gain point, a pain point, and a solution to the pain point.
- Future Journey Map: Addresses all the issues from the current journey map
Fig. 1.7 Lecture Slides - Experience Design, Week 3 (08/05/2024)
INSTRUCTIONS
https://drive.google.com/file/d/1VTud1K8MvUQGR3C0XpZ7vf3aqRaEh_eq/preview
TASK 1: Trending Experience
WEEK 1
Imagine the scenario in either of the two places. What would the AR experience be and what extended visualisation can be useful? What do you want the user to feel?
Imagine the scenario in either of the two places. What would the AR experience be and what extended visualisation can be useful? What do you want the user to feel?
Scenario: Shopping Mall - Sunway Pyramid
The scenario I chose is Sunway Pyramid, a large shopping mall whose
multi-level design can make navigation challenging with over 1,000
retail stores. I lose direction and do not
know where to go every time I visit the mall. This also makes it difficult to look for a particular
store, the toilet, elevators, etc. A navigation app comprised of AR
technology would be beneficial for me and other visitors. Perhaps
a directory with arrows on the floor or space to show the way to a
location, as well as how long it would take to travel there, showing the
shortest route.
- Detects the user's current location within the mall and provides navigation to ensure users can follow the route easily without getting lost.
- Displays the approximate distance and time to reach the nearest toilet.
- Voice-guided directions for users who prefer auditory instructions over visual ones.
- Information about other stores along the way to the destination.
- Detects the user's current location within the mall and provides navigation to ensure users can follow the route easily without getting lost.
- Displays the approximate distance and time to reach the nearest toilet.
- Voice-guided directions for users who prefer auditory instructions over visual ones.
- Information about other stores along the way to the destination.
After selecting the desired location, a layout plan of the shopping
mall will appear, indicating the customer's current location and the
route to get to the location. The live view navigation utilises AR to
indicate the directions.
WEEK 3
We were assigned to form a group and collaborate to create a user journey map for visiting a location while implementing an augmented reality (AR) experience. We have to list out the tasks, identify gain points, pain points and possible solutions using the AR mobile app. We proceeded to distribute tasks and utilized Miro to collectively work on our parts.
We worked on Taylor's University, and how to navigate from the LRT station to the Experiential Design classroom.
https://miro.com/app/board/uXjVKK5NQog=/?share_link_id=171639004414
WEEK 3
For this week, we were guided on how to use Unity Hub. We were instructed to download the Vuforia Engine Package and import it into Unity.
We were guided to add a license key in the inspector section in Unity. The image target needs to have at least a 4-star rating to ensure it can be recognised properly.
Fig. 2.3 Database in Vuforia Engine, Week 3 (08/05/2024)
We also learned how to add a license key in Unity using Vuforia Engine. Key in the license name in 'Get Basic', copy the licence key and paste it under Inspector.
1) Add Vuforia Package into Unity
2) Add AR Camera
3) Add license
1) Add Vuforia Package into Unity
2) Add AR Camera
3) Add license
3D Object on Image Target
I was having some difficulties with Unity and Vuforia, which meant I could not keep up with the tutorial. My friend sent me this YouTube video on how to add image targets in Unity with Vuforia.
Video tutorial:
https://www.youtube.com/watch?v=Z4bBMpa4xWo&t=627s&ab_channel=Vuforia%2CaPTCTechnology
To place a 3D object as a child of the Image Target, right-click to
add the 3D shape of your choice. I went with a sphere at first then
changed to a cube shape.
This is the final outcome of a 3D object on the Image Target.
Fig. 2.8 Outcome of 3D cube on the Image Target, Week 3 (11/05/2024)
Video on Image Target
We then learned to incorporate a video on the Image Target.
- Add a Plane object
- Add a Video Player component on the Plane
- Drag the video file to the Video Clip. The video can be adjusted based on loop, playback speed and volume
For the chosen video, I found a GIF of a cute cat (because cats are
adorable, cats are life) and converted it to an MP4 format. This is
because Unity does not recognize or play GIF formats.
Following what I have learnt from the tutorial, I decided to try my
hand at adding another video to the Image Target. This time, I
decided to add a Video Player to the 3D cube.
Fig. 2.11 Outcome of Video on the Image Target, Week 3 (11/05/2024)
This week we continued our lessons on Unity, specifically focusing
on how to add functions to an object, creating buttons that play or
pause a video, animating and using build settings.
Add functions to an object
We learned how to make an image or object disappear (play/pause
functions) when the Image Target is out of frame. When the
Image Target is scanned, the video appears and plays
automatically. When we move our camera away from the Image Target, the
video will pause.
- Click on the Image Target
- In the Inspector panel, go to 'Event(s) when target is found'
- Drag the image or video as the object
- On 'Target Found': click + and add an object that you want to control → Choose the VideoPlayer - Play function.
- On 'Target Found': click + and add an object that you want to control → Choose the VideoPlayer - Pause function.
- Go to 'Advanced Setting'
- Click on 'Device Tracker Setting' → uncheck 'Track Device Pose'
Track Device Pose - Checked: If the tracker is lost, the
object will remain in the same position
Track Device Pose - Unchecked: If the tracker is lost, the
object will also disappear

Fig. 3.1 'Target Found' and 'Target Lost' - object disappears when image target is out of frame, Week 4 (17/05/2024)
Fig. 3.2 Uncheck 'Track Device Pose', Week 4 (17/05/2024)
Fig. 3.3 Outcome of Play/Pause functions, Week 4
(17/05/2024)
Adding a canvas and changing the canvas size
- Right-click on the Inspector panel → UI → Canvas and EventSystem
- Set 'Render Mode' → 'Screen Space - Overlay'
- Click on build settings
- Change to Android
- Go to the 'Games' tabs and change the aspect ratio to a portrait view (1920 x 1080)
- Everytime you create a canvas: Set 'UI Scale Mode' → 'Scale with Screen Size'
- Everytime you create a canvas: Adjust 'Reference Resolution' based on the aspect ratio
Play & Pause buttons
The play/pause functions can be controlled using buttons. On the
canvas, add two buttons (Play and Pause).
- Right-click on the Inspector panel → UI → Buttons. Put it under the canvas.
- Click + to add a component to 'On Click'
- Insert video target → select object: Plane - Video Player
- Select VideoPlayer - Play for Play button, VideoPlayer - Pause for Pause button
Show & Hide objects
The show and hide function can either reveal or hide an object upon clicking on the specific button.
The show and hide function can either reveal or hide an object upon clicking on the specific button.
- Right-click on the Inspector panel → UI → Buttons. Put it under the canvas.
- Select object
- Apply the GameObject - SetActive function to show or hide the object
SetActive checked: This means that it is
true. Clicking the Show button will show the
object.
Animations on an object
- Click on the object you want to animate.
- Go to the animation tab → add a new animation clip (located in the Windows panel)
- Click record and animate the clip. Stop recording.
- Create a new folder named "Animations". In this folder, save the animation named "Object_idle"
-
Create another animation named "Cube_stop" without recording
any movements
- Open Animator
- Set the animation with keyframes as the 'Layer Default State'
Fig. 3.11 Set the Layer Default State, Week 4 (17/05/2024)
Fig. 3.12 Outcome of Animate function, Week 4
(17/05/2024)
Animate/Stop Animate buttons
For the Start Animate button:
- Right-click on the Inspector panel → UI → Buttons. Put it under the canvas.
- Select object
- Choose Animator - Play
- Type the animation file name (e.g. "Object_Idle")
For the Stop Animate button:
- Select the button
- Select object
- Choose Animator - Play
- Type the name of the empty animation file (e.g. "Object_Stop")

Fig. 3.14 Adding a Stop Animate function to one button, Week 4 (17/05/2024)
Build to Device
Fill in the basic information such as the product name and app version -
the icon and splash image.
- Access 'Player Settings' in the build setting
- Select Android
- Uncheck 'auto graphics API' and remove 'Vulkan'
- Locate 'Identification' and change the 'minimum API level' to Andriod 11
- Locate 'Configuration' and change 'scripting backend' to IL2CPP
- Change 'Target Architecture' to 'ARM64'
- Click 'Run & Build'
Fig. 3.18 Android 11 / IL2CPP / ARM64, Week 4 (17/05/2024)
Fig. 3.19 Outcome of Build to Device, Week 4 (17/05/2024)
WEEK 6
This week we were taught how to make a marker-less AR experience where we only need to detect any flat surface (e.g. floor/ground) to spawn our 3D object.
Previously, we did a marker-based AR experience, which is why we need an ImageTarget to trigger the AR visuals. In our case, we spawn a 3D cube when the image target is detected. We then changed it to a video and we added controls to the video.
Fig. 4.1 Marker-less AR Experience Tutorial, Week 6 (29/05/2024)
Previously, we did a marker-based AR experience, which is why we need an ImageTarget to trigger the AR visuals. In our case, we spawn a 3D cube when the image target is detected. We then changed it to a video and we added controls to the video.
Fig. 4.1 Marker-less AR Experience Tutorial, Week 6 (29/05/2024)
To start, set up the Vuforia Package in Unity, and add the following:
Plane Finder: Allows you to detect the ground/ surface.
Ground Plane Stage: Allows you to place objects on the ground/ surface. The object to spawn in the real world has to be a child of the Ground Plane Stage.
Anchor Stage: To anchor the Ground Plane Stage to the Plane Finder, drag and drop the Ground Plane Stage to the 'anchor stage' in the Plane Finder.
Ground Plane Stage: Allows you to place objects on the ground/ surface. The object to spawn in the real world has to be a child of the Ground Plane Stage.
Anchor Stage: To anchor the Ground Plane Stage to the Plane Finder, drag and drop the Ground Plane Stage to the 'anchor stage' in the Plane Finder.
Duplicate Stage:
Checked Duplicate Stage: To spawn multiple objects when you tap
Unchecked Duplicate Stage: To move the object.
On Target Found and On Target Lost, it allow you to customize interactivity, e.g. show other panels or other screens. The grid on the Ground Plane Stage shows the measurements in real life.
Ground plane simulators can be used to simulate the real ground and to check if the object can be spawned on the ground, before building the app.

Fig. 4.4 Ground Plane Simulators, Week 6 (31/05/2024)
Fig. 4.5 Outcome of Markerless AR with a Cube, Week 6 (31/05/2024)
We are required to try to create a markerless AR experience as if this is the IKEA app, using furniture such as chairs, tables, etc. Unity supports .FBX and .OBJ files for 3D models. I tried out a variety of 3D model furniture to test out how to download and import the files into Unity.
After lots of browsing, I chose a coffee table as the furniture I wanted to use. I downloaded the files provided, then imported them and made it a child of the Ground Plane Stage.
3D Model source: https://free3d.com/3d-model/coffee-table-54484.html
Once I imported the shape into Unity, I faced some problems with adding the texture and materials. A big thank you to my friend for helping me, I watched so many tutorials and nothing worked.
To add texture and materials to the object, drag the .jpeg material downloaded and drop it onto the desired surface/ object later.
Fig. 4.10 Outcome of Markerless AR with a Table, Week 6 (31/05/2024)
WEEK 9
This week we were taught how to create a 3D space and scenes, as well as create a script for screen navigation.
3D Room - ProBuilder
To create a 3D room, we use ProBuilder.
To create a 3D room, we use ProBuilder.
- Install ProBuilder into Unity: Window → Package Manager → Unity Registry → Search → Install
- Open ProBuilder: Tools → ProBuilder
- Create walls: New shape → Drag to create walls (1 square = 1 meter)
Scenes
We can create scenes and navigate between multiple scenes using buttons. For instance, we renamed the current scene with the 3D space and object to "ARScene" and created a "Menu" and "Exit" scene, as well as added buttons for navigation on all scenes on a Canvas.
- Menu scene has Start and Exit buttons
- Menu scene has Start and Exit buttons
- Exit scene has Start and Menu buttons
- ARScene (Start scene) has Menu and Exit buttons
- ARScene (Start scene) has Menu and Exit buttons
Scene Manager Script
The scenes must be properly arranged before writing a script for navigation. Add the scenes in Build Settings → First scene on top, last scene on the bottom. Then, create a script file.
Script in Visual Code
There are two different ways to create the script for navigation (1) individual script for each scene and (2) general script for all scenes using string.
There are two different ways to create the script for navigation (1) individual script for each scene and (2) general script for all scenes using string.
- Create a new folder "Scripts" in the "Asset" folder (in this case, the script is in the "Menu" scene)
- Create a C# Script file
- Change the name right away to "MySceneManager" or "MyScript". The name must match the class name on the side.
- Open the script in Visual Code
Fig. 5.4 Individual script for each scene, Week 9 (19/06/2024)

Fig. 5.5 General script for all scenes using string, Week 9 (19/06/2024)

Fig. 5.5 General script for all scenes using string, Week 9 (19/06/2024)
Add script to buttons in Unity
- Create an empty object and name it "SceneManager"
- Drag and drop the script to "SceneManager"
- Copy "SceneManager" from the scene and paste it to the chosen scene (e.g. copy from "Menu" to "Exit" scene)
- Go to the button, drag in the "SceneManager" to "On Click"
- Change the function to "MySceneManager" → "changetoscene" → type the scene name (e.g. "changetoARScene")
WEEK 10
This week we were taught how to import and apply UI elements to buttons. It is better to separate the text and the button (use a blank button design). It is best not to put the text with the button, because there is less flexibility, hard to play around, and the tint goes on top of the text too.
Fig. 6.1 x Week 10 (26/06/2024)
- Import or drag and drop the UI elements into the Assets folder.
- Select all the buttons/ UI element
- Change the Texture Type to "Sprite (2D and UI)" - The project is a 3D project, so it assumes everything is 3D, but we are now creating a UI design.
- Click "Apply"
Change button style and size
- Click on the button
- Change the "Source Image" of the button to the UI element/ button style you want
- Under Image, click "Preserve Aspect" to maintain the aspect ratio and make sure it is not distorted
Fig. 6.1 x Week 10 (26/06/2024)
Change button colour using colour tint
Under Interactable, change "Transition" to "Colour Tint"
- Normal colour: does not change anything
Under Interactable, change "Transition" to "Colour Tint"
- Normal colour: does not change anything
- Highlight colour: changes colour when the mouse hovers over
- Press colour: changes colour when click/ tap on
Change button style
Change button style
You can swap the image
Change "Colour Tint" to "Sprite Swap"
Drag in a different button design under "Highlight Sprite"
Drag in a different button design under "Highlight Sprite"
When hovered over, the button will change from the original style to the different button style
Button animation
Add image in 3D Room
Button animation
- Under Interactable, change "Transition" to "Animation"
- Click "Auto Generate Animation"
- Create a new folder called "Animation" and save it inside
- Inside the "Animation" folder, click on the arrow button to see all the clips (they do not have any animation yet)
- Check "Loop Time" if you want the animation to loop. Uncheck "Loop Time" if you do not want the animation to loop.
- Click on the "Animation" tab
- Under Preview, you can change which clip you want to animate. E.g. Change the "Normal" to "Highlighted" to show animation when hovered over
- Click on the record button.
- The first keyframe is the normal state, move the button to register the position on the first keyframe.
- To make the animation, move the cursor → select the next keyframe → change the scale/ make the animation.
- To check, go to "Animator" on the top tab
Add image in 3D Room
- Create a new canvas on the "ARScene", call it "RoomCanvas", and place it under "MyRoom"
- Add an image under it
- Click SHIFT + ALT on the bottom right to stretch the entire image to fit the canvas
- Change canvas from "Screen Overlay" to "World Space" to resize
- Drag "AR Camera" into "Event Camera"
- Resize the canvas to fit in the 3D room
- Insert image
Add video in 3D Room
- Create a new plane on the "ARScene", call it "Video Plane", and place the plane under "MyRoom"
- Place the plane on 3D Room wall
- Add video component → Video Player
- Add video clip
- Check "Play on Wake" to play automatically; uncheck "Play on Wake" before adding control buttons
- Add button under "RoomCanvas" → overlay on top of the video plane
- Play Button: "On Click" → Under "None" → Drag plane with video → Function: Play
- Pause Button: "On Click" → Under "None" → Drag plane with video → Function: Pause
Overlay a guide over the device
- Add canvas → rename it to "GuideCanvas" (overlay on top of the phone to guide people on what they need to do)
- Add text under canvas (e.g. 'Scan the floor slowly')
- Canvas disappears when the floor is selected, when the floor is detected canvas will show:
- On "Ground Plane Stage" → "On Target Found" → drag "Guide Canvas" to None → Function: GameObject.SetActive → Uncheck box
- On "Ground Plane Stage" → "On Target Lost" → drag "Guide Canvas" to None → Function: GameObject.SetActive → Check box
FEEDBACK
WEEK 3
Misunderstood the instructions a little bit. The
group's user journey map assumes there is an app
already being used. Try not to focus on the existence
of the app next time.
REFLECTION
WEEK 1 - 14
Experience
My experience with this module thus far has been a very mixed one. On the one hand, I appreciate getting to learn new things as well as having very insightful lectures provided to us. This was also my first class with students outside my course, so it was interesting to see everybody's perspective, especially when it came to group activities. However, I felt very overwhelmed at the beginning of this module, mainly because I completely forgot to download Unity for our first tutorial and had no prior knowledge of it. I found it challenging to keep up during tutorial classes, especially with my short attention span. I am grateful for my classmates who guided me and for Mr. Razif's class recordings.
Observations
There are not many observations that I made during this task. The only other thing is that it is important to be up to date on the lectures. Even if you miss one lecture, it becomes easy to fall off and keep up with the work. This is something that I noticed happened to myself. Moving forward, I learned to be more prepared with the class materials as well as keep myself disciplined to do work (if I even can).
Findings
Experience
My experience with this module thus far has been a very mixed one. On the one hand, I appreciate getting to learn new things as well as having very insightful lectures provided to us. This was also my first class with students outside my course, so it was interesting to see everybody's perspective, especially when it came to group activities. However, I felt very overwhelmed at the beginning of this module, mainly because I completely forgot to download Unity for our first tutorial and had no prior knowledge of it. I found it challenging to keep up during tutorial classes, especially with my short attention span. I am grateful for my classmates who guided me and for Mr. Razif's class recordings.
Observations
There are not many observations that I made during this task. The only other thing is that it is important to be up to date on the lectures. Even if you miss one lecture, it becomes easy to fall off and keep up with the work. This is something that I noticed happened to myself. Moving forward, I learned to be more prepared with the class materials as well as keep myself disciplined to do work (if I even can).
Findings
Comments
Post a Comment