Dive into the intersection of education and cutting-edge technologies like Virtual Reality (VR), Metaverse Technology, and Extended Reality. Explore research, innovations, and applications transforming the way we learn and interact in digital spaces.

Ye Jia

Ye Jia

MPhil Student
The Hong Kong Polytechnic University

About Me

Research Interests

  • Virtual Reality
  • Metaverse Technology
  • Extended Reality
  • Human-Computer Interaction

Education

  • MPhil in Computer Science
  • The Hong Kong Polytechnic University
  • BEng in Robotics Engineering
  • Chongqing University of Arts and Sciences

Featured Publications

Traceable teleportation: Improving spatial learning in virtual locomotion

Ye Jia, Zackary P. T. Sin, Chen Li, Peter H. F. Ng, Xiao Huang, George Baciu, Jiannong Cao, Qing LiInternational Journal of Human-Computer Studies (2025)

In virtual reality, point-and-teleport (P&T) is a locomotion technique that is popular for its user-friendliness, lowering workload and mitigating cybersickness. However, most P&T schemes use instantaneous transitions, which has been known to hinder spatial learning. While replacing instantaneous transitions with animated interpolations can address this issue, they may inadvertently induce cybersickness. To counter these deficiencies, we propose Traceable Teleportation (TTP), an enhanced locomotion technique grounded in a theoretical framework that was designed to improve spatial learning. TTP incorporates two novel features: an Undo-Redo mechanism that facilitates rapid back-and-forth movements, and a Visualized Path that offers additional visual cues. We have conducted a user study via a set of spatial learning tests within a virtual labyrinth to assess the effect of these enhancements on the P&T technique. Our findings indicate that the TTP Undo-Redo design generally facilitates the learning of orientational spatial knowledge without incurring additional cybersickness or diminishing sense of presence.

A scoping review on the role of virtual walking intervention in enhancing wellness

Yushen Dai, Jiaying Li, Yan Li, Frances Kam Yuet Wong, Mengqi Li, Chen Li, Ye Jia, Yueying Wang, Janelle Yorkenpj Digital Medicine (2025)

Virtual walking has the potential to be an adjunct to traditional physical therapy. This scoping review aims to synthesize evidence on the characteristics, effectiveness, feasibility, and neurological mechanism of virtual walking interventions on health-related outcomes. Articles in English were retrieved from twelve databases (January 2014–October 2024). Thirteen interventional studies were included, focusing on three types of virtual walking: passive observing moving (71.4%), arm swing locomotion (21.5%), and foot tracking locomotion (7.1%). Most studies (84.6%) involved individuals with spinal cord injuries, while the remaining studies focused on lower back pain (7.7%) and lower limb pain (7.7%). Over 70% of studies lasted 11–20 min, 1–5 weekly sessions for 10–14 days. Statistically significant findings included pain reduction (84.6%), improved physical function (mobility and muscle strength), and reduced depression. Mild adverse effects (fatigue and dizziness) were transient. Neurological evidence indicates somatosensory cortex activation during virtual walking, possibly linked to neuropathic pain.

illumotion: An Optical-illusion-based VR Locomotion Technique for Long-Distance 3D Movement

Sin, Zackary PT, Jia, Ye, Li, Richard Chen, Leong, Hong Va, Li, Qing, Ng, Peter HF2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR) (2024)

Locomotion has a marked impact on user experience in VR, but currently, common to-go techniques such as steering and teleportation have their limitations. Particularly, steering is prone to cybersickness, while teleportation trades presence for mitigating cybersickness. Inspired by how we manipulate a picture on a mobile phone, we propose illumotion, an optical-illusion-based method that, we believe, can provide an alternative to these two typical techniques. Instead of zooming in a picture by pinching two fingers, we can move forward by "zooming" toward part of the 3D virtual scene with pinched hands. Not only is the proposed technique easy to use, it also seems to minimize cybersickness to some degree. illumotion relies on the manipulation of optics; as such, it requires solving motion parameters in screen space and a model of how we perceive depth. To evaluate it, a comprehensive user study with 66 users was conducted. Results show that, compared with either teleportation, steering or both, illumotion has better performance, presence, usability, user experience and cybersickness alleviation. We believe the result is a clear indication that our novel opticallydriven method is a promising candidate for generalized locomotion.

Knowledge-Graph-Driven Mind Mapping for Immersive Collaborative Learning: A Pilot Study in Edu-Metaverse

Jia, Ye, Wang, Xiangzhi Eric, Sin, Zackary PT, Li, Chen, Ng, Peter HF, Huang, Xiao, Baciu, George, Cao, Jiannong, Li, QingIEEE Transactions on Learning Technologies (2024)

One of the promises of edu-metaverse is its ability to provide a virtual environment that enables us to engage in learning activities that are similar to or on par with reality. The digital enhancements introduced in a virtual environment contribute to our increased expectations of novel learning experiences. However, despite its promising outcomes, there appears to be limited adoption of the edu-metaverse for practical learning at this time. We believe this can be attributed to the fact that there is a lack of investigation into learners' behavior given a social learning environment. This lack of investigation is critical, as without behavioral insight, it hinders the development of education material and the direction of an edu-metaverse. Upon completing our work with the pilot user studies, we provide the following insights: 1) compared to Zoom, a typical video conferencing and remote collaboration platform, learners in the edu-metaverse demonstrate heightened involvement in learning activities, particularly when drawing mind mapping aided by the embedded knowledge graph, and this copresence significantly boosts learner engagement and collaborative contribution to the learning tasks; and 2) the interaction and learning activity design within the edu-metaverse, especially concerning the use of MM.

Featured Talks

Towards Effective Collaborative Learning in Edu-Metaverse A Study on Learners' Anxiety, Perception, and Behaviour

Towards Effective Collaborative Learning in Edu-Metaverse A Study on Learners' Anxiety, Perception, and Behaviour

ICWL 2024Shanghai, China

In the evolving landscape of educational technology, EduMetaverse presents a unique technological platform for collaborative learning (CL), which can be especially useful in distance learning settings. Researchers have taken a keen interest in the potential of Edu-Metaverse for enabling and improving CL; however, the effects of various factors on CL behaviours and performance still need to be fully understood. This study used a within-subjects design involving 32 participants (16 females and 16 males) to investigate how learners' attributes and environmental attributes affect CL in Edu-Metaverse. The participants were randomly assigned to groups of four for a CL session in Edu-Metaverse. The confirmatory factor analysis revealed that various behavioural metrics in Edu-Metaverse mediated the effects of trait anxiety and virtual space satisfaction on CL performance; perceived understanding of messages, under the umbrella of social presence, also had a direct effect on CL performance. These insights underscore the importance of optimising interactive, perceptual, and social components to make CL more effective in Edu-Metaverse.

Towards Effective Collaborative Learning in Edu-Metaverse A Study on Learners' Anxiety, Perception, and Behaviour

Towards Effective Collaborative Learning in Edu-Metaverse A Study on Learners' Anxiety, Perception, and Behaviour

2nd Annual IEEE International Conference on Metaverse Computing, Networking, and ApplicationsHong Kong SAR

Edu-metaverse is a specialized metaverse dedicated for interactive education in an immersive environment. Its main purpose is to immerse the learners in a digital environment and conduct learning activities that could mirror reality. Not only does it enable activities that may be difficult to perform in the real world, but it also extends the interaction to personalized and CL. This is a more effective pedagogical approach as it tends to enhance the motivation and engagement of students and it increases their active participation in lessons delivered. To this extend, we propose to realize an interactive virtual teaching assistant called NivTA. To make NivTA easily accessible and engaging by multiple users simultaneously, we also propose to use a CAVE virtual environment (CAVE-VR) as a “metaverse window” into concepts, ideas, topics, and learning activities. The students simply need to step into the CAVE-VR and interact with a life-size teaching assistant that they can engage with naturally, as if they are approaching a real person. Instead of textbased interaction currently developed for large language models (LLM), NivTA is given additional cues regarding the users so it can react more naturally via a specific prompt design. For example, the user can simply point to an educational concept and ask NivTA to explain what it is. To guide NivTA onto the educational concept, the prompt is also designed to feed in an educational KG to provide NivTA with the context of the student’s question. The NivTA system is an integration of several components that are discussed in this paper. We further describe how the system is designed and implemented, along with potential applications and future work on interactive collaborative edumetaverse environments dedicated for teaching and learning.