Project Lead: N. Weliweriya

Enhancing Student Understanding through Immersive Learning Environments

This project proposes expanding technological developments to the departments' astronomy courses, strengthening STEM research which will help us enhance the quality of instruction in the physics department and across other STEM disciplines. We propose to develop a virtual reality (VR) platform with specific modules to test the effectiveness of the VR platform for enhancing the quality of astronomy instruction and research, broadly STEM.

# The ideas below only relate to Astronomy topics, but Ania showed the interest of using a similar approach in her Physiology and Pharmacology courses.

The project aims to overcome the limitations and issues in current astronomy lab courses and provide a method for students to engage in real-world scenarios and practice problem-solving and decision-making skills in astronomy courses. The proposed VR platform will include specific modules like Virtual Night Sky, Eclipses, Tides, and Parallax, which will be tested for effectiveness in enhancing student understanding.

Current limitations and issues:

Current Astronomy Lab Courses at UGA utilize both indoor theoretical activities as well as outdoor observational activities. The indoor activities cover vital concepts such as parallax calculations, light pollution, Kepler's Laws, and more. Current indoor lab exercises rely on students making basic measurements and/or interpreting pre-generated graphs. These activities would benefit from the addition of virtual reality tools by providing realistic depictions of the various phenomena in the night sky. Labs can be updated to involve observations made virtual reality headsets that otherwise cannot be made due to time constraints and technological limitations. These simulated observations can replace the pre-generated graphs currently being used to allow students to experience the process of data collection and the interpretation of that data.

The other component of current astronomy labs is the outdoor observing sessions. The main limitation faced by students in the courses is the weather. Often outdoor observing sessions must be postponed or canceled due to cloudy skies. Additionally, the light pollution on campus makes most observations difficult, if not impossible, especially when the stadium lights are on at Sanford Stadium. Other obstructions, such as trees, buildings, and construction equipment, block critical viewing windows. Simulated observation environments will allow students to observe regardless of weather and other factors. Moreover, simulated environments will provide instructors with a method to demonstrate a range of observing conditions from optimal dark sky sites to poor viewing conditions in light polluted and obstructed city skies. This is currently not possible as astronomy courses are limited to the viewing conditions on campus.

Planned Virtual environments include:

Virtual Night Sky: The main component of astronomy lab courses is learning the night sky and how it maps to the celestial sphere, which requires both indoor activities and outdoor observations. Understanding how the night sky maps to the celestial sphere and how to orient oneself under the sky are foundational concepts for students to learn. Implementing an environment that displays the night sky from any location on Earth will provide students with a method to learn the locations of stars, constellations, and planets. Additionally, instructors will be provided with methods to control this environment to show the movement of these astronomical bodies throughout the night. This environment can be extended to show an outside view of the celestial sphere and the 3D nature of the cosmos.

Measures of Success

We propose to develop an astronomical VR platform with two specific modules to test the effectiveness of the VR platform in enhancing student understanding. The two specific modules for the pilot study,

  • Eclipse
  • Parallax and proper motions

With an encouraging result from this pilot study, we plan to expand the scope with more modules and seek external funding support. With the UGA LTG grant, we propose to create a platform with two specific modules above. In the proposed astronomical 3D VR platform, students can add components such as stars and planets. Then, students can define added components' dynamical, kinematic properties, such as space and orbital motions. Lastly, students can control viewing parameters such as the location of an observer. Key parameters can be adjusted on the fly, and the dynamical system will reflect the changed parameters accordingly.

Learning Outcomes of astronomical VR platform:

The proposed astronomical VR platform can offer a variety of learning outcomes.

Improved understanding of astronomical concepts: The three-dimensional immersive experience in our proposed VR platform can help students better understand basic astronomical concepts, such as the solar system's structure, and more complex concepts, Eclipses, Tides, and Parallax.

Improved critical thinking and problem-solving skills: Astronomy involves analyzing data, making predictions, and evaluating evidence. Using a VR platform, users can develop their critical thinking and problem-solving skills by exploring astronomical phenomena, interpreting data, and testing hypotheses.

Enhanced spatial reasoning skills: Astronomy involves thinking about objects and phenomena on a vast scale, often at distances and sizes that are difficult to comprehend. Users can use a VR platform to develop their spatial reasoning skills by visualizing and manipulating astronomical objects in three dimensions.

Increased engagement with astronomy: A well-designed VR platform can be highly engaging, motivating users to explore and learn more about astronomy. This can lead to a greater interest in astronomy and may encourage users to seek further information or resources.

Exposure to scientific research methods: Astronomy is a highly data-driven field, and scientific research methods are essential to the discipline. Using a VR platform allows users to gain exposure to these methods, such as observing phenomena, collecting data, and interpreting results.

Assess the effectiveness of the VR platform:

With the approval from UGA's IRB office, we will use the students enrolled in relevant astronomical courses ASTR1010, ASTR1110, and ASTR2030L to test the VR platform's effectiveness.

Assessing the effectiveness of our astronomical VR platform can involve several factors.

Testing students' understanding of concepts:

Method 1: Collect pre-post test scores of students using standardized astronomic tests like the Test of Astronomy Standards (TOAST), Astronomy Diagnostic Test 2.0 (ADT2), and Star Properties Concept Inventory (SPCI). (Link:

These tests will measure students' general astronomy content knowledge across topics but not limited to gravity, the universe's evolution, star and stellar evolution, evolution and structure of the solar system, seasons, scale, yearly patterns, daily patterns, and moon phases.

Method 2: Collect data using individuals or groups of students in think-aloud interviews. These interviews will be follow-ups for the user surveys we collect after each session.

Collecting user feedback: End-of-the-session survey questions and follow-up think-aloud interview questions will be designed to explore user satisfaction, learning outcomes, and any areas for improvement. Responses to both these methods can help us collect student feedback on their experiences using the platform.

Measure user engagement: We will track the number of users who access the platform, how long they spend on it, and how frequently they return to use it. Most importantly, the new Meta VR headsets allow us to collect the users' eye-movement data and vital information about user engagement.

Further, STEM research on students' ability to solve physics problems found that students have difficulties interpreting, constructing, and switching between representations (algebraic, gestural, graphical, and verbal). PI's recent work on upper-division student problem-solving processes (, we use students' oral exam data to look at representations at a microscopic level. He uses social semiotic resources' disciplinary affordances to describe how the representations are developed, determined to be insufficient, and replaced or augmented by new ones brought in by the students. Previous analysis solely depends on student reasoning and the interviewer's notes on the reason for students' thought processes. As the next step of this project, we wonder if we could track students' eye movements to investigate what representations or features of representations, they pay attention to while solving problems.

Comparison with other instructional methods and technical performance: The platform's performance will be compared with traditional classroom instruction or online learning modules already available. We will also assess the quality of 3D graphics and the platform's stability.

Involving Courses:

Astronomy lectures and labs. (Ania showed an interest in using a similar approach in her Physiology and Pharmacology courses.)

Current Funding:

  • CTL’s learning technologies grant (LTG) 2023

Future Funding opportunities:

  • next rounds of CTL’s LTG grants (Spring 2024)
  • NSF: IUSE proposal for January 2023

This is a collaboration with Ania Mejewska at the school of veterinary medicine.

Ania's paper on using VR in an online class:

Ania is working with my graduate student on the following project idea:

Testing whether AR is more conducive to learning about the heart and the spatial relationships of its various parts, compared to traditional visual aids. In other words, does AR provide benefits on acquisition of conceptual and spatial knowledge?

The experimental design follows work done by Logan Fiorella at the School of Education:

After recruitment, the participants come in for a first visit where they take a knowledge pre-test and paper folding test (spatial ability test). Next, they read text about the cardiovascular system and are randomly assigned to one of four heart modalities(paper, 3D printed plastic, AR tablet, or AR headset). The four modalities display the ‘same’ heart and are as similar as possible. During the 15-minute visual learning phase, the participants hear some audio about the anatomy of the heart and participants are free to ‘explore’ the heart. Following this, there is a post-sessions survey to gauge whether participants were interested and motivated by the model as well as to assess any motion sickness. About a week later the participants return for second visit to take a post-test, which includes conceptual and spatial questions.

For statistical test we could so something to the effect of: PostTestScore ~ ModalityType + PreTestScore + SpatialAbilityScore

Allison Howard

Check “hololens”, “Exposure theory”, “rubber-hand feelings”.

Models were developed with ShapeXR (create a VR model within VR) and Joy (made by Adobe). These tools allow creation of VR contents within VR.

She also has used “clean box” to sanitize headsets.