Ever wonder what happens when VR is used in real classrooms?

How can the real environment of the classroom interact and intersect with the virtual environment for deep learning and fun?

Virtual reality is becoming ubiquitous and affordable with people asking how it might be used to offer students, of all ages, wondrous learning experiences. When I scroll through my twitter feed, I see all kinds of educational technology (EdTech) articles on virtual and augmented technologies, usually featuring glossy stock photos of children and young people sporting the wide-mouth VR gape, a kind of visual short-hand for just how amazing an immersive VR experience can be. Most of the articles that accompany these images are about how the special affordances of VR (its properties or possibilities for action) can be used for learning – for example, virtual field trips to amazing places on and beyond the planet and the ability to manipulate the scale of virtual objects from the smallest (exploring a single human cell that appears as large as person) to the largest (zooming in and out of archaeological sites from an aerial view to a single in-situ artefact).

While there is imaginative thought in EdTech, evangelist articles, there is also a surprising lack of evidence on what actually happens when immersive technologies are introduced into real live schools.

There is research from the field of computer science and health on lab-based or clinical experimentation using immersive VR with children but this research often has small numbers of participants and can be limited in its implications to everyday ‘natural’ settings. Classrooms are dynamic natural settings where learning, in all its complexity, is influenced by a range of factors from the individual differences of students and their socio-cultural and geographic backgrounds, peer interaction, mandated curriculum and assessment options, and the pedagogy or the instructional choices teachers make every time they plan a lesson or step into a classroom.

So what happens when you provide students and teachers with the opportunity to use  virtual reality for learning?

How can the curriculum be tailored to use immersive virtual reality for deep learning and how can we assess if VR actually enhances learning?

What are the opportunities and challenges of using the latest VR technology in school communities, including equity concerns?

How do students and teachers experience VR in their classrooms?

Importantly, given the developmental stages of learners, how can we use this type of technology safely and ethically in schools?

The purpose of the VR School Study is to create a robust, evidence-informed dialogue on these questions based on the data collected during our collaborative research with teachers in real schools. The project involves openly sharing insights and the resources so that the use of VR in classrooms across subject areas. The focus is on developmentally and pedagogically appropriate and imaginative uses of the technology for deeper learning. We welcome dialogue from students, teachers, policy-makers, researchers and developers on using VR in schools and other educational settings.

This website is regularly updated with new insights and resources so follow it or check back every so often to find out what we are up to.

Erica Southgate (PhD), VR Enthusiast and Associate Professor of Emerging Technologies for Education, University of Newcastle, Australia.

 

 

Featured post

Dr Rick Skarbez on the past, present and future of mixed reality for education

As promised in my previous post, The Alphabet Soup of AR, VR and MR,  I caught up with computer scientist, researcher and educator Dr Rick Skarbez (LaTrobe University) to have a video conversation about his opinion piece It is Time to Let Go of Virtual Reality. We discuss why so many terms are used for spatial computing (that’s the new buzzword) and the past, present and future of mixed reality technology for education including AI to improve user experience.

You can read more of Rick’s interesting work here.

The alphabet soup of AR, MR and VR

Dr Rick Skarbez, a computer scientist from La Trobe University, recently co-wrote an opinion piece entitled It Is Time to Let Go of ‘Virtual Reality published in Communication of the ACM. It’s a timely provocation that argues that virtual reality (VR) and augmented reality (AR) are subsets of mixed reality (MR). I realise this is a lot of acronyms, but it is an argument that deserves unpacking, especially in light of advances in VR hardware that allow for virtual objects (holograms, 3D models) to be placed and interacted with in the real-life environment of a person using the equipment.

Of course, there has been MR reality headsets This includes HoloLens, Magic Leap and the forthcoming Apple Vision Pro that fit within the mixed reality category even though these are sometimes referred to as AR.  All this is very confusing because ideas about MR have morphed over time. For example, I’ve read papers that describe MR experiments that involve adding real life objects (tangibles) as part of virtual reality environments. An example of this might be using real objects that a VR system associates with 3D models recognisable in the real world; that is, it associates an ordinary box that a user may pick up with a 3D model of a house that the user can than interact with in a fully realised (synthetic) VR environment.

There have also been other ways to explain the difference between AR, VR and MR, like this infographic:
AR_VR_MR 1

The above graphic adapted from https://www.mobileappdaily.com/2018/09/13/difference-between-ar-mr-and-vr

Or this more detailed explanation:

ar-vr-mr-differences

Some headsets now have passthrough cameras which record and track, in real time, the user’s environment and, like a live stream, render it in front of the user’s eyes in the headset. This means that virtual 3D objects can then be seemingly overlayed on the user’s real environment allowing them to interact with it increasingly with hand and voice controls and gestures, although hand-held controllers are still commonly used. This might allow you to put a virtual alpaca in what looks like your lounge room and, depending on the sophistication of the programming of the virtual beastie, interact with it in a playful manner. This is mixed reality because it blends the real (your lounge room) with the virtual (the synthetic alpaca) that is programmed to respond to you and your space. Another example is the use of MR headsets such as HoloLens to teleoperate robots with varying levels of autonomy through gestural control. The mixed reality headset manufacturer Magic Leap offers this useful overview of the difference between AR, VR, and MR in terms of user experience:

 

Traditionally, many researchers have used the reality-virtuality continuum from Milgram and Kishino’s (1994) Taxonomy of Mixed Reality Displays (diagram below).

Reality_virtuality continuum

Some university and industry commentators have introduced eXtended Reality (XR) as an umbrella term for the AR, VR and MR. The term “immersive” is also used to encompass the technologies as in immersive education. Others, such as Apple, have gone back to the technical language of spatial computing.

For the non-technical person, it continues to be a confusing terminology mess. The melding of AR/MR capability (via passthrough cameras) into what are conventionally thought of as VR headsets has prompted a rethink.

To return to Skarbez et. al. (2023) who argue that the term mixed reality should be used as an “organizing and unifying concept… (to) harmonize discordant voices” in the ongoing “terminology wars” of the field (p. 41). They have previously written an article that reworks Milgram and Kishino (1994) reality-virtuality continuum. In these writings, they suggest that all technology-mediated realities are mixed reality because it is now common to have environments “in which real-world and virtual-world objects and stimuli are presented together… as a user simultaneously perceive both real and virtual content” (Skarbez et. al. 2023, p.42). Their articles present both potential positive and negative outcomes from moving to a mixed reality umbrella and there is certainly the need to consider what may be lost without the precision of distinguishing between types of technologies.

From an educational perspective AR, VR and MR have some similar but also quite different learning affordances (or properties that can enable educational experiences). Likewise there are common and unique ethical considerations now that AI is integrated into immersive technologies.

I will be having a videoed conversation with Rick Skarbez on his ideas on the future of immersive technologies and their terminology in January 2004 and posting this to the VR School website, so stay tuned.

 

This post brought to you by A/Prof Erica Southgate who is an alphabet soup kind of person who simultaneously uses the terms VR, MR, XR, metaverse and immersive technologies. 

 

References

Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329.

Skarbez, R., Smith, M., & Whitton, M. C. (2021). Revisiting Milgram and Kishino’s reality-virtuality continuum. Frontiers in Virtual Reality2, 647997.

Skarbez, R., Smith, M., & Whitton, M. (2023). It Is Time to Let Go of ‘Virtual Reality’. Communications of the ACM66(10), 41-43.

 

Cover image from Art with Mrs Filmore, 1st Grade– “Mixed media alphabet soup!” https://www.artwithmrsfilmore.com/tag/alphabet-soup-art-lesson/

AI in VR: Uses and concerns

Artificial intelligence is now integrated into most applications and platforms we interact with in every day life. It can be user-facing such as virtual assistants or operate behind-the-scenes to collect and analyse data that produces predictions and profiles about us that then create personalised experiences. Some uses of AI in VR in are:

Design and user experience: AI tools can facilitate the efficient design and generation of virtual environments and related content. There are examples of text-to-3D model generative AI and 3D data visualisation (stepping inside the data).  AI tools can produce digital twins or replicas of real-world objects or spaces that are real-time interactive spaces for single or multiple users. AI are also used for object recognition and tracking that adapts, in real time, to user action. This makes a virtual experience feel immediate. There is also integration of natural language processing so that users can use speech for interaction, navigation and translation purposes and enhance the communication capability of non-player (computer-generated, synthetic) characters in virtual environments. This all makes the experience of virtual reality feel more natural and engaging. And, AI’s promise of personalisation could translate to the development of better assistive adaptive technology integrated into spatial computing products.

Profiling and predictive analytics: Machine learning algorithms collect, analyse and  produce analytics about users of VR often in real time. These can be used to personalise or customise the experience of VR; for example, by offering the user certain content, options for customisation of avatars or non-player character interactions. Individual and social behavioural data related to action, eye-gaze attention and proxemics etc. can be captured by machine learning algorithms in VR simulations designed to understand and model how humans might react under certain conditions.

The combination of contemporary VR hardware and software provides for machine learning algorithms to collect a lot of personal information about a user. The Information Technology and Innovation Foundation has characterised this data as:

  • Observable: information about an individual that AR/VR technologies as well as other third parties can both observe and replicate, such as digital media the individual produces or their digital communications;
  • Observed: information an individual provides or generates, which third parties can observe but not replicate, such as biographical information or location data;
  • Computed: new information AR/VR technologies infer by manipulating observable and observed data, such as biometric identification or advertising profiles; and
  • Associated: information that, on its own, does not provide descriptive details about an individual, such as a username or IP address.2”

There is a significant amount of biometric data or data of the body that is collected in/by VR such as vocalisations, height, movement and use of head, body, limbs, and hands, facial expression, and eye gaze and pupil dilation, with ‘pass through’ cameras on headsets also capable of capturing information about a user’s environment.  There is research that indicates that biometric data can be highly identifiable, and this poses privacy and security concerns especially where children and young people are concerned. Tricomi et al. (2023) explain:

“Currently, there is an ongoing discussion on the potential protocols that will govern the Metaverse, with a particular focus on the controversial interplay between openness and privacy… (V)irtual devices allow tracking a large number of behavioral metrics, such as the headset’s and controllers’ position and rotation (which reflect the users’ physical actions), all the interactions between the user and any virtual object present in the scene, and also eye movements. All these data can be source of personal information, and even the user’s identity.”

Some jurisdictions have strong laws governing the harvesting and use of personal information including biometric data, while others do not. Under the Australian Privacy Act 1988 biometrics are considered sensitive information where consent is required for collection. The act has been under review for several years; however, the hope is that there will be stronger protections around the collection of such data. Internationally, there are several interesting policies and guidelines on biometrics for schools, including those for children.

It is worth closely examining the terms and conditions and disclosure statements of VR companies especially those related to children such as Meta’s  Parent Privacy Disclosure statement which includes data collection related to the following:

  • “With your approval, information about the size of walls, surfaces and objects in your child’s room and the distances between them and your child’s headset to offer experiences that blend their virtual and real-world environments
  • Personal information necessary to provide services so that the device and any features that you or your child turn on function optimally
  • Physical information about or related to your child, such as estimated hand size and hand pose data, if you choose to enable the hand tracking feature
  • Information about or related to the position and orientation of the headset, controllers and body movements to determine body pose, make your child’s avatar’s movements more realistic and deliver an immersive virtual experience
  • Information about or related to your child’s fitness activities in virtual reality if you choose to enable fitness-related experiences such as Meta Quest Move.”

All such data collection in VR is powered by AI. There is significant  amount of work going on internationally to ensure that the human rights of children are protected in the digital realm and this includes in VR and other immersive technologies such as augmented and mixed reality. With companies  lowering user age for VR to 10 years old, now is the time for a more robust and critical conversation on the ethical use of this web of technologies in schools, and in society more broadly. As educators we cannot delay action on this; in fact, we must lead the conversation.

References

Tricomi, P. P., Nenna, F., Pajola, L., Conti, M., & Gamberi, L. (2023). You can’t hide behind your headset: User profiling in augmented and virtual reality. IEEE Access11, 9859-9875.

This post bought to you by a real human A/Prof Erica Southgate

Inquiry into AI

The Australian Government is having an Inquiry into the use of generative artificial intelligence (AI) in the Australian education system. AI is integrated in extended reality (VR,AR,MR) applications and platforms in ways that are not usually discernable to users such as in the design of virtual worlds, powering the interactions of non-player characters, and biometric tracking (tracking of eye gaze, pupil dilation, limb movements, voice etc). Shortly, I will be writing a post on AI in VR. In the meantime, you can read my submission to the Inquiry which details pedagogical and ethical issues. You can also check out other submissions here.

This post bought to you by A/Prof Erica Southgate who is not (yet) a bot.

Cover Image – Street art mural by the artist Phlegm from https://www.kidsnews.com.au/arts/monster-20m-robot-mural-by-uk-artist-phlegm-moves-with-ar-app/news-

Students creating VR worlds for maths

Learning mathematics through creativity is a not an approach we often associate with junior high school classrooms. In this post, Jessica Simons, Math teacher and co-researcher on the VR School Study, explains how she went about designing curriculum that allowed her Year 9 extension mathematics class to use 360-degree virtual reality to demonstrate their depth of understanding of linear and non linear graphs.

Jessica developed a unit of work which scaffolded students towards growing their mathematical knowledge and applying this to the environment of their school. Students were asked to produce imaginative 360-degree virtual worlds that could be used to teach their peers about graphs. Working in small groups, students scouted locations around their school where graphs might be represented and then they planned and storyboarded their ideas to produce original immersive adventures in mathematics to share with others. The cover image for this blog post is of an initial brainstorm from one group on the VR task.

The unit of work can be found here.

The video below is of Jessica explaining how she developed the curriculum, bought a creative lens to mathematics teaching, and the value-add of VR to student learning.

This post bought to you by real live educators A/Prof Erica Southgate and Jessica Simons (Assistant Head of Mathematics, Trinity College, Adelaide).

Curriculum to empower student VR design on sustainability

This post provides an excellent example on how to design curriculum that integrates the use of 360-degree VR content creation for authentic research and problem-solving in the design and technology subject area. Ella Camporeale, teacher, and co-researcher on the VR School Study, discusses how she went about producing a unit of work for her Year 9 students that allowed them to use VR to create their own virtual world on the topic of sustainability at school.

The unit of work scaffolded students towards understanding the creative potential of VR and how it might be used to allow students to demonstrate deep understanding of digital design and content knowledge of sustainability based on research into the school’s sustainability practice. Students needed to investigate the issue of sustainability at school by engaging with key stakeholders to collect and represent data in their virtual worlds. They were required to produce a range of original content including 360-degree scenes of the school and different media (text, photos, video, gifs, sound files) to embed into the scenes that would tell a story about the broader issue of sustainability and their school’s approach to it. There was also the opportunity to integrate game mechanics to increase user engagement in the VR product.

The unit of work can be found here.

In the video below Ella Camporeale and Erica Southgate discuss curriculum design that integrates VR for deeper learning:

~This post bought to you by real humans, Associate Professor Erica Southgate and Ella Camporeale.

~The platform used in the project was VRTY.

~This research has been funded by the Association of Independent Schools of South Australia (AISSA).

~Cover image from Pexel by Polina Tankilevitch.

Designing curriculum for creative learning about Biomes with VR

Teacher Toni Maddock from Southern Montessori School (Adelaide) set about the task of designing an integrated unit of work (science and geography) on biomes and food security that would allow her middle school students to demonstrate both content mastery and develop communication and creativity skills by using 360-degree VR via the VRTY platform.

There are few available examples of how teachers go about designing curriculum to scaffold student VR content creation. Pedagogically, the unit of work involved a combination of direct instruction and collaborative and discovery-based learning activities. There was a staged approach, with students, in the first instance, being supplied with existing 360-degree scenes of biomes from around the world which they then needed to enhance by doing research on the biome and adding certain facts and media to their 360 scene. This was followed by the class skilling up with the 360 camera and moving to a more complex task involving research on, and an excursion to, a local biome. Informed by their research, students took their own 360-degree base scenes of the biome while on the excursion. They also conducted experiments to generate data to include in their 360-degree virtual biome world, and produced other media (such as text, sound files, photos and videos) on information relevant to the biome and local food security issues. Best of all, and a key feature of VR, students got the opportunity through a school expo to easily immerse their peers, family and community members, in the the educative virtual world they created, making the task genuinely authentic.

The unit of work can be found here.

Through careful curriculum planning, Toni provided rich, scaffolded tasks that leveraged the properties of VR to develop her students higher order thinking and provided them with a unique way express their scientific and geographic knowledge content mastery in a creative way. This was very different to how she would usually teach the topic of biomes. Toni talks a bit about the curriculum planning process in the video below:

This post bought to you by actual humans – Associate Professor Erica Southgate and Toni Maddock.

This project has been funded by the Association of Independent Schools of South Australia (AISSA).

Cover image of rainforest by Jahoo Closeau from Pexel.

The rights of the child, XR technology and schools

In March 2021, as the Covid-19 pandemic raged and school students in many countries were adapting to online learning, the United Nations (UN) released “General comment No. 25 on the children’s rights in relation to the digital environment”. Drawing on an extensive international consultation process with children and a raft of expert submissions, General comment 25 provides guidance on how children’s rights should be fostered and protected in digital environments. This post outlines some key areas in General comment 25 in order to pose some thoughts on how they relate to the use of XR (eXtended Reality including augmented and virtual reality) technology in schools.

Before outlining these key areas, it is worth historically situating General comment 25. It is part of a children’s rights-based lineage from the UN adopting the Declaration of the Rights of the Child (1959) to the Convention on the Rights of the Child (1989) which recognised the social, economic, cultural and civil roles of children and setting a minimum standards for protecting their rights. Below is a poster version which provides a snapshot of the principles that underpin the Convention of the Rights of the Child. Nation state signatories to the Convention can be found here

UN+rights+of+the+child+teen+edition+A4

To return to General comment 25, the document begins by using the four principles from the Convention to provide guidance on children’s digital rights. The principles and some of my thoughts on their implications for XR in schools are outlined below:

  1. NON-DISCRIMATION“The right to non-discrimination requires States parties ensure that all children have equal and effective access to the digital environment in ways that are meaningful to them. States parties should take all measures necessary to overcome digital exclusion.” (p. 2).

Implications: All schools, not just wealthy ones, should be able to provide their students with continuous, equitable and meaningful access to XR learning technologies including the infrastructure (connectivity, bandwidth etc) that powers the tech. Teachers should be provided with independent, evidence-based professional learning opportunities and ongoing pedagogical support to assist them to integrate XR in ways that are most effective for learning across subjects and in integrated units of work. Digital divides are born in policy (and funding) failures, no more so than in the field of school education.

  1. BEST INTERESTS OF THE CHILD“States parties should ensure that, in all actions regarding the provision, regulation, design, management and use of the digital environment, the best interests of every child is a primary consideration” (p. 2-3).

Implications: Most countries are at an early stage of regulation governing XR technology and the development of ethical standards informing its design is also nascent. In the meantime, there are some existing frameworks such as safety by design, privacy by design and guidelines on automated decision making that schools should utilise to guide procurement and implementation. I realise this feels like yet another thing to learn and do beyond the core business of schooling; however, until there is strong regulation and industry-wide accepted ethical standards in place, it is perhaps the only way most teachers in most countries will be able to uphold the digital rights of the child.

  1. RIGHT TO LIFE, SURVIVAL AND DEVELOPMENT “Opportunities provided by the digital environment play an increasingly crucial role in children’s development… States parties should identify and address the emerging risks that children face in diverse contexts, including by listening to their views on the nature of the particular risks that they face…. States parties should pay specific attention to the effects of technology in the earliest years of life, when brain plasticity is maximal and the social environment…. Training and advice on the appropriate use of digital devices should be given to parents, caregivers, educators and other relevant actors, taking into account the research on the effects of digital technologies on children’s development … ” (p. 3).

Implications: . Teachers use their knowledge of child development everyday in the classroom. This knowledge about child development needs to be extended to include the potential effects of XR technologies on children and adolescents. There is no other technology like XR technology – It can make the user’s brain and the body feel as though they are in a totally different place, imaginary or actual, with real and computer-generated actors interacting in real time, for better and for worse. There is evidence that children have developed false memories after a VR experience. There are also child protection issues related to the use of VR equipment in classrooms and open social VR platforms. The current evidence base on the immediate and longer term effects of immersive technology on children is inadequate as very few studies have been conducted and there is more work required on ensuring research with children using XR technology is ethical. Most manufacturers of VR headsets provide health and safety information and suggested age limits; however, like Terms of Service and company privacy policies, these are often not read or skimmed over. There is a great deal of work to be done by both government and industry in developing plain English and child-friendly policy related to technology risks including but not limited to privacy issues. In the digital sphere of education policy and in industry, there are either opaque or non-existent accountability mechanisms to query or contest data extraction and use, and third-party data interests, or to seek redress if something goes wrong. There is significant work to do if children and their parents/caregivers are to be given a voice and ways to effectively exercise rights in the digital learning space generally and with XR specifically.      

  1. RESPECT FOR THE VIEWS OF THE CHILD – “When developing legislation, policies, programmes, services and training on children’s rights in relation to the digital environment, States parties should involve all children, listen to their needs and give due weight to their views. They should ensure that digital service providers actively engage with children, applying appropriate safeguards, and give their views due consideration when developing products and services.” (p.3-4).

What are the views of children on the digital environment including XR technologies for leisure and learning? How do schooling systems and teachers amplify these voices for good transparent policy development and to inform classroom practice? How can schools engage in critical conversations with technology companies and ask the right ethical and educational questions about EdTech to seek evidence of effectiveness for learning and to advocate on behalf of children especially when so much of schooling has become platform dominated (often one-platform dominated)? Why is there a dearth of independent professional learning on digital technologies available to teachers?  It is fair to say that these are generally unanswered yet vital questions that deserve more than lip service from state education authorities and those in charge of schooling systems. The proliferation of digital literacy curricula is a good place to start classroom conversations. In case you are interested, here is a child friendly version of General comment 25 that can be used in class.

It is worth ending this whirlwind tour through some sections of General Comment 25 by highlighting section 42 of the document that specifically related to XR technologies:

“States parties should prohibit by law the profiling or targeting of children of any age for commercial purposes on the basis of a digital record of their actual or inferred characteristics, including group or collective data, targeting by association or affinity profiling. Practices that rely on neuromarketing, emotional analytics, immersive advertising and advertising in virtual and augmented reality environments to promote products, applications and services should also be prohibited from engagement directly or indirectly with children.” (pp.7-8).

There is a lot to unpack in this paragraph. Here are some key points to consider. The intersection between XR and artificial intelligence (AI) has hastened the harvesting of highly identifiable data from people’s bodies known as biometric data. This is harvested using the tracking and sensors built into XR hardware and software products and represents a significant privacy risk to users of the technology including children. Data can be (and is) being collected through the tracking of limb, head and finger movements, gaze patterns and pupil dilation as proxy measures for attention, facial expressions, speech and written communication, geolocation sensors, and information about the surrounding environment captured via pass-through camera technology in headsets. As boring as it seems, it is well worth reviewing the privacy policies of XR software and hardware companies. For example, check out Meta’s supplementary privacy policy, which also has a separate eye tracking policy embedded into it, to get a sense of the degree of biometric data harvesting and potential sharing of this with third parties.

The thing about biometric data is that is so personal that it can be used to identify individuals and settings. While the privacy implications of this for adults is serious, the implications for children and schools is even more concerning. In many countries and jurisdictions there is weak regulation around biometric data collection, storage, use and commercial currency for third party transactions (selling on bodily information)  despite its sensitivities. In addition, the use of that data, linked to other information collected via multiple platforms and online interactions, for surveilling, unfairly profiling, and manipulating or ‘nudging’ people’s emotional states and behaviour, covertly and overtly, raises serious ethical issues especially for vulnerable populations such as children. Hence, General Comment 25 specifically identifies virtual and augmented reality technology as representing a special class of risk to children. If you want to learn more about the ethics and implications of AI-powered biometric and affective computing applications for schools, check out the ethical framework for education contained in this report.    

Now is the time that teachers, educational policy makers, researchers and industry need to have serious conversations WITH children and their parent and caregivers about the digital rights of the child broadly and especially in relation to unique challenges emerging technologies that XR and AI bring. But conversations will not be enough. Consultation and engagement need to be accompanied by practical educational, accountability and regulatory initiatives if the digital right of the child are to be endorsed and celebrated in schools.

This post bought to you by A/Prof Erica Southgate.

Cover image by https://oscaw.com/art-camp-week-2-lets-make-eyes 

New VR survey for teachers

Virtual reality (VR) has a lot of potential for learning and teaching. However, we don’t know a lot about why and how Australian teachers use VR and what the perspectives are of those who haven’t tried it in class. This 15-20 minute survey is for early childhood, primary and secondary teachers who are considering the technology or have used it in their classroom. Your participation would assist in finding solutions that address the implementation and scaling up of VR for education.

Information on, and the survey, can be found here 

Results will be available for free download from this website.

This study is being conducted by A/Prof Erica Southgate (UON)) Prof Matt Bower (MQU), A/Prof Michael Cowling (CQU), Dr Paul Unsworth (UniSA) and A/Prof James Birt (BondUni). 

On researching VR with teachers in schools

I recently did a podcast with VR enthusiast and educator Craig Frehlich on why we need to do more research WITH teachers, and not on them, to really understand the enablers and barriers to integrating a wide range of powerful, curriculum-aligned VR learning opportunities into classrooms:

https://podcasts.apple.com/us/podcast/episode-82-vr-participatory-research-and-pedagogy/id1333244708?i=1000580379562

This of course extends to providing genuine opportunities within research projects for students to provide their perspectives on the use of the technology for learning and to showcase their virtual creations to authentic audiences (more on this in a future blog post).

Blog at WordPress.com.

Up ↑