During the 2017-2018 school year I was contracted as an Expeditions Lead for the Google Expeditions AR Program, an emerging segment of Google EDU! The program aims to bring Virtual Reality and Augmented Reality Technology into classroom lesson plans.
I visited schools on the East Coast throughout NYC and Pittsburgh and worked alongside teachers to test our new app. My team and I facilitated lesson based focus groups in each school, with close to 500 students on average per day for user testing. This feedback provided valuable guidance for design iterations.
This project focused on the Augmented Reality segment of Expeditions in order to see if such technology may act as a visual aid and foster group learning.
I operated as the liaison between our on the ground associate team, project coordinator, and manager. I also addressed associate concerns and increased outreach.
In this experience I was only involved in one segment of Agile, the user testing and iteration phase of prototyping. The endeavor sparked my interest to pursue UX so that I could have a more comprehensive role within research, design, and prototyping products and ideas moving forward.
field visit across different schools daily
personal notes about activities and teacher/student interactions
teacher feedback form
quotes of impact
positive and critical feedback
Another important discover occurred while testing the app with students in inner city schools such as the schooling system in NYC. Many students came from international backgrounds and sometimes classes had 3 or 4 native languages spoken. This technology allowed for fully immersive visual examples of content, ranging from biology to history, which added as an important aid across languages. This eased the burden placed on in-classroom translators.
One example was the visual showcase of the heart and the engagement of the four chambers. It was far easier for teachers to explain across language barriers with the 3-dimensional animation present
intro to UX research
Our research led to improvements in the AR education technology as well as invaluable student feedback. For example, based on our tests, the team decided to switch to using a QR code-based scan system to place virtual objects, instead of scanning the entire room. This decreased the bandwidth necessary for individual devices, meaning more students could be involved, and it ensured objects were placed in the correct area do to visual markers.
An example of this system is here:
In student focus groups, ranging from 10 to 40 students at a time, we facilitated lessons. Teachers sometimes took a lead and other times my colleagues and I administered lessons.
Each student received a handheld device and was free to roam the assigned area to view each artifact as we discussed as a class. 3D animations were controlled by a central device, held by the teacher or myself.
Feedback about the experience and learning process was taken at the end of each topic.
Notable feedback from science teachers included the viability of this technology in a lab-based setting. Students learn in a group setting at different workstations, each focusing on a different system.
Outside of testing and overseeing other associates across the east coast...
My teammate Virginia and I represented our emerging tech at student incubator CODE Next, as well as From Stem to Screen, held at The Paley Center For Media, (In partnership with The Mayor's Office for Media and Entertainment NYC.)
Students attended from Black Girls Code, The Young Women’s Leadership Schools, Lower East Side Girls Club, James Madison High School, Cush Center, Fannie Lou Hamer Freedom High School, Bronx International High School, and Academy of Innovative Technology High School.
More information about the project is available here: