Project Teams

Hairy Otters

Team Leads: Yena Kang, John Yoon

Hairy Otter is a team joined together to create an immersive and inspiring experience: spellcasting in virtual reality. We currently use Voice SDK (Oculus’s built-in speech recognition system) to recognize voice commands. We welcome people interested in spellcasting or various aspects of technology, including VR/AR, AI/Machine Learning, game graphics, game design, 3D modeling, or UI/UX.

Expected experience in Unity XR.

ISAACS (Research) - Immersive Semi-Autonomous Aerial Command System

Team Leads: Archit Das, Harris Thai

ISAACS is an undergraduate-led research group within the Center for Augmented Cognition of the FHL Vive Center for Enhanced Reality. Our research is in human-UAV interaction, with a focus on teleoperation, telesensing, and multi-agent interaction.

We are also collaborating with the Lawrence Berkeley National Laboratory to perform 3D reconstruction of the environment via state-of-the-art methods in radiation detection. Our vision is to create a scalable open source platform for Beyond Line of Sight Flight compatible with any UAV or sensor suite.

Virtual Bauer Wurster

Virtual Bauer Wurster (VBW) is an app that allows students to edit and publish their architectural models. VBW enables users to share and explore 3D environments together.

This project is from the XR Lab - a research lab within the College of Environmental Design, with a goal to develop innovative, impactful research and applications in VR/AR/MR.

Expected experience in Unity XR.

OpenArk (Research)

OpenARK is an open-source wearable augmented reality (AR) system founded at UC Berkeley in 2016. The C++ based software offers innovative core functionalities to power a wide range of off-the-shelf AR components, including see-through glasses, depth cameras, and IMUs.

OpenARK is a open-sourced Augmented Reality SDK that will allow you to rapidly prototype AR applications.

Expected computer vision experience.

ROAR (Research)

ROAR stands for Robot Open Autonomous Racing, and it is the FHL Vive Center for Enhanced Reality's autonomous driving research group.

Our goal is to advance XR and AI technologies used in vehicles, through a fun intercollegiate driving competition at the heart of the iconic Berkeley campus.

Expected hardware experience.

TacticalMR (Research)

Team Leads: Edward Kim, Daniel He

The vision of this research is to train humans by augmenting their experience in virtual reality to enhance their skills, which are related to dynamic tactical coordinations or interactions with other dynamic entities. To augment one's experience in VR, the content of situations, or scenarios, one experience in VR is crucial. Hence, to achieve our vision, we aim to develop an algorithm which procedurally generates a customized curriculum, or a sequence, of training scenarios (which are displayed in VR) according to a trainee's learning progression.

This algorithm entails following elements: (a) modeling and generating realistic behaviors of environment agents in VR, (b) modeling and tracing the trainee's knowledge and learning progression, and (c) adaptive synthesis of training scenarios in accordance with the trainee's learning progression.

The training technique we develop using VR could potentially help people in various application domains. For example, as factories are automated, we can potentially train factory workers to safely coordinate with various robots in VR, prior to physically interacting with them. We anticipate also training a group of officers for rescue missions amid natural disasters in VR, prior to deployment where mistakes or inexperience may lead to casualties. We also foresee our technology to help enhance sports players to coordinate and teamwork better.

For the scope of our project, we are focusing on training people to enhance their skills in an esport called EchoArena Oculus virtual reality game. Our goal is to train people to enhance their (1) situational awareness, (2) tactical decision making skills, and (3) execution of the decision to master EchoArena.

Expected experience in Unity XR and/or 3d modeling.

RehabVR (Research)

Team Leads: Edward Kim, Alton Sturgis, James Hu

We currently lack methods to reliably generate training experiences for cases such as the physical rehabilitation of stroke patients. Over the years, advances in virtual and augmented reality (VR/AR) have considerably reduced the cost of augmenting experience.

Using a similar algorithm and methodology to the previously mentioned project in "Training Humans in VR for Sports," we are able to help with the rehabilitation progress of stroke patients using VR/AR, where we synthesize scenarios with tasks that are incrementally difficult for patients, personalized to each of their physical limiations.

This research is in collaboration with Stanford Medical School (Neurology Dept.)

Expected experience in Unity XR and/or 3d modeling.