Interactive VR Mesh Deformation via Hand-Object Interaction
Names: Yun Chung Chang, Jason Du, Panfeng(gavin) Jiang, Zitong (Peter)
Hu
Link to webpage:
https://zx40224617.github.io/CS184FinalProposal/
Link to GitHub repository:
https://github.com/zx40224617/CS184FinalProposal
Overview
This project aims to create an immersive virtual environment using VR
where users can interact with objects through natural hand gestures. By
leveraging VR technology and Unity's robust development platform, the
project focuses on real-time mesh deformation when a user’s hand contacts
virtual objects. The goal is to simulate realistic physical responses and
enhance the tactile feedback within the VR space
Problem Description
As VR gradually become a big part of computer graphics, and has been
implemented in medical, entertainment, education, and much more fileds. We
want to understand how the physical hand-object interaction in VR world
can be achieved. We are aware of the fact that there are already some
existing tool for this, but we are aim to understand the pipeline and
reimplement the idea. The chanllenge will now come to how to simulate
different object deformation given different hand gesture, force, and
object properties. We aim to look for some existing algorithm for mesh
deformation that we can try to implement and see how that can simulate
different senarios, and evaluate by how fast and realistic it is.
Goals
-
Develop a VR application in Unity that utilizes Meta’s SDK in Unity to
achieve real-time hand tracking and gesture recognition.
-
Implement collision detection between user hands and virtual objects.
-
Design a mesh deformation system that alters the object’s geometry
based on different properties (e.g., material stiffness, elasticity).
-
Provide a visually compelling and interactive experience that mimics
realistic hand-object interactions.
-
We are trying to see how well the mesh deformation can represent the
physcial interaction without considering volumetric rendering.
How we measure the quality and performance:
Since we hope to deliver a real time interactive system, we will meausre
the performance using the computing time and fidelity of the simulation of
the object deformation based on the interaction. (e.g. how real it looks
compare to real objects and hand interaction)
Deliverables
-
What we plan to deliver:
-
A functional VR application demonstrating real-time hand-object
interaction.
-
A dynamic mesh deformation system that adjusts object shapes based
on interaction properties.
-
What we hope to deliver:
-
A realistic haptic feedback based on the propeties of the object
-
A custom shader that is realisic and real time for the deformation
of the objects
Schedule
-
Make ourselves familiar with Unity, the headset and existing tools (~
1 week)
-
ooking and trying different nesh deformation algorithm (~ 1 week)
-
Testing different object properties and gesture and modify current
system if needed (~ 1.5 week)
-
Check to see is there anything that we can add or imporve if there is
more time left and also finalize the report (rest of the time)
Resources
-
Platform:
-
Hardware:
-
Software:
- Oculus Integration SDK
- Unity XR Interaction Toolkit
-
Custom Mesh Deformation Library:
-
Develop or integrate existing libraries for real-time mesh
modification based on collision and force feedback
-
Custom Shader(Unity):
-
Creating custom shaders that support dynamic deformation
effects
-
Physics Engine:
-
Leverage Unity’s built-in physics for realistic collision
detection and response dynamics