Pablo Ortega Pardina
Radiaciones Ionizantes en el diagnóstico y tratamiento de enfermedades
Hora: 15 h
Duración: 2 horas
Lugar: Aula A.06 del edificio Ada Byron
The world is becoming unprecedentedly connected thanks to emerging media and cloud-based technologies. The holy grail of metaverse requires recreating a remotely shared world as a digital twin of the physical planet. In this world, the human is probably the most complex mechanical, physical, and biological system. Unlike computers, it is remarkably challenging to model and engineer how humans perceive and react in a virtual environment. By leveraging computational advancements such as machine learning and biometric sensors, this talk will share some recent research on altering and optimizing the human visual and behavioral perception toward creating the ultimate metaverse.
Qi Sun is an assistant professor at New York University, Tandon School of Engineering (joint with Dept. of Computer Science and Engineering and Center for Urban Science and Progress). Before joining NYU, he was a research scientist at Adobe Research and a research intern at NVIDIA Research. He received his Ph.D. at Stony Brook University. His research interests lie in computer graphics, VR/AR, vision science, machine learning, and human-computer interaction. He is a recipient of the IEEE Virtual Reality Best Dissertation Award.