I earned my M.S. in Mechatronics and Robotics from New York University Tandon School of Engineering—supported by the Tandon Merit Scholarship and an Andhra Pradesh State Government Fellowship—where I completed my master’s thesis under Prof. Yi Fang. During that time, I also served as a Research Assistant in the Ai4CE Lab with Prof. Chen Feng and in the MCRL with Prof. Vikram Kapila.
I hold a B.Tech in Mechanical Engineering from Acharya Nagarjuna University, India. After earning my bachelor’s degree, I helped develop telepresence conference and inspection robots at a startup. Following my master’s, I joined NYU as a Research Scientist before moving to NYU Abu Dhabi.
My research centers on embodied AI and physical intelligence, with a special focus on intuitive human-robot collaboration. I aim to develop adaptive robotic systems—especially humanoids—that seamlessly interact with people in dynamic, real-world environments.
A wavelet-based policy learning framework that enhances decision-making in complex, long-horizon tasks by analyzing observations across multiple scales for precise and reliable action planning.
Socially aware robot navigation framework that combines deep reinforcement learning with language interaction, enabling robots to communicate with pedestrians and navigate safely in dynamic environments.
A foundation-model framework that lets humanoid robots understand text instructions and perform complex loco-manipulation through multi-step reasoning.
This work exposes vulnerabilities in LLM-based robot navigation through novel prompt attacks and proposes initial defenses to enhance the security of autonomous navigation systems.
MapBERT is a transformer-based framework that predicts unobserved indoor regions from partial semantic maps, enabling spatially aware and efficient navigation for embodied agents.