Hi, I'm Sai Jagadeesh

Robotics & Perception Engineer

Developing autonomous systems, DARPA X Machine Learning, multi-robot coordination, and embedded platforms with expertise in ROS 2, C++, and Python.

Sai Jagadeesh
Available for work

About Me

# sai_jagadeesh_mk1.py

import warnings

class SaiJagadeeshMuralikrishnan:
    """Robotics & Perception Engineer | v2025_grad | Status: Caffeinated + Compiling"""

    def __init__(self):
        self.mode = "Self-Calibrating"
        self.energy = ["Python", "C++", "ROS 2", "coffee", "panic"]
        self.stack = ["Reinforcement Learning", "SLAM", "Debugging reality"]
        self.calibration_accuracy = 0.85  # rest = duct tape
        self.dependencies = ["gazebo.launch.py", "low light", "hope"]
        self.mission = "Make robots independent. Until then, I supervise their tantrums."
        self.task = "Running: ros2 launch life bring_meaning.launch.py"

    def report_status(self):
        print(f"[BOOT] Mode: {self.mode}")
        if "SLAM" in self.stack:
            print("[INFO] SLAM active with minor weekend drift.")
        if self.calibration_accuracy < 0.9:
            warnings.warn("Extrinsics may be slightly... extra.")
        print("[TASK] " + self.task)

    def future_goals(self):
        return [
            "Train drones that don’t crash (often).",
            "Design robots that work *and* pass code review.",
            "Survive ROS 2 Foxy... gracefully."
        ]

sai = SaiJagadeeshMuralikrishnan()
sai.report_status()

# ⚠️ This file has fewer launch errors than my actual robots.

Who Am I?

I'm a Robotics & Perception Engineer with 1.5+ years of hands-on experience in autonomous systems, multi-robot coordination, and embedded platforms. I specialize in developing and deploying reinforcement learning pipelines, SLAM, and real-time object detection systems.

My expertise lies in hardware-software integration, calibration, and simulation frameworks including Gazebo and Isaac Gym. I'm passionate about writing production-level code, optimizing latency, and validating perception systems.

Currently pursuing my Master's in Robotics at the University of Maryland, I'm constantly expanding my knowledge in control systems, machine learning, computer vision, and perception.

College Park, MD
saijagadeesh.muralikrishnan@gmail.com
saijagadeesh.com

Education

University of Maryland

College Park, MD

Master of Engineering in Robotics

Expected Graduation: May 2025

GPA: 3.78/4.0

Relevant Coursework:

Control Systems Machine Learning Computer Vision Perception Planning

Rajalakshmi Engineering College

Chennai, India

Bachelor of Engineering in Mechatronics

Graduated: July 2022

CGPA: 8.7/10.0

Relevant Coursework:

Embedded systems Controls Power electronics Computer Vision

Professional Journey

Robotics Engineering Intern

May 2024 – Aug 2024
KICK Robotics, College Park, MD
  • Built ROS 2 validation pipelines in Python/C++ using PyTest and GoogleTest to cover motion, sensor integrity, and firmware‑stability tests, cutting QA cycle time by 20%.
  • Refactored the ROS 2 control stack to optimize real-time command execution, cutting system latency by 15%.
  • Calibrated Basler ToF & RealSense D435 via Open3D checkerboard rectification and CUDA‑PyTorch refinements, cutting reprojection error by 0.7 px (12% depth‑map gain).
  • Led Jetson Nano firmware, driver, and ROS 2 node integration to sync actuators & sensors, boosting motion smoothness 30%

Embedded Systems Engineer

Oct 2022 – Jul 2023
TuTr Hyperloop, Chennai, India
  • Designed C++ std::thread & Python threading pipelines for real-time IMU/camera fusion over CAN, cutting latency by 25%.
  • Implemented PLC–VCU control signaling over TCP/IP automotive Ethernet (TSN), ensuring 99.9% uptime.
  • Built Git/JIRA CI/CD for unit /HIL tests with Jira traceability, increasing coverage by 40% and accelerating cycles by 20%.

Graduate Research Assistant

Sep 2024 – Present
Maryland Robotics Center, College Park, MD | DARPA Triage Challenge
  • Benchmarked SLAM Toolbox, RTAB-Map, and Cartographer in ROS 2 for RGB‑D & LiDAR mapping, identifying configurations that improved localization accuracy by 18%.
  • Automated sensor‐calibration validation in Python, cutting manual verification overhead by 40%.
  • Implemented multi-view image registration under varying lighting in OpenCV, raising feature‐match rate by 20%.
  • Fine‑tuned YOLOv8 on remapped VisDrone datasets with custom augmentations, boosting UAV person‐detection recall by 35%.
  • Trained lightweight PyTorch detection models optimized for embedded deployment, increasing mAP by 30% under aerial occlusions.

Projects

Multi-Agent Exploration Simulation

Multi-Agent Robotic Exploration Using Monte Carlo Tree Search

Designed AI-driven search algorithms using Python and C++ for robot path planning. Integrated AWS RoboMaker and optimized path planning speed by 30%.

Robotics and Autonomy Laboratory, UMD | Nov 2023 – Dec 2023
Autonomous Navigation Using DQN

Autonomous Navigation Using Double DQN and Dueling Architecture

Implemented AI-based navigation with Double DQN, increasing path planning efficiency by 75% in simulated multi-obstacle environments.

Maryland Applied Graduate Engineering, UMD | Mar 2024 – May 2024
NLP-Based Robot Navigation

Adaptive Text-to-Command Translation for Robot Navigation (NLP)

Developed a transformer-based model (T5-Small with LoRA) to convert natural language commands into robot navigation sequences.

Maryland Applied Graduate Engineering, UMD | Nov 2024 – Dec 2024
Versa-BOT Mobile Manipulator

Versa-BOT V1.0 – A Shop-Floor Mobile Manipulator

Designed a 7-DOF mobile manipulator in ROS 2 with trajectory planning and LIDAR-based obstacle detection.

Personal Project | 2023
MoveIt Motion Planning

Implementation of MoveIt Motion Planning on the Panda Robotic Arm

Developed a MoveIt-based pick-and-place operation in ROS 2 Humble with trajectory execution in RViz2.

Personal Project | 2023
Gate Detection for Drone

Autonomous Gate Detection for Drone Navigation

Developed a ROS-based gate detection pipeline for real-time navigation of a drone using MAVROS and PX4.

Academic Project | 2022
Wireless Animatronic Hand

Wireless Animatronic Hand Using Infrared Sensor

Designed a wireless animatronic hand using infrared sensors, published in ICDSMLA 2021, achieving Best Paper (Third Place).

Undergraduate Project | 2021
TurtleBot Lane Following

Mobile Robot Hardware Challenge

Created a perception pipeline using ROS 2 and OpenCV for dynamic obstacle detection and lane-following.

Academic Competition | 2023

Technical Skills

Programming Languages & Tools

I develop robust applications using a variety of languages and development tools.

Python Logo Python
MATLAB Logo MATLAB
C Logo C
C++ Logo C++
Bash Logo Bash
NVIDIA CUDA Logo CUDA

Software & Systems

I build scalable solutions using modern DevOps practices, cloud services, and containerization.

AWS Logo AWS
Docker Logo Docker
Git Logo Git
JIRA Logo JIRA
Linux Logo Linux
Windows Logo Windows

Robotics & Machine Learning

I integrate advanced robotics frameworks and machine learning libraries to deliver intelligent solutions.

ROS Logo ROS
PyTorch Logo PyTorch
TensorFlow Logo TensorFlow
OpenCV Logo OpenCV
Keras Logo Keras
Scikit-learn Logo Scikit-learn

Simulation & Control

I create realistic simulations and control systems for robotic applications.

Gazebo Logo Gazebo
Isaac Sim Logo Isaac Sim
Isaac Gym Logo Isaac Gym
pyBullet Logo pyBullet
Mujoco Logo Mujoco
MoveIt Logo MoveIt

Computer Vision

I develop advanced computer vision solutions for perception tasks.

OpenCV Logo OpenCV
YOLO Logo YOLOv8
PCL Logo PCL
Open3D Logo Open3D
Depth Sensor Logo Depth Sensing
Stereo Camera Logo Stereo Vision

Networking & Embedded

I work with embedded systems and networking protocols for robotic applications.

Socket Logo Socket Prog.
Arduino Logo Arduino
Raspberry Pi Logo Raspberry Pi
STM32 Logo STM32
ESP32 Logo ESP32
CAN Bus Logo CAN Bus

Publications & Awards

Wireless Animatronic Hand Using Infrared Sensor (2023)

ICDSMLA 2021: 3rd International Conference on Data Science, Machine Learning and Applications
Authors: M Balakarthikeyan, D Rajesh, M Sai Jagadeesh, G Santhosh Kumar
Abstract: Designed a wireless animatronic hand using an infrared sensor for data transmission, enabling pick-and- place operations and reducing human risk. Published by Springer Nature Singapore.

Best Paper Award (Third Place)

2021 | International Conference on Data Science, Machine Learning & Applications

Best Paper Award Certificate

Click to view full size

Get in Touch

I'd love to hear from you! Fill out the form below and I'll get back to you soon.

Contact Information

Location

College Park, MD

Connect with me