Developing autonomous systems, DARPA X Machine Learning, multi-robot coordination, and embedded platforms with expertise in ROS 2, C++, and Python.
# sai_jagadeesh_mk1.py
import warnings
class SaiJagadeeshMuralikrishnan:
"""Robotics & Perception Engineer | v2025_grad | Status: Caffeinated + Compiling"""
def __init__(self):
self.mode = "Self-Calibrating"
self.energy = ["Python", "C++", "ROS 2", "coffee", "panic"]
self.stack = ["Reinforcement Learning", "SLAM", "Debugging reality"]
self.calibration_accuracy = 0.85 # rest = duct tape
self.dependencies = ["gazebo.launch.py", "low light", "hope"]
self.mission = "Make robots independent. Until then, I supervise their tantrums."
self.task = "Running: ros2 launch life bring_meaning.launch.py"
def report_status(self):
print(f"[BOOT] Mode: {self.mode}")
if "SLAM" in self.stack:
print("[INFO] SLAM active with minor weekend drift.")
if self.calibration_accuracy < 0.9:
warnings.warn("Extrinsics may be slightly... extra.")
print("[TASK] " + self.task)
def future_goals(self):
return [
"Train drones that don’t crash (often).",
"Design robots that work *and* pass code review.",
"Survive ROS 2 Foxy... gracefully."
]
sai = SaiJagadeeshMuralikrishnan()
sai.report_status()
# ⚠️ This file has fewer launch errors than my actual robots.
I'm a Robotics & Perception Engineer with 1.5+ years of hands-on experience in autonomous systems, multi-robot coordination, and embedded platforms. I specialize in developing and deploying reinforcement learning pipelines, SLAM, and real-time object detection systems.
My expertise lies in hardware-software integration, calibration, and simulation frameworks including Gazebo and Isaac Gym. I'm passionate about writing production-level code, optimizing latency, and validating perception systems.
Currently pursuing my Master's in Robotics at the University of Maryland, I'm constantly expanding my knowledge in control systems, machine learning, computer vision, and perception.
College Park, MD
Master of Engineering in Robotics
Expected Graduation: May 2025
GPA: 3.78/4.0
Chennai, India
Bachelor of Engineering in Mechatronics
Graduated: July 2022
CGPA: 8.7/10.0
A ROS2-based, user-friendly GUI designed for seamless multi-robot monitoring, control, and experimentation. This powerful tool integrates real-time visualization, teleoperation, voice commands, and custom experimentation displays, making it a superior alternative to traditional RViz solutions.
Designed AI-driven search algorithms using Python and C++ for robot path planning. Integrated AWS RoboMaker and optimized path planning speed by 30%.
Implemented AI-based navigation with Double DQN, increasing path planning efficiency by 75% in simulated multi-obstacle environments.
Developed a transformer-based model (T5-Small with LoRA) to convert natural language commands into robot navigation sequences.
Designed a 7-DOF mobile manipulator in ROS 2 with trajectory planning and LIDAR-based obstacle detection.
Developed a MoveIt-based pick-and-place operation in ROS 2 Humble with trajectory execution in RViz2.
Developed a ROS-based gate detection pipeline for real-time navigation of a drone using MAVROS and PX4.
Designed a wireless animatronic hand using infrared sensors, published in ICDSMLA 2021, achieving Best Paper (Third Place).
Created a perception pipeline using ROS 2 and OpenCV for dynamic obstacle detection and lane-following.
I develop robust applications using a variety of languages and development tools.
I build scalable solutions using modern DevOps practices, cloud services, and containerization.
I integrate advanced robotics frameworks and machine learning libraries to deliver intelligent solutions.
I create realistic simulations and control systems for robotic applications.
I develop advanced computer vision solutions for perception tasks.
I work with embedded systems and networking protocols for robotic applications.
ICDSMLA 2021: 3rd International Conference on Data Science,
Machine Learning and Applications
Authors: M Balakarthikeyan, D Rajesh,
M Sai Jagadeesh, G Santhosh Kumar
Abstract: Designed a wireless animatronic hand
using an infrared sensor for data transmission, enabling pick-and-
place operations and reducing human risk. Published by Springer
Nature Singapore.
2021 | International Conference on Data Science, Machine Learning & Applications
Click to view full size
I'd love to hear from you! Fill out the form below and I'll get back to you soon.