Below are some of the exciting projects I have been a part of throughout my studies.
During my time as a Graduate Assistant at Boston University's Material Robotics Laboratory, I contributed to a pioneering project focused on enhancing colonoscopy procedures. Along side a PhD student, we developed a soft robotic sleeve, made from flexible EcoFlex silicone, designed to reduce discomfort and improve safety during colonoscopies. This innovative sleeve uses embedded soft optical waveguides for accurate force measurement and distribution, ensuring less invasive and more comfortable patient experiences. The fabrication involved meticulous molding, calibration for precise functionality, and extensive testing with medical professionals to refine its clinical application. This project is featured in a forthcoming research paper titled 'Soft Robots for Colonoscopy: Monitoring Forces, Increasing Safety, and Reducing Discomfort,' set to detail significant technological innovations and their applications in enhancing patient comfort and procedural safety.
During my summer internship at the Morphable Biorobotics Lab, I developed stacked balloon actuators (SBAs) for use in a soft robotic octopus project, focusing on the manufacturing process and improving the design. I was actively involved in fabricating and rigorously testing these SBAs, enhancing their performance through fiber reinforcement. This involved crafting the components, assembling them, and then assessing their functionality both in lab conditions and real-world simulations. This work is part of an upcoming research paper titled "Harnessing Fiber-Reinforced Bellows Actuators for Locomotive Soft Robots," which details the significant advancements in actuator technology for soft robotics.
As part of a team project for my Medical Robotics course at Boston University, we developed a soft robotic glove to aid hand therapy for individuals with severe spinal cord injuries, such as quadriplegia. This glove uses a material called Dragon Skin 30, which is both comfortable and flexible, adapting to various hand shapes to support natural movement during daily activities. The glove operates through a cable system controlled by servo motors. These motors adjust the tension and positioning of the cables, allowing for precise finger movements. Adjustments can be made in real-time using an Arduino controller, tailoring the therapy to each user's needs. An innovative feature of the glove is its integration with a sphygmomanometer connected to a pressure sensor, which helps therapists monitor and fine-tune the force applied during therapy sessions. This not only tracks progress but also ensures the therapy is effective and adapted to the patient’s recovery needs.
In my Soft Robotics course at Boston University, I developed with a team the "Soft Robotic Starfish," a bio-inspired robot for a Robosoft competition. This robot utilized soft pneumatic actuators, origami bellows, and granular jamming, adapting seamlessly to complex environments. Our design incorporated cost-effective materials and was validated through Finite Element Analysis (FEA) simulations and empirical testing. The project exemplifies the potential of soft robotics to enhance flexibility and safety in fields like search-and-rescue and human-robot interaction. By successfully navigating challenging environments and demonstrating unique methods of locomotion and object manipulation, the Soft Robotic Starfish represents a significant advancement in robotic design, blending innovation with practical applications.
In an effort to improve colon cancer screening efficacy and patient comfort, this project introduces a haptic feedback system for colonoscopy. Utilizing a glove with integrated force sensors and pneumatic chambers, the device provides real-time tactile feedback to surgeons, enhancing their ability to detect and respond to tissue contact during procedures. This innovative approach aims to reduce procedural complications, increase screening compliance, and ultimately save lives by facilitating early cancer detection.
In my senior capstone project at the University of Hartford, I developed with a team the Human Assistive and Robust Quadruped (HARQ), nicknamed "Project Cheetah." This quadruped robot was designed to aid healthcare workers by delivering medicine and food to isolated hospital patients, reducing direct contact and potential disease transmission. The project revamped an initial robot design, enhancing stability and mobility through adjusted leg movements and a robust, lightweight frame. HARQ integrates a remote operation system using a camera and joystick, enhancing its utility in various hospital settings. Our design process included extensive testing and iterations, focusing on stability, mobility, and remote operability, addressing real-world healthcare challenges.
During my internship at Kaman, I designed and built a custom KRP Torque Tester using Siemens NX. I created CAD models and made detailed engineering drawings, applying GD&T (Geometric Dimensioning and Tolerancing) to ensure precise specifications. I then carefully machined all the parts myself using a mill and lathe, maintaining the tolerances defined in the drawings. Additionally, I sourced and ordered all other components required for the assembly. The completed device was used for accurate torque measurements and quality assurance in production testing.
In the two of my industrial electronics courses at the University of Hartford, with a team we engineered an advanced automation system by integrating a dual conveyor work cell with an Epson SCARA robotic arm. This system is designed to enhance assembly line efficiency in industrial settings, capable of precision picking and placing tasks using the Epson robot’s versatile gripper, and sorting metal and plastic components via conveyor. The entire system is orchestrated using CLICK PLC programming and ladder logic, ensuring synchronized operations between the conveyors and the robotic arm.
Working with a team in my introduction to robotics class at the Boston University, we introduced Coco-Bot, an innovative agricultural robot designed to automate the harvesting of coconuts from tall trees, addressing the safety risks and inefficiencies of traditional methods. Utilizing the Intel Depth Camera D405 and YOLOv5 neural network, Coco-Bot achieves precise detection and localization of coconuts, enabling efficient robotic harvesting. This integration of advanced robotics, computer vision, and machine learning allows Coco-Bot to operate effectively in varied orchard environments.
In my Independent Study at the University of Hartford, with a team we developed a computer vision application to enhance a conveyor belt system by incorporating a color detection feature using OpenCV and a Logitech webcam. This project involved programming a Raspberry Pi to detect yellow objects and activate an LED, which was integrated into a scaled industrial setup. We mounted the camera on a 3D-printed bracket, which we adjusted to optimize object recognition. Through this project, we gained practical experience in computer vision, programming, and hardware design, successfully demonstrating how real-time color detection can be implemented in automation processes.
In my Learning from Data course at Boston University, my partner and I explored the effectiveness of ARIMA, Random Forest, and Recursive Linear Regression (RLS) models in forecasting life expectancy using the WHO Life Expectancy dataset. This project involved extensive data cleaning, feature selection, and the application of Python for analysis. We rigorously tested these models to evaluate their predictive accuracy and their ability to incorporate socio-economic and health indicators. Our findings revealed that while ARIMA excelled in short-term forecasting, RLS provided valuable adaptations for long-term predictions, demonstrating the potential of integrating these models to enhance forecasting accuracy for public health planning.
In my Cyber-Physical Systems class at Boston University, I developed with a team an autonomous robot system named "Fetch" using the ROSMaster X3 platform, designed to locate, retrieve, and return a ball autonomously. This project involved utilizing a finite state machine (FSM) for decision-making, where the robot employed LIDAR for navigation and computer vision via OpenCV to recognize specific colors related to its objectives. Key functionalities included dynamic obstacle avoidance and precise object retrieval, demonstrating advanced robotic capabilities in real-world scenarios.
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.