The Foundations of Heart Sensing
This document serves as an overview of the progress made during the first week of my robotics project. As I embarked on this exploration into advanced robotics, I successfully tackled multiple aspects of this challenging venture. Here, I provide a comprehensive breakdown of my initial milestones and outline the roadmap for forthcoming tasks.
Gantt Chart Creation: Visualizing the Future
In the initial phase of my robotics project, the focus was on effective project management, particularly in the planning and tracking of tasks. To this end, I developed a detailed Gantt chart in the first week. This chart provides a holistic view of the project's timeline and key milestones. It decomposes intricate processes into individual, manageable tasks, thereby offering a clear visualization of planned activities against the designated timeline. This Gantt chart will serve as a roadmap for the project, ensuring the alignment of progress with the predefined schedule.
Hardware Selection: The Physical Building Blocks
After careful evaluation, I have finalized the hardware components to be used in my robotics project. The centrepiece of the project will be the UR10e robotic arm, a well-known equipment piece in the field for its power and adaptability. The UR10e's exceptional flexibility allows it to perform tasks with noteworthy precision, making it an invaluable asset to the project.
For the sensing and processing roles, I have chosen the Raspberry PI 4, 60GHz mmWave Sensor, and the Pi Camera. The Raspberry PI 4, selected for its exceptional processing abilities and compact size, will serve as the project's primary microprocessor. The 60GHz mmWave Sensor, critical for heart rate detection, and the Pi Camera, utilized for visual object identification, round off the key components of the hardware infrastructure
Connectivity: The Communication Bridge
In the initial phase of my project, I planned out the interconnectivity between the Raspberry PI and the UR10e robotic arm. The Python script provided below exemplifies a rudimentary version of the envisaged interaction between these two key components. This mapping establishes the communication groundwork necessary for the harmonious operation of the Raspberry PI and UR10e in the larger system.
import cv2
import RTDE
import numpy as np
from tensorflow.keras.models import load_model
from picamera.array import PiRGBArray
from picamera import PiCamera
import radar_sensor_module
import pir_sensor_module
import time
# Load the model trained for chest detection
model = load_model('model_path.h5')
# Set robot's IP address
robot_ip = "192.168.1.100"
robot = RTDE.RTDE(robot_ip, 30004)
# Initialize the radar sensor
def setup_radar_sensor():
radar_sensor_module.initialize()
# Function to read heart rate from radar sensor
def read_from_radar_sensor():
heart_rate = radar_sensor_module.get_heart_rate()
return heart_rate
# Initialize the PIR sensor
def setup_pir_sensor():
pir_sensor_module.initialize()
# Function to detect if object is too close using PIR sensor
def read_from_pir_sensor():
is_object_close = pir_sensor_module.is_object_close()
return is_object_close
# Function to capture video frames from the Pi Camera
def capture_frame_from_camera(camera, rawCapture):
camera.capture(rawCapture, format="bgr")
frame = rawCapture.array
rawCapture.truncate(0)
return frame
# Function to use the trained model to detect chest coordinates in the frame
def detect_chest_coordinates(frame):
prediction = model.predict(np.array([frame]))
chest_coordinates = prediction
return chest_coordinates
# Function to move robot to specific coordinates
def move_robot_to_coordinates(coordinates):
robot_pose = coordinates
robot.movej(robot_pose, acc=0.1, vel=0.1)
# Main function
if __name__ == "__main__":
setup_radar_sensor() # Radar sensor setup
setup_pir_sensor() # PIR sensor setup
# Setup video capture
camera = PiCamera()
rawCapture = PiRGBArray(camera)
while True:
frame = capture_frame_from_camera(camera, rawCapture) # Capture video frame
chest_coordinates = detect_chest_coordinates(frame) # Detect chest
if read_from_pir_sensor(): # If object is too close, stop robot
print("Object too close, stopping the robot.")
robot.stopj()
continue
move_robot_to_coordinates(chest_coordinates) # Move robot to chest coordinates
if chest_coordinates: # If chest is detected
heart_rate = read_from_radar_sensor() # Get heart rate
print("Heart Rate: ", heart_rate)
The Python script I've developed follows a simple yet effective flow, illustrating the cooperation between the Raspberry PI and the UR10e robotic arm. Initially, an image is captured using the Pi Camera. The script then processes this image to identify object coordinates within it. These coordinates serve as a guide for the UR10e robotic arm, directing it towards the specified location. This workflow demonstrates the symbiotic relationship between the Raspberry PI, which handles image processing, and the UR10e robotic arm, responsible for executing physical tasks, in the overall system architecture.
What's Next: Datasets and Radar Sensor Connectivity
Moving forward, my immediate priority is the compilation of datasets to develop a robust model for object recognition. The objective is to train the system to accurately identify and localize objects from images captured by the Pi Camera. Given that the model's performance will significantly influence the overall system effectiveness, this stage is a crucial part of the project.
In parallel, I'm dedicating efforts to integrating the Raspberry with the radar sensor for heart rate detection. This task presents a unique set of challenges that I'm eager to explore and solve.
To sum up, the past couple of weeks have been immensely productive. I've successfully established the foundational elements, and the project's architecture is progressively taking shape.
Subscribe to my newsletter
Read articles from Emmanuel Adebayo directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Emmanuel Adebayo
Emmanuel Adebayo
I build with creativity.