Key Features • Wiki • Instructions • Approach • Strategy • Team • License
The team's main goal is to develop computational algorithms applied to autonomous robots using techniques of artificial intelligence and computer vision. This code was desenvolver for the ROSI CHALLENGE 2019.
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
catkin_init_workspace2. Clone and download this repository package to your ROS Workspace src folder (../catkin_ws/src) with the name rosi_defy_forros:
$ git clone https://github.com/raphaellmsousa/ForROS rosi_defy_forros
cd ~/catkin_ws/src/rosi_defy_forros/script
chmod +x rosi_forros.pycd ~/catkin_ws
catkin buildObs.: as we are using python 2.7, you must use pip2 to install the follow dependences.
- $ sudo apt install python-pip # pip2 install
- $ pip2 install "numpy<1.17" # Numpy version<1.17
- $ pip2 install https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-1.14.0-cp27-none-linux_x86_64.whl # Tensorflow version 1.14.0
- $ pip2 install keras==2.2.5 # Keras version 2.2.5 # rosi simulation parameters
rosi_simulation:
# Simulation Rendering Flag
# disable it for faster simulation (but no visualization)
# possible values: true, false.
simulation_rendering: true
# Velodyne processing flag
# disable it for faster simulation (but no Velodyne data)
# possible values: true, false.
velodyne_processing: false
# Kinect processing flag
# disable it for faster simulation (but no kinect data)
# possible values: true, false.
kinect_processing: true
# Hokuyo processing flag
# disable it for faster simulation (but no hokuyo data)
# possible values: true, false.
hokuyo_processing: false
# Hokuyo drawing lines
# enable it for seeing hokuyo lines
# possible values: true, false.
hokuyo_lines: false
# Camera on UR5 tool processing flag
# enable it for processing/publishing the ur5 tool cam
# possible values: true, false
ur5toolCam_processing: true
# Fire enabling
# allows to enable/disable the fire
fire_rendering: true$ roscore# start a ROS master
$ vrep# to open the vrep simulator
$ cd ~/catkin_ws# open your catkin workspace$ source devel/setup.bash# source the path$ roslaunch rosi_defy_forros rosi_joy_forros.launch# start the Rosi node
A detailed description of our team's approach has been provided in the jupyter notebook file "valeNeuralNetwork.ipynb"
Our team divided the mission into two laps. The first one, we turn around the treadmills avoiding obstacles and passing through the restricted region. Also, we are detecting fire while the robot is running, our GPS movement and the position of detected fire is presented in a realtime map, check the "valeNeuralNetwork.ipynb" for more details.
In the second lap, our robot is going to try climbing up the ladders and touch the rolls. We are going to touch first the suspended platform rolls on fire and after down the stairs, the robot will detect and touch the base of a roll without fire. After that, our mission is concluded.
Institutions: Federal Institute of Paraiba - IFPB - (Cajazeiras) and Federal Institute of Bahia - IFBA (Vitoria da Conquista).
- Raphaell Maciel de Sousa (team leader/IFPB)
- Gerberson Felix da Silva (IFPB)
- Jean Carlos Palácio Santos (IFBA)
- Rafael Silva Nogueira Pacheco (IFBA)
- Michael Botelho Santana (IFBA)
- Sérgio Ricardo Ferreira Andrade Júnior (IFBA)
- Lucas dos Santos Ribeiro (IFBA)
- Félix Santana Brito (IFBA)
- José Alberto Diaz Amado (IFBA)
- Matheus Vilela Novaes (IFBA)
Copyright (c) ForROS, 2019.
