Programming of Robotics Systems course at the University of Aveiro, Portugal, 2021-2022.

Related tags

Hardware psr_21-22
Overview

Programação de Sistemas Robóticos

Miguel Riem Oliveira

Universidade de Aveiro

2021-2022

VideosLAR1 Projeto AtlasCar

MVI 3502 Projecto RACE

iros2014 final mpeg4 IROS 2014

set initial estimate AtlasCar2

join hand base optimization ATOM

mov10 IROS 2014

collect data agrob AgrobV2

SEV final rt Projecto SEV

PR2ObjectPerception Projecto RACE

final edited SmObEx

jarvis lab scene01 scene02 ex2 Projeto Stamina

OptimizationWithRViz ATOM

Toda a documentação referente às aulas estará colocada neste repositório. Não obstante, deve também ver consultar periodicamente a página de elearning da UC. Pode entrar em cada uma das pastas para ver os guiões. Utilize sempre a visualização do README.adoc para se assegurar que está a visualizar a versão mais recente do documento.

Link para as sessões zoom das aulas está no elearning.

Miguel Riem Oliveira

Comments
  • issue with custom msg

    issue with custom msg

    Boa tarde professor @miguelriemoliveira @danifpdra issue

    Estive a refazer a aula/parte08, infelizmente não consigo criar o ros msg (exercício 4) mesmo seguindo os passos meticulosamente.

    Executa sem erro quando corro o comando catkin_make mas não aparece nada quando corro o rosmsg list | grep Dog.

    opened by FelicianoSachilombo 19
  • game_bringup.launch file not working

    game_bringup.launch file not working

    Good afternoon Professor @miguelriemoliveira, I have a little problem launching the game_bringup.launch file. I run gazebo and it's working fine but after, launching the game_bringup file the terminal returns this error:

    image

    The program is working for my groupmates and it worked also before today. The only thing that I could have done is update Ubuntu and nothing more.

    opened by GabrieleMicheli 13
  • 'rqt' not working

    'rqt' not working

    Good morning, When I run this command roslaunch p_gmicheli_bringup bringup.launch the terminal returns this error:

    [p_gmicheli/spawn_urdf-2] process has died [pid 5309, exit code 1, cmd /opt/ros/noetic/lib/gazebo_ros/spawn_model -urdf -model p_gmicheli -x 0.0 -y -2.0 -z 0.0 -param p_gmicheli/robot_description __name:=spawn_urdf __log:=/home/gabriele/.ros/log/b62d7dc4-80e8-11ec-8bdc-9da4fbf35463/p_gmicheli-spawn_urdf-2.log]. log file: /home/gabriele/.ros/log/b62d7dc4-80e8-11ec-8bdc-9da4fbf35463/p_gmicheli-spawn_urdf-2.log*

    rvizScreen

    bringup

    But the spawn.launch file seems working fine.

    After, if I run the command rqt and I open the TF tree, the terminal returns this error:

    AttributeError: module 'tf2_ros' has no attribute 'Buffer'

    rqt_p_gmicheli

    And the same errors appear when I run roslaunch psr_scara_bringup bringup.launch and after rqt.

    bringup_scara

    To fix the problem I searched on google: AttributeError: module 'tf2_ros' has no attribute 'Buffer' and I found this link: https://answers.ros.org/question/306904/tf2-error-attributeerror-module-object-has-no-attribute-buffer/

    Here I send the steps that I followed and the screenshot of the terminal, my problem seems very similar to this one.

    which python

    tf2_tools

    grep tf2

    python

    After, from the forum i found this new link: https://answers.ros.org/question/306956/tf-view_frame-missing-how-to-reinstall/ and I followed the steps, here some screenshots:

    rosrun_tf_2

    rosrun_tf

    tf_installation

    opened by GabrieleMicheli 10
  • Problem with multiple robots

    Problem with multiple robots

    Good morning teacher @miguelriemoliveira ,

    I'm not beein able to change the color of the robot through the command line. even with the macro apparently working. It keeps giving me the Default color. Thats my: Player.urdf.xacro

    <?xml version="1.0" ?>
    <robot name="p_amartinho" xmlns:xacro="http://ros.org/wiki/xacro">
      <!--  Xacro description of turtlebot robot waffle pi used to play team hunt.-->
      <!-- Analtino Martinho-->
      <!--  PSR, January 2021-->
    
      <!--Args are local, properties are global, so we copy the values of the arguments to the properties-->
      <xacro:arg name="player_name" default="p_amartinho" />
      <xacro:arg name="player_color" default="Yellow" /> <!-- e.g. Red, Orange. Must exist in Gazebo/Colors and be defined in properties.xacro.-->
    
      <!--Properties are global so they can be viewed from inside the included xacros-->
      <!--  args are used with $(arg arg_name), properties are used as ${property_name}-->
      <xacro:property name="player_name" value="$(arg player_name)"/>
      <xacro:property name="player_color" value="$(arg player_color)"/>
    
      <!-- Include other files-->
      <xacro:include filename="$(find ${player_name}_description)/urdf/properties.xacro"/>
    
      <!--  <xacro:include filename="$(find ${player_name}_description)/urdf/player.gazebo.xacro"/>-->
      <xacro:include filename="$(find ${player_name}_description)/urdf/player.gazebo.macro.xacro"/>
      <xacro:gazebo_macro player_color="${player_color}" laser_visual="true" camera_visual="false" imu_visual="true"/>
    
    <!-- (...) -->
      <link name="base_footprint"/>
    
        <joint name="base_joint" type="fixed">
          <parent link="base_footprint"/>
          <child link="base_link" />
          <origin xyz="0 0 0.010" rpy="0 0 0"/>
        </joint>
    
        <link name="base_link">
          <visual>
            <origin xyz="-0.064 0 0.0" rpy="0 0 0"/>
            <geometry>
              <mesh filename="package://turtlebot3_description/meshes/bases/waffle_pi_base.stl" scale="0.001 0.001 0.001"/>
            </geometry>
            <material name="light_black"/>
          </visual>
    
          <collision>
            <origin xyz="-0.064 0 0.047" rpy="0 0 0"/>
            <geometry>
              <box size="0.266 0.266 0.094"/>
            </geometry>
          </collision>
    
          <inertial>
            <origin xyz="0 0 0" rpy="0 0 0"/>
            <mass value="1.3729096e+00"/>
            <inertia ixx="8.7002718e-03" ixy="-4.7576583e-05" ixz="1.1160499e-04"
                     iyy="8.6195418e-03" iyz="-3.5422299e-06"
                     izz="1.4612727e-02" />
          </inertial>
        </link>
    
        <joint name="wheel_left_joint" type="continuous">
          <parent link="base_link"/>
          <child link="wheel_left_link"/>
          <origin xyz="0.0 0.144 0.023" rpy="-1.57 0 0"/>
          <axis xyz="0 0 1"/>
        </joint>
    
        <link name="wheel_left_link">
          <visual>
            <origin xyz="0 0 0" rpy="1.57 0 0"/>
            <geometry>
              <mesh filename="package://turtlebot3_description/meshes/wheels/left_tire.stl" scale="0.001 0.001 0.001"/>
            </geometry>
            <material name="dark"/>
          </visual>
    
          <collision>
            <origin xyz="0 0 0" rpy="0 0 0"/>
            <geometry>
              <cylinder length="0.018" radius="0.033"/>
            </geometry>
          </collision>
    
          <inertial>
            <origin xyz="0 0 0" />
            <mass value="2.8498940e-02" />
            <inertia ixx="1.1175580e-05" ixy="-4.2369783e-11" ixz="-5.9381719e-09"
                     iyy="1.1192413e-05" iyz="-1.4400107e-11"
                     izz="2.0712558e-05" />
            </inertial>
        </link>
    
        <joint name="wheel_right_joint" type="continuous">
          <parent link="base_link"/>
          <child link="wheel_right_link"/>
          <origin xyz="0.0 -0.144 0.023" rpy="-1.57 0 0"/>
          <axis xyz="0 0 1"/>
        </joint>
    
        <link name="wheel_right_link">
          <visual>
            <origin xyz="0 0 0" rpy="1.57 0 0"/>
            <geometry>
              <mesh filename="package://turtlebot3_description/meshes/wheels/right_tire.stl" scale="0.001 0.001 0.001"/>
            </geometry>
            <material name="dark"/>
          </visual>
    
          <collision>
            <origin xyz="0 0 0" rpy="0 0 0"/>
            <geometry>
              <cylinder length="0.018" radius="0.033"/>
            </geometry>
          </collision>
    
          <inertial>
            <origin xyz="0 0 0" />
            <mass value="2.8498940e-02" />
            <inertia ixx="1.1175580e-05" ixy="-4.2369783e-11" ixz="-5.9381719e-09"
                     iyy="1.1192413e-05" iyz="-1.4400107e-11"
                     izz="2.0712558e-05" />
            </inertial>
        </link>
    
        <joint name="caster_back_right_joint" type="fixed">
          <parent link="base_link"/>
          <child link="caster_back_right_link"/>
          <origin xyz="-0.177 -0.064 -0.004" rpy="-1.57 0 0"/>
        </joint>
    
        <link name="caster_back_right_link">
          <collision>
            <origin xyz="0 0.001 0" rpy="0 0 0"/>
            <geometry>
              <box size="0.030 0.009 0.020"/>
            </geometry>
          </collision>
    
          <inertial>
            <origin xyz="0 0 0" />
            <mass value="0.005" />
            <inertia ixx="0.001" ixy="0.0" ixz="0.0"
                     iyy="0.001" iyz="0.0"
                     izz="0.001" />
          </inertial>
        </link>
    
        <joint name="caster_back_left_joint" type="fixed">
          <parent link="base_link"/>
          <child link="caster_back_left_link"/>
          <origin xyz="-0.177 0.064 -0.004" rpy="-1.57 0 0"/>
        </joint>
    
        <link name="caster_back_left_link">
          <collision>
            <origin xyz="0 0.001 0" rpy="0 0 0"/>
            <geometry>
              <box size="0.030 0.009 0.020"/>
            </geometry>
          </collision>
    
          <inertial>
            <origin xyz="0 0 0" />
            <mass value="0.005" />
            <inertia ixx="0.001" ixy="0.0" ixz="0.0"
                     iyy="0.001" iyz="0.0"
                     izz="0.001" />
          </inertial>
        </link>
    
        <joint name="imu_joint" type="fixed">
          <parent link="base_link"/>
          <child link="imu_link"/>
          <origin xyz="0.0 0 0.068" rpy="0 0 0"/>
        </joint>
    
        <link name="imu_link"/>
    
        <joint name="scan_joint" type="fixed">
          <parent link="base_link"/>
          <child link="base_scan"/>
          <origin xyz="-0.064 0 0.122" rpy="0 0 0"/>
        </joint>
    
        <link name="base_scan">
          <visual>
            <origin xyz="0 0 0.0" rpy="0 0 0"/>
            <geometry>
              <mesh filename="package://turtlebot3_description/meshes/sensors/lds.stl" scale="0.001 0.001 0.001"/>
            </geometry>
            <material name="dark"/>
          </visual>
    
          <collision>
            <origin xyz="0.015 0 -0.0065" rpy="0 0 0"/>
            <geometry>
              <cylinder length="0.0315" radius="0.055"/>
            </geometry>
          </collision>
    
          <inertial>
            <mass value="0.114" />
            <origin xyz="0 0 0" />
            <inertia ixx="0.001" ixy="0.0" ixz="0.0"
                     iyy="0.001" iyz="0.0"
                     izz="0.001" />
          </inertial>
        </link>
    
        <joint name="camera_joint" type="fixed">
          <origin xyz="0.073 -0.011 0.084" rpy="0 0 0"/>
          <parent link="base_link"/>
          <child link="camera_link"/>
        </joint>
    
        <link name="camera_link">
          <collision>
            <origin xyz="0.005 0.011 0.013" rpy="0 0 0"/>
            <geometry>
              <box size="0.015 0.030 0.027"/>
            </geometry>
          </collision>
        </link>
    
        <joint name="camera_rgb_joint" type="fixed">
          <origin xyz="0.003 0.011 0.009" rpy="0 0 0"/>
          <parent link="camera_link"/>
          <child link="camera_rgb_frame"/>
        </joint>
        <link name="camera_rgb_frame"/>
    
        <joint name="camera_rgb_optical_joint" type="fixed">
          <origin xyz="0 0 0" rpy="-1.57 0 -1.57"/>
          <parent link="camera_rgb_frame"/>
          <child link="camera_rgb_optical_frame"/>
        </joint>
        <link name="camera_rgb_optical_frame"/>
    
    
    </robot>
    

    Thats my: player.gazebo.macro.xacro

    <?xml version="1.0" ?>
    <robot name="p_amartinho" xmlns:xacro="http://ros.org/wiki/xacro">
    <!-- <xacro:macro name="gazebo_macro" params="player_color laser_visual camera_visual imu_visual">-->
    
        <xacro:arg name="laser_visual"  default="false"/>
        <xacro:arg name="camera_visual" default="false"/>
        <xacro:arg name="imu_visual"    default="false"/>
    
        <gazebo reference="base_link">
          <material>Gazebo/DarkGrey</material>
        </gazebo>
    
        <gazebo reference="wheel_left_link">
          <mu1>0.1</mu1>
          <mu2>0.1</mu2>
          <kp>500000.0</kp>
          <kd>10.0</kd>
          <minDepth>0.001</minDepth>
          <maxVel>0.1</maxVel>
          <fdir1>1 0 0</fdir1>
          <material>Gazebo/${player_color}</material>
        </gazebo>
    
        <gazebo reference="wheel_right_link">
          <mu1>0.1</mu1>
          <mu2>0.1</mu2>
          <kp>500000.0</kp>
          <kd>10.0</kd>
          <minDepth>0.001</minDepth>
          <maxVel>0.1</maxVel>
          <fdir1>1 0 0</fdir1>
          <material>Gazebo/${player_color}</material>
        </gazebo>
    
        <gazebo reference="caster_back_right_link">
          <mu1>0.1</mu1>
          <mu2>0.1</mu2>
          <kp>1000000.0</kp>
          <kd>100.0</kd>
          <minDepth>0.001</minDepth>
          <maxVel>1.0</maxVel>
          <material>Gazebo/FlatBlack</material>
        </gazebo>
    
        <gazebo reference="caster_back_left_link">
          <mu1>0.1</mu1>
          <mu2>0.1</mu2>
          <kp>1000000.0</kp>
          <kd>100.0</kd>
          <minDepth>0.001</minDepth>
          <maxVel>1.0</maxVel>
          <material>Gazebo/FlatBlack</material>
        </gazebo>
    
        <gazebo reference="imu_link">
          <sensor type="imu" name="imu">
            <always_on>true</always_on>
            <visualize>$(arg imu_visual)</visualize>
          </sensor>
          <material>Gazebo/Grey</material>
        </gazebo>
    
        <gazebo>
          <plugin name="turtlebot3_waffle_pi_controller" filename="libgazebo_ros_diff_drive.so">
            <commandTopic>cmd_vel</commandTopic>
            <odometryTopic>odom</odometryTopic>
            <odometryFrame>odom</odometryFrame>
            <odometrySource>world</odometrySource>
            <publishOdomTF>true</publishOdomTF>
            <robotBaseFrame>base_footprint</robotBaseFrame>
            <publishWheelTF>false</publishWheelTF>
            <publishTf>true</publishTf>
            <publishWheelJointState>true</publishWheelJointState>
            <legacyMode>false</legacyMode>
            <updateRate>30</updateRate>
            <leftJoint>wheel_left_joint</leftJoint>
            <rightJoint>wheel_right_joint</rightJoint>
            <wheelSeparation>0.287</wheelSeparation>
            <wheelDiameter>0.066</wheelDiameter>
            <wheelAcceleration>1</wheelAcceleration>
            <wheelTorque>10</wheelTorque>
            <rosDebugLevel>na</rosDebugLevel>
          </plugin>
        </gazebo>
    
        <gazebo>
          <plugin name="imu_plugin" filename="libgazebo_ros_imu.so">
            <alwaysOn>true</alwaysOn>
            <bodyName>imu_link</bodyName>
            <frameName>imu_link</frameName>
            <topicName>imu</topicName>
            <serviceName>imu_service</serviceName>
            <gaussianNoise>0.0</gaussianNoise>
            <updateRate>0</updateRate>
            <imu>
              <noise>
                <type>gaussian</type>
                <rate>
                  <mean>0.0</mean>
                  <stddev>2e-4</stddev>
                  <bias_mean>0.0000075</bias_mean>
                  <bias_stddev>0.0000008</bias_stddev>
                </rate>
                <accel>
                  <mean>0.0</mean>
                  <stddev>1.7e-2</stddev>
                  <bias_mean>0.1</bias_mean>
                  <bias_stddev>0.001</bias_stddev>
                </accel>
              </noise>
            </imu>
          </plugin>
        </gazebo>
    
        <gazebo reference="base_scan">
          <material>Gazebo/${player_color}</material>
          <sensor type="ray" name="lds_lfcd_sensor">
            <pose>0 0 0 0 0 0</pose>
            <visualize>$(arg laser_visual)</visualize>
            <update_rate>5</update_rate>
            <ray>
              <scan>
                <horizontal>
                  <samples>360</samples>
                  <resolution>1</resolution>
                  <min_angle>0.0</min_angle>
                  <max_angle>6.28319</max_angle>
                </horizontal>
              </scan>
              <range>
                <min>0.120</min>
                <max>3.5</max>
                <resolution>0.015</resolution>
              </range>
              <noise>
                <type>gaussian</type>
                <mean>0.0</mean>
                <stddev>0.01</stddev>
              </noise>
            </ray>
            <plugin name="gazebo_ros_lds_lfcd_controller" filename="libgazebo_ros_laser.so">
              <topicName>scan</topicName>
              <frameName>base_scan</frameName>
            </plugin>
          </sensor>
        </gazebo>
    
      <!--link : https://www.raspberrypi.org/documentation/hardware/camera/-->
        <gazebo reference="camera_rgb_frame">
          <sensor type="camera" name="Pi Camera">
            <always_on>true</always_on>
            <visualize>$(arg camera_visual)</visualize>
            <camera>
                <horizontal_fov>2.085595</horizontal_fov>
                
                <clip>
                    <near>0.03</near>
                    <far>100</far>
                </clip>
            </camera>
            <plugin name="camera_controller" filename="libgazebo_ros_camera.so">
              <alwaysOn>true</alwaysOn>
              <updateRate>30.0</updateRate>
              <cameraName>camera</cameraName>
              <frameName>camera_rgb_optical_frame</frameName>
              <imageTopicName>rgb/image_raw</imageTopicName>
              <cameraInfoTopicName>rgb/camera_info</cameraInfoTopicName>
              <hackBaseline>0.07</hackBaseline>
              <distortionK1>0.0</distortionK1>
              <distortionK2>0.0</distortionK2>
              <distortionK3>0.0</distortionK3>
              <distortionT1>0.0</distortionT1>
              <distortionT2>0.0</distortionT2>
            </plugin>
          </sensor>
        </gazebo>
    <!-- </xacro:macro>-->
    </robot>
    

    and thats my : bringup_player.launch

    <launch>
      <arg name="model" default="$(env TURTLEBOT3_MODEL)" doc="model type [burger, waffle, waffle_pi]"/>
      <arg name="player_name" default = "p_amartinho"/>
      <arg name="x_pos" default="0.0"/>
      <arg name="y_pos" default="-1.5"/>
      <arg name="z_pos" default="0.0"/>
    
      <group ns= "$(arg player_name)">
          <param name="tf_prefix" value="$(arg player_name)"></param>
          <param name="robot_description" command="$(find xacro)/xacro --inorder $(find p_amartinho_description)/urdf/player.urdf.xacro" />
    
          <node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher"/>
      </group>
    <!--   include do spawn.launch -->
      <include file="$(find p_amartinho_bringup)/launch/spawn.launch">
        <arg name="player_name" value="$(arg player_name)"/>
        <arg name="x_pos" value="$(arg x_pos)"/>
        <arg name="y_pos" value="$(arg y_pos)"/>
        <arg name="z_pos" value="$(arg z_pos)"/>
      </include>
    
    <!--       include do visualize.launch -->
        <include file="$(find p_amartinho_bringup)/launch/visualize.launch">
        </include>
    
        <!--     include do teleop.launch -->
    <!--     <group ns= "$(arg player_name)"> -->
    <!--         <node pkg="rqt_robot_steering" type="rqt_robot_steering" name="teleop_rqt" output="screem"> -->
    <!--             <param name="default_topic" value="/$(arg player_name)/cmd.vel"></param> -->
    <!--         </node> -->
    
    <!--     </group> -->
    </launch>
    

    and I'm launching that way:

    roslaunch p_amartinho_bringup bringup_player.launch player_color:=red
    

    the result is:

    Captura de ecrã de 2022-01-21 09-25-39

    opened by Analtino2021 9
  • Error: Type <class 'geometry_msgs.msg._PoseStamped.PoseStamped'> if not loaded or supported

    Error: Type if not loaded or supported

    Good afternoon,

    I was trying to subscribe the topic '/move_base_simple/goal' that sends a message that is geometry_msgs.msg - PoseStamped. Then I tried to transform the reception message into a new target frame, but then is would give an error that says that the type "geometry_msgs.msg._PoseStamped.PoseStamped" is not supported.

    I don't understand why it is coverting the type from "PoseStamped" to "_PoseStamped.PoseStamped".

    The reading of the message is actually correct, I can print it and see that everything is okay, but when I try to do the transformation he just doesn't recognize the type of the message.

    Thanks for your help!

    Martin

    opened by MartinRivadeneira 8
  • TP1 Distribution

    TP1 Distribution

    Hello @miguelriemoliveira,

    in the TP1 group distribution launched, my NMecAluno is switched with some other student... Maybe not only mine?

    Also, would be helpful if everyone commented on this issue so we would, more easily, find our group partners.

    Thanks all

    opened by joao-pm-santos96 7
  • Lidar points to image

    Lidar points to image

    Good night @miguelriemoliveira and @danifpdra ,

    I am having a problem understanding how to pass the lidar points to the image. Right now, I found my extrinsic and intrinsic values, the intrinsic from the camera_info and the extrinsic from the urdf.

    It gave me this values:

            self.cameraIntrinsic = np.array([[1060.9338813153618, 0.0, 640.5, -74.26537169207533],
                                            [0.0, 1060.9338813153618, 360.5, 0.0],
                                            [0.0, 0.0, 1.0, 0.0]])
            # camera extrinsic from the lidar to the camera (back and front have different extrinsic values )
            self.cameraExtrinsicFront = np.array([[1.0, 0.0, 0.0, -0.073],
                                                 [0.0, 1.0, 0.0, 0.011],
                                                 [0.0, 0.0, 1.0, -0.084],
                                                 [0.0, 0.0, 0.0, 1.0]])
    
            self.cameraExtrinsicBack = np.array([[-1.0, 0.0, 0.0, 0.2],
                                                [0.0, -1.0, 0.0, 0.011],
                                                [0.0, 0.0, 1.0, -0.084],
                                                [0.0, 0.0, 0.0, 1.0]])
    

    I also found the coordinates from the lidar, the way we made in class 10:

    ...
            z = 0
            for idx, range in enumerate(msg.ranges):
                theta = msg.angle_min + idx * msg.angle_increment
                x = range * math.cos(theta)
                y = range * math.sin(theta)
                self.points.append([x, y, z, 0])
    ....
    

    When I multiply these values I get values in the order of the 2000, 2300 most of the time... (my camera has a width of 1280 and height of 720) I already tried thinking that it's from the middle and subtracting -1280/2 and -720/2 but I still got no results I can also have the extrinsic matrix wrong but I don't know

    Probably I'm thinking this wrong but I cannot figure it out

    opened by Sarxell 6
  • Robô parado em relação ao grid do Rviz

    Robô parado em relação ao grid do Rviz

    Bom dia Professor,

    Desde que coloquei o launch file para o gmapping o meu robô no Rviz não mexe, ou seja, tudo o resto ( mapa e nuvem de pontos) mexe em relação ao robô mas ele está sempre fixo no mesmo ponto. Alguma ideia do que pode ter causado isto e de como resolver?

    Screenshot from 2022-01-15 11-23-35

    Screenshot from 2022-01-15 11-23-12

    Obrigado

    opened by josepedropinto-pt 6
  • ROS Installation

    ROS Installation

    Good evening, I am installing ROS Noetic (I am late I know but I had many problems ...) following the tutorial on wiki: but I have a problem (see attachment) when I use the sudo apt install ros-noetic-desktop-full command. Have any of you had this problem? Before asking on Github I searched a lot of things on google but still couldn't solve the problem. Could you help me please? terminalError

    opened by GabrieleMicheli 6
  • Part 7 key duplication

    Part 7 key duplication

    Hello!

    Just a notice. On Part 7 (TP2) the key 'c' is mentioned that should be use for cleaning the canvas but, further down, is also the key for drawing a circle.

    Is this intended or can we use other key for the circle (like 'e' for ellipse)?

    opened by joao-pm-santos96 6
  • TF sync issue

    TF sync issue

    Hi @miguelriemoliveira , @danifpdra

    He are trying to transform a pose between to existing and connect frames. Yet, the following error always occurs.

    tf2.ExtrapolationException: Lookup would require extrapolation 0.003000000s into the past. Requested time 1126.316000000 but the earliest data is at time 1126.319000000, when looking up transform from frame [red1/map] to frame [red1/camera_rgb_optical_frame]

    It's just 3ms... do you know why this small difference is such a big issue?

    @Pinh0 @GoncaloPeixinho @FelicianoSachilombo

    opened by joao-pm-santos96 5
  • Sensor Fusion

    Sensor Fusion

    Boa tarde professor @miguelriemoliveira ,

    Eu estou a tentar encontrar a matriz transformação entre a câmara e o lidar, sendo que a matriz atual é calculada da seguinte maneira:

        self.rpy = (0.0, 0.45, 0.0)
        self.XYZ = (0.073 + 0.064,-0.011,0.584 - 0.122) # Camera joint to base_scan transformation
    
        K = np.array([[373.2663253873802, 0.0, 640.5], [0.0, 373.2663253873802, 240.5], [0.0, 0.0, 1.0]])
        C_T_L = transf.Rotation.from_rotvec(np.asarray(self.rpy)).as_matrix()
        C_T_L_trans = np.vstack(self.XYZ)
    
        P = np.dot(K, C_T_L)
        t = np.dot(K, C_T_L_trans)
    

    Para além do mais eu criei umas restrições, nomeadamente, os pontos do lidar não podem ser infinitos e que os pontos calculados só são guardados se estiverem dentro da height e da width da imagem da câmara. No entanto, quando vou visualizar esses pontos, estes não estão nos sítios corretos. Como tal, eu gostaria de perguntar como posso obter a matriz transformação através de um tf listener, nomeadamente, entre que links realizo a operação?

    Obrigado,

    Pedro Carvalho

    opened by PedroCarvalho98 6
  • Lidar clustering

    Lidar clustering

    Boa tarde professor,

    Basicamente estamos a tentar fazer o clustering dos pontos do lidar, o objetivo é criar markers apenas para as paredes e excluir os clusters dos robôs. Fizemos isso através do numero de pontos de cada marker. O problema é que ele não está a publicar apenas os que são maiores que o thresh, está a publicar tudo. Já tentamos com o count=30, =300, ou até mesmo =10 000 000. Por isso parece que o problema só pode estar no código. Quando vejo no rviz estão todos os clusters com a mesma cor. Alguma ideia de como resolver isto?

    aqui está a função do clustering:

    Clustering function

        thresh = 0.8
        marker_array = MarkerArray()
        marker = self.createMarker(0)
        marker_array.markers.append(marker)
    
        for idx, (range1, range2) in enumerate(zip(msgScan.ranges[:-1], msgScan.ranges[1:])):
            if range1 < 0.1 or range2 < 0.1:
                continue
    
            diff = abs(range2 - range1)
    
            if range1 > 3.5 or range2 > 3.5:
                marker = self.createMarker(idx + 1)
                marker_array.markers.append(marker)
                continue
    
            if diff > thresh:
                marker = self.createMarker(idx + 1)
    
            theta = msgScan.angle_min + idx * msgScan.angle_increment
            x = range1 * cos(theta)
            y = range1 * sin(theta)
    
            point = Point(x=x, y=y, z=0)
            
            last_marker = marker
            last_marker.points.append(point)
            count = len(last_marker.points)
    
            if count > 2000:
                marker_array.markers.append(last_marker)
    
        rospy.loginfo(f'publishing a marker array with {len(marker_array.markers)} markers')
    
        # for marker in marker_array.markers:
        #     if len(marker.points) == 0:
        #         marker_array.markers.remove(marker)
    
        self.clustering_lidar.publish(marker_array) '
    

    e aqui a função para criar o marker:

    def createMarker(self, id): marker = Marker() marker.id = id marker.header.frame_id = self.name + '/base_scan' marker.type = marker.CUBE_LIST marker.action = marker.ADD marker.scale.x = 0.05 marker.scale.y = 0.05 marker.scale.z = 0.05 marker.color.r = random.random() marker.color.g = random.random() marker.color.b = random.random() marker.color.a = 1.0 marker.pose.orientation.w = 1.0 marker.pose.position.x = 0 marker.pose.position.y = 0 marker.pose.position.z = 0

        return marker`
    
    opened by josepedropinto-pt 1
  • Ros Navigation

    Ros Navigation

    Eu fiz um tutorial do Ros Navigation com o turtlebot3 (https://emanual.robotis.com/docs/en/platform/turtlebot3/nav_simulation/) e estou tentando adapta-lo para o codigo do meu robo. Consegui faze-lo reconhecer o 2D pose estimate, entretanto o Nav goal nao esta a ser reconhecido para implementar o path planner. No terminal nao tem erros nem warnings, mas mostra a INFO "Requesting the map...". Previamente eu fiz o Slam mapping e salvei o mapa gerado. Ja conferi se os launch files estao buscando corretamente esse mapa e aparentamente esta ok nesta parte. Nao sei mais onde pode estar o problema...

    @miguelriemoliveira @danifpdra

    opened by ufesvinicius 1
  • Textures in gazebo

    Textures in gazebo

    Good afternoon @miguelriemoliveira ,

    I'm having a minor issue with the textures in gazebo. Some of my textures don't load properly, I already had this problem was never an issue. Right now, when I load the arena1, sometimes I get the following texture: Screenshot from 2022-02-22 15-35-24

    This is starting to be an issue because the green is to close to the vehicle, which influences the mask of the camera and the centroid

    opened by Sarxell 2
  • ImportError: cannot import name '**' from '**' (unknown location)

    ImportError: cannot import name '**' from '**' (unknown location)

    Good afternoon!

    Today I was trying to develop some functionalities and for that I cloned some ros packages from the internet, but some of them weren't working and gave problems. In the end, I erased all these packages, but when I tried to run my code I'm getting a new error.

    So I developed a ROS msg called "detection_info.msg", and it is recognize by rosmsg list and the catkin_ws also compile when I run the command "catkin_make", but then, when I try to run the node that uses this message, it shows the following error: image

    I've been trying to solve this, but I really don't know how. The message is also recognized by pycharm: image

    I don't know if the problem has something to do with the fact that I tried to compile some packages that weren't working, but now it's in fact compiling but still doesn't recognize the message.

    Thank you for your help!

    Martin

    opened by MartinRivadeneira 3
  • Problema a lançar varios robots

    Problema a lançar varios robots

    Desde a ultima aula,depois de implementar as diferentes cores para os robots começou a dar me este erro.Pensando que era do meu codigo, fui experimentar o do professor atualizado.Quando tento fazer o launch do bringup da me esse mesmo erro.Isto é,até com o codigo do professor da o mesmo erro. Captura de ecrã de 2022-02-15 16-58-48

    opened by PedroRibeiro37 10
Owner
Miguel Riem de Oliveira
Miguel Riem de Oliveira
Samples for robotics, node, python, and bash

RaspberryPi Robot Project Technologies: Render: intent Currently designed to act as programmable sentry.

Martin George 1 May 31, 2022
Real-time Coastal Monitoring at the University of Hawaii at Manoa

Coastal Monitoring at the University of Manoa Source code for Beaglebone/RPi-based data loggers, shore internet gateways, and web server. Software dev

Stanley Lio 7 Dec 7, 2021
A script that publishes power usage data of iDrac enabled servers to an MQTT broker for integration into automation and power monitoring systems

iDracPowerMonitorMQTT This script publishes iDrac power draw data for iDrac 6 enabled servers to an MQTT broker. This can be used to integrate the pow

Lucas Zanchetta 10 Oct 6, 2022
Example Python code for building RPi-controlled robotic systems

RPi Example Code Example Python code for building RPi-controlled robotic systems These python files have been compiled / developed by the Neurobionics

Elliott Rouse 2 Feb 4, 2022
Used python functional programming to make this Ai assistant

Python-based-AI-Assistant I have used python functional programming to make this Ai assistant. Inspiration of project : we have seen in our daily life

Durgesh Kumar 2 Dec 26, 2021
Point Density-Aware Voxels for LiDAR 3D Object Detection (CVPR 2022)

PDV PDV is LiDAR 3D object detection method. This repository is based off [OpenPCDet]. Point Density-Aware Voxels for LiDAR 3D Object Detection Jordan

Toronto Robotics and AI Laboratory 114 Dec 21, 2022
Lenovo Legion 5 Pro 2021 Linux RGB Keyboard Light Controller

Lenovo Legion 5 Pro 2021 Linux RGB Keyboard Light Controller This util allows to drive RGB keyboard light on Lenovo Legion 5 Pro 2021 Laptop Requireme

null 36 Dec 16, 2022
2021 Real Robot Challenge Phase2 attemp

Real_Robot_Challenge_Phase2_AE_attemp We(team name:thriftysnipe) are the first place winner of Phase1 in 2021 Real Robot Challenge. Please see this pa

Qiang Wang 2 Nov 15, 2021
Algorithm and Structured Programming course project for the first semester of the Internet Systems course at IFPB

Algorithm and Structured Programming course project for the first semester of the Internet Systems course at IFPB

Gabriel Macaúbas 3 May 21, 2022
Code for the AI lab course 2021/2022 of the University of Verona

AI-Lab Code for the AI lab course 2021/2022 of the University of Verona Set-Up the environment for the curse Download Anaconda for your System. Instal

Davide Corsi 5 Oct 19, 2022
A course-planning, course-map rendering and GPA-calculation web service, designed for the SFU (Simon Fraser University) student.

SFU Course Planner What is the overall goal of the project (i.e. what does it do, or what problem is it solving)? As the title suggests, this project

Ash Peng 1 Oct 21, 2021
Planning Algorithms in AI and Robotics. MSc course at Skoltech Data Science program

Planning Algorithms in AI and Robotics course T2 2021-22 The Planning Algorithms in AI and Robotics course at Skoltech, MS in Data Science, during T2,

Mobile Robotics Lab. at Skoltech 6 Sep 21, 2022
School of Artificial Intelligence at the Nanjing University (NJU)School of Artificial Intelligence at the Nanjing University (NJU)

F-Principle This is an exercise problem of the digital signal processing (DSP) course at School of Artificial Intelligence at the Nanjing University (

Thyrix 5 Nov 23, 2022
Jittor Medical Segmentation Lib -- The assignment of Pattern Recognition course (2021 Spring) in Tsinghua University

THU模式识别2021春 -- Jittor 医学图像分割 模型列表 本仓库收录了课程作业中同学们采用jittor框架实现的如下模型: UNet SegNet DeepLab V2 DANet EANet HarDNet及其改动HarDNet_alter PSPNet OCNet OCRNet DL

null 48 Dec 26, 2022
Code for Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022)

Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022) We consider how a user of a web servi

joisino 20 Aug 21, 2022
Group P-11's submission for the University of Waterloo's 2021 Engineering Competition (Programming section).

P-11-WEC2021 Group P-11's submission for the University of Waterloo's 2021 Engineering Competition (Programming section). Part I Compute typing time f

TRISTAN PARRY 1 May 14, 2022
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2020

Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2020

Phillip Lippe 1.1k Jan 7, 2023
This is a repository for the Duke University Cloud Computing course project on Serveless Data Engineering Pipeline. For this project, I recreated the below pipeline.

AWS Data Engineering Pipeline This is a repository for the Duke University Cloud Computing course project on Serverless Data Engineering Pipeline. For

null 15 Jul 28, 2021
Among AIs is a (prototype of) PC Game we developed as part of the Smart Applications course @ University of Pisa.

Among AIs is a PC Game we developed as part of the Smart Applications course @ Department of Computer Science of University of Pisa, under t

Gabriele Pisciotta 5 Dec 5, 2021
Final Project for the CS238: Decision Making Under Uncertainty course at Stanford University in Autumn '21.

Final Project for the CS238: Decision Making Under Uncertainty course at Stanford University in Autumn '21. We optimized wind turbine placement in a wind farm, subject to wake effects, using Q-learning.

Manasi Sharma 2 Sep 27, 2022