watonomous.github.io

  1. Drive Crew
  2. Drive Crew Home

[ Drive Crew : Pipeline Operator Training ]

Created by [ Rowan Dempster], last modified by [ Anita Hu] on Jan 26, 2020

Pipeline Operations consists of anything that must be done in order to launch autonomous mode. The operator, usually in the passenger seat, ensures that systems are functional and in a good state to go autonomous. This document provides some basic knowledge of the vehicle, describes the current runbook, and contains tips of how to isolate and debug issues.

Primer on the vehicle's environment

The rugged computer in the back of the car runs Ubuntu 16.04 (LTS prior to 18.04) on an Intel Xeon. The only possible difference of ROS Kinetic behaviour between its environment and other installations (such as your own, or the ones in the bay) should be either package versioning differences, such as using different versions for OpenCV, OpenVINO, Python, C++, ... or ROS cpu throttling rates of messages being published (a lot more apparent on <i7 cpus).

The car's environment has access to a sensor suite, setup via the local network LiDAR-Network which should be available if the ethernet cable from the switch is plugged in the correct port of the computer. If the ethernet cable is plugged into a different port, the rugged computer will not receive any sensor data.

From a software perspective, we only need to interface with the drivers from within ROS, since we have each sensor's corresponding ROS drivers downloaded and ready to be launched.

The catkin_ws we use is ~/integration. It contains all ROS codes, scripts, drivers, and some miscellaneous stuff you need for the operation of the vehicle. You should inspect ~/.bashrc, where we source [~/.watoalias]{.inline-comment-marker data-ref=”f9e990af-020e-4ff3-86bd-67eed1df3cff”}, which is a list of commands I've aliased and will be referencing in this document.

[NOTE:] These drivers (i.e. novatel_span_driver, pointgrey_camera_driver, ...) are usually third-party and in maintenance mode without much development. Due to how ROS releases new non-backward compatible versions tied to Ubuntu stable releases every year, the community and many of the driver contributors focus mainly on later versions of ROS. Although if some major bugs are found, we can probably expect them to also fix it in their Kinetic drivers.

Prior to launching autonomous mode

A pre-flight check is recommended before every new run. All the items below can and should be done in the bay. The main items to be ensured are as follows:


[NOTE:] Depending on what the testing session requires, non-bolded steps can be skipped.

Steps to launch autonomous mode

The launch files are on Gitlab, pulled under ~/integration/src/[state-machine]{.inline-comment-marker data-ref=”f062b56f-f753-48a5-8ee5-316c30dc4430”}.

Start the car in autonomous mode: ./bin/shadow, run prechecks, then run ./bin/master to engage the CAN.

[HD Maps]

Change the map name in localization/launch/hd_maps.launch to whatever you want 

Map files are listed in the README of the localization repo

Creating A Custom OSM HD Map

Using the V1 mapper:

[Modify]{.inline-comment-marker data-ref=”9552d6cb-a708-4cdf-a10d-cb5470c34c6e”} prediction/launch/prediction.launch, setting <param name="processing/use_hd_map" type="bool" value="true" /> to false

then run:

novatel_start
rosrun path_planning v1_map_maker.py
- drive along your desired lane lines, then press ctrl-c to stop recording data.
./bin/shadow // check torques and angles
./bin/master

Debugging

Knowledge on this is fairly spread thin, you may not get an answer on #tech-support. The more you understand how components of the car interact with each other (that came with the Bolt and our additions) and the more creative you are in your problem solving, the more effective you will be.

Make sure you understand the software pipeline doc: https://docs.google.com/document/d/14kDOO9L3YU2OQEs0tR85pUxXs78bYIPO87eHWYtXIGo/edit#heading=h.g756mi2nxmkt

Perception

Launch cameras: cameras_start
Launch lidar: lidar_vlp32_start
How to run the pipeline: roslaunch perception perception-vehicle-exp.launch

Common Problems

Problem: No detections from lidar

Problem: No detections from camera

Processing:

TODO SOMEONE ON PROCESSING

Random notes:

Path Planning:

How to run the pipeline: roslaunch path_planning planner_main.launch

Path planning has an rviz for environment. Start rviz, then load the config (file->open) ~/integration/src/decision/rviz_publisher/environment.rviz. This rviz will start whenever you run the path planning pipeline (roslaunch path_planning planner_main.launch)

Common Problems

Problem: RVIZ is empty!

Problem: There are lanes in the RVIZ but no car or path, or the car is frozen in the RVIZ. Also, the car isn't moving IRL!

Problem: Path Planning crashes when we start the autonomous pipelines

Problem: X obstacle isn't showing up in the RVIZ!

General Problems

Debug v1 map maker outputs with debug_map_pts integrations/lanes.txt
Sometimes you just need to remember to source devel/setup.bash after you do catkin_make

Useful UNIX commands

ads

Useful ROS commands

rosbag record -O file_name /topic_name_1 /topic_name_2 can include as many topics as you want, separate with 1 space. Beware rosbags with camera data will get very large.

rosbag info file_name to get information on what is in the rosbag (e.g. topics, # of messages, etc.)

rosbag play file_name publishes whatever messages that was recorded in the rosbag

record_bag -O file_name alias to record most of the required bag topics

rqt_graph shows a graph of all nodes and what topics they're publishing

rostopic [list | hz | echo | info]

Videos

Link to Google folder containing the entire training session with Simon. The folder also contains basic explanations of the wiring on the rugged that pipeline operators should know:

https://drive.google.com/open?id=1RXpkvCG5VVUxxJ5e7HprabrhkJPTyDo2

Link to Google Folder containing the training session with Anita
https://drive.google.com/drive/folders/1b1wWStGdhKWb0Y73T-SCGPPqD_x9EGp1

Document generated by Confluence on Dec 10, 2021 03:58

Atlassian