# Style-Guide & Clean Code

## ROS Packages

Although ROS packages can be compiled as long as they meet the catkin requirements (src folder, CMakeLists.txt and package.xml), it is recommended to structure them in the following way (example for the rbd\_controller):

* **`config`**
  * `default.yaml` Parameters for the parameter server
  * `config.rviz` Configuration file for Rviz
  * `<others>.yaml, <others>.rviz` Additional configuration files
* **`include`**
  * **`rbd_controller`**
    * `RbdController.hpp` Header file for the controller class
* **`launch`**
  * `rbd_controller.launch` Launch file for the controller node
  * `<others>.launch` Other launch files
* **`src`**
  * `RbdController.cpp` Implementation of the controller class and the corresponding subscribers, publishers and service servers/clients
  * `rbd_controller_node.cpp` Main, initializes controller class and creates the `ROS::NodeHandle`
* `CMakeLists.txt` Dependecies and build configuration
* `LICENSE` MIT-license
* `README.md`
* `package.xml`Dependecies and build configuration

## Coordinate Frames

The coordinate frames of the robot are defined as follows:&#x20;

<figure><img src="https://1981574695-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FsmNbOWbT7W8uswCfgHan%2Fuploads%2F75MZEVwUC1WFTZtqo5A7%2Fimage.png?alt=media&#x26;token=87f0d56a-aae7-40c7-8db5-62bcf435a039" alt="" width="375"><figcaption></figcaption></figure>

According to [REP 105](https://www.ros.org/reps/rep-0105.html), the robots base frame is named `base_link`. All of the robot's subsequent coordinate frames are defined with respect to this `base_link`. As a result, die measurements of the LiDAR sensor and the stereo camera are automatically transformed correctly.&#x20;

<figure><img src="https://1981574695-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FsmNbOWbT7W8uswCfgHan%2Fuploads%2FOqFiPPoAAB61j4ZLw6rx%2Fcoordinate_systems.png?alt=media&#x26;token=64f955da-374f-48a2-8a03-823580f0e7d1" alt=""><figcaption></figcaption></figure>

The coordinate frames used for SLAM are named `odom` and `map` (as defined in [REP 105](https://www.ros.org/reps/rep-0105.html)). The odom frame is used to calculate continous odometry based on the IMU and the stereo camera. The SLAM algorithm updates this measurement periodically with absolute measurements from the LiDAR sensor. As long as the `base_link`'s pose is defined with respect to `map`, the error (deviation of the `odom` and `map` frame) is eliminated.

<figure><img src="https://1981574695-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FsmNbOWbT7W8uswCfgHan%2Fuploads%2FLFHb8O6EMxLC1q5QynAQ%2Flidar_map.png?alt=media&#x26;token=74339c26-9cc5-45e6-b8c6-4a7d8502a6d8" alt=""><figcaption><p>The SLAM algorithm creates a map (gray/black) based on the LaseScan (white) to mitigate the error of the relative state estimation (<code>base_link</code> -> <code>odom</code>). The robot's pose (red arrow) is defined with respect to the <code>map</code> frame.</p></figcaption></figure>
