Skip to content

Commit eea8694

Browse files
committed
Refactor headers in README
1 parent 14e53cc commit eea8694

File tree

1 file changed

+37
-41
lines changed

1 file changed

+37
-41
lines changed

README.md

Lines changed: 37 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,12 @@
1-
## Low Latency Instance Segmentation by Continuous Clustering for Rotating LiDAR Sensors
1+
# Low Latency Instance Segmentation by Continuous Clustering for Rotating LiDAR Sensors
22

33
[![Basic Build Workflow](https://github.com/UniBwTAS/continuous_clustering/actions/workflows/basic-build-ci.yaml/badge.svg?branch=master)](https://github.com/UniBwTAS/continuous_clustering/actions/workflows/basic-build-ci.yaml)
44
[![Publish Docker image](https://github.com/UniBwTAS/continuous_clustering/actions/workflows/publish-docker-image.yaml/badge.svg)](https://github.com/UniBwTAS/continuous_clustering/actions/workflows/publish-docker-image.yaml)
5+
[![arXiv](https://img.shields.io/badge/arXiv-2311.13976-b31b1b.svg)](https://arxiv.org/abs/2311.13976)
56

6-
[![forthebadge](https://forthebadge.com/images/badges/made-with-c-plus-plus.svg)](https://forthebadge.com)
7+
![Continuous Clustering Demo](https://github.com/UniBwTAS/continuous_clustering/blob/master/assets/demo.gif)
78

8-
![](https://github.com/UniBwTAS/continuous_clustering/blob/master/assets/demo.gif)
9-
10-
### Abstract:
9+
## Abstract:
1110

1211
Low-latency instance segmentation of LiDAR point clouds is crucial in real-world applications because it serves as an
1312
initial and frequently-used building block in a robot's perception pipeline, where every task adds further delay.
@@ -23,7 +22,7 @@ incoming data in real time. We explain the importance of a large perceptive fiel
2322
evaluate important architectural design choices, which could be relevant to design an architecture for deep learning
2423
based low-latency instance segmentation.
2524

26-
### If you find our work useful in your research please consider citing our paper:
25+
## If you find our work useful in your research please consider citing our paper:
2726

2827
```
2928
@misc{reich2023low,
@@ -38,40 +37,40 @@ based low-latency instance segmentation.
3837

3938
Get PDF [here](https://arxiv.org/abs/2311.13976).
4039

41-
### Acknowledgement
40+
## Acknowledgement
4241

4342
The authors gratefully acknowledge funding by the Federal Office of Bundeswehr Equipment, Information Technology and In-Service Support (BAAINBw).
4443

45-
## Examples:
44+
# Examples:
4645

47-
### Works with uncommon mounting positions
46+
## Works with uncommon mounting positions
4847

4948
We mounted two Ouster OS 32 at a tilted angle in order to get rid of the blind spots of our main LiDAR sensor. Our
5049
clustering also works with these mounting positions. The main challenge here is the ground point segmentation not the
5150
clustering. It works ok, but we hope to improve it in the future.
5251

53-
![](https://github.com/UniBwTAS/continuous_clustering/blob/master/assets/demo_ouster.gif)
52+
![Clustering with Ouster sensor](https://github.com/UniBwTAS/continuous_clustering/blob/master/assets/demo_ouster.gif)
5453

55-
### Works with Fog
54+
## Works with Fog
5655

5756
There are many clutter points and the camera image is almost useless. But the clustering still works quite well after
5857
filtering potential fog points.
5958

60-
![](https://github.com/UniBwTAS/continuous_clustering/blob/master/assets/demo_fog.gif)
59+
![Clustering with clutter points from fog](https://github.com/UniBwTAS/continuous_clustering/blob/master/assets/demo_fog.gif)
6160

62-
### Works on German Highway
61+
## Works on German Highway
6362

6463
There are often no speed limits on the German Highway. So it is not uncommon to see cars with velocities of 180 km/h or
6564
much higher. A latency of e.g. 200ms leads to positional errors of `(180 / 3.6) m/s * 0.2s = 10m`. This shows the need
6665
to keep latencies at a minimum.
6766

68-
[![IMAGE ALT TEXT HERE](https://user-images.githubusercontent.com/74038190/235294007-de441046-823e-4eff-89bf-d4df52858b65.gif)](https://www.youtube.com/watch?v=DZKuAQBngNE&t=98s)
67+
[![Video GIF](https://user-images.githubusercontent.com/74038190/235294007-de441046-823e-4eff-89bf-d4df52858b65.gif)](https://www.youtube.com/watch?v=DZKuAQBngNE&t=98s)
6968

70-
## Run it yourself:
69+
# Run it yourself:
7170

72-
### Download Sensor Data
71+
## Download Sensor Data
7372

74-
#### SemanticKitti
73+
### SemanticKitti
7574

7675
We use the same folder structure as the SemanticKitti dataset:
7776

@@ -93,7 +92,7 @@ curl -s https://raw.githubusercontent.com/UniBwTAS/continuous_clustering/master/
9392
export KITTI_SEQUENCES_PATH="$(pwd)/kitti_odometry/dataset/sequences"
9493
```
9594

96-
#### Rosbag of our test vehicle VW Touareg
95+
### Rosbag of our test vehicle VW Touareg
9796

9897
Download the rosbag:
9998

@@ -105,11 +104,8 @@ export ROSBAG_PATH=$(pwd)
105104

106105
Alternatively download it manually from
107106
our [Google Drive](https://drive.google.com/file/d/1zM4xPRahgxdJXJGHNXYUpM_g4-9UrcwC/view?usp=sharing) and set the
108-
environment variable `ROSBAG_PATH` to download folder:
107+
environment variable `ROSBAG_PATH` accordingly: `export ROSBAG_PATH=/parent/folder/of/rosbag`
109108

110-
```bash
111-
export ROSBAG_PATH=/download/folder/of/rosbag/file
112-
```
113109
Available bags:
114110
- `gdown 1zM4xPRahgxdJXJGHNXYUpM_g4-9UrcwC` (3.9GB, [Manual Download](https://drive.google.com/file/d/1zM4xPRahgxdJXJGHNXYUpM_g4-9UrcwC/view?usp=sharing))
115111
- Long recording in urban scenario (no camera to reduce file size, no Ouster sensors)
@@ -118,9 +114,9 @@ Available bags:
118114
- `gdown 146IaBdEmkfBWdIgGV5HzrEYDTol84a1H` (0.7GB, [Manual Download](https://drive.google.com/file/d/146IaBdEmkfBWdIgGV5HzrEYDTol84a1H/view?usp=sharing))
119115
- Short recording of German Highway (blurred camera for privacy reasons)
120116

121-
### Setup Environment
117+
## Setup Environment
122118

123-
#### Option 1: Docker + GUI (VNC):
119+
### Option 1: Docker + GUI (VNC):
124120

125121
This option is the fastest to set up. However, due to missing hardware acceleration in the VNC Docker container for RVIZ
126122
the rosbag is played at 1/10 speed.
@@ -138,7 +134,7 @@ docker run -d -p 6080:80 -v /dev/shm:/dev/shm -v ${KITTI_SEQUENCES_PATH}:/mnt/ki
138134
6. Continue with step "Run Continuous Clustering" (see below) in the terminal opened in step 2. (There you can use the
139135
clipboard feature of noVNC; tiny arrow on the left of the screen)
140136

141-
#### Option 2: Locally on Ubuntu 20.04 (Focal) and ROS Noetic
137+
### Option 2: Locally on Ubuntu 20.04 (Focal) and ROS Noetic
142138

143139
```bash
144140
# install ROS (if not already installed)
@@ -158,7 +154,7 @@ bash /tmp/clone_repositories_and_install_dependencies.sh
158154
catkin build
159155
```
160156

161-
### Run Continuous Clustering
157+
## Run Continuous Clustering
162158

163159
```bash
164160
# run on kitti odometry dataset
@@ -177,7 +173,7 @@ between two transforms. The size of a slice depends on the update rate of the tr
177173
to batches/slices of 1/5 rotation for a LiDAR rotating with 10Hz). So for a nice visualization where the columns are
178174
published one by one like it the GIF at the top of the page you should disable this flag.
179175

180-
## Evaluation on SemanticKITTI Dataset
176+
# Evaluation on SemanticKITTI Dataset
181177

182178
We evaluate our clustering algorithm with the same metrics as described in the paper _TRAVEL: Traversable Ground and
183179
Above-Ground Object Segmentation Using Graph Representation of 3D LiDAR
@@ -186,12 +182,12 @@ Scans_ ([arXiv](https://arxiv.org/abs/2206.03190), [GitHub](https://github.com/u
186182
Under-Segmentation Entropy (USE) for clustering performance and precision / recall / accuracy / F1-Score for ground
187183
point segmentation.
188184

189-
### Results
185+
## Results
190186

191187
The following results were obtained at Commit
192188
SHA [fa3c53b](https://github.com/UniBwTAS/continuous_clustering/commit/fa3c53bab51975b06ae5ec3a9e56567729149e4f)
193189

194-
#### Clustering
190+
### Clustering
195191

196192
| Sequence | USE μ ↓ / σ ↓ | OSE μ ↓ / σ ↓ |
197193
| :---: | :---: | :---: |
@@ -209,7 +205,7 @@ SHA [fa3c53b](https://github.com/UniBwTAS/continuous_clustering/commit/fa3c53bab
209205
| 9 | 18.45 / 6.25 | 39.62 / 11.86 |
210206
| 10 | 20.10 / 8.70 | 34.33 / 12.37 |
211207

212-
#### Ground Point Segmentation:
208+
### Ground Point Segmentation:
213209

214210
| Sequence | Recall μ ↑ / σ ↓ | Precision μ ↑ / σ ↓ | F1-Score μ ↑ / σ ↓ | Accuracy μ ↑ / σ ↓ |
215211
| :---: | :---: | :---: | :---: | :---: |
@@ -227,14 +223,14 @@ SHA [fa3c53b](https://github.com/UniBwTAS/continuous_clustering/commit/fa3c53bab
227223
| 9 | 95.31 / 4.03 | 88.22 / 5.70 | 91.45 / 3.37 | 91.74 / 3.20 |
228224
| 10 | 91.62 / 6.79 | 85.76 / 7.22 | 88.33 / 5.45 | 91.83 / 3.63 |
229225

230-
### Download/Generate Ground Truth Data
226+
## Download/Generate Ground Truth Data
231227

232228
In order to evaluate OSE and USE for clustering performance additional labels are required, which are generated from the
233229
Semantic Kitti Labels and using a euclidean distance-based clustering.
234230
See [Issue](https://github.com/url-kaist/TRAVEL/issues/6) in TRAVEL GitHub repository
235231
and [src/evaluation/kitti_evaluation.cpp](src/evaluation/kitti_evaluation.cpp) for more details.
236232

237-
#### Option 1: Download pre-generated labels
233+
### Option 1: Download pre-generated labels
238234

239235
```bash
240236
cd /tmp
@@ -247,7 +243,7 @@ Alternatively download it manually from
247243
our [Google Drive](https://drive.google.com/file/d/1MOfLbUQcwRMLhRca0bxJMLVriU3G8Tg3/view?usp=sharing) and unzip it to
248244
the correct location (in parent directory of `dataset` folder).
249245

250-
#### Option 2: Generate with GUI & ROS setup (assumes prepared ROS setup, see above, useful for debugging etc.)
246+
### Option 2: Generate with GUI & ROS setup (assumes prepared ROS setup, see above, useful for debugging etc.)
251247

252248
Generate labels, which are saved to `${KITTI_SEQUENCES_PATH}/<sequence>/labels_euclidean_clustering/`
253249
If you want to visualize the generated ground truth labels in ROS then remove the `--no-ros` flag and use just one
@@ -257,7 +253,7 @@ thread (default).
257253
rosrun continuous_clustering gt_label_generator_tool ${KITTI_SEQUENCES_PATH} --no-ros --num-threads 8
258254
```
259255

260-
#### Option 3: Generate without GUI or ROS within Minimal Docker Container
256+
### Option 3: Generate without GUI or ROS within Minimal Docker Container
261257

262258
```bash
263259
# build docker image
@@ -272,9 +268,9 @@ docker run --rm -v ${KITTI_SEQUENCES_PATH}:/mnt/kitti_sequences --name build_no_
272268
docker stop build_no_ros
273269
```
274270

275-
### Run Evaluation
271+
## Run Evaluation
276272

277-
#### Option 1: Evaluate with GUI & ROS setup (assumes prepared ROS setup, see above, useful for debugging)
273+
### Option 1: Evaluate with GUI & ROS setup (assumes prepared ROS setup, see above, useful for debugging)
278274

279275
```bash
280276
# run evaluation slowly with visual output
@@ -285,7 +281,7 @@ roslaunch continuous_clustering demo_kitti_folder.launch path:=${KITTI_SEQUENCES
285281
roslaunch continuous_clustering demo_kitti_folder.launch path:=${KITTI_SEQUENCES_PATH} evaluate-fast:=true
286282
```
287283

288-
#### Option 2: Evaluate without GUI or ROS within Minimal Docker Container
284+
### Option 2: Evaluate without GUI or ROS within Minimal Docker Container
289285

290286
```bash
291287
# build docker image (if not already done)
@@ -299,18 +295,18 @@ docker run --rm -v ${KITTI_SEQUENCES_PATH}:/mnt/kitti_sequences --name build_no_
299295
docker stop build_no_ros
300296
```
301297

302-
## Tips for Rviz Visualization:
298+
# Tips for Rviz Visualization:
303299

304300
TODO
305301

306-
## Info about our LiDAR Drivers
302+
# Info about our LiDAR Drivers
307303

308304
Our clustering algorithm is able to process the UDP packets from the LiDAR sensor. So the firings can be processed
309305
immediately. We directly process the raw UDP packets from the corresponding sensor, which makes the input
310306
manufacturer/sensor specific. Luckily, we can use external libraries, so it is not necessary reimplement the decoding
311307
part (UDP Packet -> Euclidean Point Positions). Currently, we support following sensor manufacturers:
312308

313-
### Velodyne
309+
## Velodyne
314310

315311
- Tested Sensors: VLS 128
316312
- All other rotating Velodyne LiDARs should work, too
@@ -340,7 +336,7 @@ part (UDP Packet -> Euclidean Point Positions). Currently, we support following
340336
from [ros-drivers/velodyne](https://github.com/ros-drivers/velodyne) to decode packets to euclidean points
341337
- See source code at: [velodyne_input.hpp](include/continuous_clustering/ros/velodyne_input.hpp)
342338

343-
### Ouster
339+
## Ouster
344340

345341
- Tested Sensors: OS 32
346342
- All other rotating Ouster LiDARs should work, too

0 commit comments

Comments
 (0)