Skip to content
Snippets Groups Projects
Commit 81ce9019 authored by Maciej Wielgosz's avatar Maciej Wielgosz
Browse files

Merge branch 'master' of gitlab.nibio.no:maciekwielgosz/instance_segmentation_classic

parents 81f26173 ca03957b
Branches
No related tags found
No related merge requests found
# Orinal repo
For the orignal repo, please take a look there: https://github.com/philwilkes/FSCT
# Installation steps of the pipeline
The installation involves conda.
......@@ -61,15 +56,14 @@ For Windows you can check: <https://stackoverflow.com/questions/3701646/how-to-a
## Running a whole pipeline
In order to run a whole pipeline run the following command: `./run_all.sh folder`.
It may take <ins> very long time </ins> depending on your machine and the number of files and if you change the parameter if points density (by default its 150) to some lower number. The lower the number the bigger pointclounds are to be precessed and more time it may take. Keep in mind that at some point (for too low of the number) the pipeline may break.
The default model which is available in the repo in `fsct\model\model.path` was trained on the nibio data with <ins> 1 cm sampling (0.01m) </ins> the val accuracy was approx. 0.92.
Make sure that you put the data in `*.las` format to this folder. If your files are in a different format e.g. `*.laz` you can use `python nibio_preprocessing/convert_files_in_folder.py --input_folder input_folder_name --output_folder output_folder las ` to convert your file to `*.las` format.
The pipeline is composed of serveral steps and input parametes in `/run_bash_scripts/sem_seg_sean.sh and /run_bash_scripts/tls.h` which can be set before the run.
The pipeline is composed of serveral steps and input parametes in `run_all.sh input_folder_name` should be set before the run. The default parameters are as follows:
The subset of the default parameters are as follows:
```
CLEAR_INPUT_FOLDER=1 # 1: clear input folder, 0: not clear input folder
CONDA_ENV="pdal-env-1" # conda environment for running the pipeline
......@@ -83,7 +77,30 @@ GRAPH_MAXIMUM_CUMULATIVE_GAP=3
ADD_LEAVES_VOXEL_LENGTH=0.5
FIND_STEMS_MIN_POINTS=50
```
The stages are :
## Running semantic segmentation
Semantic segmentation should be run before the instance segmentation since the latter one requires results from the semantic segmentation.
To run semantic segmentation follow:
```
bash run_bash_scripts/sem_seg_sean.sh -d folder_name
```
Make sure that you put the data in `*.las` or `*.laz` format to this folder.
This is a basic run of the command. There are more parameters to be set. Take a look into `run_bash_scripts/sem_seg_sean.sh` to check them.
## Running instance segmentation
To run instance segmentation follow:
```
bash run_bash_scripts/tls.sh -d folder_name
```
This is a basic run of the command. There are more parameters to be set. Take a look into `run_bash_scripts/tls.sh` to check them.
# The stages of the steps executed in the pipeline are as follows :
* reduction of the point clound size to the point where it has density of 150 points / square meter
* mapping to `*.ply` format, all the reducted`*.las` files are mapped and the orignal files are removed (the converted to `*ply` are kept)
* semantic segmentation,
......@@ -99,17 +116,12 @@ Folder `input_folder/results` contain three subfolders:
+--segmented_point_clouds
```
## Running with sample files
The repo comes with sample file. You can use them to test your setup. To run the folow do:
```
chmod +x run_sample.sh
./run_sample.sh
```
# The paper
[Maciej Wielgosz, Stefano Puliti, Phil Wilkes, Rasmus Astrup. (2023). Point2Tree(P2T). arXiv preprint. arXiv:2305.02651.](https://arxiv.org/abs/2305.02651)
# Orinal repo
For the orignal repo, please take a look there: https://github.com/philwilkes/FSCT and https://github.com/philwilkes/TLS2trees
## Running with your files
You have to provide a `folder_path`. This is the location of your `*.las` files.
```
chmod +x run_all_fine_grained.sh
./run_all_fine_grained.sh folder_path
```
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment