Skip to content
Snippets Groups Projects
Commit 714b3896 authored by Maciej Wielgosz's avatar Maciej Wielgosz
Browse files

README updated to cover the recent pipeline

parent 6383b409
Branches
No related tags found
No related merge requests found
...@@ -69,7 +69,7 @@ The default model which is available in the repo in `fsct\model\model.path` was ...@@ -69,7 +69,7 @@ The default model which is available in the repo in `fsct\model\model.path` was
Make sure that you put the data in `*.las` format to this folder. If your files are in a different format e.g. `*.laz` you can use `python nibio_preprocessing/convert_files_in_folder.py --input_folder input_folder_name --output_folder output_folder las ` to convert your file to `*.las` format. Make sure that you put the data in `*.las` format to this folder. If your files are in a different format e.g. `*.laz` you can use `python nibio_preprocessing/convert_files_in_folder.py --input_folder input_folder_name --output_folder output_folder las ` to convert your file to `*.las` format.
The pipeline is composed of serveral steps and input parametes in `run_all.sh input_folder_name` should be set before the run. The default parameters are as follows: The pipeline is composed of serveral steps and input parametes in `/run_bash_scripts/sem_seg_sean.sh and /run_bash_scripts/tls.h` should be set before the run. The default parameters are as follows:
``` ```
CLEAR_INPUT_FOLDER=1 # 1: clear input folder, 0: not clear input folder CLEAR_INPUT_FOLDER=1 # 1: clear input folder, 0: not clear input folder
CONDA_ENV="pdal-env-1" # conda environment for running the pipeline CONDA_ENV="pdal-env-1" # conda environment for running the pipeline
...@@ -99,17 +99,21 @@ Folder `input_folder/results` contain three subfolders: ...@@ -99,17 +99,21 @@ Folder `input_folder/results` contain three subfolders:
+--segmented_point_clouds +--segmented_point_clouds
``` ```
## Running with sample files ## Running semantic segmentation
The repo comes with sample file. You can use them to test your setup. To run the folow do: Semantic segmentation should be run before the instance segmentation since the latter one requires results from the semantic segmentation.
To run semantic segmentation follow:
``` ```
chmod +x run_sample.sh bash run_bash_scripts/sem_seg_sean.sh -d folder_name
./run_sample.sh
``` ```
This is a basic run of the command. There are more parameters to be set. Take a look into `run_bash_scripts/sem_seg_sean.sh` to check them.
## Running with your files ## Running instance segmentation
You have to provide a `folder_path`. This is the location of your `*.las` files. To run instance segmentation follow:
``` ```
chmod +x run_all_fine_grained.sh bash run_bash_scripts/tls.sh -d folder_name
./run_all_fine_grained.sh folder_path
``` ```
This is a basic run of the command. There are more parameters to be set. Take a look into `run_bash_scripts/tls.sh` to check them.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment