From baf435e3a755de3fc547f7fcc64a4cc68364da45 Mon Sep 17 00:00:00 2001 From: Yang You <neilyou@qq.com> Date: Thu, 11 Mar 2021 13:46:07 +0800 Subject: [PATCH] Update README.md --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index b6b3bd6..9bac2b6 100644 --- a/README.md +++ b/README.md @@ -13,7 +13,7 @@ Change which method to use in `config/config.yaml` and run python train.py ``` ### Results -Using Adam with learning rate decay 0.3 for every 50 epochs, train for 200 epochs; data augmentation follows [this repo](https://github.com/yanx27/Pointnet_Pointnet2_pytorch). For Hengshuang and Nico, initial LR is 1e-3 (maybe fine-tuned later); for Menghao, initial LR is 1e-4, as suggested by the [author](https://github.com/MenghaoGuo). ModelNet40 classification results (instance average) are listed below: +Using Adam with learning rate decay 0.3 for every 50 epochs, train for 200 epochs; data augmentation follows [this repo](https://github.com/yanx27/Pointnet_Pointnet2_pytorch). For Hengshuang and Nico, initial LR is 1e-3 (I would appreciate if someone could fine-tune these hyper-paramters); for Menghao, initial LR is 1e-4, as suggested by the [author](https://github.com/MenghaoGuo). ModelNet40 classification results (instance average) are listed below: | Model | Accuracy | |--|--| | Hengshuang | 89.6| @@ -23,3 +23,4 @@ Using Adam with learning rate decay 0.3 for every 50 epochs, train for 200 epoch ### Miscellaneous Some code and training settings are borrowed from https://github.com/yanx27/Pointnet_Pointnet2_pytorch. Code for [PCT: Point Cloud Transformer (Meng-Hao Guo et al.)](https://arxiv.org/abs/2012.09688) is adapted from the author's Jittor implementation https://github.com/MenghaoGuo/PCT. + -- GitLab