ODFNet: Using orientation distribution functions to characterize 3D point clouds

8 Dec 2020  ·  Yusuf H. Sahin, Alican Mertan, Gozde Unal ·

Learning new representations of 3D point clouds is an active research area in 3D vision, as the order-invariant point cloud structure still presents challenges to the design of neural network architectures. Recent works explored learning either global or local features or both for point clouds, however none of the earlier methods focused on capturing contextual shape information by analysing local orientation distribution of points. In this paper, we leverage on point orientation distributions around a point in order to obtain an expressive local neighborhood representation for point clouds. We achieve this by dividing the spherical neighborhood of a given point into predefined cone volumes, and statistics inside each volume are used as point features. In this way, a local patch can be represented by not only the selected point's nearest neighbors, but also considering a point density distribution defined along multiple orientations around the point. We are then able to construct an orientation distribution function (ODF) neural network that involves an ODFBlock which relies on mlp (multi-layer perceptron) layers. The new ODFNet model achieves state-of the-art accuracy for object classification on ModelNet40 and ScanObjectNN datasets, and segmentation on ShapeNet S3DIS datasets.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
3D Part Segmentation ShapeNet-Part ODFNet Class Average IoU 83.3 # 22
Instance Average IoU 86.5 # 18

Methods


No methods listed for this paper. Add relevant methods here