Search Results for author: Liang Dong

Found 5 papers, 2 papers with code

Patched Line Segment Learning for Vector Road Mapping

no code implementations6 Sep 2023 Jiakun Xu, Bowen Xu, Gui-Song Xia, Liang Dong, Nan Xue

In our experiments, we demonstrate how an effective representation of a road graph significantly enhances the performance of vector road mapping on established benchmarks, without requiring extensive modifications to the neural network architecture.

NEAT: Distilling 3D Wireframes from Neural Attraction Fields

1 code implementation14 Jul 2023 Nan Xue, Bin Tan, Yuxi Xiao, Liang Dong, Gui-Song Xia, Tianfu Wu, Yujun Shen

Instead of leveraging matching-based solutions from 2D wireframes (or line segments) for 3D wireframe reconstruction as done in prior arts, we present NEAT, a rendering-distilling formulation using neural fields to represent 3D line segments with 2D observations, and bipartite matching for perceiving and distilling of a sparse set of 3D global junctions.

3D Wireframe Reconstruction Novel View Synthesis

Self-supervised Deep Unrolled Reconstruction Using Regularization by Denoising

no code implementations7 May 2022 Peizhou Huang, Chaoyi Zhang, Xiaoliang Zhang, Xiaojuan Li, Liang Dong, Leslie Ying

Experiment results demonstrate that the proposed method requires a reduced amount of training data to achieve high reconstruction quality among the state-of-art of MR reconstruction utilizing the Noise2Noise method.

Denoising MRI Reconstruction +1

Turbulence-based load alleviation control for wind turbine in extreme turbulence situation

no code implementations16 Apr 2021 Liang Dong, Wai Hou Lio

Based on this insight, this work proposes a turbulence-based load alleviation control strategy for adapting the controller to changes in wind condition.

Conceptualized Representation Learning for Chinese Biomedical Text Mining

1 code implementation25 Aug 2020 Ningyu Zhang, Qianghuai Jia, Kangping Yin, Liang Dong, Feng Gao, Nengwei Hua

In this paper, we investigate how the recently introduced pre-trained language model BERT can be adapted for Chinese biomedical corpora and propose a novel conceptualized representation learning approach.

Language Modelling Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.