Search Results for author: Lu Ren

Found 6 papers, 2 papers with code

DmADs-Net: Dense multiscale attention and depth-supervised network for medical image segmentation

no code implementations1 May 2024 Zhaojin Fu, Zheng Chen, Jinjiang Li, Lu Ren

In addition, in the feature fusion phase, a Feature Refinement and Fusion Block is created to enhance the fusion of different semantic information. We validated the performance of the network using five datasets of varying sizes and types.

Image Segmentation Medical Image Segmentation +1

Leveraging Cross-Modal Neighbor Representation for Improved CLIP Classification

1 code implementation27 Apr 2024 Chao Yi, Lu Ren, De-Chuan Zhan, Han-Jia Ye

Despite this, some studies have directly used CLIP's image encoder for tasks like few-shot classification, introducing a misalignment between its pre-training objectives and feature extraction methods.

Contrastive Learning Few-Shot Image Classification

ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse

1 code implementation17 Aug 2023 Yi-Kai Zhang, Lu Ren, Chao Yi, Qi-Wei Wang, De-Chuan Zhan, Han-Jia Ye

The rapid expansion of foundation pre-trained models and their fine-tuned counterparts has significantly contributed to the advancement of machine learning.

The Kernel Beta Process

no code implementations NeurIPS 2011 Lu Ren, Yingjian Wang, Lawrence Carin, David B. Dunson

A new Le ́vy process prior is proposed for an uncountable collection of covariate- dependent feature-learning measures; the model is called the kernel beta process (KBP).

A Bayesian Model for Simultaneous Image Clustering, Annotation and Object Segmentation

no code implementations NeurIPS 2009 Lan Du, Lu Ren, Lawrence Carin, David B. Dunson

The model clusters the images into classes, and each image is segmented into a set of objects, also allowing the opportunity to assign a word to each object (localized labeling).

Clustering Image Clustering +2

Cannot find the paper you are looking for? You can Submit a new open access paper.