Search Results for author: Ji Li

Found 20 papers, 6 papers with code

Data-driven Energy Consumption Modelling for Electric Micromobility using an Open Dataset

no code implementations26 Mar 2024 Yue Ding, Sen Yan, Maqsood Hussain Shah, Hongyuan Fang, Ji Li, Mingming Liu

Furthermore, we provide a comprehensive analysis of energy consumption modelling based on the dataset using a set of representative machine learning algorithms and compare their performance against the contemporary mathematical models as a baseline.

DesignEdit: Multi-Layered Latent Decomposition and Fusion for Unified & Accurate Image Editing

no code implementations21 Mar 2024 Yueru Jia, Yuhui Yuan, Aosong Cheng, Chuke Wang, Ji Li, Huizhu Jia, Shanghang Zhang

Second, we propose an instruction-guided latent fusion that pastes the multi-layered latent representations onto a canvas latent.

Text-to-Image Generation

Glyph-ByT5: A Customized Text Encoder for Accurate Visual Text Rendering

no code implementations14 Mar 2024 Zeyu Liu, Weicong Liang, Zhanhao Liang, Chong Luo, Ji Li, Gao Huang, Yuhui Yuan

Visual text rendering poses a fundamental challenge for contemporary text-to-image generation models, with the core problem lying in text encoder deficiencies.

Text-to-Image Generation

Privacy-Aware Energy Consumption Modeling of Connected Battery Electric Vehicles using Federated Learning

1 code implementation12 Dec 2023 Sen Yan, Hongyuan Fang, Ji Li, Tomas Ward, Noel O'Connor, Mingming Liu

Our findings show that FL methods can effectively improve the performance of BEV energy consumption prediction while maintaining user privacy.

Federated Learning

COLE: A Hierarchical Generation Framework for Multi-Layered and Editable Graphic Design

no code implementations28 Nov 2023 Peidong Jia, Chenxuan Li, Yuhui Yuan, Zeyu Liu, Yichao Shen, Bohan Chen, Xingru Chen, Yinglin Zheng, Dong Chen, Ji Li, Xiaodong Xie, Shanghang Zhang, Baining Guo

Our COLE system comprises multiple fine-tuned Large Language Models (LLMs), Large Multimodal Models (LMMs), and Diffusion Models (DMs), each specifically tailored for design-aware layer-wise captioning, layout planning, reasoning, and the task of generating images and text.

Image Generation

A Review on AI Algorithms for Energy Management in E-Mobility Services

no code implementations26 Sep 2023 Sen Yan, Maqsood Hussain Shah, Ji Li, Noel O'Connor, Mingming Liu

E-mobility, or electric mobility, has emerged as a pivotal solution to address pressing environmental and sustainability concerns in the transportation sector.

energy management Management

Planning Automated Driving with Accident Experience Referencing and Common-sense Inferencing

no code implementations26 Jan 2023 Shaobo Qiu, Ji Li, Guoxi Chen, Hong Wang, Boqi Li

In this work, we present the concept of an Automated Driving Strategical Brain (ADSB): a framework of a scene perception and scene safety evaluation system that works at a higher abstraction level, incorporating experience referencing, common-sense inferring and goal-and-value judging capabilities, to provide a contextual perspective for decision making within automated driving planning.

Common Sense Reasoning Decision Making

Self-Supervised Blind Motion Deblurring With Deep Expectation Maximization

1 code implementation CVPR 2023 Ji Li, Weixi Wang, Yuesong Nan, Hui Ji

In contrast, this paper presents a dataset-free deep learning method for removing uniform and non-uniform blur effects from images of static scenes.

Deblurring

Self-Supervised Deep Image Restoration via Adaptive Stochastic Gradient Langevin Dynamics

1 code implementation IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 Weixi Wang, Ji Li, Hui Ji

While supervised deep learning has been a prominent tool for solving many image restoration problems, there is an increasing interest on studying self-supervised or un- supervised methods to address the challenges and costs of collecting truth images.

Image Restoration Retrieval

TAG: Gradient Attack on Transformer-based Language Models

1 code implementation Findings (EMNLP) 2021 Jieren Deng, Yijue Wang, Ji Li, Chao Shang, Cao Qin, Hang Liu, Sanguthevar Rajasekaran, Caiwen Ding

In this paper, as the first attempt, we formulate the gradient attack problem on the Transformer-based language models and propose a gradient attack algorithm, TAG, to reconstruct the local training data.

Federated Learning Cryptography and Security

FTRANS: Energy-Efficient Acceleration of Transformers using FPGA

no code implementations16 Jul 2020 Bingbing Li, Santosh Pandey, Haowen Fang, Yanjun Lyv, Ji Li, Jieyang Chen, Mimi Xie, Lipeng Wan, Hang Liu, Caiwen Ding

In natural language processing (NLP), the "Transformer" architecture was proposed as the first transduction model replying entirely on self-attention mechanisms without using sequence-aligned recurrent neural networks (RNNs) or convolution, and it achieved significant improvements for sequence to sequence tasks.

Model Compression

Differential Diagnosis for Pancreatic Cysts in CT Scans Using Densely-Connected Convolutional Networks

no code implementations4 Jun 2018 Hongwei Li, Kanru Lin, Maximilian Reichert, Lina Xu, Rickmer Braren, Deliang Fu, Roland Schmid, Ji Li, Bjoern Menze, Kuangyu Shi

The lethal nature of pancreatic ductal adenocarcinoma (PDAC) calls for early differential diagnosis of pancreatic cysts, which are identified in up to 16% of normal subjects, and some of which may develop into PDAC.

Towards Budget-Driven Hardware Optimization for Deep Convolutional Neural Networks using Stochastic Computing

no code implementations10 May 2018 Zhe Li, Ji Li, Ao Ren, Caiwen Ding, Jeffrey Draper, Qinru Qiu, Bo Yuan, Yanzhi Wang

Recently, Deep Convolutional Neural Network (DCNN) has achieved tremendous success in many machine learning applications.

Indirect Supervision for Relation Extraction using Question-Answer Pairs

2 code implementations30 Oct 2017 Zeqiu Wu, Xiang Ren, Frank F. Xu, Ji Li, Jiawei Han

However, due to the incompleteness of knowledge bases and the context-agnostic labeling, the training data collected via distant supervision (DS) can be very noisy.

Question Answering Relation +1

Hardware-Driven Nonlinear Activation for Stochastic Computing Based Deep Convolutional Neural Networks

no code implementations12 Mar 2017 Ji Li, Zihao Yuan, Zhe Li, Caiwen Ding, Ao Ren, Qinru Qiu, Jeffrey Draper, Yanzhi Wang

Recently, Deep Convolutional Neural Networks (DCNNs) have made unprecedented progress, achieving the accuracy close to, or even better than human-level perception in various tasks.

SC-DCNN: Highly-Scalable Deep Convolutional Neural Network using Stochastic Computing

no code implementations18 Nov 2016 Ao Ren, Ji Li, Zhe Li, Caiwen Ding, Xuehai Qian, Qinru Qiu, Bo Yuan, Yanzhi Wang

Stochastic Computing (SC), which uses bit-stream to represent a number within [-1, 1] by counting the number of ones in the bit-stream, has a high potential for implementing DCNNs with high scalability and ultra-low hardware footprint.

Cannot find the paper you are looking for? You can Submit a new open access paper.