Search Results for author: Yingfang Yuan

Found 6 papers, 1 papers with code

SAIS: A Novel Bio-Inspired Artificial Immune System Based on Symbiotic Paradigm

1 code implementation11 Feb 2024 Junhao Song, Yingfang Yuan, Wei Pang

We propose a novel type of Artificial Immune System (AIS): Symbiotic Artificial Immune Systems (SAIS), drawing inspiration from symbiotic relationships in biology.

Evolutionary Algorithms

Which Hyperparameters to Optimise? An Investigation of Evolutionary Hyperparameter Optimisation in Graph Neural Network For Molecular Property Prediction

no code implementations13 Apr 2021 Yingfang Yuan, Wenjun Wang, Wei Pang

In this research, we focus on the impact of selecting two types of GNN hyperparameters, those belonging to graph-related layers and those of task-specific layers, on the performance of GNN for molecular property prediction.

Hyperparameter Optimization Molecular Property Prediction +1

A Genetic Algorithm with Tree-structured Mutation for Hyperparameter Optimisation of Graph Neural Networks

no code implementations24 Feb 2021 Yingfang Yuan, Wenjun Wang, Wei Pang

In particular, the genetic algorithm (GA) for HPO has been explored, which treats GNNs as a black-box model, of which only the outputs can be observed given a set of hyperparameters.

A Systematic Comparison Study on Hyperparameter Optimisation of Graph Neural Networks for Molecular Property Prediction

no code implementations8 Feb 2021 Yingfang Yuan, Wenjun Wang, Wei Pang

In this paper, we conducted a theoretical analysis of common and specific features for two state-of-the-art and popular algorithms for HPO: TPE and CMA-ES, and we compared them with random search (RS), which is used as a baseline.

Hyperparameter Optimization Molecular Property Prediction +1

A Novel Genetic Algorithm with Hierarchical Evaluation Strategy for Hyperparameter Optimisation of Graph Neural Networks

no code implementations22 Jan 2021 Yingfang Yuan, Wenjun Wang, George M. Coghill, Wei Pang

While in the proposed fast evaluation process, the training will be interrupted at an early stage, the difference of RMSE values between the starting and interrupted epochs will be used as a fast score, which implies the potential of the GNN being considered.

Hyperparameter Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.