Generating Infinite-Size Textures using GANs with Patch-by-Patch Paradigm

5 Sep 2023  ·  Alhasan Abdellatif, Ahmed H. Elsheikh ·

In this paper, we introduce a novel approach for generating texture images of infinite sizes using Generative Adversarial Networks (GANs) based on a patch-by-patch paradigm. Existing texture synthesis techniques rely on generating large-scale textures using a single forward pass to the generative model; this approach limits the scalability and flexibility of the images produced. In contrast, the proposed approach trains a GAN model on a single texture image to generate relatively small-size patches that are locally correlated and can be seamlessly concatenated to form a larger image. The method relies on local padding in the generator to ensure consistency between the generated patches. It also utilizes spatial stochastic modulation to allow for local variations and improve patterns alignment in the large-scale image. The trained models learn the local texture structure and are able to generate images of arbitrary sizes, while also maintaining the coherence and diversity. Experimental results demonstrate constant GPU scalability with respect to the generated image size compared to existing approaches that exhibit a proportional growth in GPU memory.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here