Style Transfer
652 papers with code • 2 benchmarks • 17 datasets
Style Transfer is a technique in computer vision and graphics that involves generating a new image by combining the content of one image with the style of another image. The goal of style transfer is to create an image that preserves the content of the original image while applying the visual style of another image.
( Image credit: A Neural Algorithm of Artistic Style )
Libraries
Use these libraries to find Style Transfer models and implementationsDatasets
Subtasks
Most implemented papers
Multi-style Generative Network for Real-time Transfer
Despite the rapid progress in style transfer, existing approaches using feed-forward generative network for multi-style or arbitrary-style transfer are usually compromised of image quality and model flexibility.
Adversarially Regularized Autoencoders
This adversarially regularized autoencoder (ARAE) allows us to generate natural textual outputs as well as perform manipulations in the latent space to induce change in the output space.
Multi-Content GAN for Few-Shot Font Style Transfer
In this work, we focus on the challenge of taking partial observations of highly-stylized text and generalizing the observations to generate unobserved glyphs in the ornamented typeface.
Delete, Retrieve, Generate: A Simple Approach to Sentiment and Style Transfer
We consider the task of text attribute transfer: transforming a sentence to alter a specific attribute (e. g., sentiment) while preserving its attribute-independent content (e. g., changing "screen is just the right size" to "screen is too small").
Style Transfer by Relaxed Optimal Transport and Self-Similarity
Style transfer algorithms strive to render the content of one image using the style of another.
Unsupervised Speech Decomposition via Triple Information Bottleneck
Speech information can be roughly decomposed into four components: language content, timbre, pitch, and rhythm.
Composer: Creative and Controllable Image Synthesis with Composable Conditions
Recent large-scale generative models learned on big data are capable of synthesizing incredible images yet suffer from limited controllability.
Stable and Controllable Neural Texture Synthesis and Style Transfer Using Histogram Losses
These losses can improve the quality of large features, improve the separation of content and style, and offer artistic controls such as paint by numbers.
Symbolic Music Genre Transfer with CycleGAN
In this paper we apply such a model to symbolic music and show the feasibility of our approach for music genre transfer.
Arbitrary Style Transfer with Style-Attentional Networks
Arbitrary style transfer aims to synthesize a content image with the style of an image to create a third image that has never been seen before.