Search Results for author: Christian Osendorfer

Found 13 papers, 6 papers with code

Separator-Transducer-Segmenter: Streaming Recognition and Segmentation of Multi-party Speech

no code implementations10 May 2022 Ilya Sklyar, Anna Piunova, Christian Osendorfer

Finally, we establish a novel framework for segmentation analysis of multi-party conversations through emission latency metrics.

Segmentation speech-recognition +3

No Representation without Transformation

no code implementations9 Dec 2019 Giorgio Giannone, Saeed Saremi, Jonathan Masci, Christian Osendorfer

To explicitly demonstrate the effect of these higher order objects, we show that the inferred latent transformations reflect interpretable properties in the observation space.

Recurrent Neural Processes

2 code implementations13 Jun 2019 Timon Willi, Jonathan Masci, Jürgen Schmidhuber, Christian Osendorfer

We extend Neural Processes (NPs) to sequential data through Recurrent NPs or RNPs, a family of conditional state space models.

Gaussian Processes Time Series +1

Two-Stage Peer-Regularized Feature Recombination for Arbitrary Image Style Transfer

1 code implementation CVPR 2020 Jan Svoboda, Asha Anoosheh, Christian Osendorfer, Jonathan Masci

This paper introduces a neural style transfer model to generate a stylized image conditioning on a set of examples describing the desired style.

Image Generation Style Transfer +1

Deep Iterative Surface Normal Estimation

2 code implementations CVPR 2020 Jan Eric Lenssen, Christian Osendorfer, Jonathan Masci

This results in a state-of-the-art surface normal estimator that is robust to noise, outliers and point density variation, preserves sharp features through anisotropic kernels and equivariance through a local quaternion-based spatial transformer.

Surface Normal Estimation Surface Normals Estimation

NAIS-Net: Stable Deep Networks from Non-Autonomous Differential Equations

1 code implementation NeurIPS 2018 Marco Ciccone, Marco Gallieri, Jonathan Masci, Christian Osendorfer, Faustino Gomez

This paper introduces Non-Autonomous Input-Output Stable Network(NAIS-Net), a very deep architecture where each stacked processing block is derived from a time-invariant non-autonomous dynamical system.

Improving approximate RPCA with a k-sparsity prior

no code implementations29 Dec 2014 Maximilian Karl, Christian Osendorfer

A process centric view of robust PCA (RPCA) allows its fast approximate implementation based on a special form o a deep neural network with weights shared across all layers.

General Classification

Learning Stochastic Recurrent Networks

1 code implementation27 Nov 2014 Justin Bayer, Christian Osendorfer

Leveraging advances in variational inference, we propose to enhance recurrent neural networks with latent variables, resulting in Stochastic Recurrent Networks (STORNs).

Variational Inference

Variational inference of latent state sequences using Recurrent Networks

no code implementations6 Jun 2014 Justin Bayer, Christian Osendorfer

Recent advances in the estimation of deep directed graphical models and recurrent networks let us contribute to the removal of a blind spot in the area of probabilistc modelling of time series.

Denoising Imputation +3

Convolutional Neural Networks learn compact local image descriptors

no code implementations30 Apr 2013 Christian Osendorfer, Justin Bayer, Patrick van der Smagt

A standard deep convolutional neural network paired with a suitable loss function learns compact local image descriptors that perform comparably to state-of-the art approaches.

Unsupervised Feature Learning for low-level Local Image Descriptors

no code implementations14 Jan 2013 Christian Osendorfer, Justin Bayer, Sebastian Urban, Patrick van der Smagt

Unsupervised feature learning has shown impressive results for a wide range of input modalities, in particular for object classification tasks in computer vision.

Binarization General Classification

Learning Sequence Neighbourhood Metrics

no code implementations9 Sep 2011 Justin Bayer, Christian Osendorfer, Patrick van der Smagt

Recurrent neural networks (RNNs) in combination with a pooling operator and the neighbourhood components analysis (NCA) objective function are able to detect the characterizing dynamics of sequences and embed them into a fixed-length vector space of arbitrary dimensionality.

General Classification Metric Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.