no code implementations • ICCV 2023 • Chen Naveh, Yacov Hel-Or
This paper describes a new technique for finding disentangled semantic directions in the latent space of StyleGAN.
1 code implementation • 30 Mar 2022 • David Dadon, Ohad Fried, Yacov Hel-Or
We present depth distribution neural radiance field (DDNeRF), a new method that significantly increases sampling efficiency along rays during training while achieving superior results for a given sampling budget.
1 code implementation • 28 Mar 2022 • Asaf Karnieli, Ohad Fried, Yacov Hel-Or
We show that the self and cast shadows not only do not disturb 3D reconstruction, but can be used alone, as a strong learning signal, to recover the depth map and surface normals.
1 code implementation • 9 Oct 2021 • Berry Weinstein, Shai Fine, Yacov Hel-Or
The weight decay regularization term is widely used during training to constrain expressivity, avoid overfitting, and improve generalization.
no code implementations • 13 Sep 2020 • Berry Weinstein, Shai Fine, Yacov Hel-Or
We derive a new margin-based regularization formulation, termed multi-margin regularization (MMR), for deep neural networks (DNNs).
no code implementations • 4 Aug 2020 • Alon Oring, Zohar Yakhini, Yacov Hel-Or
We argue that these incongruities are due to the structure of the latent space and because such naively interpolated latent vectors deviate from the data manifold.
no code implementations • 5 Feb 2020 • Inbal Lav, Shai Avidan, Yoram Singer, Yacov Hel-Or
We show that the proposed approximation is superior to the commonly used spectral methods with respect to both accuracy and complexity.
1 code implementation • 16 Nov 2019 • Berry Weinstein, Shai Fine, Yacov Hel-Or
We present a selective sampling method designed to accelerate the training of deep neural networks.
no code implementations • CVPR 2013 • Elhanan Elboer, Michael Werman, Yacov Hel-Or
The graph Laplacian operator, which originated in spectral graph theory, is commonly used for learning applications such as spectral clustering and embedding.