Search Results for author: Ziyi Xu

Found 9 papers, 1 papers with code

Mean Field Game of High-Frequency Anticipatory Trading

no code implementations28 Apr 2024 Xue Cheng, Meng Wang, Ziyi Xu

The interactions between a large population of high-frequency traders (HFTs) and a large trader (LT) who executes a certain amount of assets at discrete time points are studied.

Trading Large Orders in the Presence of Multiple High-Frequency Anticipatory Traders

no code implementations13 Mar 2024 Ziyi Xu, Xue Cheng

We investigate a market with a normal-speed informed trader (IT) who may employ mixed strategy and multiple anticipatory high-frequency traders (HFTs) who are under different inventory pressures, in a three-period Kyle's model.

Employing Real Training Data for Deep Noise Suppression

no code implementations5 Sep 2023 Ziyi Xu, Marvin Sach, Jan Pirklbauer, Tim Fingscheidt

It provides a reference-free perceptual loss for employing real data during DNS training, maximizing the PESQ scores.

The Effects of High-frequency Anticipatory Trading: Small Informed Trader vs. Round-Tripper

no code implementations27 Apr 2023 Ziyi Xu, Xue Cheng

In an extended Kyle's model, the interactions between a large informed trader and a high-frequency trader (HFT) who can anticipate the former's incoming order are studied.

Coded Speech Quality Measurement by a Non-Intrusive PESQ-DNN

1 code implementation18 Apr 2023 Ziyi Xu, Ziyue Zhao, Tim Fingscheidt

We illustrate the potential of this model by predicting the PESQ scores of wideband-coded speech obtained from AMR-WB or EVS codecs operating at different bitrates in noisy, tandeming, and error-prone transmission conditions.

Are Large Traders Harmed by Front-running HFTs?

no code implementations11 Nov 2022 Ziyi Xu, Xue Cheng

This paper studies the influences of a high-frequency trader (HFT) on a large trader whose future trading is predicted by the former.

Does a PESQNet (Loss) Require a Clean Reference Input? The Original PESQ Does, But ACR Listening Tests Don't

no code implementations4 May 2022 Ziyi Xu, Maximilian Strake, Tim Fingscheidt

Detailed analyses show that the DNS trained with the MF-intrusive PESQNet outperforms the Interspeech 2021 DNS Challenge baseline and the same DNS trained with an MSE loss by 0. 23 and 0. 12 PESQ points, respectively.

Deep Noise Suppression With Non-Intrusive PESQNet Supervision Enabling the Use of Real Training Data

no code implementations31 Mar 2021 Ziyi Xu, Maximilian Strake, Tim Fingscheidt

During the training process, most of the speech enhancement neural networks are trained in a fully supervised way with losses requiring noisy speech to be synthesized by clean speech and additive noise.

Denoising Speech Enhancement

Cannot find the paper you are looking for? You can Submit a new open access paper.