1 code implementation • 28 Sep 2022 • Artit Wangperawong
Multilingual search can be achieved with subword tokenization.
1 code implementation • 9 Sep 2019 • Kettip Kriangchaivech, Artit Wangperawong
A machine learning model was developed to automatically generate questions from Wikipedia passages using transformers, an attention-based model eschewing the paradigm of existing recurrent neural networks (RNNs).
1 code implementation • 8 Sep 2019 • Xinyi Liu, Artit Wangperawong
This study compares the effectiveness and robustness of multi-class categorization of Amazon product data using transfer learning on pre-trained contextualized language models.
no code implementations • 30 Jun 2019 • Pasawee Wirojwatanakul, Artit Wangperawong
In this study, we investigated multi-modal approaches using images, descriptions, and titles to categorize e-commerce products on Amazon.
3 code implementations • 5 Dec 2018 • Artit Wangperawong
Mathematical expressions were generated, evaluated and used to train neural network models based on the transformer architecture.
no code implementations • 26 Jul 2018 • Xinyi Liu, Artit Wangperawong
With a similar model, the same dataset was used to generate investor recommendations for companies undergoing fundraising, which achieved highest prediction accuracy of 11. 1%.
no code implementations • 9 Jan 2018 • Artit Wangperawong, Kettip Kriangchaivech, Austin Lanari, Supui Lam, Panthong Wangperawong
The model was used for matching news articles and videos, where the inputs and activation functions respectively consisted of term vectors and cosine similarity measures between the weighted structural components.
no code implementations • 18 Apr 2016 • Artit Wangperawong, Cyrille Brun, Olav Laudy, Rujikorn Pavasuthipaisit
Customer temporal behavioral data was represented as images in order to perform churn prediction by leveraging deep learning architectures prominent in image classification.