no code implementations • 1 Feb 2024 • Jung-Mei Chu, Hao-Cheng Lo, Jieh Hsiang, Chun-Chieh Cho
These systems are designed to enhance the efficiency of patent attorneys in handling OA responses through collaboration with AI.
no code implementations • 19 Sep 2020 • Jieh-Sheng Lee, Jieh Hsiang
The steps of reranking are: (1) search the most similar text in the training data of GPT-2 by taking a bag-of-word ranking approach (BM25), (2) convert the search results in text format to BERT embeddings, and (3) provide the final result by ranking the BERT embeddings based on their similarities with the patent text generated by GPT-2.
no code implementations • 11 Jan 2020 • Jieh-Sheng Lee, Jieh Hsiang
PatentTransformer is our codename for patent text generation based on Transformer-based models.
no code implementations • 26 Aug 2019 • Jieh-Sheng Lee, Jieh Hsiang
Specifically, we fine-tune a pre-trained Google BERT model to measure the patent claim spans generated by a fine-tuned OpenAI GPT-2 model.
no code implementations • 1 Jul 2019 • Jieh-Sheng Lee, Jieh Hsiang
Our contributions include: (1) being the first to generate patent claims by machines and being the first to apply GPT-2 to patent claim generation, (2) providing various experiment results for qualitative analysis and future research, (3) proposing a new sampling approach for text generation, and (4) building an e-mail bot for future researchers to explore the fine-tuned GPT-2 model further.
1 code implementation • 14 May 2019 • Jieh-Sheng Lee, Jieh Hsiang
In this work we focus on fine-tuning a pre-trained BERT model and applying it to patent classification.
Ranked #1 on Multi-Label Text Classification on USPTO-3M