Search Results for author: Jieh-Sheng Lee

Found 9 papers, 2 papers with code

LexGPT 0.1: pre-trained GPT-J models with Pile of Law

1 code implementation5 Jun 2023 Jieh-Sheng Lee

By fine-tuning models with specialized data and without modifying any source code, legal professionals can create custom language models for downstream tasks with minimum effort and technical knowledge.

The Effectiveness of Bidirectional Generative Patent Language Models

no code implementations4 Sep 2022 Jieh-Sheng Lee

Since text generation is bidirectional, the calculation of autocomplete effectiveness can be bidirectional and starts from anywhere in the text.

Text Generation

Evaluating Generative Patent Language Models

no code implementations23 Jun 2022 Jieh-Sheng Lee

The perspective is to measure the ratio of keystrokes that can be saved by autocompletion based on generative patent language models.

Language Modelling

Prior Art Search and Reranking for Generated Patent Text

no code implementations19 Sep 2020 Jieh-Sheng Lee, Jieh Hsiang

The steps of reranking are: (1) search the most similar text in the training data of GPT-2 by taking a bag-of-word ranking approach (BM25), (2) convert the search results in text format to BERT embeddings, and (3) provide the final result by ranking the BERT embeddings based on their similarities with the patent text generated by GPT-2.

PatentTransformer-2: Controlling Patent Text Generation by Structural Metadata

no code implementations11 Jan 2020 Jieh-Sheng Lee, Jieh Hsiang

PatentTransformer is our codename for patent text generation based on Transformer-based models.

Relation Sentence +1

Measuring Patent Claim Generation by Span Relevancy

no code implementations26 Aug 2019 Jieh-Sheng Lee, Jieh Hsiang

Specifically, we fine-tune a pre-trained Google BERT model to measure the patent claim spans generated by a fine-tuned OpenAI GPT-2 model.

Language Modelling Natural Language Inference +1

Patent Claim Generation by Fine-Tuning OpenAI GPT-2

no code implementations1 Jul 2019 Jieh-Sheng Lee, Jieh Hsiang

Our contributions include: (1) being the first to generate patent claims by machines and being the first to apply GPT-2 to patent claim generation, (2) providing various experiment results for qualitative analysis and future research, (3) proposing a new sampling approach for text generation, and (4) building an e-mail bot for future researchers to explore the fine-tuned GPT-2 model further.

Text Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.