no code implementations • 7 Sep 2021 • David Noever, Ryerson Burdick
The application of Generative Pre-trained Transformer (GPT-2) to learn text-archived game notation provides a model environment for exploring sparse reward gameplay.
Rubik's Cube