Search Results for author: Andy Yang

Found 2 papers, 0 papers with code

Counting Like Transformers: Compiling Temporal Counting Logic Into Softmax Transformers

no code implementations5 Apr 2024 Andy Yang, David Chiang

Deriving formal bounds on the expressivity of transformers, as well as studying transformers that are constructed to implement known algorithms, are both effective methods for better understanding the computational power of transformers.

Masked Hard-Attention Transformers and Boolean RASP Recognize Exactly the Star-Free Languages

no code implementations21 Oct 2023 Dana Angluin, David Chiang, Andy Yang

We consider transformer encoders with hard attention (in which all attention is focused on exactly one position) and strict future masking (in which each position only attends to positions strictly to its left), and prove that the class of languages recognized by these networks is exactly the star-free languages.

Hard Attention Position

Cannot find the paper you are looking for? You can Submit a new open access paper.