compact-multi-head-self-attention-pytorch

A PyTorch implementation of the Compact Multi-Head Self-Attention Mechanism from the paper: "Low Rank Factorization for Compact Multi-Head Self-Attention"

Stars

16

Forks

3

Language

Python

Last Updated

Aug 11, 2022

Similar Repos