- PyTorch Module of the kan_gpt
- Deployed to PyPi
- MIT Licence
- Test Cases to ensure forward-backward passes work as expected
- Training script
I am currently working on training it on the WebText dataset to compare it to the original gpt2. Facing a few out-of-memory issues at the moment. Perhaps the vocab size (50257) is too large?
I'm open to contributions and would love to hear your thoughts!