Logo

OPT

Free 0

Open Pre-trained Transformer.

https://github.com
Tool Interface

About OPT

OPT is a suite of decoder-only pre-trained transformers ranging from 125M to 175B parameters.

User Feedback

0 Reviews

Want to share your thoughts?

Sign in to Review

Similar Alternatives