Simple Transformers: Open-source for All
| Authors | |
|---|---|
| Publication date | 2024 |
| Book title | SIGIR-AP '24 |
| Book subtitle | Proceedings of the 2024 Annual International ACM SIGIR Conference on Research and Development in Information Retrieval in the Asia Pacific Region : December 9-12, 2024, Tokyo, Japan |
| ISBN (electronic) |
|
| Event | SIGIR-AP 2024 |
| Pages (from-to) | 209-215 |
| Publisher | New York, NY: Association for Computing Machinery |
| Organisations |
|
| Abstract |
Language technology, particularly information retrieval, is poised to have a profound impact on society. We believe that technology with such far-reaching potential should be accessible to everyone, not just the technologically privileged. Therefore, we advocate for open-source for all, ensuring that individuals from diverse research areas, societal sectors, and backgrounds have access to information retrieval and language technology tools with low barriers to entry. In this paper, we describe Simple Transformers, a library created with these goals in mind. It is designed to simplify the training, evaluation, and usage of transformer models. As of 2024, the library has garnered over 4,000 stars on GitHub and has been downloaded over 3 million times. These metrics reflect its wide acceptance and usage across different sectors. We describe the design and implementation of the library, provide examples of its usage and adoption, Finally, we also reflect on how Simple Transformers contributes to the goal of ''open-source for all.'' |
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1145/3673791.3698412 |
| Downloads |
3673791.3698412
(Final published version)
|
| Permalink to this page | |
