language | datasets | license | ||
---|---|---|---|---|
en |
|
apache-2.0 |
Google's T5 for Closed Book Question Answering.
The model was pre-trained using T5's denoising objective on C4 and subsequently additionally pre-trained using REALM's salient span masking objective on Wikipedia.
Note: This model should be fine-tuned on a question answering downstream task before it is useable for closed book question answering.
Other Community Checkpoints: here
Paper: How Much Knowledge Can You Pack Into the Parameters of a Language Model?
Authors: Adam Roberts, Colin Raffel, Noam Shazeer
It has recently been observed that neural language models trained on unstructured text can implicitly store and retrieve knowledge using natural language queries. In this short paper, we measure the practical utility of this approach by fine-tuning pre-trained models to answer questions without access to any external context or knowledge. We show that this approach scales with model size and performs competitively with open-domain systems that explicitly retrieve answers from an external knowledge source when answering questions. To facilitate reproducibility and future work, we release our code and trained models at https://goo.gle/t5-cbqa.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。
1. 开源生态
2. 协作、人、软件
3. 评估模型