1 Star 0 Fork 0

modelee / t5-large-ssm

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README
language datasets license
en
c4
wikipedia
apache-2.0

Google's T5 for Closed Book Question Answering.

The model was pre-trained using T5's denoising objective on C4 and subsequently additionally pre-trained using REALM's salient span masking objective on Wikipedia.

Note: This model should be fine-tuned on a question answering downstream task before it is useable for closed book question answering.

Other Community Checkpoints: here

Paper: How Much Knowledge Can You Pack Into the Parameters of a Language Model?

Authors: Adam Roberts, Colin Raffel, Noam Shazeer

Abstract

It has recently been observed that neural language models trained on unstructured text can implicitly store and retrieve knowledge using natural language queries. In this short paper, we measure the practical utility of this approach by fine-tuning pre-trained models to answer questions without access to any external context or knowledge. We show that this approach scales with model size and performs competitively with open-domain systems that explicitly retrieve answers from an external knowledge source when answering questions. To facilitate reproducibility and future work, we release our code and trained models at https://goo.gle/t5-cbqa.

model image

空文件

简介

暂无描述 展开 收起
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
1
https://gitee.com/modelee/t5-large-ssm.git
git@gitee.com:modelee/t5-large-ssm.git
modelee
t5-large-ssm
t5-large-ssm
main

搜索帮助

53164aa7 5694891 3bd8fe86 5694891