File size: 122 Bytes
5fa1a76
1
This paper presents a unified framework for pre-training models that are universally effective across datasets and setups.