5fa1a76
1
This paper presents a unified framework for pre-training models that are universally effective across datasets and setups.