Sebastian Ruder
Sebastian Ruder
Research Scientist, Cohere
Verified email at - Homepage
Cited by
Cited by
An overview of gradient descent optimization algorithms
S Ruder
arXiv preprint arXiv:1609.04747, 2016
Universal Language Model Fine-tuning for Text Classification
J Howard*, S Ruder*
Proceedings of ACL 2018, 2018
An overview of multi-task learning in deep neural networks
S Ruder
arXiv preprint arXiv:1706.05098, 2017
NusaCrowd: Open Source Initiative for Indonesian NLP Resources
S Cahyawijaya, H Lovenia, AF Aji, GI Winata, B Wilie, R Mahendra, ...
Findings of ACL 2023, 2023
PaLM 2 technical report
R Anil, AM Dai, O Firat, M Johnson, D Lepikhin, A Passos, S Shakeri, ...
arXiv preprint arXiv:2305.10403, 2023
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ...
arXiv preprint arXiv:2312.11805, 2023
XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalization
J Hu*, S Ruder*, A Siddhant, G Neubig, O Firat, M Johnson
Proceedings of ICML 2020, 2020
A Survey of Cross-lingual Word Embedding Models
S Ruder, I Vulić, A Søgaard
Journal of Artificial Intelligence Research 65, 569-631, 2019
Transfer learning in natural language processing
S Ruder, ME Peters, S Swayamdipta, T Wolf
Proceedings of the 2019 conference of the North American chapter of the …, 2019
On the cross-lingual transferability of monolingual representations
M Artetxe, S Ruder, D Yogatama
Proceedings of ACL 2020, 2020
Adapterhub: A framework for adapting transformers
J Pfeiffer, A Rücklé, C Poth, A Kamath, I Vulić, S Ruder, K Cho, I Gurevych
Proceedings of EMNLP 2020: System demonstrations, 2020
Long Range Arena: A Benchmark for Efficient Transformers
Y Tay, M Dehghani, S Abnar, Y Shen, D Bahri, P Pham, J Rao, L Yang, ...
Proceedings of ICLR 2021, 2021
MAD-X: An Adapter-based Framework for Multi-task Cross-lingual Transfer
J Pfeiffer, I Vulić, I Gurevych, S Ruder
Proceedings of EMNLP 2020, 2020
To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks
ME Peters*, S Ruder*, NA Smith
Proceedings of the 4th Workshop on Representation Learning for NLP, 2019
Neural Transfer Learning for Natural Language Processing
S Ruder
National University of Ireland, Galway, 2019
Compacter: Efficient Low-Rank Hypercomplex Adapter Layers
RK Mahabadi, J Henderson, S Ruder
Proceedings of NeurIPS 2021, 2021
A Hierarchical Model of Reviews for Aspect-based Sentiment Analysis
S Ruder, P Ghaffari, JG Breslin
Proceedings of the 2016 Conference on Empirical Methods in Natural Language …, 2016
Latent Multi-task Architecture Learning
S Ruder, J Bingel, I Augenstein, A Søgaard
Proceedings of AAAI 2019, 2019
On the Limitations of Unsupervised Bilingual Dictionary Induction
A Søgaard, S Ruder, I Vulić
Proceedings of ACL 2018, 2018
A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks
V Sanh, T Wolf, S Ruder
Proceedings of AAAI 2019, 2019
The system can't perform the operation now. Try again later.
Articles 1–20