Follow
Llion Jones
Llion Jones
Verified email at google.com
Title
Cited by
Cited by
Year
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems 30, 2017
636742017
Natural questions: a benchmark for question answering research
T Kwiatkowski, J Palomaki, O Redfield, M Collins, A Parikh, C Alberti, ...
Transactions of the Association for Computational Linguistics 7, 453-466, 2019
11612019
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
5152018
Attention is all you need. arXiv 2017
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2017
4552017
Prottrans: Toward understanding the language of life through self-supervised learning
A Elnaggar, M Heinzinger, C Dallago, G Rehawi, Y Wang, L Jones, ...
IEEE transactions on pattern analysis and machine intelligence 44 (10), 7112 …, 2021
4272021
The best of both worlds: Combining recent advances in neural machine translation
MX Chen, O Firat, A Bapna, M Johnson, W Macherey, G Foster, L Jones, ...
arXiv preprint arXiv:1804.09849, 2018
4252018
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
3292017
Character-level language modeling with deeper self-attention
R Al-Rfou, D Choe, N Constant, M Guo, L Jones
Proceedings of the AAAI conference on artificial intelligence 33 (01), 3159-3166, 2019
3002019
Attention is all you need (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2019
2432019
Lingvo: a modular and scalable framework for sequence-to-sequence modeling
J Shen, P Nguyen, Y Wu, Z Chen, MX Chen, Y Jia, A Kannan, T Sainath, ...
arXiv preprint arXiv:1902.08295, 2019
1572019
Attention is all you need. CoRR abs/1706.03762 (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
1542017
Wikireading: A novel large-scale language understanding task over wikipedia
D Hewlett, A Lacoste, L Jones, I Polosukhin, A Fandrianto, J Han, ...
arXiv preprint arXiv:1608.03542, 2016
1462016
Attention is all you need
N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, Ł Kaiser, ...
Advances in neural information processing systems 30, 6000-6010, 2017
912017
Polosukhin I. 2017
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Attention is all you need. In: Advances in neural information processing …, 0
53
Attention is all you need
A Waswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, A Gomez, ...
NIPS, 2017
522017
Attention is all you need. CoRR
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2017
382017
Accurate supervised and semi-supervised machine reading for long documents
D Hewlett, L Jones, A Lacoste, I Gür
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
242017
CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing
A Elnaggar, W Ding, L Jones, T Gibbs, T Feher, C Angerer, S Severini, ...
arXiv preprint arXiv:2104.02443, 2021
222021
DF-Conformer: Integrated architecture of Conv-TasNet and Conformer using linear complexity self-attention for speech enhancement
Y Koizumi, S Karita, S Wisdom, H Erdogan, JR Hershey, L Jones, ...
2021 IEEE Workshop on Applications of Signal Processing to Audio and …, 2021
212021
Attention-based sequence transduction neural networks
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent 10,452,978, 2019
152019
The system can't perform the operation now. Try again later.
Articles 1–20