Follow
Aditya Barua
Aditya Barua
Verified email at google.com
Title
Cited by
Cited by
Year
mT5: A massively multilingual pre-trained text-to-text transformer
L Xue, N Constant, A Roberts, M Kale, R Al-Rfou, A Siddhant, A Barua, ...
arXiv preprint arXiv:2010.11934, 2020
7002020
Byt5: Towards a token-free future with pre-trained byte-to-byte models
L Xue, A Barua, N Constant, R Al-Rfou, S Narang, M Kale, A Roberts, ...
Transactions of the Association for Computational Linguistics 10, 291-306, 2022
1162022
LAReQA: Language-agnostic answer retrieval from a multilingual pool
U Roy, N Constant, R Al-Rfou, A Barua, A Phillips, Y Yang
arXiv preprint arXiv:2004.05484, 2020
312020
A finite volume method for stochastic integrate-and-fire models
F Marpeau, A Barua, K Josić
Journal of computational neuroscience 26, 445-457, 2009
162009
Finite volume and asymptotic methods for stochastic neuron models with correlated inputs
R Rosenbaum, F Marpeau, J Ma, A Barua, K Josić
Journal of Mathematical Biology 65, 1-34, 2012
132012
Overcoming catastrophic forgetting in zero-shot cross-lingual generation
T Vu, A Barua, B Lester, D Cer, M Iyyer, N Constant
arXiv preprint arXiv:2205.12647, 2022
42022
Rigorously collecting commonsense judgments for complex question-answer content
M Sameki, A Barua, P Paritosh
Third AAAI Conference on Human Computation and Crowdsourcing, 2016
32016
Using commonsense for deeper understanding of complex question answer content
A Barua, P Paritosh
WebQA, SIGIR 2015, 2015
22015
Byt5: towards a token-free future with pre-trained byte-to-byte models. CoRR abs/2105.13626 (2021)
L Xue, A Barua, N Constant, R Al-Rfou, S Narang, M Kale, A Roberts, ...
arXiv preprint arXiv:2105.13626, 2021
2021
Coreferent Mention Detection using Deep Learning
A Barua, P Sharma, K Clark
Correlation transfer in neuronal populations
J Ma, K Josic, A Barua, R Rosenbaum, F Marpeau
Frontiers in Systems Neuroscience, 311, 0
The system can't perform the operation now. Try again later.
Articles 1–11