Follow
Jai Gupta
Jai Gupta
Verified email at google.com
Title
Cited by
Cited by
Year
Ext5: Towards extreme multi-task scaling for transfer learning
V Aribandi, Y Tay, T Schuster, J Rao, HS Zheng, SV Mehta, H Zhuang, ...
arXiv preprint arXiv:2111.10952, 2021
912021
Charformer: Fast character transformers via gradient-based subword tokenization
Y Tay, VQ Tran, S Ruder, J Gupta, HW Chung, D Bahri, Z Qin, ...
arXiv preprint arXiv:2106.12672, 2021
652021
Are Pre-trained Convolutions Better than Pre-trained Transformers?
Y Tay, M Dehghani, J Gupta, D Bahri, V Aribandi, Z Qin, D Metzler
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
542021
Transformer memory as a differentiable search index
Y Tay, V Tran, M Dehghani, J Ni, D Bahri, H Mehta, Z Qin, K Hui, Z Zhao, ...
Advances in Neural Information Processing Systems 35, 21831-21843, 2022
532022
Recursive ant colony optimization for estimation of parameters of a function
DK Gupta, Y Arora, UK Singh, JP Gupta
2012 1st International Conference on Recent Advances in Information …, 2012
292012
Omninet: Omnidirectional representations from transformers
Y Tay, M Dehghani, V Aribandi, J Gupta, PM Pham, Z Qin, D Bahri, ...
International Conference on Machine Learning, 10193-10202, 2021
242021
Hyperprompt: Prompt-based task-conditioning of transformers
Y He, S Zheng, Y Tay, J Gupta, Y Du, V Aribandi, Z Zhao, YG Li, Z Chen, ...
International Conference on Machine Learning, 8678-8690, 2022
222022
Confident adaptive language modeling
T Schuster, A Fisch, J Gupta, M Dehghani, D Bahri, V Tran, Y Tay, ...
Advances in Neural Information Processing Systems 35, 17456-17472, 2022
182022
A new generation of perspective api: Efficient multilingual character-level transformers
A Lees, VQ Tran, Y Tay, J Sorensen, J Gupta, D Metzler, L Vasserman
arXiv preprint arXiv:2202.11176, 2022
172022
Personalized online spell correction for personal search
J Gupta, Z Qin, M Bendersky, D Metzler
The World Wide Web Conference, 2785-2791, 2019
132019
Recursive ant colony optimization: a new technique for the estimation of function parameters from geophysical field data
DK Gupta, JP Gupta, Y Arora, U Shankar
Near Surface Geophysics 11 (3), 325-340, 2013
82013
Ed2lm: Encoder-decoder to language model for faster document re-ranking inference
K Hui, H Zhuang, T Chen, Z Qin, J Lu, D Bahri, J Ma, JP Gupta, CN Santos, ...
arXiv preprint arXiv:2204.11458, 2022
72022
DSI++: Updating Transformer Memory with New Documents
SV Mehta, J Gupta, Y Tay, M Dehghani, VQ Tran, J Rao, M Najork, ...
arXiv preprint arXiv:2212.09744, 2022
32022
Google COVID-19 Vaccination Search Insights: Anonymization Process Description
A Boulanger, A Kumok, A Patankar, B Ghazi, B Miller, C Kamath, ...
3*2021
Inversion of 1D VES Data Using New a Technique Called Recursive Ant Colony Optimization (RACO)
Y Arora, DK Gupta, JP Gupta, UK Singh
5th EAGE St. Petersburg International Conference and Exhibition on …, 2012
12012
SLAM Using relational trees and semantics
A Sarkar, R Reiger, S Roy, R Chaterjee, A Datta, JP Gupta, A Sowmyan
Advanced Materials Research 452, 648-653, 2012
12012
How Does Generative Retrieval Scale to Millions of Passages?
R Pradeep, K Hui, J Gupta, AD Lelkes, H Zhuang, J Lin, D Metzler, ...
arXiv preprint arXiv:2305.11841, 2023
2023
DSI++: Updating Transformer Memory with New Documents
S Vaibhav Mehta, J Gupta, Y Tay, M Dehghani, VQ Tran, J Rao, M Najork, ...
arXiv e-prints, arXiv: 2212.09744, 2022
2022
Machine-Learned Attention Models Featuring Omnidirectional Processing
Y Tay, DC Juan, D Bahri, DA Metzler, JP Gupta, M Dehghani, P Pham, ...
US Patent App. 17/592,796, 2022
2022
Covid Vaccine Search Classification with Pretrained Transformers and Dense Feature Memory
C Kamath, D Metzler, E Gabrilovich, J Gupta, S Bavadekar, V Tran, Y Tay
2022
The system can't perform the operation now. Try again later.
Articles 1–20