Follow
Mehdi Rezagholizadeh
Mehdi Rezagholizadeh
Principal Research Scientist, Noah's Ark Lab, Huawei Technologies
Verified email at mail.mcgill.ca
Title
Cited by
Cited by
Year
EditNTS: An neural programmer-interpreter model for sentence simplification through explicit editing
Y Dong, Z Li, M Rezagholizadeh, JCK Cheung
Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019
1652019
A computer tracking system of solar dish with two-axis degree freedoms based on picture processing of bar shadow
H Arbab, B Jazi, M Rezagholizadeh
Renewable Energy 34 (4), 1114-1118, 2009
1152009
Fully quantized transformer for machine translation
G Prato, E Charlaix, M Rezagholizadeh
arXiv preprint arXiv:1910.10485, 2019
85*2019
Alp-kd: Attention-based layer projection for knowledge distillation
P Passban, Y Wu, M Rezagholizadeh, Q Liu
Proceedings of the AAAI Conference on artificial intelligence 35 (15), 13657 …, 2021
822021
Semi-supervised regression with generative adversarial networks for end to end learning in autonomous driving
M Rezagholizadeh, MA Haidar
64*2018
Textkd-gan: Text generation using knowledge distillation and generative adversarial networks
MA Haidar, M Rezagholizadeh
Canadian Conference on Artificial Intelligence, 107-118, 2019
582019
Annealing knowledge distillation
A Jafari, M Rezagholizadeh, P Sharma, A Ghodsi
arXiv preprint arXiv:2104.07163, 2021
512021
Systems and methods for multilingual text generation field
M Rezagholizadeh, MA Haidar, A Do-Omri, A Rashid
US Patent 11,151,334, 2021
35*2021
Context-aware adversarial training for name regularity bias in named entity recognition
A Ghaddar, P Langlais, A Rashid, M Rezagholizadeh
Transactions of the Association for Computational Linguistics 9, 586-604, 2021
322021
A simplified fully quantized transformer for end-to-end speech recognition
A Bie, B Venkitesh, J Monteiro, MA Haidar, M Rezagholizadeh
arXiv preprint arXiv:1911.03604, 2019
31*2019
Mate-kd: Masked adversarial text, a companion to knowledge distillation
A Rashid, V Lioutas, M Rezagholizadeh
arXiv preprint arXiv:2105.05912, 2021
302021
Why skip if you can combine: A simple knowledge distillation technique for intermediate layers
Y Wu, P Passban, M Rezagholizade, Q Liu
arXiv preprint arXiv:2010.03034, 2020
282020
End-to-end self-debiasing framework for robust NLU training
A Ghaddar, P Langlais, M Rezagholizadeh, A Rashid
arXiv preprint arXiv:2109.02071, 2021
232021
KroneckerBERT: Significant Compression of Pre-trained Language Models Through Kronecker Decomposition and Knowledge Distillation
M Tahaei, E Charlaix, V Nia, A Ghodsi, M Rezagholizadeh
Proceedings of the 2022 Conference of the North American Chapter of the …, 2022
22*2022
Making a MIRACL: Multilingual information retrieval across a continuum of languages
X Zhang, N Thakur, O Ogundepo, E Kamalloo, D Alfonso-Hermelo, X Li, ...
arXiv preprint arXiv:2210.09984, 2022
212022
A Retargeting Approach for Mesopic Vision: Simulation and Compensation
M Rezagholizadeh, T Akhavan, A Soudi, H Kaufmann, JJ Clark
Imaging Science and Technology 60 (1), 10410-1-10410, 2016
21*2016
Latent code and text-based generative adversarial networks for soft-text generation
M Haidar, M Rezagholizadeh, A Do-Omri, A Rashid
Proceedings of the 2019 Conference of the North American Chapter of the …, 2019
202019
Kronecker decomposition for gpt compression
A Edalati, M Tahaei, A Rashid, VP Nia, JJ Clark, M Rezagholizadeh
arXiv preprint arXiv:2110.08152, 2021
192021
Towards zero-shot knowledge distillation for natural language processing
A Rashid, V Lioutas, A Ghaddar, M Rezagholizadeh
arXiv preprint arXiv:2012.15495, 2020
192020
Salsa-text: self attentive latent space based adversarial text generation
J Gagnon-Marchand, H Sadeghi, MA Haidar, M Rezagholizadeh
Advances in Artificial Intelligence: 32nd Canadian Conference on Artificial …, 2019
182019
The system can't perform the operation now. Try again later.
Articles 1–20