Character-Word LSTM Language Models L Verwimp, J Pelemans, H Van hamme, P Wambacq European Chapter of the Association for Computational Linguistics (EACL …, 2017 | 69 | 2017 |
Definite il y a-clefts in spoken French L Verwimp, K Lahousse Journal of French Language Studies, 2016 | 22 | 2016 |
Improving the translation environment for professional translators V Vandeghinste, T Vanallemeersch, L Augustinus, B Bulté, F Van Eynde, ... Informatics 6 (2), 24, 2019 | 12 | 2019 |
Error-driven pruning of language models for virtual assistants S Gondala, L Verwimp, E Pusateri, M Tsagkias, C Van Gysel ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …, 2021 | 11 | 2021 |
A comparison of different punctuation prediction approaches in a translation context V Vandeghinste, L Verwimp, J Pelemans, P Wambacq European Association for Machine Translation, 2018 | 11 | 2018 |
Analyzing the Contribution of Top-Down Lexical and Bottom-Up Acoustic Cues in the Detection of Sentence Prominence. S Kakouros, J Pelemans, L Verwimp, P Wambacq, O Räsänen Interspeech, 1074-1078, 2016 | 11 | 2016 |
State Gradients for Analyzing Memory in LSTM Language Models L Verwimp, H Van hamme, P Wambacq Computer Speech & Language, 101034, 2019 | 9 | 2019 |
TF-LM: TensorFlow-based Language Modeling Toolkit L Verwimp, H Van hamme, P Wambacq Proceedings Language Resources and Evaluation Conference (LREC), 2018 | 9 | 2018 |
Van hamme, H., and Wambacq, P.(2019) L Verwimp, J Pelemans Tf-lm: Tensorflow-based language modeling toolkit. In http://www. lrec-conf …, 0 | 6 | |
Reverse Transfer Learning: Can Word Embeddings Trained for Different NLP Tasks Improve Neural Language Models? L Verwimp, JR Bellegarda Interspeech 2019, 2019 | 5 | 2019 |
Optimizing bilingual neural transducer with synthetic code-switching text generation T Nguyen, N Tran, L Deng, TF da Silva, M Radzihovsky, R Hsiao, ... arXiv preprint arXiv:2210.12214, 2022 | 4 | 2022 |
State Gradients for RNN Memory Analysis L Verwimp, H Van hamme, V Renkens, P Wambacq Interspeech 2018, 1467-1471, 2018 | 4 | 2018 |
Language model adaptation for ASR of spoken translations using phrase-based translation models and named entity models J Pelemans, T Vanallemeersch, K Demuynck, L Verwimp, P Wambacq 2016 IEEE International Conference on Acoustics, Speech and Signal …, 2016 | 4 | 2016 |
STON: Efficient Subtitling in Dutch Using State-of-the-Art Tools L Verwimp, B Desplanques, K Demuynck, J Pelemans, M Lycke, ... Interspeech 2016, 780-781, 2016 | 4 | 2016 |
Information-Weighted Neural Cache Language Models for ASR L Verwimp, J Pelemans, H Van hamme, P Wambacq IEEE Workshop on Spoken Language Technology (SLT), 2018 | 3 | 2018 |
Domain adaptation for LSTM language models W Boes, R Van Rompaey, J Pelemans, L Verwimp, P Wambacq Book of abstracts CLIN27, 57, 2017 | 3 | 2017 |
Expanding n-gram training data for language models based on morpho-syntactic transformations L Verwimp, J Pelemans, H Van hamme, P Wambacq Computational Linguistics in the Netherlands Journal 5, 49-64, 2015 | 3 | 2015 |
Application-agnostic language modeling for on-device ASR M Nußbaum-Thom, L Verwimp, Y Oualil arXiv preprint arXiv:2305.09764, 2023 | 2 | 2023 |
Smart Computer-Aided Translation Environment (SCATE): Highlights (. pdf) V Vandeghinste, T Vanallemeersch, B Bulté, L Augustinus, F Van Eynde, ... | 2 | 2018 |
Language Models of Spoken Dutch L Verwimp, J Pelemans, M Lycke, H Van hamme, P Wambacq arXiv preprint arXiv:1709.03759, 2017 | 1 | 2017 |