Sheng Zhang
Sheng Zhang
Microsoft Research
Verified email at - Homepage
Cited by
Cited by
BioGPT: generative pre-trained transformer for biomedical text generation and mining
R Luo, L Sun, Y Xia, T Qin, S Zhang, H Poon, TY Liu
Briefings in bioinformatics 23 (6), bbac409, 2022
ReCoRD: Bridging the gap between human and machine commonsense reading comprehension
S Zhang, X Liu, J Liu, J Gao, K Duh, B Van Durme
arXiv preprint arXiv:1810.12885, 2018
LLaVA-Med: Training a large language-and-vision assistant for biomedicine in one day
C Li, C Wong, S Zhang, N Usuyama, H Liu, J Yang, T Naumann, H Poon, ...
arXiv preprint arXiv:2306.00890, 2023
AMR parsing as sequence-to-graph transduction
S Zhang, X Ma, K Duh, B Van Durme
arXiv preprint arXiv:1905.08704, 2019
Universal decompositional semantics on universal dependencies
AS White, D Reisinger, K Sakaguchi, T Vieira, S Zhang, R Rudinger, ...
Proceedings of the 2016 Conference on Empirical Methods in Natural Language …, 2016
Deep generalized canonical correlation analysis
A Benton, H Khayrallah, B Gujral, DA Reisinger, S Zhang, R Arora
arXiv preprint arXiv:1702.02519, 2017
Ordinal common-sense inference
S Zhang, R Rudinger, K Duh, B Van Durme
Transactions of the Association of Computational Linguistics, 2017
Can generalist foundation models outcompete special-purpose tuning? case study in medicine
H Nori, YT Lee, S Zhang, D Carignan, R Edgar, N Fusi, N King, J Larson, ...
arXiv preprint arXiv:2311.16452, 2023
Answering natural language questions via phrasal semantic parsing
K Xu, S Zhang, Y Feng, D Zhao
CCF International Conference on Natural Language Processing and Chinese …, 2014
BiomedCLIP: a multimodal biomedical foundation model pretrained from fifteen million scientific image-text pairs
S Zhang, Y Xu, N Usuyama, H Xu, J Bagga, R Tinn, S Preston, R Rao, ...
arXiv preprint arXiv:2303.00915, 2023
Broad-coverage semantic parsing as transduction
S Zhang, X Ma, K Duh, B Van Durme
arXiv preprint arXiv:1909.02607, 2019
Context-faithful prompting for large language models
W Zhou, S Zhang, H Poon, M Chen
arXiv preprint arXiv:2303.11315, 2023
UniversalNER: Targeted distillation from large language models for open named entity recognition
W Zhou, S Zhang, Y Gu, M Chen, H Poon
arXiv preprint arXiv:2308.03279, 2023
An Evaluation of PredPatt and Open IE via Stage 1 Semantic Role Labeling
S Zhang, R Rudinger, B Van Durme
IWCS 2017—12th International Conference on Computational Semantics—Short …, 2017
MT/IE: Cross-lingual open information extraction with neural sequence-to-sequence models
S Zhang, K Duh, B Van Durme
Proceedings of the 15th Conference of the European Chapter of the …, 2017
Optimizing bi-encoder for named entity recognition via contrastive learning
S Zhang, H Cheng, J Gao, H Poon
arXiv preprint arXiv:2208.14565, 2022
Knowledge-rich self-supervision for biomedical entity linking
S Zhang, H Cheng, S Vashishth, C Wong, J Xiao, X Liu, T Naumann, ...
arXiv preprint arXiv:2112.07887, 2021
The universal decompositional semantics dataset and decomp toolkit
AS White, E Stengel-Eskin, S Vashishtha, V Govindarajan, DA Reisinger, ...
arXiv preprint arXiv:1909.13851, 2019
Neural-Davidsonian semantic proto-role labeling
R Rudinger, A Teichert, R Culkin, S Zhang, B Van Durme
arXiv preprint arXiv:1804.07976, 2018
Cross-lingual decompositional semantic parsing
S Zhang, X Ma, R Rudinger, K Duh, B Van Durme
Proceedings of the 2018 Conference on Empirical Methods in Natural Language …, 2018
The system can't perform the operation now. Try again later.
Articles 1–20