Follow
Shanda Li
Shanda Li
Verified email at cs.cmu.edu - Homepage
Title
Cited by
Cited by
Year
Is Physics Informed Loss Always Suitable for Training Physics Informed Neural Network?
C Wang, S Li, D He, L Wang
Advances in Neural Information Processing Systems 35, 8278-8290, 2022
722022
Your transformer may not be as powerful as you expect
S Luo, S Li, S Zheng, TY Liu, L Wang, D He
Advances in Neural Information Processing Systems 35, 4301-4315, 2022
492022
Stable, fast and accurate: Kernelized attention with relative positional encoding
S Luo, S Li, T Cai, D He, D Peng, S Zheng, G Ke, L Wang, TY Liu
NeurIPS 2021, 2021
492021
Functional Interpolation for Relative Positions Improves Long Context Transformers
S Li, C You, G Guruganesh, J Ainslie, S Ontanon, M Zaheer, S Sanghai, ...
arXiv preprint arXiv:2310.04418, 2023
302023
Learning physics-informed neural networks without stacked back-propagation
D He, S Li, W Shi, X Gao, J Zhang, J Bian, L Wang, TY Liu
International Conference on Artificial Intelligence and Statistics, 3034-3047, 2023
232023
Can vision transformers perform convolution?
S Li, X Chen, D He, CJ Hsieh
arXiv preprint arXiv:2111.01353, 2021
212021
Inference scaling laws: An empirical analysis of compute-optimal inference for problem-solving with language models
Y Wu, Z Sun, S Li, S Welleck, Y Yang
arXiv preprint arXiv:2408.00724, 2024
19*2024
Learning a fourier transform for linear relative positional encodings in transformers
K Choromanski, S Li, V Likhosherstov, KA Dubey, S Luo, D He, Y Yang, ...
International Conference on Artificial Intelligence and Statistics, 2278-2286, 2024
62024
The system can't perform the operation now. Try again later.
Articles 1–8