Xiaodong Liu
Xiaodong Liu
Microsoft Research, Redmond
Verified email at microsoft.com - Homepage
Title
Cited by
Cited by
Year
MS MARCO: A human generated machine reading comprehension dataset
T Nguyen, M Rosenberg, X Song, J Gao, S Tiwary, R Majumder, L Deng
CoCo@ NIPS, 2016
4962016
Multi-task deep neural networks for natural language understanding
X Liu, P He, W Chen, J Gao
arXiv preprint arXiv:1901.11504, 2019
4392019
On the variance of the adaptive learning rate and beyond
L Liu, H Jiang, P He, W Chen, X Liu, J Gao, J Han
arXiv preprint arXiv:1908.03265, 2019
4172019
Unified Language Model Pre-training for Natural Language Understanding and Generation
L Dong, N Yang, W Wang, F Wei, X Liu, Y Wang, J Gao, M Zhou, HW Hon
arXiv preprint arXiv:1905.03197, 2019
3382019
Representation learning using multi-task deep neural networks for semantic classification and information retrieval
X Liu, J Gao, X He, L Deng, K Duh, YY Wang
3192015
Stochastic answer networks for machine reading comprehension
X Liu, Y Shen, K Duh, J Gao
arXiv preprint arXiv:1712.03556, 2017
1462017
Ms marco: A human generated machine reading comprehension dataset
P Bajaj, D Campos, N Craswell, L Deng, J Gao, X Liu, R Majumder, ...
arXiv preprint arXiv:1611.09268, 2016
1372016
Cyclical annealing schedule: A simple approach to mitigating kl vanishing
H Fu, C Li, X Liu, J Gao, A Celikyilmaz, L Carin
arXiv preprint arXiv:1903.10145, 2019
792019
Smart: Robust and efficient fine-tuning for pre-trained natural language models through principled regularized optimization
H Jiang, P He, W Chen, X Liu, J Gao, T Zhao
arXiv preprint arXiv:1911.03437, 2019
77*2019
Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
X Liu, P He, W Chen, J Gao
arXiv preprint arXiv:1904.09482, 2019
762019
Record: Bridging the gap between human and machine commonsense reading comprehension
S Zhang, X Liu, J Liu, J Gao, K Duh, B Van Durme
arXiv preprint arXiv:1810.12885, 2018
632018
Rat-sql: Relation-aware schema encoding and linking for text-to-sql parsers
B Wang, R Shin, X Liu, O Polozov, M Richardson
arXiv preprint arXiv:1911.04942, 2019
502019
Unilmv2: Pseudo-masked language models for unified language model pre-training
H Bao, L Dong, F Wei, W Wang, N Yang, X Liu, Y Wang, J Gao, S Piao, ...
International Conference on Machine Learning, 642-652, 2020
442020
Conversing by reading: Contentful neural conversation with on-demand machine reading
L Qin, M Galley, C Brockett, X Liu, X Gao, B Dolan, Y Choi, J Gao
arXiv preprint arXiv:1906.02738, 2019
422019
Stochastic answer networks for natural language inference
X Liu, K Duh, J Gao
arXiv preprint arXiv:1804.07888, 2018
342018
Language-based image editing with recurrent attentive models
J Chen, Y Shen, J Gao, J Liu, X Liu
Proceedings of the IEEE Conference on Computer Vision and Pattern …, 2018
342018
Towards human-level machine reading comprehension: Reasoning and inference with multiple strategies
Y Xu, J Liu, J Gao, Y Shen, X Liu
arXiv preprint arXiv:1711.04964, 2017
34*2017
Multi-task learning with sample re-weighting for machine reading comprehension
Y Xu, X Liu, Y Shen, J Liu, J Gao
arXiv preprint arXiv:1809.06963, 2018
322018
Representation learning using multi-task deep neural networks
J Gao, L Deng, X He, Y Wang, K Duh, X Liu
US Patent 10,089,576, 2018
312018
A hybrid retrieval-generation neural conversation model
L Yang, J Hu, M Qiu, C Qu, J Gao, WB Croft, X Liu, Y Shen, J Liu
Proceedings of the 28th ACM international conference on information and …, 2019
282019
The system can't perform the operation now. Try again later.
Articles 1–20