Volgen
Graham Neubig
Graham Neubig
Carnegie Mellon University, Inspired Cognition
Geverifieerd e-mailadres voor cs.cmu.edu - Homepage
Titel
Geciteerd door
Geciteerd door
Jaar
Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing
P Liu, W Yuan, J Fu, Z Jiang, H Hayashi, G Neubig
ACM Computing Surveys, 2021
8962021
Are Sixteen Heads Really Better than One?
P Michel, O Levy, G Neubig
NeurIPS 2019, 2019
5862019
A Syntactic Neural Model for General-Purpose Code Generation
P Yin, G Neubig
ACL 2017, 2017
5842017
XTREME: A massively multilingual multi-task benchmark for evaluating cross-lingual generalization
J Hu, S Ruder, A Siddhant, G Neubig, O Firat, M Johnson
ICML 2020, 2020
5652020
How can we know what language models know?
Z Jiang, FF Xu, J Araki, G Neubig
TACL 8, 423-438, 2020
5162020
Dynet: The dynamic neural network toolkit
G Neubig, C Dyer, Y Goldberg, A Matthews, W Ammar, A Anastasopoulos, ...
arXiv preprint arXiv:1701.03980, 2017
422*2017
Pointwise prediction for robust, adaptable Japanese morphological analysis
G Neubig, Y Nakata, S Mori
ACL 2011, 529-533, 2011
3192011
When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?
Y Qi, DS Sachan, M Felix, SJ Padmanabhan, G Neubig
NAACL 2018, 2018
3042018
Learning to generate pseudo-code from source code using statistical machine translation (t)
Y Oda, H Fudaba, G Neubig, H Hata, S Sakti, T Toda, S Nakamura
ASE 2015, 574-584, 2015
2902015
TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data
P Yin, G Neubig, W Yih, S Riedel
ACL 2020, 2020
2772020
Stress Test Evaluation for Natural Language Inference
A Naik, A Ravichander, N Sadeh, C Rose, G Neubig
COLING 2018, 2018
2642018
Lagging Inference Networks and Posterior Collapse in Variational Autoencoders
J He, D Spokoyny, G Neubig, T Berg-Kirkpatrick
ICLR 2019, 2019
2632019
Controllable Invariance through Adversarial Feature Learning
Q Xie, Z Dai, Y Du, E Hovy, G Neubig
NIPS 2017, 2017
2472017
Controlling output length in neural encoder-decoders
Y Kikuchi, G Neubig, R Sasano, H Takamura, M Okumura
EMNLP 2016, 2016
2192016
Competence-based Curriculum Learning for Neural Machine Translation
EA Platanios, O Stretcu, G Neubig, B Poczos, TM Mitchell
NAACL 2019, 2019
2182019
Neural machine translation and sequence-to-sequence models: A tutorial
G Neubig
arXiv preprint arXiv:1703.01619, 2017
2112017
Incorporating discrete translation lexicons into neural machine translation
P Arthur, G Neubig, S Nakamura
EMNLP 2016, 2016
2012016
Weight Poisoning Attacks on Pre-trained Models
K Kurita, P Michel, G Neubig
ACL 2020, 2020
1862020
Stack-Pointer Networks for Dependency Parsing
X Ma, Z Hu, J Liu, N Peng, G Neubig, E Hovy
ACL 2018, 2018
1822018
Learning to translate in real-time with neural machine translation
J Gu, G Neubig, K Cho, VOK Li
EACL 2017, 2016
1802016
Het systeem kan de bewerking nu niet uitvoeren. Probeer het later opnieuw.
Artikelen 1–20