Follow
Graham Neubig
Graham Neubig
Associate Professor of Computer Science, Carnegie Mellon University
Verified email at cs.cmu.edu - Homepage
Title
Cited by
Cited by
Year
Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing
P Liu, W Yuan, J Fu, Z Jiang, H Hayashi, G Neubig
ACM Computing Surveys, 2021
5622021
A Syntactic Neural Model for General-Purpose Code Generation
P Yin, G Neubig
ACL 2017, 2017
5382017
Are Sixteen Heads Really Better than One?
P Michel, O Levy, G Neubig
NeurIPS 2019, 2019
5032019
XTREME: A massively multilingual multi-task benchmark for evaluating cross-lingual generalization
J Hu, S Ruder, A Siddhant, G Neubig, O Firat, M Johnson
ICML 2020, 2020
4912020
Dynet: The dynamic neural network toolkit
G Neubig, C Dyer, Y Goldberg, A Matthews, W Ammar, A Anastasopoulos, ...
arXiv preprint arXiv:1701.03980, 2017
413*2017
How can we know what language models know?
Z Jiang, FF Xu, J Araki, G Neubig
TACL 8, 423-438, 2020
4032020
Pointwise prediction for robust, adaptable Japanese morphological analysis
G Neubig, Y Nakata, S Mori
ACL 2011, 529-533, 2011
3122011
When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?
Y Qi, DS Sachan, M Felix, SJ Padmanabhan, G Neubig
NAACL 2018, 2018
2842018
Learning to generate pseudo-code from source code using statistical machine translation (t)
Y Oda, H Fudaba, G Neubig, H Hata, S Sakti, T Toda, S Nakamura
ASE 2015, 574-584, 2015
2662015
Lagging Inference Networks and Posterior Collapse in Variational Autoencoders
J He, D Spokoyny, G Neubig, T Berg-Kirkpatrick
ICLR 2019, 2019
2502019
Stress Test Evaluation for Natural Language Inference
A Naik, A Ravichander, N Sadeh, C Rose, G Neubig
COLING 2018, 2018
2402018
Controllable Invariance through Adversarial Feature Learning
Q Xie, Z Dai, Y Du, E Hovy, G Neubig
NIPS 2017, 2017
2342017
TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data
P Yin, G Neubig, W Yih, S Riedel
ACL 2020, 2020
2312020
Controlling output length in neural encoder-decoders
Y Kikuchi, G Neubig, R Sasano, H Takamura, M Okumura
EMNLP 2016, 2016
2052016
Neural machine translation and sequence-to-sequence models: A tutorial
G Neubig
arXiv preprint arXiv:1703.01619, 2017
1992017
Incorporating discrete translation lexicons into neural machine translation
P Arthur, G Neubig, S Nakamura
EMNLP 2016, 2016
1932016
Competence-based Curriculum Learning for Neural Machine Translation
EA Platanios, O Stretcu, G Neubig, B Poczos, TM Mitchell
NAACL 2019, 2019
1842019
Stack-Pointer Networks for Dependency Parsing
X Ma, Z Hu, J Liu, N Peng, G Neubig, E Hovy
ACL 2018, 2018
1702018
Learning to translate in real-time with neural machine translation
J Gu, G Neubig, K Cho, VOK Li
EACL 2017, 2016
1692016
SwitchOut: an Efficient Data Augmentation Algorithm for Neural Machine Translation
X Wang, H Pham, Z Dai, G Neubig
EMNLP 2018, 2018
1552018
The system can't perform the operation now. Try again later.
Articles 1–20