Follow
Marjan Ghazvininejad
Marjan Ghazvininejad
Research Scientist, FAIR (Facebook AI Research)
Verified email at fb.com - Homepage
Title
Cited by
Cited by
Year
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension
M Lewis
arXiv preprint arXiv:1910.13461, 2019
114262019
Multilingual denoising pre-training for neural machine translation
Y Liu
arXiv preprint arXiv:2001.08210, 2020
18702020
A knowledge-grounded neural conversation model
M Ghazvininejad, C Brockett, MW Chang, B Dolan, J Gao, W Yih, ...
Proceedings of the AAAI conference on artificial intelligence 32 (1), 2018
6622018
Mask-predict: Parallel decoding of conditional masked language models
M Ghazvininejad, O Levy, Y Liu, L Zettlemoyer
arXiv preprint arXiv:1904.09324, 2019
6032019
Generating Topical Poetry
M Ghazvininejad, X Shi, Y Choi, K Knight
Empirical Methods on Natural Language Processing, 2016
2002016
Detecting hallucinated content in conditional neural sequence generation
C Zhou, G Neubig, J Gu, M Diab, P Guzman, L Zettlemoyer, ...
arXiv preprint arXiv:2011.02593, 2020
1962020
Hafez: an Interactive Poetry Generation System
M Ghazvininejad, X Shi, J Priyadarshi, K Knight
proceeding of ACL Demo Track, 2017
1952017
Towards controllable story generation
N Peng, M Ghazvininejad, J May, K Knight
Proceedings of the First Workshop on Storytelling, 43-49, 2018
1802018
In-context examples selection for machine translation
S Agrawal, C Zhou, M Lewis, L Zettlemoyer, M Ghazvininejad
arXiv preprint arXiv:2212.02437, 2022
1782022
A review on language models as knowledge bases
B AlKhamissi, M Li, A Celikyilmaz, M Diab, M Ghazvininejad
arXiv preprint arXiv:2204.06031, 2022
1682022
Pre-training via paraphrasing
M Lewis, M Ghazvininejad, G Ghosh, A Aghajanyan, S Wang, ...
Advances in Neural Information Processing Systems 33, 18470-18481, 2020
1642020
Delight: Deep and light-weight transformer
S Mehta, M Ghazvininejad, S Iyer, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2008.00623, 2020
1372020
Non-autoregressive machine translation with disentangled context transformer
J Kasai, J Cross, M Ghazvininejad, J Gu
International conference on machine learning, 5144-5155, 2020
122*2020
Aligned cross entropy for non-autoregressive machine translation
M Ghazvininejad, V Karpukhin, L Zettlemoyer, O Levy
International Conference on Machine Learning, 3515-3523, 2020
1122020
Training on synthetic noise improves robustness to natural noise in machine translation
V Karpukhin, O Levy, J Eisenstein, M Ghazvininejad
Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019), 42-47, 2019
1122019
Improving zero and few-shot abstractive summarization with intermediate fine-tuning and data augmentation
AR Fabbri, S Han, H Li, H Li, M Ghazvininejad, S Joty, D Radev, ...
arXiv preprint arXiv:2010.12836, 2020
1092020
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension, 2019
M Lewis, Y Liu, N Goyal, M Ghazvininejad, A Mohamed, O Levy, ...
arXiv preprint arXiv:1910.13461, 1910
981910
Natural language to code translation with execution
F Shi, D Fried, M Ghazvininejad, L Zettlemoyer, SI Wang
arXiv preprint arXiv:2204.11454, 2022
972022
Prompting contrastive explanations for commonsense reasoning tasks
B Paranjape, J Michael, M Ghazvininejad, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2106.06823, 2021
752021
Semi-autoregressive training improves mask-predict decoding
M Ghazvininejad, O Levy, L Zettlemoyer
arXiv preprint arXiv:2001.08785, 2020
662020
The system can't perform the operation now. Try again later.
Articles 1–20