A survey on in-context learning Q Dong, L Li, D Dai, C Zheng, J Ma, R Li, H Xia, J Xu, Z Wu, T Liu, ... arXiv preprint arXiv:2301.00234, 2022 | 1221 | 2022 |
Gated Self-Matching Networks for Reading Comprehension and Question Answering W Wang, N Yang, F Wei, B Chang, M Zhou the 55th Annual Meeting of the Association for Computational Linguistics …, 2017 | 858 | 2017 |
Knowledge neurons in pretrained transformers D Dai, L Dong, Y Hao, Z Sui, B Chang, F Wei arXiv preprint arXiv:2104.08696, 2021 | 481 | 2021 |
Table-to-text generation by structure-aware seq2seq learning T Liu, K Wang, L Sha, B Chang, Z Sui Proceedings of the AAAI conference on artificial intelligence 32 (1), 2018 | 306 | 2018 |
Jointly extracting event triggers and arguments by dependency-bridge RNN and tensor-based argument interaction L Sha, F Qian, B Chang, Z Sui Proceedings of the AAAI conference on artificial intelligence 32 (1), 2018 | 292 | 2018 |
Double Graph Based Reasoning for Document-level Relation Extraction S Zeng, R Xu, B Chang, L Li The 2020 Conference on Empirical Methods in Natural Language Processing …, 2020 | 248 | 2020 |
Max-margin tensor neural network for Chinese word segmentation W Pei, T Ge, B Chang Proceedings of the 52nd Annual Meeting of the Association for Computational …, 2014 | 205 | 2014 |
Cblue: A chinese biomedical language understanding evaluation benchmark N Zhang, M Chen, Z Bi, X Liang, L Li, X Shang, K Yin, C Tan, J Xu, ... arXiv preprint arXiv:2106.08087, 2021 | 191 | 2021 |
A dual reinforcement learning framework for unsupervised text style transfer F Luo, P Li, J Zhou, P Yang, B Chang, Z Sui, X Sun arXiv preprint arXiv:1905.10060, 2019 | 190 | 2019 |
Towards time-aware knowledge graph completion T Jiang, T Liu, T Ge, L Sha, B Chang, S Li, Z Sui Proceedings of COLING 2016, the 26th International Conference on …, 2016 | 186 | 2016 |
Graph-based Dependency Parsing with Bidirectional LSTM W Wang, B Chang Proc. of ACL, 2016 | 185 | 2016 |
Raise a child in large language model: Towards effective and generalizable fine-tuning R Xu, F Luo, Z Zhang, C Tan, B Chang, S Huang, F Huang arXiv preprint arXiv:2109.05687, 2021 | 170 | 2021 |
A soft-label method for noise-tolerant distantly supervised relation extraction T Liu, K Wang, B Chang, Z Sui Proceedings of the 2017 conference on empirical methods in natural language …, 2017 | 163 | 2017 |
Encoding temporal information for time-aware link prediction T Jiang, T Liu, T Ge, L Sha, S Li, B Chang, Z Sui Proceedings of the 2016 conference on empirical methods in natural language …, 2016 | 140 | 2016 |
Order-planning neural text generation from structured data L Sha, L Mou, T Liu, P Poupart, S Li, B Chang, Z Sui Proceedings of the AAAI Conference on Artificial Intelligence 32 (1), 2018 | 126 | 2018 |
Can we edit factual knowledge by in-context learning? C Zheng, L Li, Q Dong, Y Fan, Z Wu, J Xu, B Chang arXiv preprint arXiv:2305.12740, 2023 | 124 | 2023 |
Discourse parsing with attention-based hierarchical neural networks Q Li, T Li, B Chang Proceedings of the 2016 Conference on Empirical Methods in Natural Language …, 2016 | 119 | 2016 |
Incorporating glosses into neural word sense disambiguation F Luo, T Liu, Q Xia, B Chang, Z Sui arXiv preprint arXiv:1805.08028, 2018 | 116 | 2018 |
Document-level event extraction via heterogeneous graph-based interaction model with a tracker R Xu, T Liu, L Li, B Chang arXiv preprint arXiv:2105.14924, 2021 | 105 | 2021 |
计算语言学概论 俞士汶, 常宝宝, 詹卫东 北京: 商务印书馆 322, 323, 2003 | 99 | 2003 |