Follow
Hongyuan Mei
Hongyuan Mei
Toyota Technology Institute at Chicago, Johns Hopkins University, The University of Chicago
Verified email at ttic.edu - Homepage
Title
Cited by
Cited by
Year
The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process
H Mei, J Eisner
arXiv, 2016
4452016
What to talk about and how? Selective Generation using LSTMs with Coarse-to-Fine Alignment
H Mei, M Bansal, MR Walter
NAACL, 2016
2852016
Listen, attend, and walk: Neural mapping of navigational instructions to action sequences
H Mei, M Bansal, MR Walter
AAAI, 2016
2322016
Coherent Dialogue with Attention-based Language Models
H Mei, M Bansal, MR Walter
AAAI, 2017
1072017
Imputing missing events in continuous-time event streams
H Mei, G Qin, J Eisner
International Conference on Machine Learning, 4475-4485, 2019
232019
Neural Datalog through time: Informed temporal modeling via logical specification
H Mei, G Qin, M Xu, J Eisner
International Conference on Machine Learning, 6808-6819, 2020
142020
Noise-contrastive estimation for multivariate point processes
H Mei, T Wan, J Eisner
Advances in neural information processing systems 33, 5204-5214, 2020
112020
Accurate Vision-based Vehicle Localization using Satellite Imagery
H Chu, H Mei, M Bansal, MR Walter
NIPS 2015 Transfer and Multi-Task Learning workshop, 2015
102015
Personalized Dynamic Treatment Regimes in Continuous Time: A Bayesian Approach for Optimizing Clinical Decisions with Timing
W Hua, H Mei, S Zohar, M Giral, Y Xu
Bayesian Analysis 17 (3), 849-878, 2022
42022
Transformer embeddings of irregularly spaced events and their participants
H Mei, C Yang, J Eisner
International Conference on Learning Representations, 2021
32021
Halo: Learning semantics-aware representations for cross-lingual information extraction
H Mei, S Zhang, K Duh, B Van Durme
arXiv preprint arXiv:1805.08271, 2018
32018
On the idiosyncrasies of the Mandarin Chinese classifier system
S Liu, H Mei, A Williams, R Cotterell
arXiv preprint arXiv:1902.10193, 2019
22019
Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning
S Xie, J Qiu, A Pasad, L Du, Q Qu, H Mei
arXiv preprint arXiv:2210.10041, 2022
12022
Transformer Embeddings of Irregularly Spaced Events and Their Participants
C Yang, H Mei, J Eisner
arXiv preprint arXiv:2201.00044, 2021
12021
Tiny-Attention Adapter: Contexts Are More Important Than the Number of Parameters
H Zhao, H Tan, H Mei
arXiv preprint arXiv:2211.01979, 2022
2022
HYPRO: A Hybridly Normalized Probabilistic Model for Long-Horizon Prediction of Event Sequences
S Xue, X Shi, JY Zhang, H Mei
arXiv preprint arXiv:2210.01753, 2022
2022
Bellman Meets Hawkes: Model-Based Reinforcement Learning via Temporal Point Processes
C Qu, X Tan, S Xue, X Shi, J Zhang, H Mei
arXiv preprint arXiv:2201.12569, 2022
2022
Neural Probabilistic Methods for Event Sequence Modeling
H Mei
Johns Hopkins University, 2021
2021
Informed Temporal Modeling via Logical Specification of Factorial LSTMs
H Mei, G Qin, M Xu, J Eisner
2019
Inference of unobserved event streams with neural Hawkes particle smoothing
H Mei, G Qin, J Eisner
2018
The system can't perform the operation now. Try again later.
Articles 1–20