Follow
Kris Cao
Kris Cao
DeepMind
Verified email at deepmind.com - Homepage
Title
Cited by
Cited by
Year
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, JB Alayrac, J Yu, R Soricut, J Schalkwyk, ...
arXiv preprint arXiv:2312.11805, 2023
21462023
Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context
G Team, P Georgiev, VI Lei, R Burnell, L Bai, A Gulati, G Tanzer, ...
arXiv preprint arXiv:2403.05530, 2024
671*2024
Emergent communication through negotiation
K Cao, A Lazaridou, M Lanctot, JZ Leibo, K Tuyls, S Clark
arXiv preprint arXiv:1804.03980, 2018
1962018
Mind the gap: Assessing temporal generalization in neural language models
A Lazaridou, A Kuncoro, E Gribovskaya, D Agrawal, A Liska, T Terzi, ...
Advances in Neural Information Processing Systems 34, 29348-29363, 2021
171*2021
A joint model for word embedding and word morphology
K Cao, M Rei
arXiv preprint arXiv:1606.02601, 2016
1192016
Game Plan: What AI can do for Football, and What Football can do for AI
K Tuyls, S Omidshafiei, P Muller, Z Wang, J Connor, D Hennes, I Graham, ...
Journal of Artificial Intelligence Research 71, 41-88, 2021
1102021
Latent variable dialogue models and their diversity
K Cao, S Clark
arXiv preprint arXiv:1702.05962, 2017
882017
Control prefixes for parameter-efficient text generation
J Clive, K Cao, M Rei
arXiv preprint arXiv:2110.08329, 2021
692021
Factorising AMR generation through syntax
K Cao, S Clark
arXiv preprint arXiv:1804.07707, 2018
272018
Multiagent off-screen behavior prediction in football
S Omidshafiei, D Hennes, M Garnelo, Z Wang, A Recasens, E Tarassov, ...
Scientific reports 12 (1), 8638, 2022
182022
You should evaluate your language model on marginal likelihood over tokenisations
K Cao, L Rimell
arXiv preprint arXiv:2109.02550, 2021
182021
Learning meaning representations for text generation with deep generative models
K Cao
142020
Towards coherent and consistent use of entities in narrative generation
P Papalampidi, K Cao, T Kocisky
International Conference on Machine Learning, 17278-17294, 2022
102022
Unpacking Tokenization: Evaluating Text Compression and its Correlation with Model Performance
O Goldman, A Caciularu, M Eyal, K Cao, I Szpektor, R Tsarfaty
arXiv preprint arXiv:2403.06265, 2024
72024
Modelling latent skills for multitask language generation
K Cao, D Yogatama
arXiv preprint arXiv:2002.09543, 2020
42020
What is the best recipe for character-level encoder-only modelling?
K Cao
arXiv preprint arXiv:2305.05461, 2023
32023
Dynamic entity representations for sequence generation
KY Cao, T Kocisky, P Papalampidi
US Patent App. 17/960,775, 2023
2023
Factorising
K Cao, S Clark
Proceedings of the 2019 Conference of the North, 2019
2019
Proceedings of the Third Workshop on Representation Learning for NLP
I Augenstein, K Cao, H He, F Hill, S Gella, J Kiros, H Mei, D Misra
Proceedings of the Third Workshop on Representation Learning for NLP, 2018
2018
CPGS First Year Report
K Cao
2015
The system can't perform the operation now. Try again later.
Articles 1–20