Follow
Ruoyu Sun
Ruoyu Sun
Other namesRuo-Yu Sun
Chinese University of Hong Kong (Shenzhen), Shenzhen Institue of Big Data
Verified email at cuhk.edu.cn - Homepage
Title
Cited by
Cited by
Year
Guaranteed matrix completion via non-convex factorization
R Sun, ZQ Luo
IEEE Transactions on Information Theory 62 (11), 6535-6579, 2016
5332016
Optimization for deep learning: An overview
RY Sun
Journal of the Operations Research Society of China 8 (2), 249-294, 2020
390*2020
On the convergence of a class of adam-type algorithms for non-convex optimization
X Chen, S Liu, R Sun, M Hong
arXiv preprint arXiv:1808.02941, 2018
3732018
Joint base station clustering and beamformer design for partial coordinated transmission in heterogeneous networks
M Hong, R Sun, H Baligh, ZQ Luo
IEEE Journal on Selected Areas in Communications 31 (2), 226-240, 2013
3432013
Max-sliced wasserstein distance and its use for gans
I Deshpande, YT Hu, R Sun, A Pyrros, N Siddiqui, S Koyejo, Z Zhao, ...
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2019
2212019
Joint downlink base station association and power control for max-min fairness: Computation and complexity
R Sun, M Hong, ZQ Luo
IEEE Journal on Selected Areas in Communications 33 (6), 1040-1054, 2015
117*2015
A single-loop smoothed gradient descent-ascent algorithm for nonconvex-concave min-max problems
J Zhang, P Xiao, R Sun, Z Luo
Advances in neural information processing systems 33, 7377-7389, 2020
1112020
Adding one neuron can eliminate all bad local minima
S Liang, R Sun, JD Lee, R Srikant
Advances in Neural Information Processing Systems 31, 2018
1012018
The global landscape of neural networks: An overview
R Sun, D Li, S Liang, T Ding, R Srikant
IEEE Signal Processing Magazine 37 (5), 95-108, 2020
992020
Robust SINR-constrained MISO downlink beamforming: When is semidefinite programming relaxation tight?
E Song, Q Shi, M Sanjabi, RY Sun, ZQ Luo
EURASIP Journal on Wireless Communications and Networking 2012, 1-11, 2012
972012
Understanding the loss surface of neural networks for binary classification
S Liang, R Sun, Y Li, R Srikant
ICML, 2018
902018
On the efficiency of random permutation for ADMM and coordinate descent
R Sun, ZQ Luo, Y Ye
Mathematics of Operations Research 45 (1), 233-271, 2020
79*2020
Rmsprop converges with proper hyperparameter
N Shi, D Li, R Sun
International conference on learning representation, 2021
762021
Adam can converge without any modification on update rules
Y Zhang, C Chen, N Shi, R Sun, ZQ Luo
Advances in neural information processing systems 35, 28386-28399, 2022
712022
Worst-case complexity of cyclic coordinate descent: gap with randomized version
R Sun, Y Ye
Mathematical Programming, 1-34, 2019
662019
Global convergence of maml and theory-inspired neural architecture search for few-shot learning
H Wang, Y Wang, R Sun, B Li
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2022
64*2022
Improved Iteration Complexity Bounds of Cyclic Block Coordinate Descent for Convex Problems
R Sun, M Hong
NIPS, 2015
592015
On the benefit of width for neural networks: Disappearance of bad basins
D Li, T Ding, R Sun
arXiv preprint arXiv:1812.11039, 2018
58*2018
Cross-Layer Provision of Future Cellular Networks: A WMMSE-based approach
RS Hadi Baligh, Mingyi Hong,Wei-Cheng Liao, Zhi-Quan Luo, Meisam Razaviyayn ...
IEEE Signal Processing Magzine 31 (6), 2014
56*2014
Suboptimal local minima exist for wide neural networks with smooth activations
T Ding, D Li, R Sun
Mathematics of Operations Research 47 (4), 2784-2814, 2022
48*2022
The system can't perform the operation now. Try again later.
Articles 1–20