Követés
Dominic Richards
Dominic Richards
E-mail megerősítve itt: spc.ox.ac.uk - Kezdőlap
Cím
Hivatkozott rá
Hivatkozott rá
Év
Asymptotics of ridge (less) regression under general source condition
D Richards, J Mourtada, L Rosasco
International Conference on Artificial Intelligence and Statistics, 3889-3897, 2021
902021
Graph-Dependent Implicit Regularisation for Distributed Stochastic Subgradient Descent
D Richards, P Rebeschini
Journal of Machine Learning Research 21 (34), 1-44, 2020
282020
Decentralised learning with distributed gradient descent and random features
D Richards, P Rebeschini, L Rosasco
Proceedings of Machine Learning Research, 2020
26*2020
Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel
D Richards, I Kuzborskij
Advances in Neural Information Processing Systems 34, 2021
212021
Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up
D Richards, P Rebeschini
NeurIPS 2019, 2019
192019
Distributed Machine Learning with Sparse Heterogeneous Data
D Richards, S Negahban, P Rebeschini
Advances in Neural Information Processing Systems 34, 2021
11*2021
Learning with Gradient Descent and Weakly Convex Losses
D Richards, M Rabbat
International Conference on Artificial Intelligence and Statistics, 1990-1998, 2021
102021
Comparing Classes of Estimators: When does Gradient Descent Beat Ridge Regression in Linear Models?
D Richards, E Dobriban, P Rebeschini
arXiv preprint arXiv:2108.11872, 2021
22021
A rendszer jelenleg nem tudja elvégezni a műveletet. Próbálkozzon újra később.
Cikkek 1–8