Head of MATH+ Junior Research Group "Mathematical Foundations of Data Science”
Institute of Mathematics, Sekr. MA 5-4
10623 Berlin, Germany
Phone: +49 (030) 314 25758
Fax: +49 (030) 314 21604
I'm currently working in the field of statistical Learning Theory, in particular Deep Learning, effeciency of kernel methods, stochastic approximation methods (SGD) and regularization. My research interests also cover statistical inverse problems and adaptivity.
 Nicole Mücke, Stochastic Gradient Descent Meets Distribution Regression, Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (AISTATS) 2021, San Diego, California, USA. PMLR: Volume 130.
 Nicole Mücke, Enrico Reiss, Stochastic Gradient Descent in Hilbert Scales: Smoothness, Preconditioning and Earlier Stopping, arXiv:2006.10840
 Nicole Mücke, Ingo Steinwart, Global Minima of DNNs: The Plenty Pantry, https://arxiv.org/abs/1905.10686
 Ernesto De Vito, Nicole Mücke, Lorenzo Rosasco, Reproducing kernel Hilbert spaces on manifolds: Sobolev and Diffusion spaces, Analysis and Applications 2020, https://www.worldscientific.com/doi/abs/10.1142/S0219530520400114
 Nicole Mücke, Gergely Neu, Lorenzo Rosasco, Beating SGD Saturation with Tail-Averaging and Minibatching, 33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, arXiv:1902.08668
 Nicole Mücke, Reducing training time by efficient localized kernel regression
Proceedings of Machine Learning Research, PMLR 89:2603-2610, 2019.
 Nicole Mücke, Gilles Blanchard, Parallelizing Spectrally Regularized Kernel Algorithms, Journal of Machine Learning Research (2018)
 Nicole Mücke, Adaptivity for Regularized Kernel Methods by Lepskii's Principle,
 Gilles Blanchard, Nicole Mücke, Optimal Rates for Regularization of Statistical Inverse Learning Problems,
Foundations of Computational Mathematics (2017)
 Gilles Blanchard, Nicole Mücke, Kernel regression, minimax rates and effective dimensionality: beyond the regular case, Analysis and Applications (2019)