Peter Richtarik


Peter Richtarik is a Slovak mathematician and computer scientist[1] working in the area of big data optimization and machine learning, known for his work on randomized coordinate descent algorithms, stochastic gradient descent and federated learning. He is currently a Professor of Computer Science at the King Abdullah University of Science and Technology.

Peter Richtarik
Born
NationalitySlovak
Alma materComenius University Cornell University
Scientific career
FieldsMathematics, Computer Science, Machine Learning
InstitutionsKAUST
ThesisSome algorithms for large-scale convex and linear minimization in relative scale (2007)
Academic advisorsYurii Nesterov
Websitehttps://richtarik.org

Education

Richtarik earned a master's degree in mathematics from Comenius University, Slovakia, in 2001, graduating summa cum laude.[2] In 2007, he obtained a PhD in operations research from Cornell University, advised by Michael Jeremy Todd.[3][4]

Career

Between 2007 and 2009, he was a postdoctoral scholar in the Center for Operations Research and Econometrics and Department of Mathematical Engineering at Universite catholique de Louvain, Belgium, working with Yurii Nesterov.[5][6] Between 2009 and 2019, Richtarik was a Lecturer and later Reader in the School of Mathematics at the University of Edinburgh. He is a Turing Fellow.[7] Richtarik founded and organizes a conference series entitled "Optimization and Big Data".[8][9]

Academic work

Richtarik's early research concerned gradient-type methods, optimization in relative scale, sparse principal component analysis and algorithms for optimal design. Since his appointment at Edinburgh, he has been working extensively on building algorithmic foundations of randomized methods in convex optimization, especially randomized coordinate descent algorithms and stochastic gradient descent methods. These methods are well suited for optimization problems described by big data and have applications in fields such as machine learning, signal processing and data science.[10][11] Richtarik is the co-inventor of an algorithm generalizing the randomized Kaczmarz method for solving a system of linear equations, contributed to the invention of federated learning, and co-developed a stochastic variant of the Newton's method.

Awards and distinctions

Bibliography

  • Peter Richtarik & Martin Takac (2012). "Efficient serial and parallel coordinate descent methods for huge-scale truss topology design". Operations Research Proceedings 2011. Springer-Verlag. pp. 27–32. doi:10.1007/978-3-642-29210-1_5.
  • Peter Richtarik & Martin Takac (2014). "Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function". Mathematical Programming. 144 (1). Springer. pp. 1–38. doi:10.1007/s10107-012-0614-z.
  • Olivier Fercoq & Peter Richtarik (2015). "Accelerated, parallel and proximal coordinate descent". SIAM Journal on Optimization. 25 (4). pp. 1997–2023. doi:10.1137/130949993.
  • Dominik Csiba; Zheng Qu; Peter Richtarik (2015). "Stochastic Dual Coordinate Ascent with Adaptive Probabilities" (pdf). Proceedings of the 32nd International Conference on Machine Learning. pp. 674–683.
  • Robert M Gower & Peter Richtarik (2015). "Randomized Iterative Methods for Linear Systems". SIAM Journal on Matrix Analysis and Applications. 36 (4). pp. 1660–1690. doi:10.1137/15M1025487.
  • Peter Richtarik & Martin Takac (2016). "Parallel coordinate descent methods for big data optimization". Mathematical Programming. 156 (1). pp. 433–484. doi:10.1007/s10107-015-0901-6.
  • Zheng Qu & Peter Richtarik (2016). "Coordinate descent with arbitrary sampling I: algorithms and complexity". Optimization Methods and Software. 31 (5): 829–857. arXiv:1412.8060. doi:10.1080/10556788.2016.1190360.
  • Zheng Qu & Peter Richtarik (2016). "Coordinate descent with arbitrary sampling II: expected separable overapproximation". Optimization Methods and Software. 31 (5): 858–884. arXiv:1412.8063. doi:10.1080/10556788.2016.1190361.
  • Zheng Qu; Peter Richtarik; Martin Takac; Olivier Fercoq (2016). "SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization" (pdf). Proceedings of the 33rd International Conference on Machine Learning. pp. 1823–1832.
  • Zeyuan Allen-Zhu; Zheng Qu; Peter Richtarik; Yang Yuan (2016). "Even faster accelerated coordinate descent using non-uniform sampling" (pdf). Proceedings of the 33rd International Conference on Machine Learning. pp. 1110–1119.
  • Dominik Csiba & Peter Richtarik (2016). "Importance sampling for minibatches". arXiv:1602.02283 [cs.LG].
  • Dominik Csiba & Peter Richtarik (2016). "Coordinate descent face-off: primal or dual?". arXiv:1605.08982 [math.OC].

References

  1. "Richtarik's DBLP profile". Retrieved December 23, 2020.
  2. "Richtarik's CV" (PDF). Retrieved August 21, 2016.
  3. "Mathematics Genealogy Project". Retrieved August 20, 2016.
  4. "Cornell PhD Thesis". Retrieved August 22, 2016.
  5. "Postdoctoral Fellows at CORE". Retrieved August 22, 2016.
  6. "Simons Institute for the Theory of Computing, UC Berkeley". Retrieved August 22, 2016.
  7. "Alan Turing Institute Faculty Fellows". Retrieved August 22, 2016.
  8. "Optimization and Big Data 2012". Retrieved August 20, 2016.
  9. "Optimization and Big Data 2015". Retrieved August 20, 2016.
  10. Cathy O'Neil & Rachel Schutt (2013). "Modeling and Algorithms at Scale". Doing Data Science: Straight Talk from the Frontline. O'Reilly. ISBN 9781449358655. Retrieved August 21, 2016.
  11. Sebastien Bubeck (2015). Convex Optimization: Algorithms and Complexity. Foundations and Trends in Machine Learning. Now Publishers. ISBN 978-1601988607.
  12. "Google Scholar". Retrieved December 28, 2020.
  13. "The h Index for Computer Science". Retrieved December 28, 2020.
  14. "SIGEST Award". Retrieved August 20, 2016.
  15. "EPSRC Fellowship". Retrieved August 21, 2016.
  16. "EUSA Awards 2015". Retrieved August 20, 2016.
  17. "46th Conference of Slovak Mathematicians". Retrieved August 22, 2016.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.