Prof. Carlo Fischione (Royal Institute of Technology – KTH, Sweden) will give two classes on Monday 15 and Tuesday 16, from 2 pm to 6 pm in Aula Seminari DISIM (Alan Turing Building), on the following topics:
Title: Fundamentals of Machine Learning over Networks
Lecturer: Prof. Carlo Fischione
Abstract: This course covers
Bibliography
[1] Bubeck, Sébastien. “Convex optimization: Algorithms and complexity.” Foundations and Trends in Machine Learning, vol. 8, no.3-4 (2015): 231-357.
[2] L. Bottou, F. Curtis, J. Norcedal, “Optimization Methods for Large-Scale Machine Learning”, SIAM Rev., 60(2), 223–311.
[3] Boyd, Stephen, et al. “Distributed optimization and statistical learning via the alternating direction method of multipliers.” Foundations and Trends in Machine learning 3.1 (2011): 1-122.
[4] Goodfellow, Y. Bengio, A. Courville, “Deep Learning”, MIT press 2016
[5] Jordan, Michael I., Jason D. Lee, and Yun Yang. “Communication-efficient distributed statistical inference,” Journal of the American Statistical Association, 2018.
[6] Smith, Virginia, et al. “CoCoA: A general framework for communication-efficient distributed optimization.” Journal of Machine Learning Research 18 (2018): 230.
[7] Alistarh, Dan, et al. “QSGD: Communication-efficient SGD via gradient quantization and encoding.” Advances in Neural Information Processing Systems. 2017.
[8] Schmidt, Mark, Nicolas Le Roux, and Francis Bach. “Minimizing finite sums with the stochastic average gradient.” Mathematical Programming 162.1-2 (2017): 83-112.
[9] Boyd, Stephen, et al. “Randomized gossip algorithms,” IEEE Transactions on Information Theory, 2006.
[10] Scaman, Kevin, et al. “Optimal algorithms for smooth and strongly convex distributed optimization in networks,” ICML, 2017.