转自:爱可可-
The Numerical Tours of Data Sciences, by Gabriel Peyré, gather Matlab, Pythonand Julia experiments to explore modern mathematical data sciences. They cover data sciences in a broad since, including imaging, machine learning, computer vision and computer graphics. It showcases application of numerical and mathematical methods such as convex optimization, PDEs, optimal transport, inverse problems, sparsity, etc. The tours are complemented by slides of courses detailing the theory and the algorithms.
You can retrieve the draft of the book:
Gabriel Peyré, Mathematical Foundations of Data Sciences.
The Latex sources of the book are available.
It should serves as the mathematical companion for the Numerical Tours of Data Sciences, which presents Matlab/Python/Julia/R detailed implementations of all the concepts covered here.
This book draft presents an overview of important mathematical and numerical foundations for modern data sciences. It covers in particulars the basics of signal and image processing (Fourier, Wavelets, and their applications to denoising and compression), imaging sciences (inverse problems, sparsity, compressed sensing) and machine learning (linear regression, logistic classification, deep learning). The focus is on the mathematically-sounded exposition of the methodological tools (in particular linear operators, non-linear approximation, convex optimization, optimal transport) and how they can be mapped to efficient computational algorithms.
Shannon Theory
Fourier Transforms
Linear Mesh Processing
Wavelets
Multiresolution Mesh Processing
Linear and Non-linear Approximation
Compression
Denoising
Variational Priors and Regularization
Inverse Problems
Sparse Regularization
Convex Optimization
Convex Duality
Compressed Sensing
Machine Learning
Deep-Learning
Optimal Transport
链接:
https://mathematical-tours.github.io/book/
原文链接:
https://m.weibo.cn/1402400261/4157999205227590
When I started out, I had a strong quantitative background (chemical engineering undergrad, was taking PhD courses in chemical engineering) and some functional skills in programming. From there, I first dove deep into one type of machine learning (Gaussian processes) along with general ML practice (how to set up ML experiments in order to evaluate your models) because that was what I needed for my project. I learned mostly online and by reading papers, but I also took one class on data analysis for biologists that wasn’t ML-focused but did cover programming and statistical thinking. Later, I took a linear algebra class, an ML survey class, and an advanced topics class on structured learning at Caltech. Those helped me obtain a broad knowledge of ML, and then I’ve gained deeper understandings of some subfields that interest me or are especially relevant by reading papers closely (chasing down references and anything I don’t understand and/or implementing the core algorithms myself).