- What is it?

- Who is this course for?

- What is the format?

- What makes this course unique?

- Why constrain to open source tools?

- What are my qualifications?

- Why is this free?

- What are the prerequisites?

https://t.co/xmMm9XGK9j

- ๐ฆ Product

- ๐ข Data

- ๐ค Modeling

- ๐ Scripting

- ๐ API

- ๐ Production

More details (lessons, task, etc.) here: https://t.co/xmMm9XGK9j

Thread ๐

- What is it?

- Who is this course for?

- What is the format?

- What makes this course unique?

- Why constrain to open source tools?

- What are my qualifications?

- Why is this free?

- What are the prerequisites?

https://t.co/xmMm9XGK9j

Putting ML in Production: a guide and code-driven case study on MLOps. We will be developing and deploying Made With ML's first ML service, from Product โ ML โ Production, with open source tools.

- ML developers looking to become end-to-end ML developers.

- Software engineers looking to learn how to responsibly deploy and monitor ML systems.

- Product managers who want to have a comprehensive understanding of the different stages of ML dev.

- Intuition: high level overview of the concepts.

- Code: simple code examples to illustrate the concept.

- Application: applying the concept to our specific task.

- Extensions: brief look at other tools and techniques that will be useful.

1. Hands-on

2. Intuition-first

3. Software engineering

4. Focused yet holistic

5. Open source tools

If you search production ML or MLOps online, you'll find great blog posts and tweets. But in order to really understand these concepts, you need to implement them.

We will never jump straight to code. In every lesson, we will develop intuition for the concepts and think about it from a product perspective.

This course isn't just about ML. In fact, it's mostly about clean software engineering! We'll cover important concepts like versioning, testing, logging, etc. that really makes this a production-grade product.

For every concept, we'll not only cover what's most important for our specific task (this is the case study aspect) but we'll also cover related methods (this is the guide aspect) which may prove to be useful in other situations.

We will be using only open source tools for this project, with the exception of @googlecloud for storage and compute (free credit will be plenty).

1. We can focus on the fundamentals, everyone can participate (single player mode as my friend @eugeneyan coined) and you will have much better understanding when you do finally use a paid tool at work (if you want to).

1. I've deployed large scale ML systems at @Apple as well as smaller systems with constraints at startups and want to share the common principles I've learned along the way.

LinkedIn: https://t.co/xWmPKz53vw

Personal website: https://t.co/NpLSczadPn

1. Personal reason: Every day, people explore the amazing work on @madewithml to learn from and contribute themselves. To stay consistent with this free spirit, I'm releasing this free course to pass on the lessons I've learned from my mentors and experiences.

- You should have some familiarity with Python and basic ML algorithms. While we will be experimenting with deep learning (w.r.t compute/performance tradeoffs), you can easily apply the lessons to any class of ML models.

https://t.co/V35zXocadQ

https://t.co/cmTTeALWz1

2/ In this gif, narrow relu networks have high probability of initializing near the 0 function (because of relu) and getting stuck. This causes the function distribution to become multi-modal over time. However, for wide relu networks this is not an issue.

3/ This time-evolving GP depends on two kernels: the kernel describing the GP at init, and the kernel describing the linear evolution of this GP. The former is the NNGP kernel, and the latter is the Neural Tangent Kernel (NTK).

4/ Once we have these two kernels, we can derive the GP mean and covariance at any time t via straightforward linear algebra.

5/ So it remains to calculate the NNGP kernel and NT kernel for any given architecture. The first is described in https://t.co/cFWfNC5ALC and in this thread

Here is a compilation of resources (books, videos & papers) to get you going.

(Note: It's not an exhaustive list but I have carefully curated it based on my experience and observations)

๐ Mathematics for Machine Learning

by Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong

https://t.co/zSpp67kJSg

Note: this is probably the place you want to start. Start slowly and work on some examples. Pay close attention to the notation and get comfortable with it.

๐ Pattern Recognition and Machine Learning

by Christopher Bishop

Note: Prior to the book above, this is the book that I used to recommend to get familiar with math-related concepts used in machine learning. A very solid book in my view and it's heavily referenced in academia.

๐ The Elements of Statistical Learning

by Jerome H. Friedman, Robert Tibshirani, and Trevor Hastie

Mote: machine learning deals with data and in turn uncertainty which is what statistics teach. Get comfortable with topics like estimators, statistical significance,...

๐ Probability Theory: The Logic of Science

by E. T. Jaynes

Note: In machine learning, we are interested in building probabilistic models and thus you will come across concepts from probability theory like conditional probability and different probability distributions.

This New York Times feature shows China with a Gini Index of less than 30, which would make it more equal than Canada, France, or the Netherlands. https://t.co/g3Sv6DZTDE

That's weird. Income inequality in China is legendary.

Let's check this number.

2/The New York Times cites the World Bank's recent report, "Fair Progress? Economic Mobility across Generations Around the World".

The report is available here:

3/The World Bank report has a graph in which it appears to show the same value for China's Gini - under 0.3.

The graph cites the World Development Indicators as its source for the income inequality data.

4/The World Development Indicators are available at the World Bank's website.

Here's the Gini index: https://t.co/MvylQzpX6A

It looks as if the latest estimate for China's Gini is 42.2.

That estimate is from 2012.

5/A Gini of 42.2 would put China in the same neighborhood as the U.S., whose Gini was estimated at 41 in 2013.

I can't find the <30 number anywhere. The only other estimate in the tables for China is from 2008, when it was estimated at 42.8.

Copyright © 2021 Buzz Chronicles - All right reserved