Here's an overview of key adoption metrics for deep learning frameworks over 2020: downloads, developer surveys, job posts, scientific publications, Colab usage, Kaggle notebooks usage, GitHub data.

TensorFlow/Keras = #1 deep learning solution.

Note that we benchmark adoption vs Facebook's PyTorch because it is the only TF alternative that registers on the scale. Another option would have been sklearn, which has massive adoption, but it isn't really a TF alternative. In the future, I hope we can add JAX.
TensorFlow has seen 115M downloads in 2020, which nearly doubles its lifetime downloads. Note that this does *not* include downloads for all TF-adjacent packages, like tf-nightly, the old tensorflow-gpu, etc.
Also note that most of these downloads aren't from humans, but are automated downloads from CI systems (but none are from Google's systems, as Google doesn't use PyPI).

In a way, this metric reflects usage in production.
There were two worldwide developer surveys in 2020 that measured adoption of various frameworks: the one from StackOverflow, targeting all developers, and the one from Kaggle.
Note that the StackOverflow survey listed both TF and Keras; Keras had very strong metrics, and I suspect many people checked Keras without checking TF. So if "TF/Keras" was a choice, it would have significantly higher numbers here (probably around 15% overall usage).
Mentions in LinkedIn job posts is a metric that I'm not quite sure is meaningful, unfortunately. It doesn't reflect the stack of companies that hire, only the keywords tracked by recruiters.
We can track usage in the research community in two categories: ArXiv, which represents "pure deep learning" research, and Google Scholar, which represents all publications, including applications of deep learning to biology, medicine, etc.
Deep learning research is an important but small niche (~20k users of deep learning out of several millions in total) and it is the only niche where PyTorch is neck-to-neck with TensorFlow.
Finally, GitHub metrics. GitHub makes it possible to track new commits over the last year, but doesn't make it possible to track new stars/forks/watchers, hence why I'm displaying total numbers for these rather than 2020 increases.
Note that the GitHub metrics are only for the TensorFlow repo, not the dozens of large TensorFlow adjacent repos (like the Keras repo, etc).
Overall: 2020 has been a difficult year, in particular one during which many businesses have cut their exploratory investments in deep learning because of Covid, causing a slump from March to November. However, on balance, TF/Keras has still seen modest growth over the year.
Our current growth rate is solid, and our prospects for 2021 are looking bright! I'll post an update to these metrics in 2021. Here's to another year full of improvement, growth, and focusing on delighting our users :)

More from Internet

Or, you could let us know when you figure out why it was trending yesterday and the users are complaining which is why Vice wrote about it. Why I'm saying what I am.

There's an assumption here that this problem is fixed bc it was already hacked.

It's not.


We have ppl freezing and dying in TX right now because some ppl who thought they were really smart never spoke to anyone with actual experience with energy systems in extreme cold climates.

Texans are waiting for a solution to a preventable problem.

Farmers are saying that now, not during a crisis, they have to wait for a JD tech to arrive to help them.

The assumption that bc SOME farmers said screw this and used hacked firmware to get around that obstacle doesn't mean that all farmers are doing that.

If all farmers were using hacked firmware we wouldn't be discussing this right now would we?

Of course no one has pointed out that another issue here is that no one at John Deere has figured out they don't have enough staff to quickly and reliably SOLVE problems.

The locked firmware is just one of many issues here.

By not giving people a fast solution they're causing this and other issues.

So what I'm saying is somebody at John Deere needs to examine all of this.

You May Also Like

A THREAD ON @SarangSood

Decoded his way of analysis/logics for everyone to easily understand.

Have covered:
1. Analysis of volatility, how to foresee/signs.
2. Workbook
3. When to sell options
4. Diff category of days
5. How movement of option prices tell us what will happen

1. Keeps following volatility super closely.

Makes 7-8 different strategies to give him a sense of what's going on.

Whichever gives highest profit he trades in.


2. Theta falls when market moves.
Falls where market is headed towards not on our original position.


3. If you're an options seller then sell only when volatility is dropping, there is a high probability of you making the right trade and getting profit as a result

He believes in a market operator, if market mover sells volatility Sarang Sir joins him.


4. Theta decay vs Fall in vega

Sell when Vega is falling rather than for theta decay. You won't be trapped and higher probability of making profit.