The vast majority of images are jpegs, which are internally 420 YUV, but they get converted to 32 bit RGB for use in apps. Using native YUV formats would save half the memory and rendering bandwidth, speed loading, and provide a tiny quality improvement. It would also be \
More from Tech
These past few days I've been experimenting with something new that I want to use by myself.
Interestingly, this thread below has been written by that.
Let me show you how it looks like. 👇🏻
When you see localhost up there, you should know that it's truly an experiment! 😀
It's a dead-simple thread writer that will post a series of tweets a.k.a tweetstorm. ⚡️
I've been personally wanting it myself since few months ago, but neglected it intentionally to make sure it's something that I genuinely need.
So why is that important for me? 🙂
I've been a believer of a story. I tell stories all the time, whether it's in the real world or online like this. Our society has moved by that.
If you're interested by stories that move us, read Sapiens!
One of the stories that I've told was from the launch of Poster.
It's been launched multiple times this year, and Twitter has been my go-to place to tell the world about that.
Here comes my frustration.. 😤
Interestingly, this thread below has been written by that.
Let me show you how it looks like. 👇🏻
Recently I just refunded all Poster's sales from Gumroad. Being that said, I decided to not using that service anymore.
— Wilbert Liu \U0001f468\U0001f3fb\u200d\U0001f3a8 (@wilbertliu) November 19, 2018
Here's a little story \U0001f447\U0001f3fb
When you see localhost up there, you should know that it's truly an experiment! 😀

It's a dead-simple thread writer that will post a series of tweets a.k.a tweetstorm. ⚡️
I've been personally wanting it myself since few months ago, but neglected it intentionally to make sure it's something that I genuinely need.
So why is that important for me? 🙂
I've been a believer of a story. I tell stories all the time, whether it's in the real world or online like this. Our society has moved by that.
If you're interested by stories that move us, read Sapiens!
One of the stories that I've told was from the launch of Poster.
It's been launched multiple times this year, and Twitter has been my go-to place to tell the world about that.
Here comes my frustration.. 😤
The entire discussion around Facebook’s disclosures of what happened in 2016 is very frustrating. No exec stopped any investigations, but there were a lot of heated discussions about what to publish and when.
In the spring and summer of 2016, as reported by the Times, activity we traced to GRU was reported to the FBI. This was the standard model of interaction companies used for nation-state attacks against likely US targeted.
In the Spring of 2017, after a deep dive into the Fake News phenomena, the security team wanted to publish an update that covered what we had learned. At this point, we didn’t have any advertising content or the big IRA cluster, but we did know about the GRU model.
This report when through dozens of edits as different equities were represented. I did not have any meetings with Sheryl on the paper, but I can’t speak to whether she was in the loop with my higher-ups.
In the end, the difficult question of attribution was settled by us pointing to the DNI report instead of saying Russia or GRU directly. In my pre-briefs with members of Congress, I made it clear that we believed this action was GRU.
The story doesn\u2019t say you were told not to... it says you did so without approval and they tried to obfuscate what you found. Is that true?
— Sarah Frier (@sarahfrier) November 15, 2018
In the spring and summer of 2016, as reported by the Times, activity we traced to GRU was reported to the FBI. This was the standard model of interaction companies used for nation-state attacks against likely US targeted.
In the Spring of 2017, after a deep dive into the Fake News phenomena, the security team wanted to publish an update that covered what we had learned. At this point, we didn’t have any advertising content or the big IRA cluster, but we did know about the GRU model.
This report when through dozens of edits as different equities were represented. I did not have any meetings with Sheryl on the paper, but I can’t speak to whether she was in the loop with my higher-ups.
In the end, the difficult question of attribution was settled by us pointing to the DNI report instead of saying Russia or GRU directly. In my pre-briefs with members of Congress, I made it clear that we believed this action was GRU.
THREAD: How is it possible to train a well-performing, advanced Computer Vision model 𝗼𝗻 𝘁𝗵𝗲 𝗖𝗣𝗨? 🤔
At the heart of this lies the most important technique in modern deep learning - transfer learning.
Let's analyze how it
2/ For starters, let's look at what a neural network (NN for short) does.
An NN is like a stack of pancakes, with computation flowing up when we make predictions.
How does it all work?
3/ We show an image to our model.
An image is a collection of pixels. Each pixel is just a bunch of numbers describing its color.
Here is what it might look like for a black and white image
4/ The picture goes into the layer at the bottom.
Each layer performs computation on the image, transforming it and passing it upwards.
5/ By the time the image reaches the uppermost layer, it has been transformed to the point that it now consists of two numbers only.
The outputs of a layer are called activations, and the outputs of the last layer have a special meaning... they are the predictions!
At the heart of this lies the most important technique in modern deep learning - transfer learning.
Let's analyze how it
THREAD: Can you start learning cutting-edge deep learning without specialized hardware? \U0001f916
— Radek Osmulski (@radekosmulski) February 11, 2021
In this thread, we will train an advanced Computer Vision model on a challenging dataset. \U0001f415\U0001f408 Training completes in 25 minutes on my 3yrs old Ryzen 5 CPU.
Let me show you how...
2/ For starters, let's look at what a neural network (NN for short) does.
An NN is like a stack of pancakes, with computation flowing up when we make predictions.
How does it all work?

3/ We show an image to our model.
An image is a collection of pixels. Each pixel is just a bunch of numbers describing its color.
Here is what it might look like for a black and white image

4/ The picture goes into the layer at the bottom.
Each layer performs computation on the image, transforming it and passing it upwards.

5/ By the time the image reaches the uppermost layer, it has been transformed to the point that it now consists of two numbers only.
The outputs of a layer are called activations, and the outputs of the last layer have a special meaning... they are the predictions!
