
I could create an entire twitter feed of things Facebook has tried to cover up since 2015. Where do you want to start, Mark and Sheryl? https://t.co/1trgupQEH9


Answer "Facebook has over 30,000 employees. Senior management does not participate in day-today hiring decisions."

Answer: "He did not become aware of allegations CA may not have deleted data about FB users obtained through Dr. Kogan's app until March of 2018, when
these issues were raised in the media."

A company as powerful as @facebook should be subject to proper scrutiny. Mike Schroepfer, its CTO, told us that the buck stops with Mark Zuckerberg on the Cambridge Analytica scandal, which is why he should come and answer our questions @DamianCollins @IanCLucas pic.twitter.com/0H4VMhtIFu
— Digital, Culture, Media and Sport Committee (@CommonsCMS) May 23, 2018
More from Tech
A brief analysis and comparison of the CSS for Twitter's PWA vs Twitter's legacy desktop website. The difference is dramatic and I'll touch on some reasons why.
Legacy site *downloads* ~630 KB CSS per theme and writing direction.
6,769 rules
9,252 selectors
16.7k declarations
3,370 unique declarations
44 media queries
36 unique colors
50 unique background colors
46 unique font sizes
39 unique z-indices
https://t.co/qyl4Bt1i5x
PWA *incrementally generates* ~30 KB CSS that handles all themes and writing directions.
735 rules
740 selectors
757 declarations
730 unique declarations
0 media queries
11 unique colors
32 unique background colors
15 unique font sizes
7 unique z-indices
https://t.co/w7oNG5KUkJ
The legacy site's CSS is what happens when hundreds of people directly write CSS over many years. Specificity wars, redundancy, a house of cards that can't be fixed. The result is extremely inefficient and error-prone styling that punishes users and developers.
The PWA's CSS is generated on-demand by a JS framework that manages styles and outputs "atomic CSS". The framework can enforce strict constraints and perform optimisations, which is why the CSS is so much smaller and safer. Style conflicts and unbounded CSS growth are avoided.
Legacy site *downloads* ~630 KB CSS per theme and writing direction.
6,769 rules
9,252 selectors
16.7k declarations
3,370 unique declarations
44 media queries
36 unique colors
50 unique background colors
46 unique font sizes
39 unique z-indices
https://t.co/qyl4Bt1i5x

PWA *incrementally generates* ~30 KB CSS that handles all themes and writing directions.
735 rules
740 selectors
757 declarations
730 unique declarations
0 media queries
11 unique colors
32 unique background colors
15 unique font sizes
7 unique z-indices
https://t.co/w7oNG5KUkJ

The legacy site's CSS is what happens when hundreds of people directly write CSS over many years. Specificity wars, redundancy, a house of cards that can't be fixed. The result is extremely inefficient and error-prone styling that punishes users and developers.
The PWA's CSS is generated on-demand by a JS framework that manages styles and outputs "atomic CSS". The framework can enforce strict constraints and perform optimisations, which is why the CSS is so much smaller and safer. Style conflicts and unbounded CSS growth are avoided.
The first area to focus on is diversity. This has become a dogma in the tech world, and despite the fact that tech is one of the most meritocratic industries in the world, there are constant efforts to promote diversity at the expense of fairness, merit and competency. Examples:
USC's Interactive Media & Games Division cancels all-star panel that included top-tier game developers who were invited to share their experiences with students. Why? Because there were no women on the
ElectronConf is a conf which chooses presenters based on blind auditions; the identity, gender, and race of the speaker is not known to the selection team. The results of that merit-based approach was an all-male panel. So they cancelled the conference.
Apple's head of diversity (a black woman) got in trouble for promoting a vision of diversity that is at odds with contemporary progressive dogma. (She left the company shortly after this
Also in the name of diversity, there is unabashed discrimination against men (especially white men) in tech, in both hiring policies and in other arenas. One such example is this, a developer workshop that specifically excluded men: https://t.co/N0SkH4hR35
USC's Interactive Media & Games Division cancels all-star panel that included top-tier game developers who were invited to share their experiences with students. Why? Because there were no women on the
ElectronConf is a conf which chooses presenters based on blind auditions; the identity, gender, and race of the speaker is not known to the selection team. The results of that merit-based approach was an all-male panel. So they cancelled the conference.
Apple's head of diversity (a black woman) got in trouble for promoting a vision of diversity that is at odds with contemporary progressive dogma. (She left the company shortly after this
Also in the name of diversity, there is unabashed discrimination against men (especially white men) in tech, in both hiring policies and in other arenas. One such example is this, a developer workshop that specifically excluded men: https://t.co/N0SkH4hR35

THREAD: How is it possible to train a well-performing, advanced Computer Vision model 𝗼𝗻 𝘁𝗵𝗲 𝗖𝗣𝗨? 🤔
At the heart of this lies the most important technique in modern deep learning - transfer learning.
Let's analyze how it
2/ For starters, let's look at what a neural network (NN for short) does.
An NN is like a stack of pancakes, with computation flowing up when we make predictions.
How does it all work?
3/ We show an image to our model.
An image is a collection of pixels. Each pixel is just a bunch of numbers describing its color.
Here is what it might look like for a black and white image
4/ The picture goes into the layer at the bottom.
Each layer performs computation on the image, transforming it and passing it upwards.
5/ By the time the image reaches the uppermost layer, it has been transformed to the point that it now consists of two numbers only.
The outputs of a layer are called activations, and the outputs of the last layer have a special meaning... they are the predictions!
At the heart of this lies the most important technique in modern deep learning - transfer learning.
Let's analyze how it
THREAD: Can you start learning cutting-edge deep learning without specialized hardware? \U0001f916
— Radek Osmulski (@radekosmulski) February 11, 2021
In this thread, we will train an advanced Computer Vision model on a challenging dataset. \U0001f415\U0001f408 Training completes in 25 minutes on my 3yrs old Ryzen 5 CPU.
Let me show you how...
2/ For starters, let's look at what a neural network (NN for short) does.
An NN is like a stack of pancakes, with computation flowing up when we make predictions.
How does it all work?

3/ We show an image to our model.
An image is a collection of pixels. Each pixel is just a bunch of numbers describing its color.
Here is what it might look like for a black and white image

4/ The picture goes into the layer at the bottom.
Each layer performs computation on the image, transforming it and passing it upwards.

5/ By the time the image reaches the uppermost layer, it has been transformed to the point that it now consists of two numbers only.
The outputs of a layer are called activations, and the outputs of the last layer have a special meaning... they are the predictions!
