How can we use language supervision to learn better visual representations for robotics?

Introducing Voltron: Language-Driven Representation Learning for Robotics!

Paper: https://t.co/gIsRPtSjKz
Models: https://t.co/NOB3cpATYG
Evaluation: https://t.co/aOzQu95J8z

🧵👇(1 / 12)

Videos of humans performing everyday tasks (Something-Something-v2, Ego4D) offer a rich and diverse resource for learning representations for robotic manipulation.

Yet, an underused part of these datasets are the rich, natural language annotations accompanying each video. (2/12)
The Voltron framework offers a simple way to use language supervision to shape representation learning, building off of prior work in representations for robotics like MVP (https://t.co/Pb0mk9hb4i) and R3M (https://t.co/o2Fkc3fP0e).

The secret is *balance* (3/12)
Starting with a masked autoencoder over frames from these video clips, make a choice:

1) Condition on language and improve our ability to reconstruct the scene.

2) Generate language given the visual representation and improve our ability to describe what's happening. (4/12)
By trading off *conditioning* and *generation* we show that we can learn 1) better representations than prior methods, and 2) explicitly shape the balance of low and high-level features captured.

Why is the ability to shape this balance important? (5/12)
Because robotics isn't a single thing! While prior work focuses on learning for control, there are so many problems we care about – problems that require different features!

How do we know?

Because we build an evaluation suite of 5 diverse robotics problem domains! (6/12)
Problems like grasp affordance prediction (per-pixel segmentation) tend to require more *low-level* spatial features; edges, object boundaries, textures.

Evaluation: the ARC Grasping dataset (https://t.co/rRI4ya84DL) – CC @andyzengtweets @SongShuran. (7/12)
Learning for control tasks benefit from representations that mix of low and high-level features.

Modeling *multi-frame* contexts (easy with Voltron) is also high-impact!

Evaluation: Franka Kitchen & Adroit Manipulation domains from R3M – CC @aravindr93 @Vikashplus. (8/12)
Really cool is how we can use the generative language model zero-shot, with no extra data.

Given a video & language intent, we can score – in real time – how well the behavior in the video captures the intent.

Transfers to *robot data* – no robots during pretraining! (9/12)
But don't take our word for it – try out our representations yourself... or evaluate your own!

Models & Pretraining: https://t.co/NOB3cpATYG
Evaluation Suite: https://t.co/aOzQu95J8z

Use our models: `pip install voltron-robotics` (10/12)
This project was a huge endeavor; one that would not have been possible without amazing collaborators and mentors – @SurajNair_1 @_anniechen_ @tkollar @chelseabfinn @DorsaSadigh and @percyliang.

Further thanks to @ToyotaResearch, @stanfordnlp, and the @StanfordAILab ! (11/12)
I'm really excited to see the impact of language on shaping representations for robotics... but this isn't the end. The hard parts of robotics remain hard.

Voltron is a building block – a tool. I can't wait to see how y'all use it. Thanks folks – and stay tuned 🤖🚀! (12/12)

More from All

@franciscodeasis https://t.co/OuQaBRFPu7
Unfortunately the "This work includes the identification of viral sequences in bat samples, and has resulted in the isolation of three bat SARS-related coronaviruses that are now used as reagents to test therapeutics and vaccines." were BEFORE the


chimeric infectious clone grants were there.https://t.co/DAArwFkz6v is in 2017, Rs4231.
https://t.co/UgXygDjYbW is in 2016, RsSHC014 and RsWIV16.
https://t.co/krO69CsJ94 is in 2013, RsWIV1. notice that this is before the beginning of the project

starting in 2016. Also remember that they told about only 3 isolates/live viruses. RsSHC014 is a live infectious clone that is just as alive as those other "Isolates".

P.D. somehow is able to use funds that he have yet recieved yet, and send results and sequences from late 2019 back in time into 2015,2013 and 2016!

https://t.co/4wC7k1Lh54 Ref 3: Why ALL your pangolin samples were PCR negative? to avoid deep sequencing and accidentally reveal Paguma Larvata and Oryctolagus Cuniculus?

You May Also Like

1

From today, we will memorize the names of 27 Nakshatras in Vedic Jyotish to never forget in life.

I will write 4 names. Repeat them in SAME sequence twice in morning, noon, evening. Each day, revise new names + recall all previously learnt names.

Pls RT if you are in.

2

Today's Nakshatras are:-

1. Ashwini - अश्विनी

2. Bharani - भरणी

3. Krittika - कृत्तिका

4. Rohini - रोहिणी

Ashwini - अश्विनी is the FIRST Nakshatra.

Repeat these names TWICE now, tomorrow morning, noon and evening. Like this tweet if you have revised 8 times as told.

3

Today's Nakshatras are:-

5. Mrigashira - मृगशिरा

6. Ardra - आर्द्रा

7. Punarvasu - पुनर्वसु

8. Pushya - पुष्य

First recall previously learnt Nakshatras twice. Then recite these TWICE now, tomorrow morning, noon & evening in SAME order. Like this tweet only after doing so.

4

Today's Nakshatras are:-

9. Ashlesha - अश्लेषा

10. Magha - मघा

11. Purvaphalguni - पूर्वाफाल्गुनी

12. Uttaraphalguni - उत्तराफाल्गुनी

Purva means that comes before (P se Purva, P se pehele), and Uttara comes later.

Read next tweet too.

5

Purva, Uttara prefixes come in other Nakshatras too. Purva= pehele wala. Remember.

First recall previously learnt 8 Nakshatras twice. Then recite those in Tweet #4 TWICE now, tomorrow morning, noon & evening in SAME order. Like this tweet if you have read Tweets #4 & 5, both.
1

From today, we will memorize the names of 27 Nakshatras in Vedic Jyotish to never forget in life.

I will write 4 names. Repeat them in SAME sequence twice in morning, noon, evening. Each day, revise new names + recall all previously learnt names.

Pls RT if you are in.

2

Today's Nakshatras are:-

1. Ashwini - अश्विनी

2. Bharani - भरणी

3. Krittika - कृत्तिका

4. Rohini - रोहिणी

Ashwini - अश्विनी is the FIRST Nakshatra.

Repeat these names TWICE now, tomorrow morning, noon and evening. Like this tweet if you have revised 8 times as told.

3

Today's Nakshatras are:-

5. Mrigashira - मृगशिरा

6. Ardra - आर्द्रा

7. Punarvasu - पुनर्वसु

8. Pushya - पुष्य

First recall previously learnt Nakshatras twice. Then recite these TWICE now, tomorrow morning, noon & evening in SAME order. Like this tweet only after doing so.

4

Today's Nakshatras are:-

9. Ashlesha - अश्लेषा

10. Magha - मघा

11. Purvaphalguni - पूर्वाफाल्गुनी

12. Uttaraphalguni - उत्तराफाल्गुनी

Purva means that comes before (P se Purva, P se pehele), and Uttara comes later.

Read next tweet too.

5

Purva, Uttara prefixes come in other Nakshatras too. Purva= pehele wala. Remember.

First recall previously learnt 8 Nakshatras twice. Then recite those in Tweet #4 TWICE now, tomorrow morning, noon & evening in SAME order. Like this tweet if you have read Tweets #4 & 5, both.