For people curious about the Roam API and confused by the syntax, or interested in why Conor went with Datomic/Datascript and not a traditional database, this older talk by Roam developer @mark_bastian is a great overview.

He gives great examples using Spiderman of how even modeling something fairly trivial in SQL is much more complex than in Datomic. But the real kicker is when you're trying to interrogate the data to find recursive relationships.
Right now the Roam data model (at least that's exposed to developers) is just about pages, blocks, and children with tags. Already you can see how finding the page containing a block with a certain tag etc is useful.https://t.co/jWJnuKu1RG
But imagine when attribute relationships are fully represented

You should be able to model the entire Spiderman story in Roam.
Page title: Peter Parker
Child of:: [[Richard Parker]] [[Mary Parker]]
Aliases:: [[Spidey]]

etc, and do these kind of queries.
"Show me quotes about operational efficiency in books by authors who used to be in the military"
"Show me companies in Boise, Idaho, founded by women, whose evaluation is lower than 10X ARR"
Of course, this data can also be used as input to timelines, graphs, etc:

"Show me a graph of my sleep quality versus days in which I ate foods that had gluten in them or not" (where [[bread]] has a page with ingredients::).
One thing I'm curious about is how the Datalog system differs from Wikidata and SparQL, the modeling seems to be kind of similar - you have triplets of entities, like :Oslo :is-a-capital-if :Norway (where all three entities have an id), and you can do graph queries.
So you can ask "Largest city with female mayors", but you can also visualize data in all kinds of ways, like dimensions of elements, children of Genghis Khan, or lighthouses in Norway https://t.co/XvxmEVO9vB
What would it look like to integrate Wikidata with Roam in the future, being able to easily pull in and reference data about entities (cities, authors, scientific concepts)... And build our own Wikidata through inter-Roaming... As well as citations (https://t.co/aP7RSyaGl0) ...

More from Tech

The entire discussion around Facebook’s disclosures of what happened in 2016 is very frustrating. No exec stopped any investigations, but there were a lot of heated discussions about what to publish and when.


In the spring and summer of 2016, as reported by the Times, activity we traced to GRU was reported to the FBI. This was the standard model of interaction companies used for nation-state attacks against likely US targeted.

In the Spring of 2017, after a deep dive into the Fake News phenomena, the security team wanted to publish an update that covered what we had learned. At this point, we didn’t have any advertising content or the big IRA cluster, but we did know about the GRU model.

This report when through dozens of edits as different equities were represented. I did not have any meetings with Sheryl on the paper, but I can’t speak to whether she was in the loop with my higher-ups.

In the end, the difficult question of attribution was settled by us pointing to the DNI report instead of saying Russia or GRU directly. In my pre-briefs with members of Congress, I made it clear that we believed this action was GRU.
THREAD: How is it possible to train a well-performing, advanced Computer Vision model 𝗼𝗻 𝘁𝗵𝗲 𝗖𝗣𝗨? 🤔

At the heart of this lies the most important technique in modern deep learning - transfer learning.

Let's analyze how it


2/ For starters, let's look at what a neural network (NN for short) does.

An NN is like a stack of pancakes, with computation flowing up when we make predictions.

How does it all work?


3/ We show an image to our model.

An image is a collection of pixels. Each pixel is just a bunch of numbers describing its color.

Here is what it might look like for a black and white image


4/ The picture goes into the layer at the bottom.

Each layer performs computation on the image, transforming it and passing it upwards.


5/ By the time the image reaches the uppermost layer, it has been transformed to the point that it now consists of two numbers only.

The outputs of a layer are called activations, and the outputs of the last layer have a special meaning... they are the predictions!

You May Also Like