We have a different combination of the hosts for this episode where we continue the series on the types of database systems available and why you might choose one over another. Michael continues impressing by recalling everything we’ve ever said on our 500+ hours of podcasts, Allen enjoys learning about a database system he’d never come across, and Joe is loaded up and ready for his trek to Georgia, USA.
Reviews
- iTunes: Calum55555
- Spotify: Ian Neethling, Ghostmerc, Xuraith
- Audible: Wood2prog
News
Orlando Code Camp
https://orlandocodecamp.com/
Object Oriented DBMS
Wide Column Stores
- Popular: 12. Cassandra, 26. HBase, 27. Azure Cosmos DB
- Also known as extensible record stores
https://static.googleusercontent.com/media/research.google.com/en//archive/bigtable-osdi06.pdf
- Can hold extremely large numbers of dynamic columns
- How much is a large number – “a record can have billions of columns” – which is why they’re also described as two-dimensional key/value stores
- Schema on read
- Wide column stores should not be confused with columnar storage in RDBMS – the latter is an implementation detail inside a relational database system that imroves OLAP type of performance by storing data column by column rather than record by record
- Using Cassandra as the information – https://cassandra.apache.org/_/cassandra-basics.html
Vector DBMS
- Popular: 52. Kdb, 103. Pinecone, 139. Chroma
- A database system that specializes in storing vector embeddings and being able to retrieve them quickly
- What is a vector embedding?
- https://www.pinecone.io/learn/vector-embeddings-for-developers/
- What is a vector? A mathematical structure with a size and a direction
- Think of it as a point in space (on a graph) with the direction being the arrow from (0,0,0) to the vector point
- They say for developers, it’s easier to think of vectors as an array of numbers
- When you look at the vectors in space, some will be floating by themselves while others might be clustered closely to each other
- Vectors are very useful in Machine Learning algorithms because CPUs and GPUs are very good at doing math
- Vector Embeddings is the process of converting virtually any data structure into vectors
- It’s not as simple as just a straight conversion
- You don’t want to lose the original data’s “meaning”
- An example they used was comparing two sentences – you wouldn’t just compare the words, you want to compare if the two sentences had the same meaning
- To keep the meaning and produce vectors with relationships that make sense, that requires embedding models
- Nowadays, many embedding models are created by passing large sets of “labeled” data to neural networks
https://en.wikipedia.org/wiki/Neural_network
- Neural networks are trained using supervised learning (usually), they can also be self-supervised or unsupervised learning
- Using a supervised model, you pass in large sets of data as pairs of inputs and labeled outputs
- The values are transformed in each layer of the neural network
- With each training of the neural network, the activations at each layer are modified
- The goal is that eventually the neural network will be able to provide an output for any given input, even if it hasn’t seen that specific input before
- The embedding model is essentially those layers of the neural network minus the last one that was labeling data – rather than getting labeled data you get a vector embedding
- They have a great visualization on the pinecone page showing the output of a word2vec embedding model that shows how words would appear in this 3d vectror space
- This is what an embedding model does – it can take inputs and know where to place them in “vector space”
- Items placed closer together are more related, and further apart, less related
- Ok, so now we know what vector embeddings are, what can we do with them?
- Semantic search – rather than having search engines be able to search for words that are similar to what you entered, they can now search for content with meaning similar to what you searched for
- Question answering applications
- Audio search
- Check out the page of sample applications – https://docs.pinecone.io/page/examples
Resources
Tips of the Week