Session: Embeddings Everywhere: Not Magic, just Math!!
In the era of Large Language Models (LLMs) and Vector Databases, embeddings have become the industry’s favorite buzzword. We’re told they “Capture meaning” but for many, they remain a mathematical mystery- as black box where text goes in and magic numbers come out.
It's time to pull back the curtain and demystify the magic. This talk strips away the hype to reveal the surprisingly elegant math behind powering modern semantic search, recommendation engines, and RAG (Retrieval-Augmented Generation). We will journey from the basics of one-hot encoding to the high-dimensional geometry of Transformer-based vectors. You’ll learn how computers transform "context" into "coordinates" and how “similarity” becomes distance.
Whether you’re a developer building AI-powered apps or a curious mind wanting to understand the "why" behind the "wow," you’ll walk away with a clear intuition of what embeddings are, how they are calculated, and how to use them effectively without the hand-waving or blind trust in the model.
Bio
Shalmali Kulkarni is a Sr. Lead Data scientist on the GSMT team, where she drives Sales and Marketing AI initiatives that translate data into measurable business impact. With over a decade of experience building and deploying AI-driven solutions at scale, she is the architect behind the GAIN models in Zone and currently leads Marketing AI for Zone – powering smarter targeting, personalization and growth.
She holds a master’s degree in applied data science from New York University.
Shalmali is both professionally trained architect and a data scientist. Her career journey started from design to data to now agents- blending creativity and engineering to design AI that thinks, acts and delivers results.
Based in New Jersey, Shalmali enjoys dancing, cooking and collaborating across teams.