PinnedDaniel TunkelangRanking vs. Relevance: 2 Pitfalls and How to Avoid ThemRelevance takes priority over desirability, but desirability dominates small differences in probability of relevance.3 min read·Apr 29, 2024----
PinnedDaniel TunkelangLLMs and RAG are Great, But Don’t Throw Away Your Inverted IndexIt is tempting to believe that we can dispense with the inverted index in favor of embedding-based retrieval. But there are a few…3 min read·Mar 29, 2024----
PinnedDaniel TunkelangSparse and Dense RepresentationsAI-powered search moves from sparse bags of words to dense embedding-based representations. But sparse vs. dense is a false dichotomy.6 min read·Apr 15, 2024----
PinnedDaniel TunkelangAI-Powered Search: Embedding-Based Retrieval and Retrieval-Augmented Generation (RAG)Replacing traditional search with AI-powered search means embedding-based retrieval and possibly retrieval-augmented generation (RAG).9 min read·Apr 8, 2024--3--3
PinnedDaniel TunkelangSemantic Equivalence of e-Commerce QueriesSemantic Equivalence of e-Commerce Queries by Aritra Mandal, Daniel Tunkelang, and Zhe Wu. KDD 2023 Workshop on E-Commerce and NLP (ECNLP).1 min read·Aug 7, 2023--1--1
Daniel TunkelangIs Similarity Objective?Many search problems involve content and query similarity. Is similarity objective? In theory, no. In practice, it’s often close enough.5 min read·1 day ago----
Daniel TunkelangBags of Documents and the Cluster HypothesisThe bag-of-documents model is a corollary to the cluster hypothesis. The model is likely to fail if a query violates the cluster…5 min read·Jun 10, 2024--1--1
Daniel TunkelangBags of Queries as Sparse Document RepresentationsThe bag-of-queries model provides a sparse document representation that can be useful as either a positive or negative relevance signal.4 min read·May 28, 2024----
Daniel TunkelangIs Targeted Advertising EthicalUsers have — or should have — two choices: pay cash or accept targeted advertising. There is no free lunch.4 min read·May 7, 2024--1--1
Daniel TunkelangLLMs and RAG are great. What’s Next?In the next few years, I believe that we will see LLMs focus less on size and more on function calling and tool use.7 min read·Apr 18, 2024----