newsletter-archive

107 posts

CT No.249: Weathering the sea change

Where I'm placing my bets and where I'm walking away

CT No.248: Robots read text, not subtext

And other insights for developing content in the new era of AI-powered search

CT No. 247: For the rage monkeys and the print nerds

A holiday gift guide for the content professionals in your life

CT No.245: Billy Joel mathletics

Query fan-out is a new term for what's long been a bedrock of keyword-based language processing: the nouns are central to understanding the user's intent.

CT No.244: Explicating "attention" in LLMs

The words we publish and hold up for peer review remain the best representation of our brains at work in the digital world. A published paper is the best way to look closely at the foundational assumptions of LLMs. And those begin with pop culture.

CT No.243: Disambiguation, sliding doors, hallucinations, and madeleines

Transformers take static vector embeddings, which assign single values to every token, and expand their context, nearly simultaneously as they process the context of every other word in the sentence. But who cares, let's listen to a pop song!

Your link has expired. Please request a new one.
Your link has expired. Please request a new one.
Your link has expired. Please request a new one.
Welcome to The Content Technologist! You've successfully subscribed.
Welcome to The Content Technologist! You've successfully subscribed.
Welcome back to The Content Technologist
Success! You now have access to additional content.