CT No.233: Adamantium skeleton
The storytelling tradition of SEO strategies and annual reports
The storytelling tradition of SEO strategies and annual reports
In many email clients, that Spotify embed didn't work (although they work fine if you view in browser. But if you don't want to work even that hard, here are the playlists as boring regular links instead. * Best songs of 2024 * Christmas Goodies Have a lovely...
We must remember: the internet is optimized to make us angry. But, begrudgingly, I kinda like LinkedIn.
A step-by-step method to proving the value of your 2025 content plans
How to use a measurement model to understand the effectiveness of content
How to prove your content's value to your business
The words we publish and hold up for peer review remain the best representation of our brains at work in the digital world. A published paper is the best way to look closely at the foundational assumptions of LLMs. And those begin with pop culture.
Transformers take static vector embeddings, which assign single values to every token, and expand their context, nearly simultaneously as they process the context of every other word in the sentence. But who cares, let's listen to a pop song!
How to understand tokens and vector embeddings, for word people.
Even in the face of "black box" algorithms, the history of artificial intelligence—natural language processing, more specifically—has left plenty of clues. While we can't understand the full equation, we can see how building blocks create common patterns in how current algorithms process language.