CT No.203: WYSIWYG until the next feature release
Closing out content administration month, the chum bucket takes LinkedIn, and a content analytics job worth one million dollars
Closing out content administration month, the chum bucket takes LinkedIn, and a content analytics job worth one million dollars
Events next week! An intro to content administration! And the difference between the tech and media approaches to the content business
Bottle episode: Funeral for a friend
What if keywords aren't what you think they are?
Content powers many businesses, but decision-making about content strategy is rarely elevated to the leadership level. These five continua will directly connect content strategy decisions with budget fluctuations and tangible results.
A team-oriented thought-starter, brand new courses, and oodles of links and resources for your team
The words we publish and hold up for peer review remain the best representation of our brains at work in the digital world. A published paper is the best way to look closely at the foundational assumptions of LLMs. And those begin with pop culture.
Transformers take static vector embeddings, which assign single values to every token, and expand their context, nearly simultaneously as they process the context of every other word in the sentence. But who cares, let's listen to a pop song!
How to understand tokens and vector embeddings, for word people.
Even in the face of "black box" algorithms, the history of artificial intelligence—natural language processing, more specifically—has left plenty of clues. While we can't understand the full equation, we can see how building blocks create common patterns in how current algorithms process language.