CT No. 198: Content strategy is not content planning
How can we better elevate content strategy at the business level so that our roles, our passions, our work, and our brains are not considered replaceable?
How can we better elevate content strategy at the business level so that our roles, our passions, our work, and our brains are not considered replaceable?
Happy new year! It's four days into 2024, and if you are like me, your body is sore from all the new workouts you've been adding to your routine. The Content Technologist is on our new biweekly publishing schedule—that means every other week, not twice...
A special offer for Content Technologist subscribers
Everything loved and learned in 2023 and a preview of 2024
With data science and algorithms being panned into binary, good-versus-bad narratives, where can we find clarity and room to experiment?
Is it science or is it fiction?
The words we publish and hold up for peer review remain the best representation of our brains at work in the digital world. A published paper is the best way to look closely at the foundational assumptions of LLMs. And those begin with pop culture.
Transformers take static vector embeddings, which assign single values to every token, and expand their context, nearly simultaneously as they process the context of every other word in the sentence. But who cares, let's listen to a pop song!
How to understand tokens and vector embeddings, for word people.
Even in the face of "black box" algorithms, the history of artificial intelligence—natural language processing, more specifically—has left plenty of clues. While we can't understand the full equation, we can see how building blocks create common patterns in how current algorithms process language.