CT No. 71: Analytics that illuminate your audience of lurkers
1/28: How to use Google Analytics to create reports on your audience, a review of Mangools and links about good jobs and bad jobs
1/28: How to use Google Analytics to create reports on your audience, a review of Mangools and links about good jobs and bad jobs
1/21: First-party data collection 101, a free-to-use robust survey tool; and great stats on news consumption and audience trust.
1/14/21: Data acquisition for content marketers, annual survey, a cookie widget tool review, and new tools from Google.
Our culture needs to reconsider -- deeply, honestly -- the economics and standards of the content and media environment.
This year's most read content and my favorite posts.
Website redesign nitty gritty and technical details, a review of a proximity chat app, and links of the week
The words we publish and hold up for peer review remain the best representation of our brains at work in the digital world. A published paper is the best way to look closely at the foundational assumptions of LLMs. And those begin with pop culture.
Transformers take static vector embeddings, which assign single values to every token, and expand their context, nearly simultaneously as they process the context of every other word in the sentence. But who cares, let's listen to a pop song!
How to understand tokens and vector embeddings, for word people.
Even in the face of "black box" algorithms, the history of artificial intelligence—natural language processing, more specifically—has left plenty of clues. While we can't understand the full equation, we can see how building blocks create common patterns in how current algorithms process language.