Hello, new subscribers, and welcome to Q4. We're a B2B newsletter, and we know the urgency that gets in the water at the end of the year. Let's take some big swings before closing out 2023 with a bang.
We're spending this quarter exploring red herrings and taking some of our own experimental medicine.
In this week's issue:
- On "content" and its discontents: When the meaning of words matters less than the function
- Don't let the chum bucket block your airflow
- Tech links of the week: Was that traffic drop from an algo change?
PROMOS AND SPONSORS
Learn about digital publishing trends in AI, audience growth, and editorial strategy
Do you work in the digital publishing and media space? If so, you must've had a whirlwind of a year!
Between trying to keep up with the pace of AI's impact on the industry, setting up new monetization channels, fighting competitors for relevant organic traffic, exploring unfamiliar audience engagement channels and strategies, and a dozen other developments in the space, you may feel like there's just too much you have to stay on top of.
Join PubTech2023 – a free online event for digital publishing and media professionals that will help you cut through the noise and understand what really matters.
Interested in seeing your ad above? Sponsor The Content Technologist.
“Content” isn’t the enemy: Keeping content creative, not generative
Publisher Deborah Carver steps into the ring to weigh on recent media criticism of the word "content," suggesting that the budgets allocated to content production and the possibilities of generating "form" with AI should be where our attention lies.
Words have meaning, as we know, and those meanings affect audiences. If I say, "That outfit looks good," you'll likely tell me about the circumstances of its purchase. If it's a skirt, whether it has pockets. If I modify the compliment slightly to say, "That outfit looks good on you," you'll most likely blush a little and say thank you. Not only have I complimented your fashion sense, but I've also hinted that either your physical body or your je ne sais quoi enhances your choice of clothes. I've called you out specifically.
In digital business, meaning is more slippery. There, interpretation is almost entirely dependent on experience and context. If I were to tell a client who doesn't understand their data, "Engagement rates are up and to the right," they'll likely believe their project is skyrocketing toward wild success. In reality, their business could have acquired one more extremely engaged user who is skewing the analytics. Meanwhile, their audience has not grown at all.
If I say, "I'd like to develop a content strategy for your business," that could be interpreted in a variety of ways, none more accurate than the other. The potential client may hear "I'd like to write every word of your marketing copy." Or "I will create a bunch of social media content for you." Or "I'd like to create a replicable production workflow that allows your business to bypass the usual uncertainty of the publishing business and create a new revenue stream." (By the way, my offering is closest to that last option.)
Meaning is so slippery in digital business because software, promising higher salaries and a more stable lifestyle, has been valued over language and communication. If college students want a high-paying job after graduation, they are advised to major in computer science — not English literature. Ten years ago, a job title like "content strategist" in corporate marketing meant you were compensated far more than a "managing editor," even when the strategist gig required a skillset that was fundamentally the same across both roles. When a young slacker sought a more profitable career, no one advised, "Learn to write a really great sentence!" For as far as I've been cognizant, it has always been "learn to code."
Recent pieces in The New York Times and Slate offer up the term "content" as sacrifice, arguing that craft and art get washed away when anything described as content is seen as interchangeable. "The term doesn't just de-professionalize the creation of art and culture by implying that it's all just more chum to feed to customers too hungry to say no," Aaron Bady wrote in Slate last month. I don't disagree with Bady's assessment, but I'd wager the attitude he describes comes more from paternalistic 20th-century conceptions of the consumer as a dull receptive brain with no ability to learn, create, or change. Whether media and tech executives call it "content" or "writing," the attitude toward both creators and audiences is the same, and the budget line item is seen as expendable or replaceable.
Why “content” isn’t the root of all business evil
Fixating only on the term "content" is a red herring. In my view, as someone who has been working in digital business for more than a decade, the word "content" gave "writing" more credibility. Writing meant lofty thinking, gatekeeping, red-line edits, massive manuscript print-outs, moody individualists who hate compromise, audiences who don't know any better, and the reputational and business risk inherent in publication.
Content was more predictable and friendlier, free of the snobbery of the arts, more amenable to the teamwork and maker-thinking that characterized digital startups in the late aughts and early teens. In digital industries, content is a give-and-take rather than a passive read-out. For me, it was tomato, tomahto: if I could get paid to make meaning on a screen, then you could call the output whatever you want.
In general, whether working in content or consuming it, correcting individual word usage is a fool's errand. Writers should be wary of asking people to alter their vocabularies to serve ideological aims, as that's a karmic career investment that never yields positive returns. It reeks of spin doctoring and grates on the audience's nerves, effectively accomplishing no actual change except adding that not-so-creative "scold" veneer.
How would avoiding a word like "content" make the larger machinations of digital business better? And why are we using language to scapegoat the more worrisome trend of endless replication that plagues corporate thinking? Especially when the word in question is one that we use to convey a box to be filled with something more specific later. Why no beef with words like "thing" or "stuff," let alone "form" or "structure"?
Since 2020, I've seen companies obsess over the words "blacklist" and "whitelist," while actual political progress toward ending structural racism stagnates. I love you, writers, and I am one of you, but let's get real: If businesses are already using the term "content" as shorthand quality writing, then one writer arguing for calling their work "episodic serial dramas" or "tv shows" instead of "content" is not going to accomplish a damn thing.
Form, content and engineering language beyond meaning: Ceci n’est pas une pipe
Clearly I'm fine with the word "content," and my acceptance of its casual use tracks with experience. When I studied poetry for my English literature major, we were asked to write papers that analyzed form and content with equal concern. Form considers the poem's structure — villanelle, sonnet, ballad, epic, whathaveyou — and the content describes the words that fill that structure and adhere to its rules. Form is the physical; content is the metaphysical. The concepts and separation of form and content are foundational across Western education, from logic to art criticism to, yes, software engineering. Yes, form and content may be polars ends of a continuum, but once separated they're no longer useful.
Engineers build pipes. Whether they hold water or sewage doesn't essentially matter if the pipe's a good pipe. Software engineers have, over the years, developed a series of tubes* called the internet, which are, as far as audiences are concerned, a giant form that pipes information from one place to another. Literally, websites are designed to be forms.
With email software, we still input text into forms. Search engines like Google and Bing give the user a single input form and some control mechanisms to bring them just about all the information you can find on the non-dark side of the web. Content management systems are forms. Social media platforms use forms. Even when we add an AI interface on top to make the software look less like a form, all software is a fucking form. And what goes in the form?
Content. Content goes into forms.
In the machinations of business, form is inherently considered more valuable than its content. The binary between the two interrelated concepts has become extreme. Software engineers with undergraduate degrees make far more than their liberal arts counterparts writing for the knife. Because "scale"? Because "economics"? Some Marxist explanation about superstructure I learned once and forgot? Because fewer humans = less conflict = efficiency = profits? Probably all of the above, but what matters now is: across industries, the content budget line item is almost always undervalued, under-resourced and misunderstood.
Words have meaning, except when they don't change the high-cost line-item on the budget. So what can we do to fix that?
*It's not the worst metaphor. The internet is closer to a series of tubes than, say, a print magazine, from a business ops perspective.
The content opportunity: Maybe now we can build new forms
First, we have to remember that the internet isn't a stagnant text like the content of a poem; it's culture, a seething reflection of human emotion and material circumstances, closer to fecal matter than to spring water.
The makers of the form, the tech companies subject to our loudest complaints, are so in love with "democratizing" that they only see simple majorities, the outweighing of good effects over bad effects. But digital culture doesn't work like that: with an open form, a little bit of shit can contaminate the clearest water and make everyone sick. Meanwhile, the longtime keepers of the "good" content, legacy news websites and media companies, have minimal interest in repairing rusty pipelines. They also remain among the worst offenders when it comes to violating audience privacy with ad tech and scammy digital marketing practices.
Unlike broadcast media and coastal entertainment hubs, where wide reach rests in the hands of very few power players, quality digital content is networked and dependent on its users. Whether you're on a contained platform environment like Substack, Instagram or TikTok, or navigating the still-very-much-breathing open web, you can't just watch passively and expect the work to be good. It's garbage in, garbage out, and if you don't like what you're seeing, the medium is designed to be flexible.
But right now, the people who have time and energy to make good content, by and large, aren't making it to their best abilities. Probably because we're creative people in a highly systematized world, our salaries don't support even modest lifestyles, and we're all forced to use so much software to accomplish basic tasks. It's headache-inducing at best, life-draining at worst.
What if we remember that writing, the kind we studied in school while we received our less-valuable degrees, still comprises both form and content, no matter what the language of industry dictates? What if the generative tools can help us build scaffolding for more lasting creative projects, rather than the addiction to immediacy and rapid change that media and tech companies alike manufacture?
Once more, with feeling: Websites are still cool
If you care deeply about having quality digital content in the future, if you are posting about not wanting to post anymore or not having enough good posts to read anymore, might I suggest: build a website?
Websites are not dead. Web is form and content together, and you don't have to follow anyone's rules if you don't want to. You do have to maintain the site, like a private room or a garden or a social media presence, but you don't have to use Google's ad software or anyone's ad software if you don't want to.
On a website, you don't have to swim among "generated" content if that's not your bag, baby. And hey, websites with original, clearly intentional content stand out. More importantly, as long as they're not creating individual pages "for SEO," they outperform websites that juice up with generative content steroids.
Or you could create the entire structure of your website using generative AI — and fill in the no-code web app you just built with brilliant content you wrote yourself. The software developers who create generative AI build their products for content generation because they are experts in form. You, as the expert in content, can use the generator to combat the tedium of coding and structuring the form. After all, no one says you have to ask the robot for a good story when the machine is better at building bridges.
Whatever it is, think outside the prescribed limitations of "content." Consider what we can do with the forms in front of us, the feeds where we have control, and new options for creating, rather than generating.
In the last quarter of 2023, in this publication, we're experimenting with both "form" and "content." It's part of a quest to become more profitable from the media we create, rather than the clients thought leadership attracts. We'll still be publishing long-form essays like this one, but we're experimenting with shorter pieces and new pathways into The Content Technologist.
And we're experimenting with AI-generated content in ways that augment, not replace, our creative brains. For a digital business, that means using generative AI to solve the most annoying problems of maintaining a website. We want to see what form improvements we can build with the code machines to make our content shine.
Stay tuned. And let us know what you think.
Interested in learning more about how to improve your form? Check out our upcoming salon and find new ways of working with software that enhance and improve your content.
From the bottom of the chum bucket
This week we're featuring another chum bucket from our production assistant, M.E. Gray, whose feed consistently has the greatest automated ads.
Do you have choking awareness? If not, it's on sale! Shop shop shop!
Content tech links of the week
- What does success look like for large language models (LLMs), and are we evaluating them in the best way? At AI Snake Oil, some academics tackle recent coverage of AI and why our current methods for making LLMs good may leave something to be desired.
- Documentation from Google's antitrust lawsuit confirms that it uses click data in ranking, which it has long denied but has been suspected by everyone who works with search extensively. Kevin Indig describes what Google's use of click data means for SEO. Do I agree with his conclusions? Not all of them!
- Did your search traffic drop because of an algorithm update? In this awesome new guide, Google describes four different patterns of traffic drops from organic search and the reasons for each.
- In fact, what would a world without Google look like for digital publishing? Joshua Benton at Nieman Lab has some realistic answers.
- Reporting on the content of TikTok trends can be a bit much—you're telling me isolated people with a lot of time on their hands want to tell the world about the cutesy dopey language they invented with their partners? Shocker—but I like this piece in Hell Gate about TikTokkers pretending to be non-player characters on the streets of lower Manhattan. If there's any place that can fold in a cultural trend like human Pokemon, it's lower Manhattan. Like, what if these kids are just today's beatniks? (They're probably not, but a girl can dream.)
- October 19 | How to scope and sell content projects, a virtual skill workshop
- November 3 | Practical AI for content professionals: A live salon about how to work with the robots, a virtual salon
- December 1 | Never go out of style: A live salon about content sophistication, a virtual salon
- December 7 | How to plan change with 1st party data, a virtual skill workshop
The Content Technologist a company based in Minneapolis, Los Angeles and around the world. We publish weekly and monthly newsletters.
- Publisher: Deborah Carver
- Managing editor: Wyatt Coday
- Production assistant: M.E. Gray
Did you read? is the assorted content at the very bottom of the email. Cultural recommendations, off-kilter thoughts, and quotes from foundational works of media theory we first read in college — all fair game for this section.
Since graduate school, I've been extremely skeptical of behavioral research — I fundamentally believe that you can't replicate real human decision-making in the controlled, experimental environment that the scientific method demands — so Data Colada's points are moot in my view because those studies aren't replicable in the real-world.
But what strikes me about this story is how much Harvard Business School takes zero responsibility for the pace and quality of research it's been funding and putting its brand weight behind. The mass-market book publishers aren't stepping up the plate to discuss how they don't fact-check pop science books, either. When we talk about misinformation or folk economics logic, we have to remember how much institutions are happy to take credit for publishing something new and shiny, only to revoke their support the second it's questioned publicly.