This essay originally appeared in CT No. 134: Editorial via algorithms: What is Helpful Content?, which was sent to paid subscribers on August 25, 2022.

Google's updates are programed algorithmically, and they aren't as prescriptive as those in a traditional style guide. Although they're created by technologists and linguists and engineers at one company, they're developed with an editorial ideal of a website in mind. As in much of the tech industry, they're iterative and outcome-focused instead of the one-and-done process of traditional content publishing.

These newest updates reverse engineer standards that are similar to established book or periodical publishing principles. If we format and publish information on the web while following consistent norms, that information becomes more credible.

Yes, it's problematic that Google exclusively holds the keys to information quality through search. I'm tired of writing about Google and following only their ideas of what's best for the web. I wish our system provided better options for a comprehensive, searchable index of web content.

However, if you're in the business of facts and you want to reach global audiences of all ages, search optimization is necessary to distribute information and ideas. Digital search is a deeply established information-seeking behavior, like watching tv news or reading before bedtime. It requires equal, if not more, editorial effort than publishing on other media.

Conferring consensus: The advantages of slow media

In last week's links, I shared Google's recent assertion that its search engine now "understands consensus." My postmodern brain wonders whether understanding consensus is even achievable for a human brain... but for Google, the phrase means that when multiple high-quality websites make the same claim, the search engine displays that claim as factual. Compared with real-time social media, Google is better equipped to handle consensus on facts because its algorithms value

  • website longevity and relative permanence
  • established social proof and topical authority through linking and content
  • trust signals like context, citations and authorship.

If you're a disinformation actor or even a high-quality but brand new source of reliable information, it is challenging to spin up a website and have your shiny new facts display on the first page of search results immediately for an established query. Web, which Google has made synonymous with SEO, is a medium of credibility. That means your brand or publication's reputation is dependent on both the links of others and similarity to other sources covering the same topic.

It takes six months to a year of regular content creation to begin appearing in search results for the most marginal topics. Once you've established a reputation of expertise, it's easier to rank for brand new queries, but organic search algos are engineered to be hard to hack. Google's newly announced updates are designed to highlight facts and improve information quality.

If you're building a website without focusing on organic search/SEO/Google... your very expensive website is a money pit. Organic social traffic continues to diminish, paid promotion costs keep rising, and word-of-mouth/traditional PR is chancy, expensive and time consuming. Organic search, on the other hand, reliably and regularly attracts audiences who are actively looking for your content directly to your website. When websites focus on search optimization, their traffic and ROI for creating organic content grows every year.

SEO is a long-term, slow investment in developing and maintaining expertise. Creating a website can be faster than publishing and distributing a book with new ideas and research... but not that much faster.* The slow speed is also the advantage: website content has a far longer shelf life than any periodical, broadcast or social medium with a higher long-term return on investment, financially or influentially.

*It's not much cheaper, either. We all need to get over the idea that "websites are free" because they are quite expensive and on par with printing costs.

Pruning the imperfect index

In the same post with the consensus news, Google also acknowledged that users rely on the search engine for confirmation of facts.** Glimpse a controversial new statement on a preferred social discovery medium, corroborate that info through search. Sure, marketing or shopping, are far more profitable behaviors, and the search engine has honed them over the past two decades. But the tool's unique strength is the same as when it was a research project known as PageRank: Google is a searchable index of information.

The index has never been perfect. Its ability to process, understand and serve quality, factual information for new or rapidly changing queries remains Google's biggest challenge. The consensus found in early search results has too often reflected the worldview of the Silicon Valley hacker-engineer stereotype who built the internet, parroting the basement creep who has time, means and energy to screed.** But Google is now actively making changes like media literacy notifications and an active effort to monitor viral disinformation and emerging queries. It gives me hope for the stability and quality of the index because, hey, 90% of the world uses the search engine multiple times daily.

If you manage a website, the best thing you can do (in Google's eyes, right now) is to make content more helpful.

**For more on how Google's algoritms reinforce established biases, check out Safiya Umoja Noble's Algorithms of Oppression. Worth noting that shitposting not a new problem in publishing. Rich trolls and graduate students have distributed bile — along with valid and legitimate new ideas — since the printing press was invented.

What Google's helpful content update means for your website

The imminently scheduled Helpful Content Update is the first major content-focused Google algorithm update in about a decade. Previous algo updates, nicknamed Panda (2011) and Hummingbird (2013) significantly improved the quality of search results, and many of us on the content side hope Helpful Content does the same.

If you've been following Google's guidelines for search optimization all along, your website will see improvement or no change at all. If you've been trying to hack search results because "it works, no matter what Google says," or because you learned SEO once from a website/journalism school/marketing company in 2013 and think it's all about keywords and meta descriptions, you might be in trouble.

Google identified the three specific areas that Helpful Content Update will impact:

  • Misleading content or information, such as pages devoted to new seasons of tv shows that do not yet exist
  • Content that is designed primarily for search engines or quick-turn traffic, aka over-optimized, badly written or AI-generated content
  • Websites that cover broad, unrelated topics on a surface level to attract "traffic" or "eyeballs"

Here are Google's two updates on the issue:

More content by people, for people in Search
We’re improving Search so you’re less likely to find content made primarily for search engines, and more likely to find helpful, authentic information.

High-level summary

What creators should know about Google’s helpful content update | Google Search Central Blog | Google Developers

In-depth information with website requirements

At Amsive, Lily Ray notes that the helpful update is sitewide, so it's not just the bad landing pages that will be affected. If you have shitty "SEO content" published alongside the quality, legitimate information about your brand, your entire website may decline in rankings.

So what should you do to ensure your website content is helpful?

  1. Focus on building topic authority on your whole website, rather than individual landing pages. Sites not pages has been best practice for a while, but Google is amping up its focus on topic authorityDecide on authoritative topics when developing strategies. Bake keyword, topic and audience research into your content in the information architecture. Connect your topics through contextual internal links, preferably chosen by humans. Make it so your readers want to dive deep and understand context.
  2. Ensure every page has a next step, preferably one chosen by human editors. Unlike print periodicals with a clear beginning and end, web content should always be presented in some sort of linked, deeper context. If you have the resources, do not rely on automated tagging systems to surface your best content. People make better editorial decisions than computers every time.
  3. Audit your old "SEO pages" and liberally retire and redirect that content if it's not serving your mission or your business. Get rid of pages with outdated information, shallow content, or content disconnected from the rest of your approach.
  4. While we're at it, let's all stop creating "SEO content" on websites altogether. If you are creating content for an audience that uses search, every piece of content should be optimized for the medium.

    If your primary medium is your website, digital optimization is a core part of any editorial or content creation role, not just for siloed "audience editors" or outside "SEO experts."†

    Bake query research and topic modeling in from the beginning; it's every bit as important as interviewing subject matter experts and more effective than social listening at understanding your audience and the language they use.
  5. Create lasting, permanent meaning through linked subject matter expertise, rather than one-and-done stories served up in a "blog." We're not selling individual issues on a newsstand; we're building authority at the topic level.
  6. If you haven't yet, audit your trust signals. Ensure authorship, mastheads, editorial policies, and citations are in plain sight and referenced in structured data where possible.
  7. Check your facts, and keep them up to date. Websites are the 21st century's reference materials, and I'd argue that they should be updated and checked more vigorously than books (most of which are not extensively fact-checked beyond prelim edits).
  8. Monitor your direct and organic traffic numbers in your analytics platform. Both should grow steadily over time and together should comprise at least 80% of traffic on a healthy website.
  9. Link out! Link to trustworthy sources wherever possible, and trustworthy sources will link back to you. The point of the whole web thing is community knowledge sharing, which means linking. "We want people to stay on our website and never look at another source of info" doesn't speak to the contemporary content economy. Just open the link in a new tab if you're worried about folks losing their spot in your content.
  10. Support your work with automation, but default to humanity if you have a choice. Only humans can ideate and create appropriate context for an audience.****

The role of websites in digital information ecosystems is becoming more fixed as the source of truth, an alternative to the flurry of real-time user-generated speculation.

†Pro tip: Use automated grammar checkers for all your comma-related nitpicks and other proofreads. Edit at the idea and context level, and leave the automateable tasks to the computers.

††Once more for the back row: GPT-3 and DALL-E 2 cannot generate ideas! They can draw connections between two disparate topics, especially when prompted, but they are consensus copy machines.


Google MUM & the ideal SERP | The Content Technologist
What if you could have more control over the results of what an algorithm serves up for you? What does the ideal result look like?
How do keyword research tools generate their data?
How do Google’s keyword planner and other keyword research tools generate their data? Here’s what I’ve learned in the past decade of SEO research.
How to read Google Search Console Insights | The Content Technologist
GSC Insights is an automated dashboard with content-focused data from both Search Console and Google Analytics, along with some brand new data that Google has never before offered as a default calculation.