Everyone else, also

Everyone else also thinks it’s about them.

Everyone else is in a hurry.

Everyone else is afraid.

Everyone else wonders if they’re being left behind.

Everyone else is tired.

Everyone else isn’t sure, either.

The good news is that everyone else also has unused potential and the ability to make an impact.

Online marketing vs. marketing online

Online marketing has become a messy mix of direct marketing, seo, tricks, tips, code and guesswork. It’s an always-moving target and it’s mostly focused on tactics, not strategy, because tactics are easy to measure.

Marketing online, on the other hand, is what happens when the work to serve our audience arrives in an electronic form. Marketing online is simply marketing–the act of making things better by making things–aided by a mouse and a keyboard.

Careful not to get stuck focusing on the wrong one. You need both, but one drives the other.

How to Do a Quarterly Business Review (QBR)

A question that comes up a lot at Portent and in agency circles is, “What defines a good quarterly business review?”

To be sure, it’s a very subjective question, but at its heart, the answer can be pretty simple: a good QBR is one that shows value. Value in the work you’ve done. Value in the relationship. Value in the time and money that you’ve spent! Your teams have dedicated countless hours working hard to deliver on the client promise and now is your time to highlight the fruits of your labor. However, don’t get caught up in thinking you can throw out some numbers that, in a vacuum, impress an SEM specialist or a content writer. It would be best if you were aligning everything in your presentation back to the goals of your client.

Graphic of a blue box with a star in the middle and the words wins, challenges, and opportunities insideGraphic of a blue box with a star in the middle and the words wins, challenges, and opportunities inside

Your Quarterly Business Review Should be Tied to Client Goals

You Have Client Goals, Right?

Don’t laugh here, because I’ve seen instances where a client can’t or won’t provide them. Perhaps they were interested in the overall general health of their website campaigns. Maybe they’ve been led to believe that when their CTA, CPA, CTR, DA, and other acronyms are all looking better, then they’re winning! Don’t get me wrong; showing a client why these metrics are important, and how your work impacted them is essential, though perhaps better suited for a monthly recap conversation.

You should drive the conversation early on in a client engagement that bubbles up these small successes into a larger purpose—your client’s primary goals. Whether they are qualitative or quantitative, during your QBR you should be telling the story of how your work and any underlying challenges and successes affected them. Examples here could be growing revenue by 20% year-over-year, increasing cart completions, doubling traffic from paid sources to targeted landing pages, or driving up site engagement. Once you have these, you can frame your QBR around them.

Quarterly Business Review Best Practices

What Makes a Good QBR?

Over the years, I’ve seen various ways that you can run a QBR. The best ones, however, center on the client’s goals and speak to the ways that we have been working to achieve them. What are we going to start working on this quarter? Did something not go well last quarter? What will we carry over? In other words, Start, Stop, and Continue (however around here, we prefer to call them Wins, Challenges, and Opportunities).

When you highlight the wins, challenges, and opportunities from the past quarter, you’re able to run a quick retrospective (or as some may call it, a post-mortem). This thoughtful look back gives you and the client a way to pat yourselves on the back, critically examine misses, and explain how your team is going to roll the insight from the two into next quarter’s opportunities. The following topics lay the groundwork for a thoughtful retrospective.

Start with KPIs

Key Performance Indicators (KPIs) are a great way to evaluate your success against established goals. It’s nice to begin by showing the big numbers to which you and your audience are beholden. You will want to show progress to-date, projections for the rest of the year, and comparisons to previous periods.

Graphic of a blue box with a star in the middle and the word wins insideGraphic of a blue box with a star in the middle and the word wins inside


Maybe I’m just a sucker for leading with the good news, but I love setting the tone of a QBR by highlighting some wins! What kicked ass? Why did it kick ass? How did the ass-kicking impact that client’s goal?

Graphic of a blue box with a star in the middle and the word challenges nsideGraphic of a blue box with a star in the middle and the word challenges nside


Not every strategy or tactic is going to produce a win. What didn’t work? Why didn’t it work? What did we learn? Most failures come with learnings and lead to the next topic: opportunities.

Graphic of a blue box with a star in the middle and the word opportunities insideGraphic of a blue box with a star in the middle and the word opportunities inside


Moving from challenges to opportunities provides a natural segue for a QBR conversation. If your retrospective unveiled a glaring gap, there must be a way forward; highlight it here. Don’t forget to promote the ways that you’re going to further leverage or expand on your wins.

Roadmap For the Next Quarter and Beyond

In this section, it’s time to lay out a plan. Using your existing roadmaps and items from your wins, challenges, and opportunities, let your client know what you’ll be working on for the upcoming quarter(s). Be sure to get buy-in for your plans. The beauty of this section is that you can reference it during your next QBR using the above format.

How do I Prepare for a Quarterly Business Review?

While there are many ways to prepare for a QBR, at Portent, we encourage our teams to be proactive in planning for these meaningful discussions. Depending on the relationship with your client and direct contacts, they might even appreciate an opportunity to preview the final document that you’ll be presenting. So with that in mind, we keep to the following schedule.

Three Weeks Out

Assign your tasks. Start by creating all of the assignments for the upcoming weeks For me, that means ensuring that initial drafts of our presentation deck get started, revised, and completed on time.

Internal QBR prep. Next, we schedule two meetings. The first is a 30-minute internal meeting for QBR prep where our team will review our client’s goals The meeting organizer should prompt their team to contribute their wins, challenges, and opportunities from the last quarter.

QBR brainstorm/work session. The second internal meeting is a QBR brainstorm/work session. Your account specialists need to come to this hour-long session prepared with specific metrics that detail how their work impacted the client’s KPIs and quarterly goals. With that data in hand, the team will collaborate on a 30-60-90 day roadmap for the client based on the opportunities that were uncovered during this brainstorm session.

Two Weeks Out

Schedule internal slide delivery. We typically ask that our teams deliver their slides to the account manager two weeks before the final presentation. Each team member should include wins, challenges, and opportunities sections. These must match the template’s formatting and include high-resolution images and charts. Ensure that your team’s slides are rooted in the client’s vernacular, and tie tightly back to your client’s goals and KPIs. Don’t just add sexy numbers. Show your client the “so what?”

Compile your presentation. Once your account manager has the slides they need, it’s time to compile the information into one comprehensive presentation. If you’re using PowerPoint, gather your slides an create the master deck. BONUS TIP: Use an online version of a presentation from the outset, and you’ll save time compiling the final version later. Not only that, your contributors will benefit from additional insight as their peers gather their thoughts. We’re all about breaking down silos, and we find that this process furthers the collaborative spirit. Some examples of online presentation resources include Google Slides and Microsoft’s Office Live.

Internal team presentation. An internal presentation is your chance to give your QBR a dress-rehearsal. Your team should be prepared to speak to their sections (if applicable). We also highly suggest involving senior leadership or some other stakeholder from your agency or company. These additional eyeballs provide an objective perspective and ensure that your presentation is delivering on your company’s promise. Any final updates should be implemented following this meeting.

One to two Days Before the QBR

Send your deck to the client contact. Sending it early serves two purposes. First, it allows you to ensure that you have covered all angles. Second, should you be presenting remotely, they’ll have it in case your connectivity goes down.

Day of the Presentation

You will likely want to schedule 90 minutes for the presentation. The first 60 will be for the QBR itself while the remaining 30 will allow plenty of time for Q&A.

That’s All Folks…?

You’re not done just yet. Be sure to see if your client has any edits to the presentation. Within a week, it would be helpful for you to update your deck and send them the final PDF for their records. Don’t forget to assign all tasks out from your roadmap.

Final Thoughts

In my experience, I’ve found that sticking to this schedule and format has made for some transcendent QBR conversations. The challenge and opportunity sections allow you and your client to discuss what fine-tuning your relationship needs, and you’ll both be better for it. Moreover, don’t forget to take credit for the beautiful work you’ve been doing. Your Quarterly Business Review is the perfect time to show the value you bring to your client.

Ready to up your quarterly business review game? Check out Portent’s QBR Cheat Sheet to get started!

The post How to Do a Quarterly Business Review (QBR) appeared first on Portent.

Google Podcasts Search Results and SEO

Google has podcasts in search results!

Neat! Google has yet another feature to push down the ten* blue links, encourage people to never click through to your website, and take credit for your content! Alright, that’s a little more cynical than I usually am but, I’m not wrong.

*your number of blue links may vary

These rich results for podcasts first launched in 2017. Initially, they displayed as a list of playable episodes under an organic listing. Now, podcast search results show up as a card carousel high up on the results page.

Screenshot of Google search results for "stuff you should know" podcastsScreenshot of Google search results for "stuff you should know" podcasts

My Chrome browser isn’t the only place you will see these Google Podcasts search results either; you can also find them in several Google-owned and operated properties and services:

  • Google Search in any browser
  • The Google Search app for Android
  • The Google Podcasts app
  • Google Home
  • Content Action for the Google Assistant
  • Android Auto

So, How Do I Get My Podcast in Google Search Results?

The first step, after “get yourself a podcast” is to publish your podcast to Google Play Music. It’s a straightforward process that includes creating an RSS feed and having a website that links to the RSS feed.

  • Create an RSS feed for your podcast and follow the episode-level requirements
  • Have a website for your podcast and link to your RSS feed from the homepage
  • Don’t block any users or crawlers from your website with passwords or robots.txt, etc.

Once you have verified and published your podcast, it will be available in the Google Podcasts app and eligible for all the search display opportunities Google has to offer podcasts.

Is There Such a Thing as Podcast SEO?

Anytime a website and Google are involved, there is going to be an opportunity for optimization. And, from what I can tell, there are not a lot of podcasts out there that are optimized for organic search. I mean, two of the three podcast results for “productivity podcasts” use an episode title of “Productivity.”

Screenshot of Google search results for "productivity" podcastsScreenshot of Google search results for "productivity" podcasts

It also looks like Google is still trying to interpret search intent with a lot of “podcast” related searches; probably because there are podcasts out there for practically everything!

For example, trying to find out which type of schema markup to use on podcasts brings up a handful of pages that don’t actually answer the query and a Google Podcasts result with podcasts talking about structured data. The eighth organic result is the info I needed:

Screenshot of Google search results for "podcast schema markup"Screenshot of Google search results for "podcast schema markup"

Podcast SEO Best Practices

So, with all of that in mind, here are some SEO fundamentals for podcast SEO:

  • Get a dedicated website for your podcast
  • Don’t gate or block your content
  • Create a unique page for each episode
  • Use your target keywords in all the right places like the URL, title tag, and meta description
  • Your podcast episode title and page title tag do not have to be the same and should pass the Blank Sheet of Paper Test
  • Have a summary introduction on the page
  • If you have episode transcripts available, get those on the page
  • Use the schema markup for OnDemandEvent on each podcast episode page
  • And submit your podcast to Google Play Music along with everywhere else

There you have it. As the art of podcasting continues to grow and gain more momentum, it will become even more necessary to make sure yours is well positioned and easily accessible through Google search results. By following these best practices, you’ll be set up for success. Happy podcasting!

The post Google Podcasts Search Results and SEO appeared first on Portent.

Politics vs. governance

“It’s just politics.”

No one ever says, “it’s just governance.”

Politics is organized sparring about power, without much regard for efficacy or right or wrong.

Governance is the serious business of taking responsibility for leadership.

Over the last twenty years, the mass media has shifted, from “here’s the news,” to, “hey, it’s just media.” As a result, a system has been built in which situations, emergencies and bad news have been packaged and promoted twenty-four hours a day.

In the face of that maelstrom of noise, it’s easy to come to the conclusion that the world is more dangerous and unstable than it has ever been.

When we have a chance to speak up for governance, we can strike a blow against politics.

Because even though it doesn’t make compelling TV, the long-term challenges ahead of us aren’t going to respond to politics.

Dedication, resilience and concerted effort have saved us before and they can save us again. Except once again, it’s on us to speak up and do something about it.

A Developer’s Guide To SEO

ol.fraggles {margin-top:30px;list-style-type:none;}
ol.fraggles li {line-height:.25rem;}
ol.fraggles li ol {
ol.fraggles li a {font-size:1rem;}
h4 {font-size:1.5rem;font-weight=bold;
h3 {line-height:1rem;margin-bottom:40px;margin-top:70px;font-size:2rem;}
h2 {line-height:1rem;margin-bottom:40px;margin-top:70px;font-size:2.5rem;}

Developers don’t do SEO. They make sure sites are SEO-ready.

That means developers hold the key to SEO. It’s true. If you’re a developer and you’re reading this, laugh maniacally. You’re in control.

You control three things: viability, visibility, and site flexibility.

This post provides guidelines for all three.

  1. Viability
    1. Generate And Store HTTP Server Logs
    2. Don’t Turn On Analytics. Configure It.
    3. Consider Robots.txt
    4. Set The Correct Response Codes
    5. Configure Headers
    6. Other Random Things
  2. Visibility
    1. Get Canonicalization Right
    2. Pay Attention To Performance
    3. Engineer Away ‘Thin’ Content
    4. Use Standard Page Structure
    5. Put Videos On Their Own Pages
    6. Generate Readable URLs
    7. Use Subfolders, Not Subdomains
    8. Don’t Use Nofollow
    9. Make Navigation Clickable
    10. Link All Content
    11. Don’t Hide Content (If You Want To Rank For It)
    12. JavaScript & Frameworks
  3. Flexibility
    1. Have One, Editable Title Tag On Each Page
    2. Make Meta Tags Editable In The CMS
    3. Make Image ALTs Editable In The CMS

What’s A Developer?

This isn’t a navel-gazing philosophical question.

For this article’s purposes, a developer connects site to database (or whatever passes for a database, don’t get all anal-retentive on me), builds pages using the design provided, and does all the work those two jobs require.

A developer does not design. They do not write content. If you do all three jobs, tell the designer/content parts of your brain to take a break. This post isn’t for them.


Viability: Stuff you do on the server and in early software configuration that readies a site for ongoing SEO.

Mostly I chose this word because the other two ended with “ility,” and it just works.

Generate And Store HTTP Server Logs

Server logs are an SEO source of truth. Log file analysis can reveal all manner crawler hijinx.

Every web server on the planet has some kind of HTTP log file.

And now someone’s going to tweet me their platform that, in defiance of all logic, doesn’t generate log files. OK, fine.

99% of web servers on the planet have some kind of log file.

Happy? Great. Now go make sure your server generates and saves HTTP logs.

Most servers are set up correctly out of the box, but just in case, make sure log files include:

  • The referring domain and URL, date, time, response code, user agent, requested resource, file size, and request type
  • IP address helps, too
  • Relevant errors

Also make sure that:

  • The server doesn’t delete log files. At some point, someone’s going to need to do a year-over-year analysis. If your server wipes log files every 72 hours or similar silliness, they can’t do that. Archive logs instead. If they’re gigantic, make the SEO team pay for an Amazon Glacier account
  • The logs are easily retrieved. If you don’t want your SEOs mucking around the server, I understand. But make it easy for you and the rest of the development team to retrieve HTTP logs. It’ll save you time later, and ensure your replacement can find them after you win the lottery

Log files, folks. Love ’em. Keep ’em. Share ’em.

Don’t “Turn On” Analytics. Configure It.

Why does everyone treat analytics like a light switch? Paste the script, walk away, boom, you’ve got data.


Before you add that JavaScript, make sure your analytics toolset—Google, Adobe, whatever—can:

  • Track onsite search. People use that little magnifying glass buried in your site navigation. Your SEO (and UX) teams can learn a lot by reviewing onsite query data. Store it now, avoid apologizing later
  • Track across domains and subdomains. If your company operates multiple domains or splits content across subdomains, creepily stalk users across all of those properties. Your SEO team can then see how organic traffic flows from site to site
  • Filter by IP. Exclude users from your company, from competitors, or from your pesky neighbor who keeps asking you for a job. One IP filter your SEO will appreciate: users in your office. Set it up, and they’ll buy you the beverage of your choice, except Southern Comfort, which gave me the worst hangover of my life and is banned from our entire industry, forever
  • Track on-page events. If your Analytics team is ready for you, put the “hooks” in place now, saving everyone precious time later

Is this all SEO stuff? Not exactly. But it all helps the SEO team. Is this your job? Maybe not. But you’re on the Dev team. You know you’re the top of the escalation tree for everything from analytics data to printer malfunctions. When they can’t find the data they need, the SEO team will end up at your door.

Consider Robots.txt

Hopefully, you already know all about robots.txt. If not, read this guide.

Even if you do, keep in mind:

  • Robots.txt tells bots not to crawl a URL or page. The page might remain in the search index if it was previously crawled (at least, in my experience)
  • Robots.txt noindex probably won’t work much longer
  • The meta robots tag tells bots not to index a page, and/or not follow links from that page. The bot has to crawl the page to find the tag
  • When you launch the site remember to remove the robots disallow/and noindex meta tags please gods please I beg you

Set The Correct Response Codes

Use the right response codes:

200: Everything’s OK, and the resource exists

301: The resource you requested is gone forever. Poof. Look at this other one instead

302: The resource you requested is gone, but it might be back. Look at this other one for now

40x: The resource you requested can’t be found. Oops

50x: Gaaaahhhhh help gremlins are tearing my insides out in a very not-cute way. Nothing’s working. Everything’s hosed. We’re doomed. Check back later just in case

Some servers use 200 or 30x responses for missing resources. This makes Sir Tim Berners-Lee cry. It also makes me cry, but I don’t matter. Change it.

Even worse, some CMSes and carts come configured to deliver a 200 response for broken links and missing resources. The visiting web browser tries to load a missing page. Instead of a 404 response, the server delivers a 200 ‘OK’ response and keeps you on that page.

That page then displays a ‘page not found’ message. Crawlers then index every instance of that message, creating massive duplication. Which becomes a canonicalization issue (see below) but starts as a response code problem.

Yes, Google says they’ll eventually figure out whether you meant to use a 302 or a 301. Keyword: eventually. Never wait for Google. Do it right in the first place.

Configure Headers

I make no judgments regarding the pluses or minuses of these. But plan ahead and configure them before you launch:

  • last-modified
  • rel canonical
  • hreflang
  • X-Robots-Tag

Other Random Things

Check ’em off now, so you don’t have to deal with them later:

  • Put your site on a server with solid-state drives (SSDs). Read/write is a lot faster. Argue if you want, but a faster server means a faster site, which makes ranking easier. More about this when I get to Performance
  • Virtual servers. Call me old-fashioned, but putting my site on a server with 900 others gives me hives. I’m not worried about shared IPs or search reputation. I’m worried about what happens when some bozo creates an endless loop and crashes my site

Viability: It’s Like Good Cholesterol

I just found out that I have high cholesterol, which is irritating because I eat carefully and bike 50–100 miles/week. But whatever.

MY POINT HERE is that server viability fights potential blockages by making sure your SEO team can get straight too…

This is a horrible analogy. Moving on.


This is what everyone thinks about: How you build a site impacts search engines’ ability to find, crawl, and index content. Visibility is all about the software. How you build the site impacts it.

Get Canonicalization Right

Every resource on your site should have a single valid address. One. Address. Every page, every image.

Canonicalization problems can cause duplicate content that, in turn, wastes crawl budget, reduces authority, and hurts relevance. Don’t take my word for it. Read Google’s recommendation. If you follow these recommendations, you’ll avoid 90% of canonicalization problems:

Home Page Has a Single URL

If your domain is www.foo.com, then your home page should “live” at www.foo.com.

It shouldn’t be


or anything else. Those are all canonically different from www.foo.com. Make sure all links back to the home page are canonically correct.

Don’t depend on rel=canonical or 301 redirects for this. Make sure all internal site links point to the same canonical home page address. No site should ever require a 301 redirect from internal links to its own home page.

Pagination Has One Start Page

Make sure that the link to page one of a pagination tunnel always links to the untagged URL. For example: If you have paginated content that starts at /tag/foo.html, make sure that clicking ‘1’ in the pagination links takes me back to /tag/foo.html, not /tag/foo.html?page=1.

Friends don’t let friends create links like this:

<a href=‘~’>

Those can create infinitely-expanding URLs:


Never hard-code relative links, unless you want to be the comic relief in an SEO presentation.

No Query Attributes For Analytics

Don’t use query attributes to tag and track navigation. Say you have three different links to /foo.html. You want to track which links get clicked. It’s tempting to add ?loc=value to each link. Then you can look for that attribute in your analytics reports and figure out which links get clicked most.

You don’t need to do that. Instead, use a tool like Hotjar. It records where people click, then generates scroll, click and heat maps of your page.

If you absolutely must use tags, then use /# instead of ? and change your analytics software to interpret that, so that ?loc=value becomes /#loc=value. Web crawlers ignore everything after the hash sign.

Things to Do

Whether you have canonicalization issues or not, make sure you:

  • Set the preferred domain in Google Search Console and Bing Webmaster Tools (last time I checked, you could do this in both)
  • Set rel=canonical for all pages. Might as well handle it ahead of time
  • Set the canonical HTTP header link

Quick Fixes

It’s best to fix canonicalization issues by doing it right: build your site to have a single address for every page.

If you can’t do that, though, use these:

  • rel=canonical points search engines at the preferred page. It doesn’t fix crawl budget issues, but it’s something. Make sure you use it right! Incorrect rel=canonical setups can hurt more than help
  • Use the URL Parameters Tool in Google Search Console to filter out parameters that cause duplication. Be careful. This tool is fraught with peril

Get Canonicalization Right From The Start

Please don’t do these things:

  • Use robots.txt or meta robots to hide duplicate content. This completely screws up the site’s link structure, doesn’t hide the content, and costs you authority
  • Point rel=canonical for one set of duplicates at different target pages
  • Use either Google Search Console or Bing Webmaster Tools to remove the URLs of duplicate pages

In other words, no funny business. Do it right from the start.

Pay Attention To Performance

Performance is done to death, so I’m going to keep it short. First, a brief sermon: page speed is an easy upgrade that gets you multiple wins. Faster load time means higher rankings, sure. It also means higher conversion rates and better UX.

First, run Lighthouse. Sample several pages. Use the command line to batch things. The Lighthouse Github repository has everything you need.

Lighthouse isn’t perfect, but it’s a helpful optimization checklist. It also tests accessibility for a nice 2-in–1.

Do all the stuff.

Regardless of the test results:

  • Use HTTP/2 if you’re allowed. It has all sorts of performance benefits
  • Use hosted libraries. You don’t have to use Google’s, but here they are
  • Unless you look at code coverage, in which case I suggest you trim the heck out of your included files and work from there
  • Compress images. Teach your team to use squoosh. They’ll remember to use it for about a day. After that, either flog them regularly or use something like Gulp to automatically compress before upload
  • Defer blocking CSS and JavaScript. Because I said so

You can also consider installing page speed modules. I’d never do this. I don’t want Google software running directly on my server. But they do a lot of work for you. You decide.

A few other quick tips:

Third-Party Scripts

Chances are, someone else will add a bunch of third-party scripts and clobber site performance. You can get off to a good start:

  • Defer loading of third-party scripts, where you can
  • Ask the service provider for the compressed version of the script. They often have one
  • Use CDN versions wherever possible. For example, you can use the Google CDN version of jquery

Use DNS Prefetch

If you’re loading assets from a separate site, consider using DNS prefetch. That handles the DNS lookup ahead of time: <link rel="dns-prefetch" href="//foo.com" /> That reduces DNS lookup time. More on that:

Use Prefetch

Find the most popular resources on your site and use prefetch (not to be confused with DNS prefetch, above). That loads the asset when the browser is idle, reducing load time later: <link rel="prefetch" href="fonts.woff" /> Be careful with prefetch. Too much will slow down the client. Pick the most-accessed pages and other resources and prefetch those.

Engineer Away ‘Thin’ Content

Build your site to avoid ‘thin’ content: pages with very little content and little unique information.

Avoid these things. Don’t laugh. I still find this kind of stuff in audits all the time:

  • Send-to-a-friend links with unique query attributes
  • Member pages with blank bios and/or no other useful content
  • Blank or low-value “more reviews” pages. Some sites have links to separate review pages for each product. That’s helpful, unless there are no reviews, or the text for most reviews is terribly helpful like “great product”
  • Empty, paginated photo galleries. I honestly don’t know how sites manage this, but they do
  • Tag pages for tags with a single piece of content

Don’t wait for an SEO to make you go back and fix it. Build to prevent this kind of stuff:

  • If you must have send-to-a-friend links, use fragments plus window.location or something similar. Crawlers will ignore everything after the hash
  • Require a minimum length bio, or hide member profiles with short or nonexistent bios
  • Don’t display separate review pages unless you have a minimum number of reviews
  • Don’t generate or link to tag pages unless the tags have more than N pieces of content. You can choose “N.” Just please make sure it’s not “1”
  • Use rel=canonical for multiple SKUs, request forms or anything else that might end up generating thin content. This is not a fix. It’s a lousy workaround. But it’s better than nothing, and it’ll catch stuff you miss

Use Standard Page Structure

We’ve already dealt with title elements and such, so this is a lot easier. Every page should:

Have a Single H1

While heading tags don’t necessarily affect rankings, page structure as evidenced by rendering does. H1 is the easiest way to represent the top level in the page hierarchy.

Have a single H1 that automatically uses the page headline, whether that’s a product description, an article title, or some other unique page heading. Do not put the logo, images or content that repeats from page to page in an H1 element.

Make H2, H3, H4 Available to the content creators

Allow multiple H2, H3, and H4 elements on the page. Let content creators use H2, H3, and H4. You can let them drill down even further, but I’ve found that leads to some, er, creative page structures.

Use <p> Elements for Paragraph Content, Not Hard Breaks or DIVs

Any developer knows this. Content creators sometimes don’t. I still see many writers insert double line breaks. It’s not easy, but if you can somehow enforce the use of <p> elements for paragraphs, it will make later tweaks to styles a lot easier.

Use Relevant Structured Data

At a minimum, generate structured markup for:

  • Places
  • Products
  • Reviews
  • People

See schema.org for more information. Right now, JSON-LD is the most popular way to add structured data. It’s easiest, and if you (properly) use a tag manager, you can add structured data to the page without changing code.

Oh, Come On Ian

I can hear you. No need to mutter. You’re saying, “None of this impacts rankings.”

It may. It may not. But using standard page structure improves consistency across the site for every content manager and designer who will work on it. That leads to good habits that make for a better site. It leads to less hacky HTML code pasted into the WordPress editor. That means a more consistent user experience. Which is good for rankings.

So there.

Put Videos On Their Own Pages

Video libraries are great, but having all of your videos on a single page makes search engines cry. Put each video on its own page. Include a description and, if you can, a transcript. Link to each video from the library. That gives search engines something to rank.

Generate Readable URLs

Where possible, create URLs that make sense. /products/shoes/running is better than /products?blah–1231323

Readable URLs may not directly impact rankings. But they improve clickthrough because people are more likely to click on readable URLs.

Also, Google bolds keywords in URLs.

Finally, what are you more likely to link to?



/asdf/shoes/ ?

Use Subfolders, Not Subdomains

Yeah, yeah, go ahead and hurl insults. I’ve heard it all before. If you want to argue about it, go read this post first.

All quality content should ‘live’ on the same domain. Use subfolders. The blog should live at /blog. The store should live at /store or similar. I always get pushback on this one. Google has said in the past that subdomains are OK. Yes, they’re OK. They’re not the best. Google says subdomains are sometimes just as good. Not always.

When Googlebot comes across a subdomain, it decides whether to treat it as a subfolder or not. Like many things Google does and says, they’re unclear about it and results differ. I have no test data. I can say this: in most cases, moving content to a subfolder helps, if by ‘most’ we mean ‘every site I’ve ever worked on.’

So why leave it to chance? Use a subfolder now, and you won’t have to deal with subdomains and unhappy marketers later.

There are two exceptions to the rule:

  • If you’re doing reputation management, you need to control as many listings on the first page of a search result as possible. Google often separately ranks subdomain content. A subdomain can help you eat up an additional spot
  • If you’re having trouble with a large amount of low-quality content or thin content, move that to a subdomain, and you may see rankings improvements

The most common reason folks use subdomains is the blog: The CMS, or server, or something else doesn’t support a blog. So you set up a WordPress.com site.

That ends up being blog.something.com. If you have to do that, consider using a reverse proxy to put it all under one domain. Of course, if you have no choice, use a subdomain. It’s better than nothing.

Don’t Use Nofollow

Just don’t. Nofollow is meant to prevent penalties for links from comments and advertising. It doesn’t help channel PageRank around a site. It does burn PageRank. It’s a bad idea.

The only time to use nofollow is to avoid a penalty because you’re linking to another site via ads or other paid space on your site. A good rule of thumb: If you’re doing something ‘just’ for SEO, think carefully. Nofollow is a good example.

Make Navigation Clickable

Clicking the top-level navigation should take me somewhere other than ‘/#.’.

Top-level nav that expands subnav but isn’t clickable creates three problems:

  • The site’s primary navigation is a hidden rollover. Google and Bing will attribute less importance to it
  • You lose what could be a top-level link to a single page on your site from every other page on your site. That’s scads of internal authority gone to waste
  • Users will click on ‘Dropdown’ and get frustrated

Make sure clicking any visible navigation takes me somewhere.

Link All Content

If you want a page indexed, I need to be able to reach it by clicking on links. Forms, JavaScript maps, etc. aren’t enough. For example: If you have a stores directory, keep the map and ZIP code search.

Just make sure there’s also a clickable index I can use to find stores. That means I can link to it, too. This rule is particularly important when you work with JavaScript frameworks. See the next chapter for more about that.

Don’t Hide Content (If You Want To Rank for It)

Until, oh, last week (seriously, Google just changed this last week), Google said they wouldn’t consider content that only appeared after user interaction. Content behind tabs, loaded via AJAX when the user clicks, etc. got zero attention.

Last week, the big G said they do examine this content, and they do consider it when determining relevance. I believe them, but as always, they’ve left out some details:

  • Do they assign the same weight to content that requires user interaction?
  • Do they differentiate between hidden content (like tabs) and content that doesn’t load without user interaction (like asynchronous content)?

Oh, also: The old tiny-content-at-the-bottom-of-the-page trick still doesn’t work. That’s not what they meant.

JavaScript & Frameworks

JavaScript isn’t bad for indexing or crawling. JavaScript is bad for SEO.

Instead of typing yet another diatribe about the evils of Javascript, I’ll link to mine and add a few quick notes:

Ask Yourself Why

First, before you get into complicated ways to mitigate the SEO problems caused by many frameworks and JavaScript widgets, ask yourself, ‘Why am I building my site this way?’ If there’s no compelling argument–if using a framework doesn’t offer essential features–consider doing something else.

Only Hide Content When Essential

This is the easy part: if you’ve got content on the page for which you want to rank, don’t hide it behind a tab, an accordion, or whatever else. On a well-designed page, people who want to see everything will scroll down. If they don’t want to see it, they weren’t going to click the tab anyway.

Don’t Deliver Content Based on User Events

If you want content indexed, don’t deliver it based on a user event. Yes, Google says they now index content that reveals after user interaction. Play it safe, though, if you can.

Show Content Before the Load Event

Look at your site’s HAR. Anything that appears after the ‘load’ event is probably not going to get indexed: the Load event, in an HAR

Make sure whatever you want indexed appears before then.

Use Indexable URLs

See Make Content Clickable. URLs with /#! and similar won’t get crawled. Google deprecated that as an indexing method.


If you must use JavaScript content delivery, try to mitigate the damage.


No one thinks about this. No. One. SEO requires non-stop tweaks and changes by content managers, analysts, designers, and lots of other non-developers. If they can’t do the work, they bury the resource-strapped development team in requests.

SEO grinds to a halt, and organic performance falls.

I mean, if you have infinite dev resources no worries. Skip the rest of this article. Go back to feeding your pet rainbow-crapping unicorn.

Otherwise, keep reading this relatively brief section.

Have One, Editable Title Tag on Each Page

The title element is a strong on-page organic ranking signal.

  • There must be one <title></title> element on each page
  • It must be a separate, editable field. Have the title element default to the page headline, but make it separately editable
  • As I write this, the ideal title tag is 60-ish characters in length, but don’t set a limit. It changes all the time. Your users should be using the Portent SERP Preview Tool because it’s the best thing since Nestle KitKats. Right? Right???

Make Meta Tags Editable in the CMS

First: the meta keywords tag is utterly useless and has been since, oh, 2004. Remove it. If your SEO protests, find a new SEO. With that out of the way, make sure each page has the following editable META tags:


Every page should have an editable description meta tag. The description tag doesn’t affect rankings. It does, however, affect clickthrough rate, which can mean organic traffic growth even if rankings don’t improve. Like the title tag, make the description tag a separate, editable field.

If the page is a product page, have the description tag default to the short product description. If the page is a longer descriptive page, have the description tag default to the first 150 characters of the page content. Never have a blank meta description! If you do, Google and Bing will choose what they think is best. Don’t rely on them.

Open Graph Protocol (OGP)

Facebook uses OGP tags to build the text, image, and title of shared content. Without it, Facebook may use the title and meta description tag and pick an image. It may pick something else. OGP tags let the content creator control what will appear on Facebook and, like the meta description tag, they can boost clickthrough.

Have the OGP tags default to the page’s title, meta description and featured image. Then let the author edit them. At a minimum, include og:title, og:type, og:image and og:url. You can read more about OGP tags at http://ogp.me/.

Twitter Card Markup

Twitter cards are more niche. Twitter will use OGP tags as a fallback, so these aren’t required. If you can add them, though, it gives content creators even more control over what Twitter shows for shared content.

Twitter cards can double clickthrough and other engagement. They’re worth the effort. See https://dev.twitter.com/cards/overview for more information.

Make Image ALT Attributes Editable in the CMS

The ALT attribute is another strong ranking signal. Every image uploaded as part of page content must be editable when the user uploads it. If they do not enter an ALT attribute, default to:

  • “Image:” + product name, if this is a product page
  • “Image:” + image caption, if entered
  • “Image” + file name

I recommend including “Image:” so that screen readers and other assistive devices identify the snippet of code as an ALT attribute.

That’s It

Last updated 2019. Things change. Check back for new stuff.

The post A Developer’s Guide To SEO appeared first on Portent.

Arithmetic true

Arithmetic is true. It’s true because

1. we accept the terms for what they mean

2. it’s timeless, past and present and future are the same

3. it’s testable

In every fourth-grade classroom, the statement, “9 is bigger than 7” is clearly true. We can count out nine marbles. We have a mutual understanding of what “bigger” means in this context. From this shared understanding of the axioms and vocabulary, we can build useful and complex outcomes.

On the other hand, “Cheryl is a better candidate than Tracy” might be true for some people, but it presents all sorts of trouble if we look at it through the same lens of “truth” as a term we learned in arithmetic. We know who Cheryl is and we know who Tracy is, but it’s not clear what “better” means in this case. Are we describing who will win an election in two weeks? That’s awfully hard to test in advance.

And ‘words as building blocks of truth’ gets even more complicated when the ideas intersect with both science and culture. The statement, “The theory of evolution is our best explanation for how we all got here,” is demonstrably true in the realm of science, but for people with a certain worldview who value cultural alignment more than verifiable and testable evidence, this statement isn’t true at all.

The words matter. It matters whether we’re talking about ‘arithmetic true’ or simply an accurate description of what works for part of our culture.

Lottery logic

Someone has to win the lottery, it might as well be you.

Buying a lottery ticket is economically irrational and emotionally rewarding for some. Because while someone has to win, it’s probably not going to be you.

There are examples of lottery logic in our daily work as well. It’s clear that someone is going to be the next Taylor Swift, the next George Clooney or the next Will Smith. But it’s probably not going to be you. Someone is going to raise a $40 million seed round, or get picked to be the next big thing. But it’s probably not going to be you.

It’s tempting to decide to follow the path that leads to mass-market stardom, the top of the charts, the fame and fortune that comes to the person who wins a media lottery. It’s tempting to build a mass-market podcast or a general-audience news site. It’s tempting to be the sort of vanilla-but-attractive actor who can play just about any role…

But it’s far more productive to focus on stepwise progress for the smallest viable audience instead. It might not make headlines, but it’s far more likely to work and more rewarding in the long run.

The Critical Rendering Path Explained

What is the Critical Rendering Path?

The critical rendering path is the collection of steps made between the time when your browser receives an HTML response from a server, and the painting of the requested web page. In this post, I’ll break down the process so it is a bit more precise, while also providing some tips to optimize each of the steps.

DOM Tree

The DOM, or Document Object Model, is an object-based representation of the parsed HTML. For example:

Screenshot of the DOM code treeScreenshot of the DOM code tree

As the HTML is parsed, it will construct what is called the DOM Tree. The DOM Tree is made up of the objects that are parsed via HTML and XML. For example:

Screenshot showing the DOM treeScreenshot showing the DOM tree

While this is only one part of the critical render path, making sure you are writing clean semantic markup will help to ensure your HTML is parsed quickly for optimum performance.


Similar to the DOM, the CSSOM is also object-based. The CSS Object Model represents the styles associated with each node that lives in the DOM. Styles can be declared or inherited.

Screenshot of the CSSOM code treeScreenshot of the CSSOM code tree

The above CSS would create the following:

Screenshot showing the CSSOM treeScreenshot showing the CSSOM tree

CSS is a “render-blocking” resource, which means that the render tree (more information on this later in this post) cannot be built until after the CSS is loaded. In past years CSS was typically served as one file style.css. Now, developers are using different techniques that allow you to split your files and serve critical styles, which can help reduce or eliminate any render-blocking sources you might be loading.

How to Reduce Render-Blocking Resources

As developers, we have some techniques we use that can help with render-blocking resources. Here are a few ways you can ensure you are not blocking the render tree from loading.

Start From the Beginning

If you have the resources, the best way to avoid blocking the render tree is during the initial phase of your website build or configure during the website maintenance period.

  1. Map your modules/components/layouts, so you have the header, hero, and content that typically shows up before the fold (initial scroll point) in a critical-styles.css file. This file will usually be much smaller than your entire style.css because it only contains the above the fold styles. You can load the critical styles first, and after page load, the rest of the styles would then load. This technique can drastically increase the speed of your website and remove any unwanted render-blocking CSS.
  2. Make sure you are using base styles. If you are using Bootstrap, Foundation, or other frameworks, these are typically imported in automatically, although it is good to double-check.

Use Autoptimize

For WordPress users, you can prevent render tree blocking by using the Autoptimize plugin, along with the Autoptimize Critical CSS addition. Both plugins offer free and paid versions; you can find the plugin in the WordPress Plugin Repository.


As you create your styles, it is vital to understand inheritance and the role it plays with CSS.

  1. Plan BEFORE you write! It is always beneficial for developers and webmasters alike to create a roadmap before you begin writing styles. Note any similarities between components, create utility classes for each similarity, and keep it simple. Far too often I see new developers (including my “Jr.” self) building from the inside out, and that can get you into trouble. This can lead to you breaking a fundamental development rule, DRY (Don’t Repeat Yourself).
  2. Make sure you aren’t overriding styles.
  3. Utilize base styles as much as possible (as shown in the above image).

For more information on how you can optimize your page speed and reduce render-blocking resources, check out Portent’s Ultimate Guide to Page Speed.

JavaScript Execution

JavaScript is a dynamic language that allows you to manipulate the DOM. One of the more popular ways of doing this is by adding interactivity to your websites, like a carousel slider or popup module, for example. The problem with adding these types of interactions is that they are costly to website load times—this is because JavaScript is a “parser-blocking” resource. While your browser is reading the document, a JavaScript file is encountered and construction of the DOM Tree is paused. Once the script has executed, construction continues.

Here is an example of loading your script in the footer:

Screenshot showing JavaScript loaded in a website footerScreenshot showing JavaScript loaded in a website footer

Here is an example of loading your script asynchronously:

Screenshot showing JavaScript loaded asynchronously on a websiteScreenshot showing JavaScript loaded asynchronously on a website

JavaScript can be a costly resource if done incorrectly. Here are a few tips that can help your JavaScript run more efficiently, reducing any/all parser-blocking resources:

  1. Write Vanilla JavaScript rather than employing jQuery.
  2. Dynamically load your scripts based on whether a specific ID lives on the page. That way, a browser only has to make one quick check to see if the ID exists, rather than running the entire script. (For WordPress users, wp_enqueue_script function will do this.)
  3. If possible, serve your JavaScript with the components that the JS is going to be manipulating. That way, you are only loading the JS if the element lives on the page.
  4. HERO SLIDERS ARE BAD, no if’s and’s or but’s about it. IF you must use a slider, try to make sure it is lower on the page, preferably, below the fold.
  5. If you have third-party scripts loading on your page, there isn’t a whole lot you can do. But there are a couple of things to try:
    • Add an “async” attribute to the “script tag.”
    • If available, host the script within your files. That way you aren’t relying on a third-party host.

The Render Tree

Combining the DOM and the CSSOM results in the creation of the Render Tree. The tree represents the computed layouts of each visible layout, which are then served to paint the process that renders the pixels to your screen.

Constructing the Layout

Now that we have a fully rendered tree, we can begin to illustrate layout construction. This step establishes the location and placement of the elements on the page, taking into account the size of the viewport, the width, and height of elements, as well as the position of where the elements are in relation to one another. By default, block-level elements have a width of 100% within their parent element. The parent element, in this case, would be the viewport or screen size.

As we create the markup, it is essential to be responsive in your thinking. One way that we can make sure our app or website is responsive in relation to the viewport is to use the meta tag:

Screenshot showing how to use the meta tag to make sure an app or website is responsive in relation to the viewportScreenshot showing how to use the meta tag to make sure an app or website is responsive in relation to the viewport

By doing this, you are ensuring that your app or website is visible within the current viewport. There are other steps to ensure your site is fully responsive, but that is a topic for another blog post.

Painting the Picture

The final step of the critical rendering path is painting the picture. Once the DOM, and CSSOM have fully parsed, the JavaScript will execute, the Render Tree is computed, and the layout constructed, the website begins to be painted on your screen. This process converts each node/element of the Render Tree into visible pixels on your screen.

Illustration showing the concept of a web page being painted as it loadsIllustration showing the concept of a web page being painted as it loads

Optimizing the Critical Rendering Path

If you have ever run a site speed test, you have likely seen the “First Contentful Paint” in the metrics section for Google’s Lighthouse tool. This number is the result of the critical rendering path. If you see your score and you aren’t sure if it is good or bad, luckily for us Google provides colors for us to understand.

Green = Good
Yellow = Could use improvements
Red = Trouble is lurking

Often, this can be misleading, which is one of the primary purposes of this post. As you run audits, it would be beneficial for you and your team to begin optimizing at the DOM Tree level. After optimizing the DOM Tree, move forward with optimizing the CSSOM. Often times, after optimizing the DOM and CSSOM, you will see less warnings as these are very critical parts. If you are still getting warnings for JS optimization, try a few of our recommendations listed in the JavaScript Execution section of the post.

To Recap

While website resources and implementations can vary. The critical rendering path is consistent. It is important to understand the ins and outs of the process so you and your team can begin optimizing for the future. To summarize, here are some steps to help your process going forward:

  1. Look for ways to clean up your HTML
  2. Optimize CSS
    • Set base styles for similar elements and components
    • Remove unused CSS
    • Implement critical styles that load above the fold
  3. The JavaScript
    • Use Vanilla JS, rather than entire libraries
    • Dynamically load scripts
    • Use Async, when applicable
    • If you must use sliders or other interactive elements, try to use them below the fold, so they are not render-blocking.
    • Minimize the use of third-party scripts, if possible, if not – try loading asynchronously.

The post The Critical Rendering Path Explained appeared first on Portent.

The Portfolio Theory of Diversification Applied to Career Planning

An important fundamental of investing and financial security is diversification. Spreading your money among different types of assets such as stocks, bonds, real estate, and collectables means you’re balancing risk and reward. Over time, this approach is much more likely to grow compared to investing in just one asset class. In my experience, the same thing is true about a career. Having a diversified set of income streams is likely to be better over time than working for one company and one industry.