So how ’bout that lunch, huh? Tacos, fajitas, and churros! Whew! If not for the steady stream of coffee I’ve got going on here—and the risk of embarrassing imprints on my face—I’d be ready for a nap on my keyboard after all that delicious food. On second thought, who needs coffee when we have such an exciting topic coming up? SEO, data, and competitive analysis? Yay!
Despite the title of this session, I will refrain from making any Harry Potter jokes. You’re welcome. The truth is, as magical as technical SEO may seem sometimes, it really comes down to using tools well, analyzing the data you collect, and knowing what to do with it once you have it. You can also mitigate algorithm-induced issues with good planning and preparation. Air BnB’s Head of Global SEO Dennis Goedegebuure will moderate as BlueGlass Director of Strategy Development Selena Narayanasamy, and SEOmoz Marketing Scientist Dr. Peter Meyers, give us the scoop.
Dr. Pete’s up first. He warned me he was going to talk fast, and about a lot of tech stuff. Challenge accepted!
His presentation is “Big Site SEO Triage.” He mentions SEOmoz’s customer support, and says they have about 15 minutes to help people and answer their questions. Over the years, they’ve gotten better at it, and it’s taught him a lot.
Is Dr. Pete a real doctor? Well, not exactly. He has a PhD in psychology—he’s not Dr. House or Dr. Frasier Crane—he’s like Dr. Pete Venkman from Ghostbusters.
He shows a graph of site visits showing a big jump over a period of about five months from April to September. He wanted to understand where it came from. It’s from a client in the education market, so they’re usually pretty slow in the summer. They did not link building, no content generation, just made some architectural changes, and increased traffic 2.6 times over.
He says he can sum up his presentation in two words: keyword density. Nah, he’s just f***king with us. He goes further.
Is there a dupe content penalty?
After years of arguing about it, he’s come to a conclusion: he doesn’t give a shit. We need to know the difference between lost rankings and a penalty, but we’re about the client. Our job is to get the job done.
Here are the steps to do that.
Look at the damned site
It starts with the site. You need to know what you’re getting into. Use your top two SEO tools—your eyes, and your brain (and we see a slide with stills from “Young Frankenstein” and “The Man With Two Brains.” Awesome.)
Scope out the architecture.
Start clicking around, take a few minutes to do that. He brings up the REI site as an example. They have a flat architecture. There’s gonna be a ton of pages, and a lot of link juice. He likes to get the lay of the land off the bat.
Check for Pagination
He then brings up BestBuy. Look for search filters and sorts. Are they being parameterized? Look at those folders.
Master the basics. He’s a big believer in learning how to use a tool, and use it well. Use the search operator site:
Specifically: site: + intitle: and site + inurl:
Take a unique phrase from the home page and throw it into the “intitle” operator, and you’ll immediately see dupe content issues, as well as canonicalization problems. You may also find out if anyone’s scraping you. It’s a very useful combination of queries.
If you have a blog or whatever, find out how much real estate it’s occupying. He dealt with a client sort of like Etsy, but not really. They had a big index, but not a ton of authority. He found that 8 million of their 11 million pages were the individual shop pages. All the search filters were being crawled. He realized about 7 million of their indexed pages were coming from user generated search filters.
How reliable is site:?
He did an experiment, because that’s what he does. The experiment produced widely varied results, so you have to perform your own experiments to find the data you need.
Don’t forget the basics
Bring in the Firepower
Now you can bring in the tools and get some easy wins.
HTML Improvements page from Google Webmaster Tools
Gives you a quick drilldown, and you can see where your errors are right away
Index Status (Advanced)
You get the “not selected,” which are the pages Google is crawling, but choosing not to index
It’s a year at a glance, so it’s great because it’s often hard to get that information from clients
Tiered XML sitemaps
Put all your main pages into one sitemap. Put all your main categories into a sitemap. Put all your sub-categories into a sitemap. If you think you should have 10k pages indexed, but Google only has 6k indexed, you’re using up too much equity early on.
It’s a great tool, easier to use then Xenu. How to solve one really important problem: We’re seeing all these links to pages, but we don’t know where they’re coming from. Do your crawl with Screaming Frog.
Choose Your Weapon
We have all these tools to deal with the index, and I get asked a lot, what’s the best one? There is no one answer to that question. You can do it by the book, do what Google says, and it won’t work, so you’ll have to adapt.
- don’t need dev acces
- good for prevention
- still can be dangerous
- bad for removal
- extremely flexible
- fairly easy to reverse
- occasionally ignored
- still very powerful
- impact users + search
- difficult to reverse
- Google can get suspicious
- fast and powerful
- very flexible
- easy to screw up
The canonical URL is the canonical URL. Don’t use something different in the canonical tag. If you do, you don’t know what canonical means. It’s not a band-aid for bad site architecture. Use canonical to fix little messes.
- low-risk option
- allows pages to rank
- difficult to implement
Rule #1 for page-level cues:
They can’t see what they don’t crawl. If they don’t recrawl that page, no header-level cue will get picked up. I’ve seen people use 301 redirects, canonical, and other tools all at once, but it doesn’t work. Go slowly. Take your time. You’ll get better results.
GWT parameter handling
- requires no site access
- inconsistently applied
Google has pushed a lot of the problems back on us. They’ve upped the stakes and the penalties. As soon as you have 1,000 pages, you need to worry about technical SEO. The index costs Google billions of dollars every year. If they can make us clean up the crap, they come out ahead. Give them the best content, don’t dilute it, and you’re making more money for them. He encourages it. Check out Dr. Pete’s slide deck here:
Next up, our own Selena Narayanasamy! She’s happy because she’s short, but can actually reach the mike on the podium. I feel your pain, Selena.
Her presentation is “Leveraging Data for Insight.” She’s an even faster talker than Dr. Pete, so I grabbed as much as I could here. I’m sure you’ll cut me a little slack. ;-)
Tools can show you a lot of things—except how to make a strategic decision.
- Hard numbers
- obscure terminology
You have to use your own analysis, draw your own conclusions, and create an actionable plan from those conclusions. Technical SEO is a lot of work.
She’s going to cover several tools and tactics, and what they’ll do for you:
Quick On-Site Strategy Check
- what are their priority categories and topics?
- are they segmenting with subdomains?
- what’s their topical dilution?
- how are the segmenting according to intent?
Cross-Data Check – SEMRush
- sort by directory hierarchy
- break out categories 1-2 levels down
- cross check with tool for categorical volumes
- look for open opportunities and weaknesses
- where are they consolidating blog posts
- are they consolidating into static pages
- are they using an internal canonical strategy?
You can see how they’re optimizing for Open Graph, as well as opportunities where they’re not optimizing.
What does this tell me?
leveraging with “verbs” and “actions” to increasing sharing
are they fully optimizing their content?
Semantics, Concepts and Entities
- what’s the topical strength of category pages
- what user base are they targeting
- what are the pages intended for?
Concepts and entities
- overall makeup vs. density
- Who are they?
- What are they doing?
- More importantly, how are YOU viewed
- Majestic SEO
- Analytics SEO
What are we looking at?
- promotional timelines
- potential link quality
- what’s your natural backlink cadence
- what’s our competitor’s cadence
- low points in backlink generation
- high points in backlink generation
Anchor Text Distribution
The obvious: What are they building for?
The not-so-obvious: How does that tie into the overall strategy
- does the heavy lifting for you
- pulls phrases
- drills down to individual words
- helps show relationships between phrases
- You can see their skeletons and low quality links
- you can identify which competitor has high quality links
Various Tracking Graphs
- track specific URLs against your competitors that are performing better than you
- track your pages that are struggling
- track the drop off of referring domains
Use your best judgment. What tactics are helping vs. hurting strategy? Use your brain.
See Selena’s presentation here:
And there you go! You should be ready to implement technical SEO on your site! Ok, maybe not completely—but this should give you a good idea of where to start. Stay with us! More coverage coming up!