Resolving rogue robots directives

Subscribers:
751,000
Published on ● Video Link: https://www.youtube.com/watch?v=5nK-WtNJqQ0



Duration: 15:34
4,827 views
84


In this episode of SEO Fairy Tales, Martin Splitt and Jason Stevens, a Senior Performance Media Manager at Google, shares how Jason’s team audited a site with bad search snippets and discovered that a robots.txt file was preventing the site from being crawled. Learn the steps you can take to investigate changes in website traffic or snippets using Search Console and Google’s suite of tools.

Chapters
0:00 - Intro
0:27 - Bad snippets
1:18 - Researching the root cause
2:11 - Understanding the updated directive
3:30 - It isn't always that simple
4:17 - What to look for?
5:36 - Other things to look out for
6:55 - Reports and tools
8:56 - Did it work?
9:43 - Setting up for sustainable success
11:05 - robots.txt - friend or foe?
14:01 - Wrap up

Watch more episodes of SEO Fairy Tales → https://goo.gle/SEOFairyTales
Subscribe to Google Search Central Channel → https://goo.gle/SearchCentral

#TechnicalSEO #Snippets #Crawling







Tags:
Bad snippets
crawling
search crawling
what is causing bad snippets
Search console
GA
mobile friendliness
lighthouse
Google search tooling
robot.txt
technical seo problems
technical seo tools
google search console
common seo problems
google search central
google search
technical SEO
SEO Fairytales