Unsolved Blog archive pages in Craw Error Report
-
Hi there,
I'm new to MOZ Pro and have a question. My scan shows Archive pages as having crawl issues, but this is because Yoast is set up to block robots on these pages.
Should I be allowing search engines to crawl these pages, or am I fine to leave them as I have it set up already?
Any advice is greatly appreciated.
Marc -
@fcevey
If blog archive pages are showing up in the crawl error report, it indicates that search engine bots are encountering issues while attempting to crawl and index those pages. To address this:Check URL Structure: Ensure that the URLs for your blog archive pages are correctly formatted and follow best practices. Avoid special characters, and use a logical and organized structure.
Update Sitemap: Make sure that the blog archive pages are included in your website's XML sitemap. Submit the updated sitemap to search engines using their respective webmaster tools.
Robots.txt File: Review your website's robots.txt file to ensure it's not blocking search engine bots from crawling your blog archive pages. Adjust the file if needed.
HTTP Status Codes: Check if the archive pages return the correct HTTP status codes (e.g., 200 OK). Crawl errors might be triggered if pages return 4xx or 5xx status codes.
Internal Linking: Ensure that there are internal links pointing to your blog archive pages. This helps search engines discover and index these pages more effectively.
Redirects: If you've recently changed the URL structure or migrated your website, implement proper redirects from old URLs to new ones to maintain SEO authority.
Server Issues: Investigate if there are any server-related issues causing intermittent errors when search engine bots try to access the blog archive pages.
-
Blog archive pages in the crawl error report indicate issues or problems encountered while indexing or accessing the archive pages of a blog or website. These errors need attention to ensure that all content remains accessible to users and search engines. ( what is project management) ( PMP Exam Prep) (study abroad)
-
@mhenshall The decision to allow search engines to crawl archive pages in YOAST SEO or leave them as they are currently configured depends on your specific goals and needs.
If the archive pages contain valuable and relevant content for both search engines and users, allowing them to be crawled could enhance the visibility of that content in search results. However, if the content on the archive pages is not important or is duplicated from other pages, blocking crawling could be a valid option to prevent indexing issues and improve the user experience.
I would recommend evaluating the content on the archive pages and considering how their visibility in search engines will be affected by allowing or blocking crawling. You can use tools like Google Search Console to monitor how Google is indexing those pages and make informed decisions based on the data.
Keep in mind that configuring YOAST SEO is a strategic decision that should align with your SEO goals and your webs```
code_textFree Research Preview. ChatGPT may produce inaccurate information about people, places, or facts. Follow us [https://posicionamiento-web-seo.com.ar/](link url)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Tools Should I Use To Investigate Damage to my website
I would like to know what tools I should use and how to investigate damage to my website in2town.co.uk I hired a person to do some work to my website but they damaged it. That person was on a freelance platform and was removed because of all the complaints made about them. They also put in backdoors on websites including mine and added content. I also had a second problem where my content was being stolen. My site always did well and had lots of keywords in the top five and ten, but now they are not even in the top 200. This happened in January and feb. When I write unique articles, they are not showing in Google and need to find what the problem is and how to fix it. Can anyone please help
Technical SEO | | blogwoman10 -
SEO Drop
Over the last few months my rank has dropped by around half and for the life of me I can’t see why. There are no warnings on Google Console. Am I missing something? Website: thespacecollective.com
Intermediate & Advanced SEO | | moon-boots0 -
Solved how to do seo audit
hello . i am a user that runs parsp website . i really need to know how to run a site audit to keep my job clean and my site works well ! i need help and i am a newbie in this job . thanks moz !
Moz Tools | | valigholami13862 -
How do you report SEO audit findings?
Hello, Mozzers! I'm curious to know how you report SEO audit findings. Do you use a spreadsheet? A presentation? A formal report? Or maybe something else. If you have a favourite audit template, I'd love to see it. A second question: what things do you report in an audit? I currently report crawl findings, authority and trust, link profiles, and competitive analysis. I also investigate a site's security—that's not usually part of an audit, but site owners need to know about it. What do you report to your audit customers? Thanks for sharing your auditing wisdom!
Intermediate & Advanced SEO | | AndyKubrin0 -
Should I set blog category/tag pages as "noindex"? If so, how do I prevent "meta noindex" Moz crawl errors for those pages?
From what I can tell, SEO experts recommend setting blog category and tag pages (ie. "http://site.com/blog/tag/some-product") as "noindex, follow" in order to keep the page quality of indexable pages high. However, I just received a slew of critical crawl warnings from Moz for having these pages set to "noindex." Should the pages be indexed? If not, why am I receiving critical crawl warnings from Moz and how do I prevent this?
Moz Pro | | NichGunn0 -
Site Crawl Error
In moz crawling error this message is appears: MOST COMMON ISSUES 1Search Engine Blocked by robots.txt Error Code 612: Error response for robots.txt i asked help staff but they crawled again and nothing changed. there's only robots.XML (not TXT) in root of my webpage it contains: User-agent: *
Moz Pro | | nopsts
Allow: /
Allow: /sitemap.htm anyone please help me? thank you0 -
Raven Reporting Alternative
Does anyone know of a tool that has similar capacity to Raven's Reports? I'm doing some white labeling of reports for another Agency that is used to sending their clients a monthly Raven report that is basicly pulling out all the Google Analytic Data of top referrers, keywords, landing pages ect and putting them into a nice format. I've used the Moz reporting and at times just combined it with a pdf of the actual GA equivalent. That's always seemed a bit clunky but I don't want to have to take on a Raven subscription just for that. Anyone have any good alternatives or know if when the new dashboard launches if it will also up its game when it comes to the generated reports?
Moz Pro | | BCutrer0 -
On-Page Summary (Report Cards) automation?
Hi everyone, Under the "On-Page" tab which shows your report cards, is there a way of getting it to grade your entire site? One of my site's is only ~20 pages so it's no big deal to manually enter each URL and set each one to update weekly. But what if I have a site that has ~1,000 pages and I want to optimise each and every page for my main keyword using the report cards feature? Thanks in advance! 🙂 Ash
Moz Pro | | AshSEO20110