Unsolved Blog archive pages in Craw Error Report
-
Hi there,
I'm new to MOZ Pro and have a question. My scan shows Archive pages as having crawl issues, but this is because Yoast is set up to block robots on these pages.
Should I be allowing search engines to crawl these pages, or am I fine to leave them as I have it set up already?
Any advice is greatly appreciated.
Marc -
@fcevey
If blog archive pages are showing up in the crawl error report, it indicates that search engine bots are encountering issues while attempting to crawl and index those pages. To address this:Check URL Structure: Ensure that the URLs for your blog archive pages are correctly formatted and follow best practices. Avoid special characters, and use a logical and organized structure.
Update Sitemap: Make sure that the blog archive pages are included in your website's XML sitemap. Submit the updated sitemap to search engines using their respective webmaster tools.
Robots.txt File: Review your website's robots.txt file to ensure it's not blocking search engine bots from crawling your blog archive pages. Adjust the file if needed.
HTTP Status Codes: Check if the archive pages return the correct HTTP status codes (e.g., 200 OK). Crawl errors might be triggered if pages return 4xx or 5xx status codes.
Internal Linking: Ensure that there are internal links pointing to your blog archive pages. This helps search engines discover and index these pages more effectively.
Redirects: If you've recently changed the URL structure or migrated your website, implement proper redirects from old URLs to new ones to maintain SEO authority.
Server Issues: Investigate if there are any server-related issues causing intermittent errors when search engine bots try to access the blog archive pages.
-
Blog archive pages in the crawl error report indicate issues or problems encountered while indexing or accessing the archive pages of a blog or website. These errors need attention to ensure that all content remains accessible to users and search engines. ( what is project management) ( PMP Exam Prep) (study abroad)
-
@mhenshall The decision to allow search engines to crawl archive pages in YOAST SEO or leave them as they are currently configured depends on your specific goals and needs.
If the archive pages contain valuable and relevant content for both search engines and users, allowing them to be crawled could enhance the visibility of that content in search results. However, if the content on the archive pages is not important or is duplicated from other pages, blocking crawling could be a valid option to prevent indexing issues and improve the user experience.
I would recommend evaluating the content on the archive pages and considering how their visibility in search engines will be affected by allowing or blocking crawling. You can use tools like Google Search Console to monitor how Google is indexing those pages and make informed decisions based on the data.
Keep in mind that configuring YOAST SEO is a strategic decision that should align with your SEO goals and your webs```
code_textFree Research Preview. ChatGPT may produce inaccurate information about people, places, or facts. Follow us [https://posicionamiento-web-seo.com.ar/](link url)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What steps should I take to address damage to my website, including malware insertion and content theft?
The question revolves around the steps required to mitigate damage inflicted upon a website, encompassing issues such as malware insertion and content theft. It prompts a comprehensive exploration of the necessary actions to take in response to these challenges. The inquirer seeks guidance on how to effectively address the damage, indicating a desire for practical solutions and strategies to restore and safeguard their website's integrity. By posing this question, the individual demonstrates an awareness of the severity of the situation and a readiness to undertake corrective measures.
Technical SEO | | ralphbaer0 -
Searchability Strategy
We recently had a technical SEO audit carried out and it highlighted how many clicks it took to get to many of our pages (more than 4)). Does anyone have advice on how to create a structure for our pages to avoid this or recommend any articles I can read?
Technical SEO | | Caroline_Ardmoor0 -
Unsolved GMB Local SEO question
I am trying to diagnose how one particular competitor is smoking us in local rankings. I came across a text field “Service Details' within Google My Business Services. This allows me to put in a brief description of each service we offer. My thought is that this could be a good place for keywords. That said, the descriptions are not public facing (or to the best of my knowledge) so I am reluctant to do all the work for nothing. I am wondering if anyone has filled these out and if there were any noticeable results. Any insight is appreciated
Local SEO | | jorda0910 -
Pages with Duplicate Content Error
Hello, the result of renewed content appeared in the scan results in my Shopify Store. But these products are unique. Why am I getting this error? Can anyone please help to explain why? screenshot-analytics.moz.com-2021.10.28-19_53_09.png
Moz Pro | | gokimedia0 -
Have a Campaign, but only states 1 page has been crawled by SEOmoz bots. What needs to be done to have all the pages crawled?
We have a campaign running for a client in SEOmoz and only 1 page has been crawled per SEOmoz' data. There are many pages in the site and a new blog with more and more articles posted each month, yet Moz is not crawling anything, aside from maybe the Home page. The odd thing is, Moz is reporting more data on all the other inner pages though for errors, duplicate content, etc... What should we do so all the pages get crawled by Moz? I don't want to delete and start over as we followed all the steps properly when setting up. Thank you for any tips here.
Moz Pro | | WhiteboardCreations0 -
How best is it to use the on-page reports in seomoz?
how best is it to use the on-page reports in seomoz? Any help and techniques people use would be greatly appreciated thanks
Moz Pro | | Bristolweb0 -
New site on page check
hello wonderful Mozzers, I am building a new site and was wondering if any of you knew any latest " thorough" ON Page Check Lists? I want to make sure I build the site right, and do every bit of on page and new site seo right. I have access to the SEOMOZ guides as well. Have a fab day guys, Best, Vijay
Moz Pro | | vijayvasu0 -
On-Page URL
Hopefully I am missing something basic... I can't see how to specifically add and delete On-Page reports. It seems like running a report adds it but how to delete? Also, how does one change the URL for a report? I have re-organized some pages and can't seem the get the on-page report to keep my URL change. Here is what I tried. From the On-Page report card for a keyword I changed the URL and ran the test. Test runs ok but if I navigate back to the summary my old bad URL is still there.
Moz Pro | | Banknotes0