Why do old URL format are still being crawled by Rogerbot?
-
Hi,
In the early days of my blog, I used permalinks with the following format:
http://www.mysitesamp.com/2009/02/04/heidi-cortez-photo-shoot/
I then decided to change this format using .htaccess to this format:
http://www.mysitesamp.com//heidi-cortez-photo-shoot/
My question is, why do rogerbot still crawls my old URL format since these urls' no longer exists in my website or blog.
-
Thanks Alan,
That solved my problem...
-
-
Hi Alan,
After disallowing the directory in robots.txt, Rogerbot still includes the non-existing URLs. Here is a sample URL that is being reported by Rogerbot
www.lugaluda.com/2009/08/05/chase-online-banking-chase-checking-bonus/
-
If you give me the url, i can crawl it fior you if you like.
-
Thanks Alan, I really appreciate your help. Gave me an idea since all the old URLs are coming from a virtual 2009 directory, I tried to add a disallow statement for that directory in the robots.txt section. Hopefully this will help solve the problem.
I will let you know the results after rogerbot finishes recrawling my site...
Thanks Dude....
-
You need to search your site, but bots start on a page and follow the links, if the report them then they must of found them, bots like googlebot or bingbot can find them on other sites, but rogerbot is only crawling within your site.
-
How will I know if they still exists on my site? If I tried to access the specific URLs, they are no longer active.
-
The old format must still exist in your site somewhere, bots follow links from your home page though your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Full title in url
Hi to all, what is the best url structure, to have all words in the url or to tweak url like Yoast suggest? If we remove some words from url , not focus keyword but stop words and other keywords to have shorter url will that impact search rankings? example.com/one-because-two-for-three-on-four - long url, moz crawl error, yoast red light example.com/one-two-three-four - moz ok, yoast ok Where one is a focus keyword.
Intermediate & Advanced SEO | | WalterHalicki0 -
Having issues crawling a website
We looked to use the Screaming Frog Tool to crawl this website and get a list of all meta-titles from the site, however, it only resulted with the one result - the homepage. We then sought to obtain a list of the URLs of the site by creating a sitemap using https://www.xml-sitemaps.com/. Once again however, we just go the one result - the homepage. There is something that seems to be restricting these tools from crawling all pages. If you anyone can shed some light as to what this could be, we'd be most appreciative.
Intermediate & Advanced SEO | | Gavo0 -
Old URLs that have 301s to 404s not being de-indexed.
We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?
Intermediate & Advanced SEO | | boxclever0 -
Keep Pages with Old Dates?
We have a tourism related site. We list annual events. Right now the URL extension includes the year. I assume it is better to keep the same page and update the dates, thereby keeping any links, ranking trust and authority we built. Is that the best strategy by updating the event info with the new dates? I would assume with a new page for the new year we would be starting over again and would have too much similar content and link diffusion. And in the future are we better off not including the year in the URL extension?
Intermediate & Advanced SEO | | Ebtec0 -
Search Refinement URLs
My site is using search refinement and I am concerned about the URL adding additional characters when it's refined. My current URL is: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm and when someone chooses their specific year, make, and model then it changes to: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm?searching=Y&Cat=10280&RefineBy_7371=7708. Will this negatively affect SEO for this URL? Will the URL be counted twice? Any help would be great!
Intermediate & Advanced SEO | | BrandLabs0 -
URL rewrites
We have a problem whereby a number of our urls are adressable from different urls - I'm told because of a quirk of developing in .net. e.g. mysite/FundComparison mysite/Fund-comparison mysite/fund-comparison We asked our supplier who hosts this section of our site to do some url rewrites so that the duplicates would 301 to the correct url. They're on IIS 6.0 and are not ready to upgrade to IIS 7.0 (my recommendation, which makes it easier for them to do the rewrite using the rewrite module). They said it would take 6-8 weeks to implement a web controller to do this. "The bulk of the time for this implementation is in the build of the engine + the addition of all the possible permutations of the URL to redirect to the proper URL." This sounds absolutely insane to me. I would have thought it could be done in a matter of hours. What do people think?
Intermediate & Advanced SEO | | SearchPM0 -
URL for offline purposes
Hi there, We are going to be promoting one of our products offline, however I do not want to use the original URL for this product page as it's long for the user to type in, so I thought it would be best practice in using a URL that would be short, easier for the consumer to remember. My plan: Replicate the product page and put it on this new short URL, however this would mean I have a duplicate content issue, would It be best practice to use a canonical on the new short URL pointing to the original URL? or use a 301? Thanks for any help
Intermediate & Advanced SEO | | Paul780 -
Page URL Issue
Hey Friend, I am having sort of a problem. I currently have a subpage with the url of: /musclecars/ I also have a subpage at /muscle-cars/muscle-car-restoration.html Obviously my main url is not listed here. My problem is I am trying to rank for the term Muscle Cars but the first URL does not have the keywords seperated so I rank no where. If I type MuscleCars into google I rank though (but nobody types the keyword in like that). So my question is can I create muscle-cars.mydomainname.com and rank well with that? Or is it better to just use mydomainname.com/muscle-cars/ even though that second term I am ranking for already has that in its url?
Intermediate & Advanced SEO | | shandaman0