We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
-
Deep Crawl is great for large sites
-
I would recommend using deepcrawl.com on your old domain so you can remap / rewrite the old domain and its URLs so if the URLs are rewritten it will help your new website a least it would minimize the damage.
To answer your question correctly yes why not 301 redirect thing you are going to lose any authority your old domain has yes it's bad.
Use archive.org it might have a copy of your entire site structure start form there.
Do you have backups?
-
Unfortunately, we did not do 301 redirects for the entire site and now we don't have the old urls to create the 301 redirects. Is this going to cause serious problems with Google by not having 301 redirects?
-
I agree that keeping the site map is definitely going to lead Googlebot to your site much faster and you should use Fech as a Googlebot on the entire site
Be certain that you have done a page page 301 redirect for the entire site. After that you can look into using this method of removing Data from Google's Index cache
I recommend not removing this unless it is doing damage to your site
https://support.google.com/webmasters/answer/1663691?hl=en
How to remove outdated content
<a class="zippy index1 goog-zippy-header goog-zippy-collapsed" tabindex="0">Remove a page that was already deleted from a site from search results</a><a class="zippy index2 goog-zippy-header goog-zippy-expanded" tabindex="0">Remove an outdated page description or cache</a>
Follow the instructions below if the short description of the page in search results (the snippet) or the cached version of the page is out of date.
- Go to the Remove outdated content page.
-
No problem! Here is a pretty comprehensive list of resources. I personally use ScreamingFrog.
Good luck!
-
Perfect sense. Thank you. Do you know of any good tools that will create an xml site map of at least 19,000 pages?
-
Hi again!
Every page should be on the sitemap so long as it's not behind a login or not supposed to be seen by search engines or users. I would update it and make sure pages aren't noindexed or blocked in your robots.txt. It shouldn't be limited to just your top navigation. Search engines will still crawl and see those deeper pages (not top nav) exist, but uploading them to the sitemap will help expedite the indexing process.
Does that make sense?
-
Thanks for getting back to me. It's the same domain so no change of address needed. We did upload a new site map, but the new site map only has 100 pages on it where the old site map had 19,000. Does the site map need every page on it or just the top navigation pages?
-
Hi Stamats
Did you update your sitemap xml and also submit it to Webmaster Tools? If you changed your domain, you should look into a change of address as well, but only if you changed your domain name.
Keep in mind that it could take Google a little bit to notice these changes, so do your best to help them notice these changes by the steps above.
Hope this helps! Let me know if you need anything else!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we include URLs with parameters in the sitemap?
Hi, I wanted to know whether we can include URLs with search parameters in the sitemap. Currently, we are trying to append structured data for our job listing page. There happens to be a large number of job listings around 1000 pages with unique job-id and location. Should we add these pages in the sitemap or is there any other solution to this? Regards, Tejas
Algorithm Updates | | tejasbansode0 -
Google Open Graph
Hi I wanted to find out what makes Google select a site to show the answer to a question you type in search? For example, typing What is COSHH, brings up this site http://rospaworkplacesafety.com/2013/01/08/what-is-coshh-about-coshh/ and this answer top of Google SERPs. COSHH stands for 'Control of Substances Hazardous to Health' and under the Control of Substances Hazardous to Health Regulations 2002, employers need to either prevent or reduce their workers' exposure to substances that are hazardous to their health.8 Jan 2013 Is it their open graph mark up only? Becky
Algorithm Updates | | BeckyKey0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Mozcast: 5th & 9th May - what's shaking up?
What's going on at the moment, i can't find any info on the 5/9th May but Mozcast is showing some movement. Anyone have any info? Cheers
Algorithm Updates | | Bondara0 -
Is it better to build a large site that covers many verticals or many sites dedicated to each vertical
Just wondering from an seo perspective is it better to build a large site that covers many verticals or build out many sites one for each vertical?
Algorithm Updates | | tlhseo0 -
Google Panda - large domain benefits
Hi, A bit of a general question, but has anyone noticed a improvement in rankings for large domains - ie well known, large sites such as Tesco, Amazon? From what I've seen, the latest Panda update seems to favour the larger sites, as opposed to smaller, niche sites. Just wondered if anyone else has noticed this too?Thanks
Algorithm Updates | | Digirank0 -
Google Page Rank?
We have had a quality website for 12 years now, and it seems no matter how many more links we get and how much new content we add daily, we have stayed at PR3 for the past 10 years or so. Our SEOMoz domain authority is 52. We have over 950,000 pages linking to us from 829 unique root domains. Is this in line with PR3 or should we be approaching PR4 soon? We do daily blog posts with all unique, fresh quality content that has not been published elsewhere. We try to do everything with 'white hat' methods, and we are constantly trying to provide genuine content and high quality products, and customer service. How can we improve our PR and how important is PR today?
Algorithm Updates | | applesofgold0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0