We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
-
Deep Crawl is great for large sites
-
I would recommend using deepcrawl.com on your old domain so you can remap / rewrite the old domain and its URLs so if the URLs are rewritten it will help your new website a least it would minimize the damage.
To answer your question correctly yes why not 301 redirect thing you are going to lose any authority your old domain has yes it's bad.
Use archive.org it might have a copy of your entire site structure start form there.
Do you have backups?
-
Unfortunately, we did not do 301 redirects for the entire site and now we don't have the old urls to create the 301 redirects. Is this going to cause serious problems with Google by not having 301 redirects?
-
I agree that keeping the site map is definitely going to lead Googlebot to your site much faster and you should use Fech as a Googlebot on the entire site
Be certain that you have done a page page 301 redirect for the entire site. After that you can look into using this method of removing Data from Google's Index cache
I recommend not removing this unless it is doing damage to your site
https://support.google.com/webmasters/answer/1663691?hl=en
How to remove outdated content
<a class="zippy index1 goog-zippy-header goog-zippy-collapsed" tabindex="0">Remove a page that was already deleted from a site from search results</a><a class="zippy index2 goog-zippy-header goog-zippy-expanded" tabindex="0">Remove an outdated page description or cache</a>
Follow the instructions below if the short description of the page in search results (the snippet) or the cached version of the page is out of date.
- Go to the Remove outdated content page.
-
No problem! Here is a pretty comprehensive list of resources. I personally use ScreamingFrog.
Good luck!
-
Perfect sense. Thank you. Do you know of any good tools that will create an xml site map of at least 19,000 pages?
-
Hi again!
Every page should be on the sitemap so long as it's not behind a login or not supposed to be seen by search engines or users. I would update it and make sure pages aren't noindexed or blocked in your robots.txt. It shouldn't be limited to just your top navigation. Search engines will still crawl and see those deeper pages (not top nav) exist, but uploading them to the sitemap will help expedite the indexing process.
Does that make sense?
-
Thanks for getting back to me. It's the same domain so no change of address needed. We did upload a new site map, but the new site map only has 100 pages on it where the old site map had 19,000. Does the site map need every page on it or just the top navigation pages?
-
Hi Stamats
Did you update your sitemap xml and also submit it to Webmaster Tools? If you changed your domain, you should look into a change of address as well, but only if you changed your domain name.
Keep in mind that it could take Google a little bit to notice these changes, so do your best to help them notice these changes by the steps above.
Hope this helps! Let me know if you need anything else!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you think profanity in the content can harm a site's rankings?
In my early 20's I authored an ebook that provides men with natural ways to improve their ahem... "bedroom performance". I'm now in my mid 30s, and while it's not such an enthralling topic, the thing makes me 80 or so bucks a day on good days, and it actually works. I update the blog from time to time and build links to it on occasion from good sources. I've carried my SEO knowledge to a more "reputable" business, but this project is still interesting to me, because it's fully mine. I am more interested in getting it to rank and convert than anything, but following the same techniques that are working to grow the other business, this one continues to tank. Disavow bad links, prune thin content.. no difference. However, one thing I just noticed now are my search queries in the reports. When I first started blogging on this, I was real loose with my tongue, and spoke quite frankly (and dirty to various degrees). I'm much more refined and professional in how I write now. However, the queries I'm ranking for... a lot of d words, c words (in the sex sense)... sounds almost pornographic. Think Google may be seeing this, and putting me lower in rankings or in some sort of lower level category because of it? Heard anything about google penalizing for profanity? I guess in this time of authority and trust, that can hurt both of those... but I wonder if anyone's heard any actual confirmation of this or has any experience with this? Thanks!
Algorithm Updates | | DavidCapital0 -
Google Webmaster Tools show the error in Manual Action while there is no any error in Structured Data Testing Tool.
It is showing error as below Spammy structured markup Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google's Spammy Structured Markup guidelines. While I see in Structured Data Testing Tool, it doesn't show any error.
Algorithm Updates | | infinitemlm0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Where has Google found the £1.00 value for the penny black? Is it Google moving beyond the mark-ups too?
Hi guys, I am curious, so am wondering something about the Penny Black SERPs.
Algorithm Updates | | madcow78
Apparently Google shows a value of £1.00 Penny Black SERP From where does it come from? It's not the value Penny Black Value SERP The Wikipedia page hasn't any mark-up about it, actually it has the Price value mark-up of 1 penny Penny Black Wiki Markup Among the rare stamps, also the Inverted Jenny shows a value Inverted Jenny SERP But it's clearly taken from USPS and it's the cost of a new version of this rare stamp USPS Inverted Jenny Indeed, the mark-up matches that value USPS Inverted Jenny Mark-up I've been looking on-line for a new version of the Penny Black, but couldn't find anything.
The only small piece of information that I've found to correlate one pound with the Penny Black is on the Wikipedia page, but the point is: is Google able to strip those information from that piece? It's not a mark-up, it's not a number and mostly it's not a simple sentence like "The penny black cost was of £1.00" It reads "One full sheet cost 240 pennies or one pound sterling". Penny Black Wikipedia particular Is it Google moving beyond the mark-ups too? Thanks, Pierpaolo 9Cm3MOs.jpg f7XYNtF.jpg 5PpwapB.jpg hYUJswI.jpg 7kbIC4Q.jpg jnu1Gbe.jpg Wzltg0t.jpg2 -
Any way to tell if a link has been devalued?
I have some listings in lawyer directories some of which have very hig PR , links, traffic, etc. For example, www.nolo.com, I know that Google has more or less recently devalued a lot of directory links. I would assume that a monster site like nolo would not be one of those, but does anyone know any way to tell? Paul
Algorithm Updates | | diogenes0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0 -
Does Google index Wordpress pages with frames
Does Google or other search engines index Wordpress pages that use frames? Here is the site in question: http://www.source-nutrition.com/son/
Algorithm Updates | | BradBorst0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0