We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
-
Deep Crawl is great for large sites
-
I would recommend using deepcrawl.com on your old domain so you can remap / rewrite the old domain and its URLs so if the URLs are rewritten it will help your new website a least it would minimize the damage.
To answer your question correctly yes why not 301 redirect thing you are going to lose any authority your old domain has yes it's bad.
Use archive.org it might have a copy of your entire site structure start form there.
Do you have backups?
-
Unfortunately, we did not do 301 redirects for the entire site and now we don't have the old urls to create the 301 redirects. Is this going to cause serious problems with Google by not having 301 redirects?
-
I agree that keeping the site map is definitely going to lead Googlebot to your site much faster and you should use Fech as a Googlebot on the entire site
Be certain that you have done a page page 301 redirect for the entire site. After that you can look into using this method of removing Data from Google's Index cache
I recommend not removing this unless it is doing damage to your site
https://support.google.com/webmasters/answer/1663691?hl=en
How to remove outdated content
<a class="zippy index1 goog-zippy-header goog-zippy-collapsed" tabindex="0">Remove a page that was already deleted from a site from search results</a><a class="zippy index2 goog-zippy-header goog-zippy-expanded" tabindex="0">Remove an outdated page description or cache</a>
Follow the instructions below if the short description of the page in search results (the snippet) or the cached version of the page is out of date.
- Go to the Remove outdated content page.
-
No problem! Here is a pretty comprehensive list of resources. I personally use ScreamingFrog.
Good luck!
-
Perfect sense. Thank you. Do you know of any good tools that will create an xml site map of at least 19,000 pages?
-
Hi again!
Every page should be on the sitemap so long as it's not behind a login or not supposed to be seen by search engines or users. I would update it and make sure pages aren't noindexed or blocked in your robots.txt. It shouldn't be limited to just your top navigation. Search engines will still crawl and see those deeper pages (not top nav) exist, but uploading them to the sitemap will help expedite the indexing process.
Does that make sense?
-
Thanks for getting back to me. It's the same domain so no change of address needed. We did upload a new site map, but the new site map only has 100 pages on it where the old site map had 19,000. Does the site map need every page on it or just the top navigation pages?
-
Hi Stamats
Did you update your sitemap xml and also submit it to Webmaster Tools? If you changed your domain, you should look into a change of address as well, but only if you changed your domain name.
Keep in mind that it could take Google a little bit to notice these changes, so do your best to help them notice these changes by the steps above.
Hope this helps! Let me know if you need anything else!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In one site a 3rd party is asking visitors to give feedback via pop-up that covers 30-50% of the bottom of the screen, depending on screen size. Is the 3rd party or the site in danger of getting penalized after the intrusive interstitial guidelines?
I am wondering whether the intrusive interstitial penalty affects all kinds of pop-ups regardless of their nature, eg if a third party is asking feedback through a discreet pop-up that appears from the bottom of the screen and covers max 50% of it. Is the site or the third party who is asking the feedback subject to intrusive interstitial penalty? Also is the fact that in some screens the popup covers 30% and in some others 50% plays any role?
Algorithm Updates | | deels-SEO0 -
How do you get a url to show as a tagline in google mobile search?
When searching in google via mobile, I am seeing urls changed to taglines. I have attached pictures that show the url in a web search, but a tag line from the mobile search. Does anyone know how to get a tagline to show in place of a url in a mobile search? Any advice would be appreciated! uLkYWRx.png wljXRI3.png
Algorithm Updates | | David-Kley0 -
Reduced traffic from google news
I have a question for my site. Can you advice me for situation? Last 1 months my site reduced traffic from google and google news.. And i dont know why? I need your advice. Thanks. Yunus Altindag Diyarbakir/Turkey
Algorithm Updates | | maxbilgisayar0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Https slower site Versus Non https faster site??
Hey all, I know that everyone is going on about https as a ranking signal (as far as I read it is not a very important ranking signal, but a low ranking signal) but Site speed is a ranking signal https is now a ranking signal as well https makes sites slower So in view of the above, what's better? An https site that is slower A non https site that is faster Thanks!
Algorithm Updates | | bjs20100 -
Recommended action for site hit by penguin ?
What is more advisable, though there surely could be debate on this? Back in '07 till sometime around a year ago it seems our site got hit by google's updates, no manual action though, and have seen in past few months disavowed what we could find as well as deleted a lot of links. We are also working on getting word out on the brand as well and trying to get on some business websites to have articles and offer some discounts. Our keyword rankings seem stuck in limbo the past year or so though. Some main keywords for example seem stuck around page 8 when they used to be on page 1. Question is, can what seems to be a penguin update be recovered from? Is Google likely to refresh the algorithm? Also could starting a new site be more worth the investment - starting fresh with natural links, etc And if googles system could pick up that the site is run from same ip, etc. would they care? Also the keyword competition one of Moz's tools said around 46% if that makes a difference for one of the main keywords. Thanks
Algorithm Updates | | xelaetaks0 -
Best practice for cleaning up multiple Google Places listings and multiple Google accounts when logins were lost.
We are an inbound marketing agency, most of our clients are not relying on local seo. I have a pretty good understanding of it when starting fresh but not so much in joining a "movie in progress" kind of scenario. Recently we've brought on two clients who have had their websites in place for awhile, have made small attempts at marketing themselves online over the years and its resulted in multiple Google places listings, variations of the company names (one of them changed their name), worried there are yet more accounts out there they aren't aware of, etc (analytics, and others from well intentioned employees and past service providers - no internal leadership at the company level). In reading Google help forums I'm seeing some recently having their accounts suspended when they try to clean things up - in one case a person setup a new Google account thinking he would start fresh and in trying to claim listings, get rid of duplicates, etc. his account was suspended. What is the CURRENT recommended course of action in situations like these? With all the changes going on with Google, I don't know which route to take and have combed the Internet reading articles about this (including Google's resources) - would like some current real world advise.
Algorithm Updates | | rhgraves651 -
Google+ Local Optimization
What are the recommended ways to optimize the Google+ places page for clients. Do services like louder voice and customer lobby help? I'd love to get the group's opinion on what strategies are working for them on local optimization.
Algorithm Updates | | SEO5Team0