Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Staging website got indexed by google
-
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index.
Note- we already added Meta NOINDEX in head tag
-
Hi Dera Moz My Domain Is 18 Years Old But Da is don't increased i don't know why can you please help me and check my url cigars please check sir
#mozda
-
Its good that you already put the Meta NOINDEX.
Now, you can ask to remove the url of website from google index. Visit the google search console and request the url removal.
You can use the URL Removal Tool in Google Search Console to request the removal of specific URLs from Google's index.
To use the URL Removal Tool, you can:
- Open the Removals tool.
- Select the Temporary Removals tab.
- Click New Request.
- Select Next to complete the process.
Warm Regards
Rahul Gupta
Suvidit Academy -
Sydney's Best Chauffeur Car Service | A1 Corporate Cars Au
Sydney's Best Chauffeur Car Service is a premier provider of corporate chauffeured cars in Sydney, Australia. We offer top-of [url=https://a1corporatecars.com.au/]corporate cars Australia[/url] transportation solutions for business professionals, executives, and VIP clients who demand the highest service and comfort. With a fleet of luxury vehicles and experienced professional chauffeurs, we ensure a seamless and luxurious travel experience for our esteemed customers.
-
If your staging website has been indexed by Google, it means that Google's web crawlers have discovered and added your staging site's pages to their search index. This is typically not desirable because staging websites are meant for testing and development purposes and often contain incomplete or confidential content.
To address this issue, you can take several steps. Firstly, ensure that your staging website has a "robots.txt" file configured properly. This file tells search engines which parts of your website to crawl and index. In the case of a staging site, you can disallow all web crawlers from indexing it by using a "robots.txt" file.
Another effective measure is to include a "noindex" meta tag in the HTML of your staging website's pages. This tag instructs search engines not to index the page, adding an extra layer of protection.
Consider password-protecting your staging website using HTTP authentication. This adds an additional layer of security and ensures that only authorized users can access the site.
To further mitigate indexing issues, you can set up your staging website on a subdomain or a subdirectory instead of a separate domain. Google is less likely to index staging content if it's located in a subdomain or subdirectory.
If your staging site is already indexed, you can request the removal of specific URLs from Google's index using the Google Search Console's URL Removal Tool. This is a more proactive approach to remove already indexed content.
Lastly, regularly monitor your staging website to ensure it remains hidden from search engines and that any changes to the robots.txt file or meta tags are being followed. It's a good practice to implement these measures before you create or launch a staging website to prevent it from being indexed in the first place.
Remember that it may take some time for Google to update its index and remove your staging site's pages. Be patient and continue to monitor the situation closely to ensure the desired results are achieved.
-
If a staging website (a non-production or testing version) gets indexed by Google, it can lead to privacy, user experience, and SEO issues. To address this, use methods like robots.txt, "noindex" meta tags, or password protection to prevent indexing. If already indexed, request removal through Google Search Console to ensure only the production site is visible in search results.
-
If your staging website has been indexed by Google, it means that Google's search engine has discovered and included your staging site in its search results. This is not an ideal situation since staging websites are usually intended for testing and development purposes, and you may not want them publicly accessible.
To address this issue, you can take a few steps:
Use a robots.txt file: Create a robots.txt file on your staging website and instruct search engines not to index it. This file specifies which areas of your site search engines should or should not crawl.
Add a noindex meta tag: Insert a "noindex" meta tag in the head section of your staging website's HTML. This tag tells search engines not to index that specific page.
Password protect your staging website: Implement password protection on your staging environment to ensure that only authorized users can access it. This can be done through various authentication methods, depending on your setup.
Remember that these steps can help prevent further indexing, but they may not immediately remove your staging site from the search results. It might take some time for search engines to re-crawl your site and recognize the changes you made.
-
If your staging website gets indexed by Google, you should take these steps:
( Atlantic Immigration Pilot Program application form)
Use a robots.txt file to disallow indexing.
Request removal of indexed pages via Google Search Console.
Canada PR
Add a "noindex, nofollow" meta tag to staging pages.
Consider password protecting the staging site.
Ensure canonical URLs point to the production site.
These actions will help prevent your incomplete or sensitive staging content from appearing in Google search results.
Best digital marketing agency -
If your staging website has been indexed by Google, it means that Google's search engine has crawled and added your staging site's pages to its search index. This is typically not desired because staging websites are not meant for public access and may contain incomplete or sensitive content.
To address this issue, you should take the following steps:
Disallow indexing: Use a robots.txt file to instruct search engines not to crawl and index your staging website. You can add the following lines to your robots.txt file to disallow all search engines:
makefile
Copy code
User-agent: *
Disallow: /
Place this robots.txt file in the root directory of your staging website.Remove indexed pages: You can request Google to remove indexed pages from its search results by using the Google Search Console's "Remove URLs" tool. Log in to your Google Search Console account, select your property, go to the "Index" section, and choose "Removals." From there, you can temporarily hide specific URLs from Google search results.
Use noindex meta tags: On your staging website's pages, you can add a meta tag to indicate that the page should not be indexed. Add the following meta tag within the HTML <head> section of each page you want to exclude:
html
Copy code
<meta name="robots" content="noindex, nofollow">
This tag tells search engines not to index the page or follow any links on it.Password protection: Consider adding password protection to your staging website, so only authorized users can access it. This adds an additional layer of security and privacy.
Update canonical URLs: Ensure that your staging website's canonical URLs (if used) point to the production website, not the staging one. This helps search engines understand the preferred version of your content.
After taking these steps, monitor your staging website to ensure it's no longer being indexed by Google. Keep in mind that it may take some time for changes to take effect and for Google to de-index your staging content.
-
@Asmi-Ta said in Staging website got indexed by google:
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index.
Note- we already added Meta NOINDEX in head tagTo remove indexed staging site links and prevent further indexing, take these steps: Add a "Disallow" rule for the staging site in your
robots.txt
file, use 301 redirects for indexed staging URLs to point to production, update all internal links to production URLs, request URL removals through Google Search Console's "Fetch as Google" and URL Removal Tool, submit an updated production sitemap, and monitor Google Search Console for updates. Be patient, as it may take time for search engines to de-index staging URLs and re-crawl your site. Ensure the staging site has a "noindex" tag in its<head>
section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website is not getting indexed
Hi,
Content Development | | Aman0022
Hope you all are doing great!
I have created a dog blog a few weeks back which talks about all things about dogs (http://pawspulse.com/). I am publishing couple of articles everyday which are more than 5k words long with proper keyword research but still Google is not indexing my content. My content is systematically categorized in proper categories related to dog guides, nutrition, accessories, dog breeds etc.
Can anyone help me how to get the website index faster fully. Any help will be much appreciated. Thanks0 -
Should I "no-index" two exact pages on Google results?
Hello everyone, I recently started a new wordpress website and created a static homepage. I noticed that on Google search results, there are two different URLs landing on same content page. I've attached an image to explain what I saw. Should I "no-index" the page url? Google url.JPG In this picture, the first result is the homepage and I try to rank for that page. The last result is landing on same content with different URL. So, should I no-index last result as shown in image?
Technical SEO | | amanda59640 -
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
One of our web pages will not rank on Google. The website as a whole ranks fine except just one section...We have tested and it looks fine...Google can crawl the page no problem. There are no spurious redirects in place. The content is fine. There is no duplicate page content issue. The page has a dozen product images (photos) but the load time of the page is absolutely fine. We have the submitted the page via webmaster and its fine. It gets listed but then a few hours later disappears!!! The site has not been penalised as we get good rankings with other pages. Can anyone help? Know about this problem?
Intermediate & Advanced SEO | | CayenneRed890 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
Yoast SEO Plugin: To Index or Not to index Categories?
Taking a poll out there......In most cases would you want to index or NOT index your category pages using the Yoast SEO plugin?
Intermediate & Advanced SEO | | webestate0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0