Issue with site not being properly found in Google
-
We have a website [domain name removed] that is not being properly found in Google. When we run it through Screaming Frog, it indicates that there is a problem with the robot.txt file.
However, I am unsure exactly what this problem is, and why this site is no longer properly being found.
Any help here on how to resolve this would be appreciated!
-
Note: We've edited and removed select links and images in this thread as requested by the OP for privacy.
-
Hi Thomas,
Thanks for all your help here. You've been fantastic!
We have had an issue generating a sitemap for our website using our usual sitemap creation tools. Could you explain why this is?
-
Moderator's Note: Attached images, along with select links in this thread have been edited and/or removed for privacy at the request of the OP.
--
I noticed your robots.txt is fixed but I would recommend two things to get your site back into the index faster based on the photographs below I am suggesting fetching your site as a Google bot as well as adding your XML site map to Webmaster tools.
Please do not forget to add all four versions of your website to webmaster tools if it has not been added
when I say that I mean add every URL below to Google Webmaster tools with and without www
target the site to the fourth or canonical URL. Choose the one with www.
here is a reference from Google
https://support.google.com/webmasters/answer/34592?hl=en&ref_topic=4564315
I would do two things I would add my site map to my robots.txt file because if you're going to use search tools it's going to help you.
You should set up your robots.txt just like this
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php[Sitemap: https://www.website.com/sitemap_index.xml]
you can reference
https://yoast.com/ultimate-guide-robots-txt/
Allow
directiveWhile not in the original “specification”, there was talk of an
allow
directive very early on. Most search engines seem to understand it, and it allows for simple, and very readable directives like this:Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php
The only other way of achieving the same result without an
allow
directive would have been to specificallydisallow
every single file in thewp-admin
folder.because you don't want your login to be showing up in Google.
after which I would go into Webmaster tools/search console and fetch as a Google bot
Ask Google to re-crawl your URLs
If you’ve recently made changes to a URL on your site, you can update your web page in Google Search with the_Submit to Index_function of the Fetch as Google tool. Thisfunction allows you to ask Google to crawl and index your URL.
See
http://searchengineland.com/how-to-use-fetch-as-googlebot-like-seo-samurai-214292
https://support.google.com/webmasters/answer/6066468?hl=en
Ask Google to crawl and index your URL
- Click Submit to Index, shown next the status of a recent, successful fetch in the Fetches Table.
- Select** Crawl only this URL **to submit one individual URL to the Google for re-crawling. You can submit up to 500 individual URLs in this way within a 30 day period.
- Select** Crawl this URL and its direct links** to submit the URL as well as all the other pages that URL links to for re-crawling. You can submit up to 10 of requests of this kind within a 30 day period.
- Click Submit to let Google know that your request is ready to be processed.
adding your XML site map to Google Webmaster tools
[https://www.website.com/sitemap_index.xml]
will help Google determined that you are back online you should not see any real fallout from this. And submitting a complete XML site map gets a lot of images into Google images.
I hope this helps,
Tom
-
Hi it seems your robots.txt file is blocking Google and all other bots that search the web and obey robots.txt basically the good ones. So if you would like your site to be seen and indexed by Google and other search engines you need to remove the forward slash "/"
Shown here in your robots.txt file
Block all web crawlers from all content
User-agent: * Disallow: /
Go here to see [
https://www.website.com/robots.txt]-
Please read https://moz.com/learn/seo/robotstxt
-
Use to make the file http://tools.seobook.com/robots-txt/generator/
it looks like you're using WordPress so if you're using Apache or Yoast SEO you can go in and set it to use this I added your xml sitemap https://www.brightonpanelworks.com.au/sitemap_index.xml
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php[Sitemap: https://www.website.com/sitemap_index.xml]
You can use tools like this to analyze & fix robots.txt & can allways see it by adding /robots.txt after the .com or tld.
I hope that helps,
Tom ```
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home Page Disappears From Google - But Rest of Site Still Ranked
As title suggests we are running into a serious issue of the home page disapearing from Google search results whilst the rest of the site still remains. We search for it naturally cannot find a trace, then use a "site:" command in Google and still the home page does not come up. We go into web masters and inspect the home page and even Google states that the page is indexable. We then run the "Request Indexing" and the site comes back on Google. This is having a damaging affect and we would like to understand why this issue is happening. Please note this is not happening on just one of our sites but has happened to three which are all located on the same server. One of our brand which has the issue is: www.henweekends.co.uk
Intermediate & Advanced SEO | | JH_OffLimits0 -
Keyword On Page 1 Everywhere but Google (Site Specific)
Website: www.wheelchairparts.com
Intermediate & Advanced SEO | | Mike.Bean
Keyword: wheelchair parts My website is #1 or #2 on almost every search engine besides Google. Google has us bouncing between the bottom of page 2 and top of 3. However we are on page one for "wheelchairparts". I need to get a link building campaign going for this site. I feel it's more difficult for ecommerce websites and nothing seems to fit in with Rand's Mozcon 2016 Link Building talk except hacks. I need to find a flywheel. Either way, my question is what can I do other than link building to get on page 1 of Google for the term "wheelchair parts"? Thanks in advance! - Mike Bean1 -
Multilingual SEO - site using Google translate within existing URL structure
Hi everyone - I've just been looking at a site that simply uses Google Translate through its website. So basically, on any page you can Google Translate the content to any language you like - there's no change to the URL structure according to language, etc. I haven't come across this approach before (simply allowing users to Google Translate withing the existing page) - and it doesn't sit well with me - let me have your thoughts re: the SEO implications. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
I need thoughts on how to chase a suspected Hosting Issue with Simple Helix and 524 errors, also some site speed data mixed in...
So the back story on this project is we've been working as PPC and SEO managers with an ecoomerce site (Magento Enterprise based) that crashed in April. After the issue they fired their developer and switched hosting to Simple Helix at the recommendation of the new developer. Since the change we have seen a plummeting ecommerce conversion rate especially on weekends. Every time something seems really bad, the Developer gives us a "nothing on our end causing it." So doing more research we found site speed in GA was reporting crazy numbers of 25+ seconds for page loads, when we asked Simple Helix gave us answers back that it was "Baidu spiders" crawling the site causing the slowdown. I knew that wasn't the issue. In all of this the developer keeps reporting back to the site owner that there is no way it is hosting. So the developer finally admitted the site could be slowing down from a Dos attack or some other form of probing. So they installed Cloudflare. Since then the site has been very fast, and we haven't seen turbulence in the GA site speed data. What we have seen though is the appearance of 524 and 522 errors in Search Console. Does anyone have experience with Cloudflare that seeing those types of errors are common in usage? Is there any other thought what might be causing that and what that means from the servers, because the developer reports back that Simple Helix has had no issues during this time. This has been a super frustrating project and we've tried a lot different tests, but there is really abnormal conversion data as I said especially during peak times on the weekend. Any ideas of what to chase would be appreciated.
Intermediate & Advanced SEO | | BCutrer0 -
Why this site is not hit by google penguin only experts answer please
Hello I know a site which has the following things and it is not hit by google penguin at all and is ranked on first page first link in google.com.pk it has 1.3K total backlinks its has 34 referring domains according to ahrefs the exact match keyword which is ranked in google.com.pk as first position is used in 1.2 K backlinks its DA is 18 and PA is 31 its citation flow is 19 trust flow is 13 targeted exact match keyword is present in Title and description also. Now tell me what is happening here.
Intermediate & Advanced SEO | | tanveerayakhan0 -
Ranking on google search
Hello Mozzers Moz On page grader shows A grade for the particular URL,but my page was not ranking on top 100 Google search. Any help is appreciated ,Thanks
Intermediate & Advanced SEO | | sobanadevi0 -
SEO question regarding rails app on www.site.com hosted on Heroku and www.site.com/blog at another host
Hi, I have a rails app hosted on Heroku (www.site.com) and would much prefer to set up a Wordpress blog using a different host pointing to www.site.com/blog, as opposed to using a gem within the actual app. Whats are peoples thoughts regarding there being any ranking implications for implementing the set up as noted in this post on Stackoverflow: "What I would do is serve your Wordpress blog along side your Rails app (so you've got a PHP and a Rails server running), and just have your /blog route point to a controller that redirects to your Wordpress app. Add something like this to your routes.rb: _`get '/blog', to:'blog#redirect'`_ and then have a redirect method in your BlogController that simply does this: _`classBlogController<applicationcontrollerdef redirect="" redirect_to="" "url_of_wordpress_blog"endend<="" code=""></applicationcontrollerdef>`_ _Now you can point at yourdomain.com/blog and it will take you to the Wordpress site._
Intermediate & Advanced SEO | | Anward0 -
Google penalized site--307/302 redirect to new site-- Via intermediate link—New Site Ranking Gone..?
Hi, I have a site that google had placed a manual link penalty on, let’s call this our
Intermediate & Advanced SEO | | Robdob2013
company site. We tried and tried to get the penalty removed, and finally gave up and purchased another name. It was our understanding that we could safely use either a 302 or 307 temporary redirect in order to redirect people from our old domain to our new one.. We put this into place several months and everything seemed to be going along well. Several days ago I noticed that our root domain name had dropped for our selected keyword from position 9 to position 65. Upon looking into our GWT under “Links to Your site” , I have found many, many, many links which were pointed to our old google penalized domain name to our new root domain name each of this links had a sub heading “Via this intermediate link -> Our Old Domain Google Penalized Domain Name” In light of all of this going on, I have removed the 307/302 redirect, have brought the
old penalized site back which now consists of a basic “we’ve moved page” which is linked to our new site using a rel=’nofollow’ I am hoping that -1- Our new domain has probably not received a manual penalty and is most likely now
received some sort of algorithmic penalty, and that as these “intermediate links” will soon disappear because I’m no longer doing the 302/307 from the old sight to the new. Do you think this is the case now or that I now have a new manual penalty place on the new
domain name.. I would very much appreciate any comments and/or suggestions as to what I should or can do to get this fixed. I need to still keep the old domain name as this address has already been printed on business cards many, many years ago.. Also on a side note some of the sub pages of the new root domain are still ranking very
well, it’s only the root domain that is now racking awfully.. Thanks,0