Google Search Console Block
-
Am new to SEO.
My clients site was completed using Yoast premium and then used Google search console to initiate the crawl.
Initially setup an http:// property and all seemed good. Then i removed that under search console an created an https:// did the render and it appears google has put a block and placed their own robots.txt file which basically has rendered the site useless.
Feedback most appreciated.
-
what is interesting is that i can see that all the individual pages are good in terms of displaying in the browser correctly except the "home" page.
-
No problem, good luck! Moz has plenty of great resources to help you along the way. Be sure to check out the beginners guide to SEO.
-
Ok looks like I have work to do so will focus on these things now...
I was trying to create a rather flat layout with the pages as there are only a few; however; I do have a "services" page and will put the internal links between home page and services and then incorporate that page into the process.
I believe that it could be a wise investment for me at this stage to step back and get Yoast further involved and do a "Gold Review" on the site... this should fill in the gaps and raise my SEO knowledge.
Really appreciate the feedback...
-
Responses to the first 3 questions:
- HTTPS is in place, but a redirect is not in place to push HTTP to HTTPS
- Ok good, keep all Search Console profile intact, it's a good way to identify problems specifically as they relate to HTTP and HTTPS indexing (you don't want both to show)
- This search, site:albertaautosales.com. As you can see when you click that link, you've only got a few URLs indexed, 2 for the homepage, with and without HTTPS.
Now that I have the domain, I see a few problems.
- You have no internal linking - Screaming Frog will not go beyond the homepage. Upon further inspection, the only internal link I saw on the homepage was to a dead URL
- Google isn't creating a robots.txt file for you, there's just nothing for them to crawl as a result of my previous point.
- I cannot view your source code, if I can't see it, chances are Google can't either.
If this currently live version of the site is placeholder for development, I'd recommend putting the old site back out there and working on the new site in a development environment.
-
Hi Logan;
Thanks for reply...
the site is -- https://albertaautosales.com
-
Yes the HTTPS has been setup correctly and is active with no issues on all pages.
-
Yes I realize now that i could have left the http profile. It actually had a complete status and was ranking my key word phrases (also setup a campaign in Moz). I did activate it again however now shows blank pages even though the status is complete.
-
not sure if I get your question 3. Prior to removing and setting up the https profile the site was fine and the google ranking process was occuring...
I have created a help ticket for Google under the Search Console but no idea how prompt they are on responding. Site is simply down just showing some images. From what i can see Google blocked it by applying a very restrictive robots.txt file... but not sure as I am new to this.
Appreciate
-
-
Hi David,
I've got a few questions before I can provide any advice.
- Is the site using HTTPS everywhere?
- Why shut down the HTTP Search Console profile? You should always have all four versions of your domain setup in SC - http/https and www/non-www.
- Have you done a site:domain.com search in Google to verify indexation?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Does Google View Hidden Content?
I have a website which contains a lot of content behind a show hide, does Google crawl the "hidden" copy?
Web Design | | jasongmcmahon0 -
I am Using <noscript>in All Webpage and google not Crawl my site automatically any solution</noscript>
| |
Web Design | | ahtisham2018
| | <noscript></span></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><meta http-equiv="refresh" content="0;url=errorPages/content-blocked.jsp?reason=js"></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><span class="html-tag"></noscript> | and Please tell me effect on seo or not1 -
Any risks involved in removing a sub-domain from search index or completely taking down? Ranking impact?
Hi all, One of our sub-domains has thousands of indexed pages but traffic is very less and irrelevant. There are links between this sub-domain to other sub domains of ours. We are planning to take this subdomain completely. What happens if so? Google responds for this with a ranking change? Thanks
Web Design | | vtmoz0 -
Hiding content until user scrolls - Will Google penalize me?
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections. I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it). Is this still the case? Thanks, Sam
Web Design | | Sam.at.Moz0 -
Do you know any tool(s) to check if Google can crawl a URL?
Our site is currently blocking search bots that's why I can't use Google Webmaster Tools' URL fetch tool. In Screamingfrog, there are dynamic pages that can't be found if I crawl the homepage. Thanks in advance!
Web Design | | esiow20130 -
Given the lastest Google update, should I rewrite my Flash site or try to present an alternative HTML/CSS site?
I have a site that was created using Flash. The reasoning behind this was, at the time, that I didn't care if the site ranked or not (portfolio site). Now I would like to drive traffic to the site from SE's. Given the Penguin update, should I rewrite my Flash site in HTML/CSS or present an alternative site for bots and browsers that don't support Flash? My concern is that by presenting an alternative site to bots and non Flash supporting browsers that the SE's will see potentially see this as cloaking. Thoughts and advice would be much appreciated.
Web Design | | mj7750 -
Should /dev folder be blocked?
I have been experiencing a ranking drop every two months, so I came upon a new theory this morning... Does Google do a deep crawl of your site say every 60-90 days and would they penalize a site if they crawled into your /dev area which would contain pretty the exact same urls and content as your production environment and therefore penalize you for duplicate content? The only issue I see with this theory is that I have been penalized only for specific keywords on specific pages, not necessarily across the board. Thoughts? What would be the best way to block out your /dev area?
Web Design | | BoulderJoe0 -
Why is site not being indexed by Google, and not showing on a crawl test??
On a site we developed of which .com is forwarded to .net domain, we quit getting crawled by google on about the 20th of Feb. Now when we try to run a crawl test on either url, we get There was an error fetching this page. Error description For some reason the page returned did not describe itself as an html page. It could be possible that the url is serving an image, rss feed, pdf, or xml file of some sort. The crawl tool does not currently report metrics on this type of data. Our other sites are fine and this was up to this date. We took out noodp, noydir today as the only thing we could think of. Site is on WP cms.
Web Design | | RobertFisher0