Indexing issue or just time?
-
Hey guys,
When I publish a post on our blog, I notice that it barely shows up in SERPs even if I copy and paste the title verbatim into Google. All my settings in Yoast are correct from what I've seen.
Is this just Google slowly getting around to crawling our site? Or is something else wrong here? We recently shut down and relaunched our site about 3 weeks ago.
Here is the site URL: The Tech Block
-
Yep, I also spotted them. And are all the canonical tags point to the right pages?
If all is in place and is configured correctly, it should be a matter of time yes. I would however try to speed up the process by doing some linkbuilding, for instance using social media.
-
Actually Steven. I just looked at my source code and I see the canonicals.
We also have sitemaps too that are being indexed by Google and I can track that in Webmaster. I guess it's just a matter of time?
-
I would recommend reading the Google Help section on this. It's quite complete: http://support.google.com/webmasters/bin/answer.py?hl=en&hlrm=nl&answer=139394.
-
Steven,
Thanks for your help. What is the proper way to set up canonicals?
-
Yes, even with new posts. How are the search engines supposed to find it?
If you don't have the measures I stated in place it will take them longer to find and index the content. The way it's now you want the search engines to find the content (new and old). When having the sitemaps and proper canonicals you're handing it to them.
Good luck!
-
Even with new posts? I can understand old post taking a while to crawl, but new posts too? If I write something, and post it, I'll check hours later and Google hasn't indexed it. Is this normal?
-
Hi Abdel,
As CleverPhD already pointed out: shutting down and relaunched (with a changed information architecture) can search engines to take a while until they're up to speed with indexing your website again. It will take some time, and quite some content and links to speed up that process. In the meantime you can of course help the search engines by:
- Having a HTML sitemap
- Having a XML sitemap (be sure to link the XML sitemap in the HTML sitemap, add the XML sitemap to the robots.txt and submit the XML sitemap and RSS feed in Google Webmaster Tools)
- Having proper canonical tagging and 301-redirects (if possible)
It's important to get crawled, but it also important to let the search engines crawl the right pages. Why are you linking to those tag pages in the bottom right of your website? Perhaps it's better to create category pages for that (better URL structure).
Good luck and keep us posted!
-
If you shut down and relaunched your site 3 weeks ago and lets say you also changed your URL structure and title tags and if you also do not have 301 redirects for old to new content and you don't have a sitemap.
All those things, even if you did them all "correctly" can cause Google to take a while to respider an reindex your pages. Google ranks "pages" vs "sites" generally speaking and so that can impact rankings.
Looks like you canonical all the pages to themselves? Example
<link rel="<a class="attribute-value">canonical</a>" href="[http://thetechblock.com/the-ios-interface-concept](view-source:http://thetechblock.com/the-ios-interface-concept)" />
Were you www.thetechblock.com before and now you are trying to change to the non www? If you did change to non www then Google would see this as a new site and so the rankings would start over
If you want to look at crawl rate, you should be able to go into Google Webmaster Tools and see how often they are spidering. Similarly, you can submit a sitemap and see how many are indexed
I
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Indexing Issues - Search Bots will only crawl Homepage of Website, Help!
Hello Moz World, I am stuck on a problem, and wanted to get some insight. When I attempt to use Screaming Spider or SEO Powersuite, the software is only crawling the homepage of my website. I have 17 pages associated with the main domain i.e. example.com/home, example.com/sevices, etc. I've done a bit of investigating, and I have found that my client's website does not have Robot.txt file or a site map. However, under Google Search Console, all of my client's website pages have been indexed. My questions, Why is my software not crawling all of the pages associated with the website? If I integrate a Robot.txt file & sitemap will that resolve the issue? Thanks ahead of time for all of the great responses. B/R Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
Bulk redirect or only a few pages at a time
Dear all, I would very much like to have your advise about whether or not to implement bulk 301 redirects. We have 3 retail websites with the same technical architecture, namely: Netherlands-example.nl Belgium-example.be France-example.fr These three websites are all bilingual, namely: Netherlands-example.nl/nl Netherlands-example.nl/fr Belgium-example.be/nl Belgium-example.be/fr France-example.fr/nl France-example.fr/fr We’re going to do a CMS update and therefore we have to change a bulk of 301 redirects: Part 1: For France (France-example.fr) URL’s in the Dutch language (France-example.fr/nl) will be redirected to Belgium (Belgium-example.be/nl). It’s a matter of about 8.000 redirects. Part 2: For the Netherlands (Netherlands-example.nl) URL’s in the French language (Netherlands-example.nl/fr ) will be redirected to Belgium (Belgium-example.be/fr). It’s also a matter of about 8.000 redirects. Question:
Intermediate & Advanced SEO | | footsteps
What will be the best way to implement these redirects? Fully implement part 1 first (8.000 redirects) and then a couple of weeks/months later a full implement of part 2? Or will it be better to implement small batches like 200-500 per 2 weeks? I’d like to hear your opinion. Thanks in advance. Kind regards, Gerwin0 -
How to make Google index your site? (Blocked with robots.txt for a long time)
The problem is the for the long time we had a website m.imones.lt but it was blocked with robots.txt.
Intermediate & Advanced SEO | | FCRMediaLietuva
But after a long time we want Google to index it. We unblocked it 1 week or 8 days ago. But Google still does not recognize it. I type site:m.imones.lt and it says it is still blocked with robots.txt What should be the process to make Google crawl this mobile version faster? Thanks!0 -
A new website issue
Hello everybody,
Intermediate & Advanced SEO | | mtmaster
I have started a new website 22 days ago at the beginning of this month and i have long articles. I think this should make the site appear in search results for long tail keywords even if they are not very relevant but as you can see in the attached image from my webmaster tools the impression count has suddenly increased to 100 then significantly decreased again. Even when i cancel "filter" option. Is this normal for a 3 weeks old website? or there is something i have to check? thanks. cLMa04l.jpg0 -
HTTPS pages - To meta no-index or not to meta no-index?
I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages. I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.
Intermediate & Advanced SEO | | Jamie.Stevens0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Getting Pages Requiring Login Indexed
Somehow certain newspapers' webpages show up in the index but require login. My client has a whole section of the site that requires a login (registration is free), and we'd love to get that content indexed. The developer offered to remove the login requirement for specific user agents (eg Googlebot, et al.). I am afraid this might get us penalized. Any insight?
Intermediate & Advanced SEO | | TheEspresseo0