Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long does google take to show the results in SERP once the pages are indexed ?
-
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions -
A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap.
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
An answer from SEO experts will be highly appreciated. Thnx !
-
No problem my friend. You are most welcome. As most of your site gets served through https, you need to have your http version of URLs re-directed to their https equivalents. I repeat, HTTP to HTTPS. Make sure that the re-direction gives an HTTP header status message 301 and not anything else. If you do so you do not loose any of your efforts put in to building links to the https version.
You can check the HTTP header status messages for your URLs by using any of the tools like the one found here: http://web-sniffer.net
Best regards,
Devanur Rafi.
-
Hey thanks Moosa.
-
Hello Devanur,
Thanks for the prompt reply. Never knew that http & https would be so much of a trouble. Will get this one resolved. Btw, I just wanted to know whether after making this changes (https to http) will the link value be passed/ redirected from https to http or will I have loose the entire effort made on https pages? Thanks again. Awaiting your reply
Regards,
PepMoBot
-
Sorry but I am little lazy at writing so i will try to keep it short and simple
There is no time of it... but your website should be appear for branded terms like if your website s www.exampleABC.com ... your website should at least appear against “example ABC”. If you want to target more keywords and you want your website to appear against them then other then optimized pages you need some targeted links pointing back to your website.
-
Hi,
there is no fixed time after which or under which an indexed page starts appearing in the SERPs.
I just checked your sitemap.xml file and it has only the https version of the URLs. In the index, I saw non https version is URLs are also listed. So there is no consistency. You have decided to serve the entire site in https and parts of it are still non-https. Serving the pages in https puts an overhead on your server. This might result in poor page loading times. If you have good resources on the server side, then this should not be a problem.
Though guys at Google say they don't care if the URLs are https or http when in comes to ranking but here I would like to mention as site loading time is an official ranking factor, when Google comes across two similarly capable and eligible pages competing for the same keyword, the one that has better loading times will be favored. By the way, can you let me know the reason behind serving the entire site in https?
Your linking profile is not at all consistent. You build links to http://www.peprismine.com and https://www.peprismine.com
Please beware that http://www.peprismine.com, though it takes you to https://www.peprismine.com, it does not give an http header status 301 instead it gives a status 200 message. This should be fixed immediately. If you get this fixed, I think you should be fine technically but be careful with pages being served over SSL as this tends to screw the page loading times sometimes. You might want to look in to this. Don't blindly go by the page speed test scores instead, look at the actual page loading times. You can do a test here: http://www.urivalet.com and also go ahead and perform a test at webpagetest.org and check out the performance review section.
Best regards,
Devanur Rafi.
-
Hi Devanur,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa & Devanur,
Thanks for your responses. However, I would like to know some more information on my 1st query
After making the necessary changes to our web pages, how long will it usually take to rank for particular keyword/ keywords (Assuming we have optimized these pages, as per the requirement). I read in some websites, that it will take minimum of 1 month, after the indexing is done. Is this really true or a myth? What have been your experiences?
P.S: I'm unable to see my url for any of my keywords yet (Not even in the last page too)
Regards,
PepMozBot
-
Hi there,
Straight into the meat:
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
A. Once the pages are in the index they become eligible to appear in the SERPs but, where they appear, on which page and in which position will they appear depends on lot of factors like the competition for the search term, your content, the back links that you have and the list goes on.
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
A. To a little extent and in some cases yes, but this again depends on the quality (in terms of relevance, uniqueness, originality etc, etc, ) of the content on a website, the quality of its link popularity and all the other 200+ factors that Google considers before positioning a website in the SERPs. To put it straight, you do not need to worry about the number of pages if your content is of pristine quality and highly relevant as per Google.
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
A. If the URLs being removed had duplicate content then in this case you will not have any negative effect.
Over a period of time, gradually, on an as and when required basis, keep adding pages that target one search term per page with relevant, unique and up-to-date content. This will result in a positive change in your organic traffic numbers. And very importantly, do not build links desperately from all the places. Earn links, that is what I would say, you have to earn links by giving a reason for your visitors to visit your website.
1. Try to earn links from authority sites in your niche. Links like this fall in the tier 1 category.
2. Get links from generic authority websites (like Wikipedia) by posting quality content. This would be your tier 2.
3. Get links from similar theme (sites that operate in your niche) websites. These links can be your tier 3.
4. Finally, earn links from generic web properties like forums, blogs, social networking sites, social bookmarking sites etc. These would be your tier 4 links.
A very important thing to keep in mind while doing the above is, "the quality of the content being posted". Be specific and try to address an issue or provide a solution in your posts. Never engage in low quality link exchanges and bulk link building. Above all, keep asking yourself all the time, "why should anyone visit my website?", "what can I do to make a visitor's visit to my website worthwhile" and " what should I do to make my website to give a better user experience or a better advantage than that of my competitors?"
With questions like the above, you will be able to secure a good longstanding and enduring position in the SERPs for your website.
Also be an active participant in social sites to attract good social buzz. Social signals are very good for your search engine optimization efforts and they can give a boost to your SEO efforts.
Wish you good luck.
Best regards,
Devanur Rafi.
-
Ok, when you said the URLs are indexed, this simply means they are appearing in SERPs, you can type in your exact URL in the Google search bar to see if the page is appearing or not... appearing against a keyword is a completely different topic it has nothing to do with indexing alone.
It’s good to have more pages but if more pages are not producing any value and your overall website is getting low value then you should prefer to go with less pages and more value.
Removal or Change of URL can have an impact on rankings... for instance one of your URL is ranking on first page for some “XYZ” keyword if you are going to change or remove the URL it will obviously going to lost its rankings...
It is always recommended to add 301 from old domain to new domain when changing or removing the URL.
Hope this helps...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google does not want to index my page
I have a site that is hundreds of page indexed on Google. But there is a page that I put in the footer section that Google seems does not like and are not indexing that page. I've tried submitting it to their index through google webmaster and it will appear on Google index but then after a few days it's gone again. Before that page had canonical meta to another page, but it is removed now.
Intermediate & Advanced SEO | | odihost0 -
Subdomain replaced domain in Google SERP
Good morning, This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below: Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP. Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall. Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index. Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again? Thank you for your time, Chase
Intermediate & Advanced SEO | | chiplab0 -
Wrong meta descriptions showing in the SERPS
We recently launched a new site on https, and I'm seeing a few errors in the SERPS with our meta descriptions as our pages are starting to get indexed. We have the correct meta data in our code but it's being output in Google differently. Example: http://imgur.com/ybqxmqg Is this just a glitch on Google's side or is there an obvious issue anyone sees that I'm missing? Thanks guys!
Intermediate & Advanced SEO | | Brian_Owens_10 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
Wordpress blog in a subdirectory not being indexed by Google
HI MozzersIn my websites sitemap.xml, pages are listed, such as /blog/ and /blog/textile-fact-or-fiction-egyptian-cotton-explained/These pages are visible when you visit them in a browser and when you use the Google Webmaster tool - Fetch as Google to view them (see attachment), however they aren't being indexed in Google, not even the root directory for the blog (/blog/) is being indexed, and when we query:site: www.hilden.co.uk/blog/ It returns 0 results in Google.Also note that:The Wordpress installation is located at /blog/ which is a subdirectory of the main root directory which is managed by Magento. I'm wondering if this causing the problem.Any help on this would be greatly appreciated!AnthonyToTOHuj.png?1
Intermediate & Advanced SEO | | Tone_Agency0