Tactic to get 2000+ indexed (fast)
-
Dear SEO'mozzers,
Soon i'll be launching my new project. A website with about 2000+ pages, with +- 150 words per page (simple instructions, can't make it longer).
It is vital that every page is indexed and will get in the SERP's.
Wich tactic can you recommend:
- Just set every page online at once ( with a good sitemap) or,
- Feed Google Sitemap with lets say 30 pages a day, so the crawler will come by every day. And hopefully get a better indexation and better rankings over time.
- Other tactic? of doesnt matter?
Many thanks for your help.
Gr Menno
-
I echo what Ryan said 100%. Another suggestion - especially because it sounds like you're going to start with a whole bunch of info - is to add a blog. When you're building a site, especially one that has a whole bunch of info go live at once, is to stay focused on fresh content.
With my businesses' sites, I've really found that pushing content all at once during the launch gets me indexed, but doesn't necessarily get me the SERP position I want. I try to write two articles a week per website at a minimum. It keeps the crawlers coming back and increases my site wide keyword density and potential for catching long tail searched.
-
Thanks for the advice. Think ill go with it and redesign structure to get more info on one page, so i can also put more effort in unique articles ( only around 700 then). Wich saves me time + make my website better for SEO.
-
I'm with Ryan on this one. If you can use less pages with more information on then do so.
And also I'd recommend reading up on the Panda Update.
-
Without thoroughly understanding your niche, the products / services / companies involved, it is very difficult to offer meaningful advice.
In brief, you can drop the "generic product" pages and instead make a single, rich page for Company A which offers all the details readers need.
You are welcome to operate your site however you see fit, but Google and Bing will operate their search results how they see fit, and they have determined the tactic you are using is not in the best interest of users.
If you felt compelled to present the site in the manner you described, you can add the canonical tag to all the Generic Product pages indicating the Company A page as the primary page to be indexed.
-
Ill try to explain what my problem is. Cause what you're telling is true, found that out myself onze too.
The problem is that every page NEEDS to be there, cause the little info differences are vital.
It a website with info about how to cancel subscriptions. Most of services are offered are all the same from all company's. Only the adress is the difference.
Its build up like this:
Company A - info page
Generic product a - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Company B - info page
Generic product a - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopageThe difference from content is not more that 15%, but that 15% makes the difference and is vital. Any idea for a solution for this problem?
-
The second choice would be recommended.
It is common for site owners to publish more pages in an attempt to rank for more keywords. An example I can think of related to directions:
Article 1 - How to clear cache in Firefox 13
Article 2 - How to clear cache in Firefox 12
Article 3 - How to clear cache in Firefox 11
...and so forth. The directions are all the same but in an effort to target individual keywords the site owner generates numerous pages. Search engines view the pages as duplicate content.
Next, site owners attempt what you are suggesting...hire writers to change a few words around to make each article appear unique. This tactic does not help improve the quality of your pages and therefore does not help users. It is simply an attempt to manipulate search engines. It often does not work. If it does work, it may stop working after a time as search engines get better at filtering such techniques.
The suggestion I would make is to forget search engines exist and write the clearest, best directions ever written. Offer images, details about things that might go wrong, etc.
-
Thanks for list, i think everything is fine. Only not the content you mentioned. Think i need a few good text writers, to write 2000x200 words of unique articles.
To tackle the unique content problem i have 2 solutions. Wich one do you think its best?
- Publish the site with 75% possible dupe content, and then rewrite over time.
- Only publish only unique articles, and take some time for it ?
Gr
-
Your site size really is not a factor in determining how quickly the site is indexed. A few steps you can take to achieve the goal of having all 2k pages indexed fast:
-
ensure your site's navigation is solid. All pages should be reachable within a maximum of 3 mouse clicks from the home page.
-
for the most part, your site should be HTML based. You can use Javascript, flash and so forth but the HTML support needs to be there as well. Try turning off javascript and flash, then navigating your site.
-
for pages you do not wish to be indexed, add the "noindex" tag to them rather then blocking them in robots.txt when possible.
-
review your site map to ensure it is solid. Ensure all 2k pages you want indexed are included in the sitemap. Also ensure there are not any pages blocked by robots.txt or "noindex" in your sitemap.
-
review your content to ensure each page is unique. With only 150 words per page, there is a high likelihood many pages will be viewed as duplicate content and therefore not indexed.
-
review your site code (validator.w3.org) to ensure it is fairly clean. Some errors can impact a search engine's ability to crawl your site.
My biggest concern is the last point. If you simply change the title and a couple keywords, then the other pages will be viewed as duplicates and not indexed, or even if they are indexed they wont rank well.
I should also clarify the above applies to Google.com mostly. Bing is much pickier about the pages it will index.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How i get link to my website
hi i'm very new in seo want to have links to my website:www.warningbroker.com how i can get links to my website?
Intermediate & Advanced SEO | | marketing660 -
I'm noticing that URL that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before?
I'm noticing that URLs that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before? Here's an example:
Intermediate & Advanced SEO | | nystromandy
http://www.thefader.com/2017/01/11/the-carter-documentary-lil-wayne-black-lives-matter0 -
Trying to find example of in app indexing in SERPs
My colleague who is a developer is trying to find an example of in apps being indexed in the SERPs. Does anybody know of any examples? Thanks+
Intermediate & Advanced SEO | | RosemaryB0 -
Why Google isn't indexing my images?
Hello, on my fairly new website Worthminer.com I am noticing that Google is not indexing images from my sitemap. Already 560 images submitted and Google indexed only 3 of them. Altough there is more images indexed they are not indexing any new images, and I have no idea why. Posts, categories and other urls are indexing just fine, but images not. I am using Wordpress and for sitemaps Wordpress SEO by yoast. Am I missing something here? Why Google won't index my images? Thanks, I appreciate any help, David xv1GtwK.jpg
Intermediate & Advanced SEO | | Worthminer1 -
Why is my site not getting crawled by google?
Hi Moz Community, I have an escort directory website that is built out of ajax. We basically followed all the recommendations like implementing the escaped fragment code so Google would be able to see the content. Problem is whenever I submit my sitemap on Google webmastertool it always 700 had been submitted and only 12 static pages had been indexed. I did the site query and only a number of pages where indexed. Does it have anything to do with my site being on HTTPS and not on HTTP? My site is under HTTPS and all my content is ajax based. Thanks
Intermediate & Advanced SEO | | en-gageinc0 -
Drop in number of pages in Bing index
I regularly check our index inclusion and this morning saw that we had dropped from having approx 6,000 pages in Bing's index to less than 100. We still have 13,000 in Bing's image index, and I've seen no similar drop in the number of pages in either Google or Yahoo. I've checked with our dev team and there have been no significant changes to the sitemap or robots file. Has anybody seen anything like this before, or could give any insight into why it might be happening?
Intermediate & Advanced SEO | | GBC0 -
Sitemap not indexing pages
My website has about 5000 pages submitted in the sitemap but only 900 being indexed. When I checked Google Webmaster Tools about a week ago 4500 pages were being indexed. Any suggestions about what happened or how to fix it? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
How to get subdomains to rank well?
Hi All, I am setting up a new site and I want to make use of subdomains to target multiple countries as follows: uk.mydomain.com us.mydomain.com australia.mydomain.com etc. Now i know what you're all going to say, why not use folders as they are more effective. Well I did think of this but decided against it because I would like to make the best of a low competition industry. I want to push my competitors as far down in the SE's as possible and i plan to do this by targeting generic non locational search terms with both sites so I can hog the top 4 spots.as follows: www.mydomain.com www.mydomain.com/keyterm uk.mydomain.com uk.mydomain.com/keyterm-in-the-UK Whats steps can I take to ensure rank passes to my subdomains? Is it better to start the site with folders like www.mydomain.com/us/keyterm and then 301 them to subdomains at a later stage or should i start with the subdomains?
Intermediate & Advanced SEO | | Mulith1