Should I submit a sitemap for a site with dynamic pages?
-
I have a coupon website (http://couponeasy.com)
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically.I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical.
I have about 8-9 pages which are static and hence I can include them in sitemap.
Now the question is....
If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages?
NOTE: I need to create the sitemap for getting expanded sitelinks.
-
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
Hello Ahuj,
To answer your final question first:
Crawlers will not stop until they encounter something they cannot read or are told not to continue beyond a certain point. So your site will be updated in the index upon each crawl.
I did some quick browsing and it sounds like an automated sitemap might be your best option. Check out this link on Moz Q&A:
https://moz.com/community/q/best-practices-for-adding-dynamic-url-s-to-xml-sitemap
There are tools out there that will help with the automation process, which will update hourly/daily to help crawlers find your dynamic pages. The tool suggested on this particular blog can be found at:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
I have never used it, but it is worth looking into as a solution to your problem. Another good suggestion I saw was to place all removed deals in an archive page and make them unavailable for purchase/collection. This sounds like a solution that would minimize future issues surrounding 404's, etc.
Hope this helps!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you see sites with unfixable Penguin penalties?
Hello, We have a site with 2 Penguin update penalties (drops in traffic) and one quality penalty (another drop in traffic) all years ago, both just drops in rankings and not messages in Google Console. Now that Penguin is hard coded, do you find that some sites never recover even with a beautiful disavow and cleanup? We've added content and still have some quality errors, though I thought they were minor. This client used to have doorway sites and paid links, but now is squeaky clean with a disavow done a month ago though most of the cleanup was done by deletion of the doorways and paid links 9 months ago. Is this a quality problem or is our site permanently gone? Let me know what information you need. Looking for people with a lot of experience with other sites and Penguin. Thanks.
White Hat / Black Hat SEO | | BobGW2 -
Canonical tags being direct to "page=all" pages for an Ecommerce website
I find it alarming that my client has canonical tags pointing to "page=all" product gallery pages. Some of these product gallery pages have over 100 products and I think this could effect load time, especially for mobile. I would like to get some insight from the community on this, thanks!
White Hat / Black Hat SEO | | JMSCC0 -
Are 2 sites in same niche from same company white hat?
Hello, We want to open a second eCommerce store. Our first one is doing well. It would be different code, different graphics, a different category/menu system, but many of the products will be the same. Will that be safe and white hat now and into the future to have 2? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Finding out why Bing gave page-level penalty?
In the last couple of weeks Bing has gradually removed 5 webpages of my website from their SERP's. The URL's are totally gone. They all had top 5 rankings and just got removed out of nothing. Have can I investigate what went wrong with these pages? Are here perhaps experts who are willing to investigate this for a fee? How can I restore a page-level penalty? I have no messages in my Bing Webmastertools account.
White Hat / Black Hat SEO | | wellnesswooz0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
Can anyone tell me why this site ranks so well?
Site in question: cellphoneshop.net From what I can tell from their link profile, the links they garner don't appear to be particularly high value but they dominate organic listings for my vertical (cell phone accessories), esp. in the last 2-3 months when Google was supposedly increasing the quality of their search results. Can anyone tell me why in particular this site ranks so well for competitive short and long tail terms?
White Hat / Black Hat SEO | | eugeneku0 -
Partner Site Hit with Penguin - Links hurt me
I work for a network of international websites, the site I work on is for Canada. Our partners in Australia were hit by penguin hard because they hired a black hat SEO guy and didn't know. He was creating profiles on highly authoritative sites and keyword stuffing them. Now, they've completely dropped off the SERP. This is where the issue occurs, because we are all international partners we are all linked together on the header of every page so visitors can choose their country. Now, because they were hit hard and we have reciprocal links (not for rankings but for usability) will we be affected? It seems like we have, but I just want some opinions out there. Also, should we go ahead and stop linking our sites between countries to avoid this mess?
White Hat / Black Hat SEO | | BeTheBoss0 -
Can someone explain how a site with no DA, links or MozTrust, MozRank can rank #1 in the SERPs?
I do SEO for a legal site in the UK and one of the keywords I'm targeting is 'Criminal Defence Solicitors'. If you search this term in Google.co.uk this site comes top www.cdsolicitors.co.uk, yet in my mozbar it has 0 links, 0 DA etc, I noticed it top a few weeks ago and thought something spammy was going on; I thought if I was patient, Google would remove it, however it still hasn't. Can someone explain how it is top in the SERPs? I've never seen this before. thanks
White Hat / Black Hat SEO | | TobiasM0