Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I submit a sitemap for a site with dynamic pages?
-
I have a coupon website (http://couponeasy.com)
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically.I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical.
I have about 8-9 pages which are static and hence I can include them in sitemap.
Now the question is....
If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages?
NOTE: I need to create the sitemap for getting expanded sitelinks.
-
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
Hello Ahuj,
To answer your final question first:
Crawlers will not stop until they encounter something they cannot read or are told not to continue beyond a certain point. So your site will be updated in the index upon each crawl.
I did some quick browsing and it sounds like an automated sitemap might be your best option. Check out this link on Moz Q&A:
https://moz.com/community/q/best-practices-for-adding-dynamic-url-s-to-xml-sitemap
There are tools out there that will help with the automation process, which will update hourly/daily to help crawlers find your dynamic pages. The tool suggested on this particular blog can be found at:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
I have never used it, but it is worth looking into as a solution to your problem. Another good suggestion I saw was to place all removed deals in an archive page and make them unavailable for purchase/collection. This sounds like a solution that would minimize future issues surrounding 404's, etc.
Hope this helps!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Best Location to find High Page Authority/ Domain Authority Expired Domains?
Hi, I've been looking online for the best locations to purchase expired domains with existing Page Authority/ Domain Authority attached to them. So far I've found: http://www.expireddomains.net
White Hat / Black Hat SEO | | VelasquezEF
http://www.domainauthoritylinks.com
http://moonsy.com/expired_domains/ These site's are great but I'm wondering if I'm potentially missing other locations? Any other recommendations? Thanks.1 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
Site dropped suddenly. Is it due to htaccess?
I had a new site that was ranking on the first page for 5 keywords. My site was hacked recently and I went through a lot of trouble to restore it. Last night, I discovered that my site was nowhere to be found but when i searched site: mysite.com, it was still ranking which means it was not penalized. I discovered the issue to be a .htaccess and it have been resolved. My question is now that the .htaccess issue is resolved , will my site be restored back to the first page? Is there additional things that i should do? I have notified google by submitting my site
White Hat / Black Hat SEO | | semoney0 -
Merging four sites into one... Best way to combine content?
First of all, thank you in advance for taking the time to look at this. The law firm I work for once took a "more is better" approach and had multiple websites, with keyword rich domains. We are a family law firm, but we have a specific site for "Arizona Child Custody" as one example. We have four sites. All four of our sites rank well, although I don't know why. Only one site is in my control, the other three are managed by FindLaw. I have no idea why the FindLaw sites do well, other than being in the FindLaw directory. They have terrible spammy page titles, and using Copyscape, I realize that most of the content that FindLaw provides for it's attorneys are "spun articles." So I have a major task and I don't know how to begin. First of all, since all four sites rank well for all of the desired phrases-- will combining all of that power into one site rocket us to stardom? The sites all rank very well now, even though they are all technically terrible. Literally. I would hope that if I redirect the child custody site (as one example) to the child custody overview page on the final merged site, we would still maintain our current SERP for "arizona child custody lawyer." I have strongly encouraged my boss to merge our sites for many reasons. One of those being that it's playing havoc with our local places. On the other hand, if I take down the child custody site, redirect it, and we lose that ranking, I might be out of a job. Finally, that brings me down to my last question. As I mentioned, the child custody site is "done" very poorly. Should I actually keep the spun content and redirect each and every page to a duplicate on our "final" domain, or should I redirect each page to a better article? This is the part that I fear the most. I am considering subdomains. Like, redirecting the child custody site to childcustody.ourdomain.com-- I know, for a fact, that will work flawlessly. I've done that many times for other clients that have multiple domains. However, we have seven areas of practice and we don't have 7 nice sites. So child custody would be the only legal practice area that has it's own subdomain. Also, I wouldn't really be doing anything then, would I? We all know 301 redirects work. What I want is to harness all of this individual power to one mega-site. Between the four sites, I have 800 pages of content. I need to formulate a plan of action now, and then begin acting on it. I don't want to make the decision alone. Anybody care to chime in? Thank you in advance for your help. I really appreciate the time it took you to read this.
White Hat / Black Hat SEO | | SDSLaw0