Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
XML Sitemap and unwanted URL parameters
-
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing.
So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong?
Thanks !
-
Our IT department is on a big project and we won't have any support for almost a year, that's why I was looking at other solutions.
We currently add about 10 to 20 pages a month, so I probably could redo the sitemap once a month, right after the new content is published.
-
Glad I could help
The only other issue I see with this is your sitemap will get outdated quickly if you have a lot of content/pages being added to your site. Additional work or development may be needed to create a fluent sitemap that auto-updates alongside the website.
-
Thanks, I really like your answer.
I should have thought about cleaning it in Excel. I will get right on it !
-
HI Jean-Francois
I would try to keep you sitemap as clean as possible. But could you export all the data into a CSV and clean up the pages using a formula. If you got a full list of your URLs in column A in Excel. Then used the following formula
=LEFT(A1,Find("ref=",A1)-1)
Put this formula into cell B1 and drag the formula down all the rows. This should strip out all of the parameters you do not want. Then simply remove the duplicates and you have your list of URLs to create a clean sitemap.
Let me know if this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL has caps, but canonical does not. Now what?
Hi, Just started working with a site that has the occasional url with a capital, but then the url in the canonical as lower case. Neither, when entered in a browser, resolves to the other. It's a Shopify site. What do you think I should do?
Technical SEO | | 945010 -
Include or exclude noindex urls in sitemap?
We just added tags to our pages with thin content. Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
Technical SEO | | vcj0 -
Resubmit sitemaps on every change?
Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!
Technical SEO | | yacpro131 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Sitemap indexed pages dropping
About a month ago I noticed my pages indexed from my sitemap are dropping.There are 134 pages in my sitemap and only 11 are indexed. It used to be 117 pages and just died off quickly. I still seem to be getting consistant search traffic but I'm just not sure whats causing this. There are no warnings or manual actions required in GWT that I can find.
Technical SEO | | zenstorageunits0 -
Old URL redirect to New URL
Alright I did something dumb a year a go and I'm still paying for it. I changed my hyphenated URL to the non-hyphenated version when I redesigned my website. I say it was dumb because I lost most of my link juice even though I did 301 redirects (via the htaccess file) for almost all of the pages I could find in Google's index. Here's my problem. My new site took a huge hit in traffic (down 60%) when I made the change and even though I've done thousands of redirects my old site is still showing up in the SERPS and send much if not most of my traffic. I don't want to take the old site down in fear it will kill all of my traffic. What should I do? Is there a better method I should explore then 301 redirects? Could the other site be affecting my current rank since it's still there? (FYI...both sites are built on the WP platform). Any help or ideas are greatly appreciated. Thank you! Joe
Technical SEO | | kaje0 -
Products with discrete URLs for each color
here is the issue. i have an ecommerce site that on a category page, shows each individual color for each product sold. and there is a distinct URL for each color. each product page shares the same content, with the only potentially differentiating factor being customer reviews (not nearly enough of these to differentiate anything). so we have URLs like: www.domain.com/product-green www.domain.com/product-yellow www.domain.com/product-red and so on. i am looking for a way to consolidate these URL while still showing all colors on the category page. the first solution i am considering is using the hash tag. so we would create www.domain.com/product#green, www.domain.com/product#yellow, www.domain.com/product#red. if possible, i would set the canonical tag as www.domain.com/product. the second solution would be to use the canonical tag and keep the URLs as is. the issue i see here is that we would need to create www.domain.com/product and show that page somewhere. www.domain.com/product would the URL that the above color URLs would canonicalize to. what would be the preferred solution? or is there something else?
Technical SEO | | rakesh_patel0 -
Is "last modified" time in XML Sitemaps important?
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
Technical SEO | | ShaMenz0