Guys & Gals anyone know if urllist.txt is still used?
-
I'm using a tool which generates urllist.txt and looking on the SEO Forums it seems that Yahoo used to use this. What I'd like to know is is it still used anywhere and should we have it on the site?
-
Thanks for the advice, we already create and submit the XML sitemap to Google, that wasn't the question. Would there be any benefit in creating the urllist.txt file?
-
I would just use a sitemap.xml file instead for Google, Bing and Yahoo. Then you can submit the sitemap.xml file within the Google Webmaster Tools and Bing Webmaster Tools (includes Yahoo). You can easily create an XML sitemap at http://www.xml-sitemaps.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue with GA tracking and Native AMP
Hi everyone, We recently pushed a new version of our site (winefolly.com), which is completely AMP native on WordPress (using the official AMP for WordPress plugin). As part of the update, we also switched over to https. In hindsight we probably should have pushed the AMP version and HTTPS changes in separate updates. As a result of the update, the traffic in GA has dropped significantly despite the tracking code being added properly. I'm also having a hard time getting the previous views in GA working properly. The three views are: Sitewide (shop.winefolly.com and winefolly.com) Content only (winefolly.com) Shop only (shop.winefolly.com) The sitewide view seems to be working, though it's hard to know for sure, as the traffic seems pretty low (like 10 users at any given time) and I think that it's more that it's just picking up the shop traffic. The content only view shows maybe one or two users and often none at all. I tried a bunch of different filters to only track to the main sites content views, but in one instance the filter would work, then half an hour later it would revert to no traffic. The filter is set to custom > exclude > request uri with the following regex pattern: ^shop.winefolly.com$|^checkout.shopify.com$|/products/.|/account/.|/checkout/.|/collections/.|./orders/.|/cart|/account|/pages/.|/poll/.|/?mc_cid=.|/profile?.|/?u=.|/webstore/. Testing the filter it strips out anything not related to the main sites content, but when I save the filter and view the updated results, the changes aren't reflected. I did read that there is a delay in the filters being applied and only a subset of the available data is used, but I just want to be sure I'm adding the filters correctly. I also tried setting the filter to predefined, exclude host equal to shop.winefolly.com, but that didn't work either. The shop view seems to be working, but the tracking code is added via Shopify, so it makes sense that it would continue working as before. The first thing I noticed when I checked the views is that they were still set to http, so I updated the urls to https. I then checked the GA tracking code (which is added as a json object in the Analytics setting in the WordPress plugin. Unfortunately, while GA seems to be recording traffic, none of the GA validators seem to pickup the AMP tracking code (adding using the amp-analytics tag), despite the json being confirmed as valid by the plugin. This morning I decided to try a different approach and add the tracking code via Googles Tag Manager, as well as adding the new https domain to the Google Search Console, but alas no change. I spent the whole day yesterday reading every post I could on the topic, but was not able to find any a solution, so I'm really hoping someone on Moz will be able to shed some light as to what I'm doing wrong. Any suggestions or input would be very much appreciated. Cheers,
Technical SEO | | winefolly
Chris (on behalf of WineFolly.com)0 -
Using rel=canonical
I have a set of static pages which were created with the purpose of targeting long tail keywords. That has resulted in Domain Authority dilution to some extent. I am now in the process of creating one page which will serve the same results but only after user selects the fields in the drop-down. I am planning to use rel=cannonical on the multiple pages pointing back to the new page. Will it serve the purpose?
Technical SEO | | glitterbug0 -
Using both .co.uk and .com
Hello a client has launched a website with both the .com and .co.uk The content is identical. I understand that you should add rel="alternate" hreflang="x" to the code. However, will there be a problem with the identical content? It would be hard to localise the content to one country. I understand why the client has got both domains, particularly the UK one but the actual content is not specific to one country. It is written for English speaking customers really. Also what about links? In this case do you need to build two sets of links to make them both rank? Thanks for any help.
Technical SEO | | AL123al0 -
2 sites using 1 CMS... issues?
Hi, We are working with a client that has 2 sites in the same sector. They are currently on separate servers, with separate blogs, images galleries etc. Both sites rank combined for over 200 terms. IF we were to "combine" the sites on one CMS, with one IP, two separate front ends, one blog stream, one image gallery what do you think the SEO impact would be from this? We had an issue with another client whose sites were too close and we had to separate in order to get them both to rank. Further to this we want both sites to now have their own https certificate however this wouldn't be possible if combined. Interested to hear thoughts on this. Thanks
Technical SEO | | lauratagdigital0 -
Sitemap & noindex inconstancy?
Hey Moz Community! On a the CMS in question the sitemap and robots file is locked down. Can't be edited or modified what so ever. If I noindex a page in the But it is still on the xml sitemap... Will it get indexed? Thoughts, comments and experience greatly appreciate and welcome.
Technical SEO | | paul-bold0 -
Use of Multiple Tags
Hi, I have been monitoring some of the authority sites and I noticed something with one of them. This high authority site suddenly started using multiple tags for each post. And I mean, loads of tags, not just three of four. I see that each post comes with at least 10-20 tags. And these tags don't always make sense either. Let's say there is a video for "Bourne Legacy", they list tags like bourne, bourney legacy, bourne series, bourne videos, videos, crime movies, movies, crime etc. They don't even seem to care about duplicate content issues. Let's say the movie is named The Dragon, they would inclue dragon and the-dragon in tags list and despite those two category pages(/dragon and /the-dragon) being exactly the same now, they still wouldn't mind listing both the tags underneath the article. And no they don't use canonical tag. (there isn't even a canonical meta on any page of that site) So I am curious. Do they just know they have a very high DA, they don't need to worry about duplicate content issues? or; I am missing something here? Maybe the extra tags are doing more good than harm?
Technical SEO | | Gamer070 -
Microsite & Ducplicate Content Concern
I have a client that wants to put up a micro-site. It's not really even a niche micro-site, it's his whole site less a category and a few other pages. He is a plastic surgeon that offers cosmetic surgery services for the Face, Breast, and Body at his private practice in City A. He has partnered with another surgeon in City B who's surgical services are limited to only the Face. City B is nearby, but not so close that they consider themselves competitors for Facial surgery. The doctors agreement is that my client will perform only Breast and Body surgery at the City B location. He can market himself in City B (which he currently is not doing on his main site) but only for Breast and Body procedures and is not to compete for Facial surgery. Therefore, he needs this second site to not include content about Facial surgery. My concern is duplicate content. His request plan: the micro-site will be on different domain and C-block, the content, location keywords and meta data will be completely re-written and target City B. However, he wants to use the same theme of his main site - same source code, html/css, same top level navigation, same sub-navigation less the Face section, same images/graphics, same forms, etc. Is it okay to have the same exact site build on a different domain with rewritten copy (less a few pages) to target the same base keywords with only a different location? The site is intended for a different user group in City B, but I'm concerned the search engines won't like this and trigger the filters. I've read a bunch of duplicate content articles including this post panda by Dr. Pete. Great post, but doesn't really answer this particular issue of duplicating code for a related site. Can anyone make a case for or against this? Thanks in advance!
Technical SEO | | cmosnod0 -
Does anyone know how to automatically record Google Cache dates?
I haven't heard of such a tool but I would have thought it would be pretty useful for measuring changes etc Does anyone know of such a tool?
Technical SEO | | CraigAddyman0