Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Guys & Gals anyone know if urllist.txt is still used?
-
I'm using a tool which generates urllist.txt and looking on the SEO Forums it seems that Yahoo used to use this. What I'd like to know is is it still used anywhere and should we have it on the site?
-
Thanks for the advice, we already create and submit the XML sitemap to Google, that wasn't the question. Would there be any benefit in creating the urllist.txt file?
-
I would just use a sitemap.xml file instead for Google, Bing and Yahoo. Then you can submit the sitemap.xml file within the Google Webmaster Tools and Bing Webmaster Tools (includes Yahoo). You can easily create an XML sitemap at http://www.xml-sitemaps.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Shopping Carts & Sub Domains
I was hoping someone could guide me in making the correct decision regarding integrating my existing domain with a hosted shopping cart. I have an existing website to promote my bricks and mortar retail operation and am expanding into web retailing. I will be using one of the major hosted shopping carts. What is the best way to join the two components from an SEO perspective? Have the cart as a sub domain of my main site, or move my existing domain name to be hosted by the cart provider and have both components operate under the same general domain? I have read arguments that putting your cart within a sub domain is not a good idea because any clout of the pre-existing domain will not be shared with the sub domain; that they will be treated as two separate sites. I have also read that using a sub domain is a good idea being that the content focus of the main domain (marketing and blogs) is different form the focus of the sub domain (product sales), and that the two components would benefit form earning their own rankings undiluted by the other. And, I have also read that search engines are getting good at being able to deduce that an eCommerce sub domain is legitimate extension of a content intensive main domain, and that they treat the two components as a combined whole. What is the truth? Which is the better way to go? Any guidance would be appreciated.
Technical SEO | | MEI1520 -
Does anyone use pingler and is it any good
Hi, i have joined pingler and pay per month to use it but i have not seen any difference with traffic or google rankings and i would like to know if anyone else is using the paid version of pingler.com and if they find it a good service
Technical SEO | | ClaireH-1848860 -
Using a non-visible H1
I have a developer that wants to use style="text-indent:-9999px" to make the H1 non-visible to the user. Being the conservative person I am, I've never tried this before and worry that Search Engines may think this is a form of cloaking. Am I worrying about nothing? And apologies if it's already been covered here. I couldn't find it. Thanks in advance!!!!
Technical SEO | | elytical0 -
Does anyone use buzzfeed to creat links traffic and increase brand
Hi i would like to know if anyone uses http://www.buzzfeed.com to create links, gain traffic and increase brand awareness. I have signed up for an account but cannot get it to work and would like some help. I can get my content on there but cannot manage to get the links to work I signed up for this account a while back and a friend shown me how to use it but i have forgotten. here is my page http://www.buzzfeed.com/lifestylemagazine some links work and some links do not. what i am trying to do is to publish stories from my site as well as other sites and have the link included where you press the title and it goes to the site any help would be great
Technical SEO | | ClaireH-1848860 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0