Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
RSS feeds- What are the secrets to getting them, and the links inside then, indexed and counted for SEO purposes?
-
RSS feeds, at least on paper, should be a great way to build backlinks and boost rankings. They are also very seductive from a link-builder's point of view- free, easy to create, allows you to specifiy anchor text, etc. There are even several SEO articles, anda few products, extolling the virtues of RSS for SEO puposes.
However, I hear anecdotedly that they are extremely ineffective in getting their internal links indexed. And my success rate has been abysmal- perhaps 15% have ever been indexed,and so far, I havenever seem Google show an RSS feed as a source for a backlink. I have even thrown some token backlinks against RSS feeds to see if that helped in getting them indexed, but even that has a very low success rate.
I recently read a blog post saying that Google "hates aRSS feeds" and "rarely spiders perhaps the first link or two." Yet there are many SEO advocates who claim that RSS feeds are a great untapped resource for SEO. I am rather befuddled.
Has anyone "crackedthe code" onhow to get them,and the links that they contain, indexed and helping rankings?
-
Actually, RSS feeds are also used as a defensive method of link building. YOAST makes a plugin for Wordpress that everyone should use (if they use wordpress), one of the features is inserting text and links into your RSS feed.
Obnoxious scraper sites use RSS feeds to populate their websites, they do not monitor the content, its all automated. By putting links and a citation in your RSS feeds, this lets you at least get a little benefit from their theft of your content.
Link Explorer shows feedburner and a couple other RSS agg sites as high value referring sites.
-
why would anyone need this service? I believe the original question was RSS feeds from the site owner being indexed? RSS feeds should be submitted too google webmaster tools to be index by google and Bing offers a similar service too webmasters, After initial submission the webmaster never has to submit again?
If I wanted to push my content using RSS feeds then I would use Ping.fm to push my content and links to third party sites and social media.......
I am at a loss why a webmaster would use the linkilecious site?
-
Really detailed overlook. Nice touching on everything.
-
If I understand the question correctly you would like your content to be spread to other sites through rss feeds and then be indexed there with a backlink to your site?
Number 1: there must be a reason for the other site to index and create a backlink to your site.
Number 2: these links are almost always "no follow" and therefore need to reach a very high amount of links to be of any real use for you if you want to affect the serp.
eg: You submit your site to several "ping" sites of your choosing that index certain content and then when you publish a new story these sites get pinged from your cms and a nofollow backlink is created for you on that site,
Just make certain that these sites that you ping actually has good content and have fills a puropose for the visitors.
A better way though to keep control over the material is to create an own site running wordpress where you write about your site as a blog. Just put a news section in a sidebar and put your RSS feed in there. wordpress sites are indexed extremly fast and when you own the site you can choose to use follow links in the section on the blog site.
This should lead to a faster indexing and you create backlinks that have a function and furthermore you own the site linking to you primary site.
A short summary:
RSS feeds are good to spread content and attract visitors. They're not a quick way to get backlinks.
-
We use an RSS feed for new product lists. We may have some lag time before a new product gets put in a category and able to be browsed to on our site. The RSS feed gives a few days head start getting these new products into the search engines. We redirect all RSS links back to the main site links that include canonical tags for the main product pages.
-
RSS should be designed primarily for your users and secondly to syndicate out using RSS Aggregators to distribute parts of your content (headlines and URLS)
Be careful about how much of the article content you include within the RSS Feed themselves. Whilst is it good for the user to include the full article within the feed by doing so you are also giving scrapers an an easy time to reproduce your content and thus might end up being penalising for duplicate content even though you are the original source (I've seen this happen).
I've used two techniques in the past the first was to publish a short additional body that contain a call to action to follow the link to the original article stub. I then switched to publishing the full content within the feed just for my users but I am thinking about going changing it again and publishing part of the content within the feed and then have a call to action for the reader to visit my site for the full article which will hopefully increase CTR on the feed whilst reducing the content duplication issue
-
The link building power of rss feeds is simply in getting other sites to feature and link to your content via rss. There would be no utility for a bot to crawl your feed stand alone, it would rather just look at the content itself. Try submitting your feed to rss directories or having other webmasters feature your feed on their site. I believe several web 2.0 sites like squid allow for feed publishing as well. Hope that helps.
-
Sorry, I'm a little confused as well. Why would you want people linking to your RSS feed instead of your original posts? Why would you even want the RSS feed to be indexed and returned in search results rather than the original posts? Wouldn't Google want to link people to the original post vs. the RSS feed? Aren't RSS feeds supposed to be a feed of content already on your site... so I don't see why Google would have much of an incentive to spider it or return it in search results?
-
You load links into it, it then creates an RSS feed on their end that gets pinged. You can load any kind of link into it and it'll ping them.
-
Thanks, but Linklicious turns links into RSS feeds- it doesn't help get the RSS feeds, or their internal links, to get indexed as far as I know. Am I not understanding the service correctly?
-
This service works well, I've personally tested it: http://linklicious.me/
Try that or another pinging service, there are a ton of them out there.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to get Google to stop indexing an old site!
Howdy, I have a small dilemma. We built a new site for a client, but the old site is still ranking/indexed and we can't seem to get rid of it. We setup a 301 from the old site to the new one, as we have done many times before, but even though the old site is no longer live and the hosting package has been cancelled, the old site is still indexed. (The new site is at a completely different host.) We never had access to the old site, so we weren't able to request URL removal through GSC. Any guidance on how to get rid of the old site would be very appreciated. BTW, it's been about 60 days since we took these steps. Thanks, Kirk
Intermediate & Advanced SEO | | kbates0 -
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
My url disappeared from Google but Search Console shows indexed. This url has been indexed for more than a year. Please help!
Super weird problem that I can't solve for last 5 hours. One of my urls: https://www.dcacar.com/lax-car-service.html Has been indexed for more than a year and also has an AMP version, few hours ago I realized that it had disappeared from serps. We were ranking on page 1 for several key terms. When I perform a search "site:dcacar.com " the url is no where to be found on all 5 pages. But when I check my Google Console it shows as indexed I requested to index again but nothing changed. All other 50 or so urls are not effected at all, this is the only url that has gone missing can someone solve this mystery for me please. Thanks a lot in advance.
Intermediate & Advanced SEO | | Davit19850 -
If I nofollow outbound external links to minimize link juice loss > is it a good/bad thing?
OK, imagine you have a blog, and you want to make each blog post authoritative so you link out to authority relevant websites for reference. In this case it is two external links per blog post, one to an authority website for reference and one to flickr for photo credit. And one internal link to another part of the website like the buy-now page or a related internal blog post. Now tell me if this is a good or bad idea. What if you nofollow the external links and leave the internal link untouched so all internal links are dofollow. The thinking is this minimizes loss of link juice from external links and keeps it flowing through internal links to pages within the website. Would it be a good idea to lay off the nofollow tag and leave all as do follow? or would this be a good way to link out to authority sites but keep the link juice internal? Your thoughts are welcome. Thanks.
Intermediate & Advanced SEO | | Rich_Coffman0 -
Do internal links from non-indexed pages matter?
Hi everybody! Here's my question. After a site migration, a client has seen a big drop in rankings. We're trying to narrow down the issue. It seems that they have lost around 15,000 links following the switch, but these came from pages that were blocked in the robots.txt file. I was wondering if there was any research that has been done on the impact of internal links from no-indexed pages. Would be great to hear your thoughts! Sam
Intermediate & Advanced SEO | | Blink-SEO0 -
SEO Impact of High Volume Vertical and Horizontal Internal Linking
Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!
Intermediate & Advanced SEO | | jhariani2 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
Increasing Internal Links But Avoiding a Link Farm
I'm looking to create a page about Widgets and all of the more specific names for Widgets we sell: ABC Brand Widgets, XYZ Brand Widgets, Big Widgets, Small Widgets, Green Widgets, Blue Widgets, etc. I'd like my Widget page to give a brief explanation about each kind of Widget with a link deeper into my site that gives more detail and allows you to purchase. The problem is I have a lot of Widgets and this could get messy: ABC Green Widgets, Small XYZ Widgets, many combinations. I can see my Widget page teetering on being a link farm if I start throwing in all of these combos. So where should I stop? How much do I do? I've read more than 100 links on a page being considered a link farm, is that a hardline number or a general guideline?
Intermediate & Advanced SEO | | rball10