Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can hotlinking images from multiple sites be bad for SEO?
-
Hi,
There's a very similar question already being discussed here, but it deals with hotlinking from a single site that is owned by the same person.
I'm interested whether hotlinking images from multiple sites can be bad for SEO.
The issue is that one of our bloggers has been hotlinking all the images he uses, sometimes there are 3 or 4 images per blog from different domains.
We know that hotlinking is frowned upon, but can it affect us in the SERPs?
Thanks,
James
-
Sorry, hotlinking was the wrong word to use, we're actually just embedding the images.
Is it possible that Google recognises that spammy sites (as an example) tend to embed lots of images and therefore use it as an indicator of spam?
Also, is poor netiquette ever taken into account? Again, maybe because Google is trying to find spammy sites?
For the record, it is something we'll be fixing (especially from a copyright point of view), but we're trying to prioritise this. If there's a potential SEO impact, we'll sort it quick, if not, then we'll do more pressing things first.
-
Okay, so hotlinking is the wrong terminology to use. Do you think embedding images is taken into account by Google?
For example, would Google see spammy sites embedding lots of images, and therefore use it as an indicator of spam?
-
That's confused me too! Embedding an image from another site is hotlinking. A href doesn't have anything to do with it.
-
Excuse me, it's late in the day. Embedding is still referencing the sites image URL right?
Also, what if the site changes the directory or something and all the images on your site now 404.
-
Another thing to consider is that requesting images from multiple sites will create a lag in load times. Most modern browsers will download multiple files in parallel from the one host. Multiple hosts will mean the page load will occur in series (not parallel) and this will create a slower load time.
Hope this helps!
Dan
-
Sorry, I assumed you meant you were hotlinking images, rather than just embedding them. If you're just using tags with no <href> defined (so just embedding, not hotlinking), then you're right - this won't cause a problem.</href>
-
Create and host your own image or use a royalty-free image so you won't suffer from someone claiming copyright, this should be your biggest concern here.
-
Takeshi is right. Bandwidth can cost money, so there's that as well as the copyright theft. You could also fall victim to a 'switcheroo': http://www.deuceofclubs.com/switcheroo/index.html - I've done this myself before by adding a polite message asking someone not to hotlink.
Google don't include hotlinked images in Google News so it is something they may take into account when ranking a page in their general search.
-
Surely that only works if it's an actual link, right? Simply using the tag shouldn't be regarded as a link by Google?
-
You are definitely missing out on image traffic by not hosting your own images. Plus, hotlinking is poor netiquette since you are using someone else's bandwidth without their permission. If the images are copyrighted, then you could be hit by DMCA requests which can negatively impact your SEO.
-
Hi James
A lot of this will depend on the site you're linking to.
It's long been a part of the ranking algorithm that if you link to sites that are seen negatively by Google, due to spam/malware/etc, then your site may be viewed negatively itself. Without knowing where your blogger has been linking from, it's hard to say - but it's worth running a check just in case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple CMS on one website / domain & SEO
For a client we would like to work with a content hub, but their website is build on a custom CMS so we are limited in our options and if we aks their web developers they ask crazy prices to help us. So now we have the idea to build the content hub with wordpress and implement it next to their current CMS. for example on www.website.com/contenthub/ . As far as i know this is technically possible and there are no negative effects regarding SEO as long as we link the two sitemaps together. Am i right or am i missing something here?
Technical SEO | | Siphoplait0 -
Query string parameters always bad for SEO?
I've recently put some query string parameters into links leading to a 'request a quote' form which auto-fill the 'product' field with the name of the product that is on the referring product page. E.g. Red Bicycle product page >>> Link to RFQ form contains '?productname=Red-Bicycle' >>>> form's product field's default value becomes 'Red-Bicycle' I know url parameters can lead to keyword cannibalisation and duplicate content, we use sub-domains for our language changer. BUT for something like this, am I potentially damaging our SEO? Appreciate I've not explained this very well. We're using Kentico by the way, so K# macros are a possibility (I use a simple one to fill the form's Default Field).
Technical SEO | | landport0 -
Image Sitemap
I currently use a program to create our sitemap (xml). It doesn't offer creating an mage sitemaps. Can someone suggest a program that would create an image sitemap? Thanks.
Technical SEO | | Kdruckenbrod0 -
Best Web-site Structure/ SEO Strategy for an online travel agency?
Dear Experts! I need your help with pointing me in the right direction. So far I have found scattered tips around the Internet but it's hard to make a full picture with all these bits and pieces of information without a professional advice. My primary goal is to understand how I should build my online travel agency web-site’s (https://qualistay.com) structure, so that I target my keywords on correct pages and do not create a duplicate content. In my particular case I have very similar properties in similar locations in Tenerife. Many of them are located in the same villa or apartment complex, thus, it is very hard to come up with the unique description for each of them. Not speaking of amenities and pricing blocks, which are standard and almost identical (I don’t know if Google sees it as a duplicate content). From what I have read so far, it’s better to target archive pages rather than every single property. At the moment my archive pages are: all properties (includes all property types and locations), a page for each location (includes all property types). Does it make sense adding archive pages by property type in addition OR in stead of the location ones if I, for instance, target separate keywords like 'villas costa adeje' and 'apartments costa adeje'? At the moment, the title of the respective archive page "Properties to rent in costa adeje: villas, apartments" in principle targets both keywords... Does using the same keyword in a single property listing cannibalize archive page ranking it is linking back to? Or not, unless Google specifically identifies this as a duplicate content, which one can see in Google Search Console under HTML Improvements and/or archive page has more incoming links than a single property? If targeting only archive pages, how should I optimize them in such a way that they stay user-friendly. I have created (though, not yet fully optimized) descriptions for each archive page just below the main header. But I have them partially hidden (collapsible) using a JS in order to keep visitors’ focus on the properties. I know that Google does not rank hidden content high, at least at the moment, but since there is a new algorithm Mobile First coming up in the near future, they promise not to punish mobile sites for a collapsible content and will use mobile version to rate desktop one. Does this mean I should not worry about hidden content anymore or should I move the descirption to the bottom of the page and make it fully visible? Your feedback will be highly appreciated! Thank you! Dmitry
Technical SEO | | qualistay1 -
One robots.txt file for multiple sites?
I have 2 sites hosted with Blue Host and was told to put the robots.txt in the root folder and just use the one robots.txt for both sites. Is this right? It seems wrong. I want to block certain things on one site. Thanks for the help, Rena
Technical SEO | | renalynd270 -
URL Structure On Site - Currently it's domain/product-name NOT domain/category/product name is this bad?
I have a eCommerce site and the site structure is domain/product-name rather than domain/product-category/product-name Do you think this will have a negative impact SEO Wise? I have seen that some of my individual product pages do get better rankings than my categories.
Technical SEO | | the-gate-films0 -
Staging site and "live" site have both been indexed by Google
While creating a site we forgot to password protect the staging site while it was being built. Now that the site has been moved to the new domain, it has come to my attention that both the staging site (site.staging.com) and the "live" site (site.com) are both being indexed. What is the best way to solve this problem? I was thinking about adding a 301 redirect from the staging site to the live site via HTACCESS. Any recommendations?
Technical SEO | | melen0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0