Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can hotlinking images from multiple sites be bad for SEO?
-
Hi,
There's a very similar question already being discussed here, but it deals with hotlinking from a single site that is owned by the same person.
I'm interested whether hotlinking images from multiple sites can be bad for SEO.
The issue is that one of our bloggers has been hotlinking all the images he uses, sometimes there are 3 or 4 images per blog from different domains.
We know that hotlinking is frowned upon, but can it affect us in the SERPs?
Thanks,
James
-
Sorry, hotlinking was the wrong word to use, we're actually just embedding the images.
Is it possible that Google recognises that spammy sites (as an example) tend to embed lots of images and therefore use it as an indicator of spam?
Also, is poor netiquette ever taken into account? Again, maybe because Google is trying to find spammy sites?
For the record, it is something we'll be fixing (especially from a copyright point of view), but we're trying to prioritise this. If there's a potential SEO impact, we'll sort it quick, if not, then we'll do more pressing things first.
-
Okay, so hotlinking is the wrong terminology to use. Do you think embedding images is taken into account by Google?
For example, would Google see spammy sites embedding lots of images, and therefore use it as an indicator of spam?
-
That's confused me too! Embedding an image from another site is hotlinking. A href doesn't have anything to do with it.
-
Excuse me, it's late in the day. Embedding is still referencing the sites image URL right?
Also, what if the site changes the directory or something and all the images on your site now 404.
-
Another thing to consider is that requesting images from multiple sites will create a lag in load times. Most modern browsers will download multiple files in parallel from the one host. Multiple hosts will mean the page load will occur in series (not parallel) and this will create a slower load time.
Hope this helps!
Dan
-
Sorry, I assumed you meant you were hotlinking images, rather than just embedding them. If you're just using tags with no <href> defined (so just embedding, not hotlinking), then you're right - this won't cause a problem.</href>
-
Create and host your own image or use a royalty-free image so you won't suffer from someone claiming copyright, this should be your biggest concern here.
-
Takeshi is right. Bandwidth can cost money, so there's that as well as the copyright theft. You could also fall victim to a 'switcheroo': http://www.deuceofclubs.com/switcheroo/index.html - I've done this myself before by adding a polite message asking someone not to hotlink.
Google don't include hotlinked images in Google News so it is something they may take into account when ranking a page in their general search.
-
Surely that only works if it's an actual link, right? Simply using the tag shouldn't be regarded as a link by Google?
-
You are definitely missing out on image traffic by not hosting your own images. Plus, hotlinking is poor netiquette since you are using someone else's bandwidth without their permission. If the images are copyrighted, then you could be hit by DMCA requests which can negatively impact your SEO.
-
Hi James
A lot of this will depend on the site you're linking to.
It's long been a part of the ranking algorithm that if you link to sites that are seen negatively by Google, due to spam/malware/etc, then your site may be viewed negatively itself. Without knowing where your blogger has been linking from, it's hard to say - but it's worth running a check just in case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Query string parameters always bad for SEO?
I've recently put some query string parameters into links leading to a 'request a quote' form which auto-fill the 'product' field with the name of the product that is on the referring product page. E.g. Red Bicycle product page >>> Link to RFQ form contains '?productname=Red-Bicycle' >>>> form's product field's default value becomes 'Red-Bicycle' I know url parameters can lead to keyword cannibalisation and duplicate content, we use sub-domains for our language changer. BUT for something like this, am I potentially damaging our SEO? Appreciate I've not explained this very well. We're using Kentico by the way, so K# macros are a possibility (I use a simple one to fill the form's Default Field).
Technical SEO | | landport0 -
I am trying to generate GEO meta tag for my website where on one page there are multiple locations My question is, Can I add GEO tagging for every address?
Am I restricted to 1 geo tag per page or can i add multiple geo tags ?
Technical SEO | | lina_digital0 -
Image Sitemap
I currently use a program to create our sitemap (xml). It doesn't offer creating an mage sitemaps. Can someone suggest a program that would create an image sitemap? Thanks.
Technical SEO | | Kdruckenbrod0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Staging site and "live" site have both been indexed by Google
While creating a site we forgot to password protect the staging site while it was being built. Now that the site has been moved to the new domain, it has come to my attention that both the staging site (site.staging.com) and the "live" site (site.com) are both being indexed. What is the best way to solve this problem? I was thinking about adding a 301 redirect from the staging site to the live site via HTACCESS. Any recommendations?
Technical SEO | | melen0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
How to push down outdated images in Google image search
When you do a Google image search for one of my client's products, you see a lot of first-generation hardware (the product is now in its third generation). The client wants to know what they can do to push those images down so that current product images rise to the top. FYI: the client's own image files on their site aren't very well optimized with keywords. My thinking is to have the client optimize their own images and the ones they give to the media with relevant keywords in file names, alt text, etc. Eventually, this should help push down the outdated images is my thinking. Any other suggestions? Thanks so much.
Technical SEO | | jimmartin_zoho.com0 -
Can local SEO harm national rankings?
Today I met with a firm called Localeze that provides local directory submissions. I understand the importance of this service if your site is competing locally, however I'm not sure the effects of local SEO for a national brand. Our firm gets most of our traffic from across the country, not just one location, and our business is scattered (which is a good thing). We rank for service related keywords that are not tied to a location. We do not show up for local results so our business in our immediate location is weak. We would like to increase our local presence in search engines but I want to make sure that this will not take away from our national presence. Will optimizing a site for local search negatively affect general rankings? Thanks
Technical SEO | | KevinBloom1