Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Bulk reverse image search?
-
Hi, i have a couple fashion clients who have very active blogs and post lots of fashion content and images. Like 50+ images weekly.
I want to check if these images have been used by other sources in bulk, are there any good reverse image search tools which can do this?
Or any recommended ways to efficiently do this for a large number of images?
Cheers
-
-
Hey there,
Great tool I use is the _TinEye Reverse Image Search _(doesn't work in bulk though, and you'll have to do it one by one).
Hope it helps. Cheers, Martin
-
http://www.pixsy.com/ https://www.digimarc.com/products/guardian
Both provide solutions for copyright protection. So they should fit your needs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do bulk 301 redirects hurt seo value?
We are working with a content based startup that needs to 301 redirect a lot of its pages to other websites. Will give you an example to help you understand. If we assume this is the startups domain and URL structure www.ourcompany.com/brand1/article What they want to do is do a 301 redirect of www.ourcompany.com/brand1/ to www.brand1.com I have never seen 301 as a problem to SEO or link juice. But in this case where all the major URLs are getting redirected to other sites i was wondering if it would have a negative effect. Right now they have just 20-30 brands but they are planning to hit a couple of hundreds this year.
Intermediate & Advanced SEO | | aaronfernandez0 -
Redirect old image that has backlinks
Hi Moz Community! I'm doing an audit of a website and did a backlink analysis. In the backlink analysis, there is an image that has 66 backlinks but the image doesn't exist on the website anymore (it was on a website that was created in 2011 - 2 web launches ago). I don't believe a 301 redirect will work for an image that doesn't exist anymore. How would I redirect the image URL (it's WordPress so we have a specific URL that other websites are linking to but get 404 errors) without going to each individual website and requesting they change the URL link? Any advice or recommendations would be great. Thanks!
Intermediate & Advanced SEO | | BradChandler1 -
Website Snippet Update in Search Console?
I have a company that I started working with that has an outdated and inaccurate snippet coming up. See the link below. They changed their name from DK on Pittsburgh Sports to just DK Pittsburgh Sports several years ago, but the snippet is still putting the old info, including outdated and incorrect description. I'm not seeing that title or description anywhere on the site or a schema plugin. How can we get it updated? I have updated titles, etc. for the home page, and done a Fetch to get re-indexed. Does Snippet have a different type of refresh that I can submit or edit? Thanks in advance https://g.co/kgs/qZAnAC
Intermediate & Advanced SEO | | jeremyskillings0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
How Many Images on 1 Page Are Acceptable
Example I have a page with a slideshow of 35 pictures. They are all unique pictures and relevant to the page, have unique alt text, though no captions or description surrounding the images. Page also has a lot of unique written content. Question: is this large nr of pictures potentially overwhelming for search engines and they may think it is spammy and it would be a safer bet to only keep the top 10 pictures on such page? I did review this great whiteboard Friday - http://moz.com/blog/image-seo-basics-whiteboard-friday - and I noticed this at very end: "The other part, and I see this happen a lot especially with bigger clients, is when you put lots and lots of images on one page, like an image gallery, those pages tend to be very hard to get indexed. The reason for that is there's not a lot unique textual content. A lot of times it's just overwhelming to users. It doesn't provide a lot of benefit in a search result." My page has been indexed, but will ranking potentially be hurt and to play it safe I better reduce nr of pictures? I do understand the "do what is best for the user" scenario and that is what I am doing with a lot of amazing original pictures not found on any other website. However, with search engines we obviously have to consider how they operate as well. Thank you
Intermediate & Advanced SEO | | khi50 -
My website (non-adult) is not appearing in Google search results when i have safe search settings on. How can i fix this?
Hi, I have this issue where my website does not appear in Google search results when i have the safe search settings on. If i turn the safe search settings off, my site appears no problem. I'm guessing Google is categorizing my website as adult, which it definitely is not. Has anyone had this issue before? Or does anyone know how to resolve this issue? Any help would be much appreciated. Thanks
Intermediate & Advanced SEO | | CupidTeam0 -
Hosting images on multiple domains
I'm taking the following from http://developer.yahoo.com/performance/rules.html "Splitting components allows you to maximize parallel downloads. Make sure you're using not more than 2-4 domains because of the DNS lookup penalty. For example, you can host your HTML and dynamic content on www.example.org and split static components between static1.example.org and static2.example.org" What I want to do is load page images (it's an eCommerce site) from multiple sub domains to reduce load times. I'm assuming that this is perfectly OK to do - I cannot think of any reason that this wouldn't be a good tactic to go with. Does anyone know of (or can think of) a reason why taking this approach could be in any way detrimental. Cheers mozzers.
Intermediate & Advanced SEO | | eventurerob0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0