Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why have bots (including googlebot) categorized my website as adult?
-
How do bots decide whether a website is adult? For example, I have a gifting portal, but strangely here, it is categorized as 'Adult'.
Also, my google adsense application to run ads on my site got rejected - I have a feeling this is because googlebot categorized my site as adult. And there are good chances that other bots also consider it an adult website, rather than a gifting website.
Can anyone please go through the site and tell me why this is happening? Thanks in advance.
-
Thanks for your responses!
-
I think you have a good question, but may be making too many assumptions. Similarweb.com has its own algo to drop sites in a given category. (Here is the list they use) http://www.similarweb.com/category When I look at your site in similarweb - they have an options for you to change the category and so I would suggest you do that (no algo is perfect).
The assumption you have is that if similarweb.com categorized you as "adult" then maybe Google did, but that may not be the case. Few references on what you should do when you apply to adsense plus the adsense guidelines
http://allbloggingtips.com/2012/08/27/applying-for-google-adsense-program/
http://www.techrez.com/2013/09/how-to-get-approved-google-adsense-account.html
https://support.google.com/adsense/answer/48182?hl=en
It mentioned some other things besides adult content, you need a privacy policy, an about us page. You have those things at the bottom of your page, but because you have infinite scrolling that loads more products on many of your pages (http://www.wishpicker.com/gifts-for/boyfriend), I can't get to the bottom of the page to click on them.
It just seems like you may be focusing on the wrong thing IMHO. Are there any other reasons that you would see that someone might categorize your site as adult? Does not look that way to me when I visited. Also, I ran your site through OSE and from your top 100 domains, did not see any adult sites linking to you.
Good luck!
-
Have you checked Google Webmaster Tools for any notices about your site, such as making sure that it isn't hacked with porn terms?
For AdSense, what reason was given? In looking at a couple of your pages, it looks like it's a site that mainly redirects to other sites for people to buy, likely with an affiliate ID or something. What value does your site add to users? In the content guidelines, Google states to create sites with unique and relevant content. https://support.google.com/adsense/answer/23921?hl=en
How many people are webmasters for this site? Could there be too many cooks in the kitchen? I've seen at least four different Moz accounts ask about this domain in the past year. You want to make sure you're all communicating with each other about the site (and if it's just one person, feel free to reach out to us for any account issues, so we can help you with using just one account and you can have the benefit of your account history over time).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding ad code from bots
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall. So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads? https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
White Hat / Black Hat SEO | | Matthew_Edgar
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY0 -
Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
White Hat / Black Hat SEO | | edmundsseo0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
Pages mirrored on unknown websites (not just content, all the HTML)... blackhat I've never seen before.
Someone more expert than me could help... I am not a pro, just doing research on a website... Google Search Console shows many backlinks in pages under unknown domains... this pages are mirroring the pages of the linked website... clicking on a link on the mirror page leads to a spam page with link spam... The homepage of these unknown domain appear just fine... looks like that the domain is partially hijacked... WTF?! Have you ever seen something likes this? Can it be an outcome of a previous blackhat activity?
White Hat / Black Hat SEO | | 2mlab0 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Bad for SEO to have two very similar websites on the same server?
Is it bad for SEO to have two very similar sites on the same server? What's the best way to set this up?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Should I delete Meta Keywords from a website?
Hi Guys, I've been reading various posts on the Q&A section here at Moz about Meta keywords. I understand that meta keywords are not relevant with Google and that Bing signals this as spam. I'm optimising existing websites which already have meta keywords in the html coding. My question is: If I delete ALL meta keyword coding will this have any negative impact whatsoever? Thanks Mozers Jason 🙂
White Hat / Black Hat SEO | | Grant-Westfield0 -
Is it negative to put a backlink into the footer's website of our clients ?
Hello there ! Everything is in the subject of this post but here is the context : we are a web agency and we, among others, build websites for our clients (most of them are shops). Until now, we put a link in their footer, like "developped by MyWebShop". But we don't know if it is bad or not. With only one website we can have like hundred of backlinks at once, but is it good for SEO or not ? Will Google penalize us thinking that is blackhat practices ? Is it better to put our link in the "legal notices" or "disclaimer" part of the websites ? What is the best practice for a lasting SEO ? I hope you understand my question, Thnak you in advance !
White Hat / Black Hat SEO | | mywebshop0