What is triggering Google account suspensions?
-
Over the past 24 hours many of our clients have had their Google accounts suspended. The explanation has been:
"After reviewing your profile, we determined that it has been used to impersonate another individual or mislead other users. This violates the Google+ User Content and Conduct Policy."
We are NOT impersonating our clients, we have their permission. We are not misleading anyone, simply setting up profiles for our clients on Google+.
This has not affected all of our clients, but a significant number of them. We cannot find a common variable between the clients that have been suspended, and those who have not.
- Some have had other Google+ profiles in the past, in another account, some have not.
- Some have been previously verified via SMS, others by phone.
- Some have posts in their profile, others have only the profile info filled out.
Again, we are not trying to game Google, we are simply setting up authorship for them, with their permission.
I have not seen much in the SEO community about this today, and this is NOT related to fake reviews. We do not partake in that kind of activity.
We have written a post on the topic, and no matter how this shakes out, I think our take is solid. Authorship is changing the game, content is changing the game, trust is changing the game… and Google is getting serious about it.
We have also seen this happen to our clients, to our competitors' clients, and to other marketing firms' clients, outside of our vertical.
Does anyone know more about the topic, especially in regards to the suspensions over the past 24 hours?
-
Another point. It doesn't have to be the business owner who does the verification and uses his own IP address. It can be the receptionist or anyone else. All I need is for the business owner to tell the employee to expect my call and follow my instructions. On one occasion, the employee even called me from home because she had no web access at work.
-
I am completely with Daniel here - IP's can trigger this. I know you say you don't think this is IP related, but there are many other instances of this happening. Why Google might favour some over others is anyones guess, but this is where I would place my bets.
-
Thanks for your feedback Daniel. We've tried this before, but with our volume + limited access to the actual business owners, its very difficult. Again, I don't believe this is a IP issue. The same thing happened to competing company's clients in our vertical that are smaller than ours (including some individual consultants that wouldn't trip an IP filter), within the same 48 hour period.
-
FWIW, I try to have my clients set up their own Google+ accounts, then share the login and password with me.
This is generally done during one of my update calls or web meetings, with me talking them through the process
-
Boy the thought has crossed my mind that the number for verification might get tagged. Going to have to start creating Skype accounts or something?
-
Thanks for your response. I know they can track IPs, but this appears to be something different. It hit our clients, and our competitors clients, on the same day.
I am thinking they targeted our vertical for some reason.
Also, many but not all accounts were affected.
Smells like there's more to this story than just "IP volume tracking", since we deal with that and similar volume issues often (like using the same ph# repeatedly for verification).
-
Hi, Even though you are setting up accounts for your clients, creating too many from the same IP would trigger the suspension. This is nothing new. I remember hearing about this last year. Mash
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Yelp (recrawl Google/Bing)
If Google and Bing show an older version of a site's Yelp rating in the search results, what options are there to help ensure Google and Bing recrawl the Yelp page? Additionally, it appears third-party sites such as MapQuest show Yelp ratings and appear in Google search results; is it possible to request MapQuest to recrawl Yelp and then ask Google to recrawl MapQuest? Any advice would be much appreciated!
Industry News | | Mack_1 -
How to remove the inbound links of a website from Google Webmaster Tools?
Hello viewers, One of my projects (BannerBuzz.com) is having linked with this Site: http://www.article-niche.com/ and we can see so many inbound links in our Webmaster account from this site, we have already disavowed this site but still it is found in our Webmaster Tools and we don’t have option to mail them as the site is down, so kindly anyone help us out how to remove this back-links and I want to remove it from my Webmaster account as well as from “Search Results” as the site is down. Currently as this site is show down from long time and because of its back-links, our website (BannerBuzz.com) has been penalized by Google.
Industry News | | CommercePundit0 -
Google number one search result looks drastically different in firefox compared to chrome
I just noticed this today that some websites and brands look like this on firefox only, and others while still being number one result for their brand name, do not appear like this at all. also, this does not happen over chrome at all. both images provided for comparison are using the same google apps account logged in. It would be nice if someone could shed some light on as to why this happens sporadically and what does it take to be distinguished like this for your own brand if you own the identical domain.com or whatever. Zz7ZkX5.png lpuwheo.png
Industry News | | Raydon0 -
Google Analytics Tracking Miscount - Originating Around 5/24
Is anyone else experiencing an issue where it seems GA failed to track many of their visitors? This is why I think it's something on GA's end and not mine (and not natural): 1. No changes were made, site wasn't down, tracking code is correct and is "receiving data" 2. Referral, Direct and Organic traffic all dropped at about the same decline over the same period *3. Pinterest Analytics shows me being sent over 10 times more traffic every day than GA shows me actually receiving. In the screenshot, Pinterest shows 584 uniques while GA shows 43 for Sunday. I figure if something was broken, I wouldn't be showing ANY traffic and if it was natural, it wouldn't have happened on the same decline across all traffic mediums and Pinterest Analytics wouldn't be so far off. I saw a few people mentioning similar things on Google's forums and wanted to ask you guys if anyone noticed any issues? y8XAvwz.png pRsRAmK.png
Industry News | | zDucketz0 -
Google update on Jan 17 2013 ?
Hi guys, Today ( Jan 17 2013 ) I am observing a lot of changes within google serp for a variety of keyword. im feeling like if there was a google update somehow. There seems to be few thread around the web that claim such an update ( or a panda refresh ) , were you affected ? Did somebady else noticed a huge SERP fluctuation within their primary keyword ? Thanks in advance for your answer 😄 Best regards, Yan
Industry News | | ydesjardins2001 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Google's New Release "What do you love"
Is this gonna be a game changer for SEO http://www.wdyl.com/ Regards, Shailendra Sial
Industry News | | IM_Learner0 -
How to achieve the highest global and local relevance in google?
Let's say I have a company that has its main business in Europe for thefollowing languages: English German Portugese French Italian And let's say some other markets (e.g. the Portugese one in south america) is also important. The question now is how should we structure the Domain if we want onlyone top level domain (www.company.com)? a) By using subdomains to target users with Google Webmaster Tools for the relevant country: portugal.company.com/pt (same content) brasil.company.com/pt (same content) germany.company.com/de england.company.com/en etc. or b) by using virtual folders www.company.com/pt www.company.com/de www.company.com/en
Industry News | | imsi
etc. or c) something completely different I do not know about? What do you reckon is best? I appreciate all suggestions!0