Can a Self-Hosted Ping Tool Hurt Your IP?
-
Confusing title I know, but let me explain.
We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates.
This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP.
Thoughts?
-
We are not using Wordpress for any of the tools, and all will be handled using PHP. This is a separate directory from our main site used strictly for the tools. Our main site does not use Wordpress either.
The server resources are not an issue at this time, as we have a very powerful setup.
I am not worried about how many times a subscribed user wants to ping their site. I am more concerned about where the ping is being sent out from and how many times.
-
First of all, there is no certainty that pinging a domain helps get it indexed, submitting a site map in the search console seems like the appropriate way to get that done.
I understand that if you ping your site when you update content it can let many sites, RSS feeds, and search engines know about it, but if you ping too much you risk getting blacklisted.
Second, it seems that using your server to send out many pings may slow down response time and therefor slow page load speed for your site which definitely has a negative effect on SEO.
Thirdly, if you can host the service on a separate IP, that would seem like the best course of action because if it gets blacklisted you can just start using a different one, don't risk your domains IP to get blacklisted.
Maybe, I'm missing something here but if you are using WordPress, doesn't that automatically create an auto-updating /feed/ URL for your site?
The following is from - https://en.support.wordpress.com/comments/pingbacks/
Granted I am using WordPress so that is mostly what I focus on. Are you using a different CMS?
How do I send out update pings?
Many services like Technorati, Feedster, Icerocket, Google Blog Search, and others want a “ping” from you to know you’ve updated so they can index your content. WordPress.com handles it all for you. When you post, we send a ping using Ping-o-Matic!, is a service that pings several different search providers all at once including Technorati, My Yahoo!, and Google Blog Search.
Pings are automatically sent if you have a public blog. If your blog is private or if you block search engines, pings will not be sent.
-
Yup!
Use "javascript" on client site to do pinging. Or Java app running from web as applet. Or Flash.
There are two major problems - javascript doesn't support cross-platform post without hacks. And not all computers comes with Java. Same is with Flash.
-
Thank you for your response. As to the IP getting blacklisted, since we have full server control we could always assign another dedicated IP address to the site. The issue is that we would not know if and when it happened to take such action. Obviously, we don't want to have to do this, and it could create headaches if the main site IP is blacklisted for our search position until we get it resolved.
We are also planning on adding in website submission limits. For example, you could only submit mysitehere.com up to 3 times per month per subscriber account. The only way they could spam the system is to create another account and sign up all over again. I doubt anyone would go through that much effort, but I could be wrong.
Thoughts?
-
TL;DR - YES
Long story - i'm author of similar desktop tool called SEOPingler:
http://www.mobiliodevelopment.com/seopingler/
so anyone you can use to ping anything. And bots are coming within second or two. This works perfect.The problem is when you use this to ping many URLs (like 10k-20k). At some time this stop working and ping API endpoint receive your request but i can't see that bots are coming. This mean that there is some threshold that if you pass it for IP and you're temporary blacklisted. I also heard (but i can't confirm this) that this temporary may vary due previous usage. For me this isn't problem because users can blacklist their own IPs. And they can use hotspot wifi internet or VPN for continuing pinging.
But on server this will be HUGE problem because you can't switch IPs on-fly. And no one can guarantee how long your IP will be blacklisted.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Image redirection: Will it helps or hurts?
Hi all, There are some old images (non-existing now) from our website which have backlinks. We would like to redirect them to some live images to reclaim the backlinks. Is this Okay or sounds suspicious to Google? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
What would you say is hurting this site, Penguin or Panda?
Would you say this is both Penguin and Panda and no penalty has ever been lifted? What would be your general recommendations for this site? seWnoQm
White Hat / Black Hat SEO | | BobGW0 -
I Mistakenly uploaded Disavow File to Non WWW version of Webiste in Webmaster Tools...is this a Problem???
Hey guys and gals, I need some advice on this please. I recently had someone perform a negative S.E.O campaign on my site and I was inundated with 13,000 + spammy links pointing to my website and I had to perform a Disavow in Google Webmaster Tools but for some reason, it is showing that I uploaded the Disavow text file to the Non WWW version of the website but the WWW version of my website is the preferred domain and I have all NON WWW queries being 301 redirected to www.pcmedicsoncall.com My question is should I correct this and upload the Disavow text to the preferred domain in Google Webmaster Tools??? Please advise on how I should proceed with this situation.... Thank you. Cam
White Hat / Black Hat SEO | | CamMcArthur0 -
I'm Getting Attacked, What Can I Do?
I recently noticed a jump in my Crawl Errors in Google Webmaster Tools. Upon further investigation I found hundreds of the most spammy web pages I've ever seen pointing to my domain (although all going to 404 errors): http://blurchelsanog1980.blog.com/ http://lenitsky.wordpress.com/ These are all created within the last week. A. What the hell is going on? B. Should I be very concerned? (because they are 404 errors) C. What should my next steps be? Any help would be greatly appreciated.
White Hat / Black Hat SEO | | CleanEdisonInc0 -
Please Help- Confusion about how to Avoid Keyword Self-Cannibalization and Keyword Stuffing
I am pretty much a rookie when it comes to the SEO game and to be completely honest SEO is really confusing. I just recently started using MOZ and I was looking at my On-Page report and I saw that I needed to correct some “Avoid Keyword Self-Cannibalization” errors. So I looked at the error and the fix. Here is what MOZ gave me. Cannibalizing link "How to make a fake diploma", "How to get a fake diploma", "Making a Fake High School Diploma", "Fake Diploma Template", and "Framing your fake diploma" Explanation It's a best practice in SEO to target each keyword with a single page on your site (sometimes two if you've already achieved high rankings and are seeking a second, indented listing). To prevent engines from potentially seeing a signal that this page is not the intended ranking target and creating additional competition for your page, we suggest staying away from linking internally to another page with the target keyword(s) as the exact anchor text. Note that using modified versions is sometimes fine (for example, if this page targeted the word 'elephants', using 'baby elephants' in anchor text would be just fine). Recommendation Unless there is intent to rank multiple pages for the target keyword, it may be wise to modify the anchor text of this link so it is not an exact match. This error is for my Hompage(http://www.fake-diploma.com) for the keyword Fake Diploma. My understanding is that for Self-Cannibalization to occur I would have to have a link on this page pointing to another page using "Fake Diploma" as my anchor text since I want this page to rank for Fake Diploma. I do have the right hand sidebar which contains my most recent posts and some of my titles do include Fake Diploma. How to make a Fake Diploma
White Hat / Black Hat SEO | | diplomajim
Fake Diploma Template
Framing your Fake Diploma
To me theses are separate longtail keywords. While they do include Fake Diploma in them I thought theses were fine because they are not an Exact Match to each other nor are they an Exact Match to “Fake Diploma”. Am I wrong about this? Secondly I reached out on another Forum trying to get a better understanding of this and just got even more confused. I was told that I am also Keyword Stuffing and could be penalized. They said because I have Fake Diploma in most of my article titles that I am Stuffing Fake Diploma. I am in a Niche Market and of course most of my titles include Fake Diploma because that is what my entire site is about. I used the Google Keyword Tool and searched Fake Diploma and it gave me a list of about 79 related keywords like: Make a Fake Diploma Online
Create a Fake Diploma
Fake Diploma Software This is just a few of the many that I have. I thought the best way to rank for a keyword was to actually write a post about that Keyword and use it as the title of the article. I am not over using the Keyword in the actual article and I maybe have a Keyword density of about 2-5%. I thought Keyword Stuffing was where you actually used the Keyword like 50 times and also just added random Keywords to the article that did not belong. Please help me with any insights you can offer. I feel like I am doing all of this completely wrong.0 -
Keyword Rich Domains on Same IP
In addition to my main website, I want to create two new sites for the upcoming football and basketball seasons. By starting now, I'm thinking I have enough time to get them ranked decently. I have purchased www.collegefootballpredictions.net for the upcoming football seasons. The intent here is two fold. First, I'd like to rank in the top 3 for "College Football Predictions." Second, and this is why I'm thinking that Google won't hate me for the approach, is that someone looking for that search term is much more likely to convert on a landing page geared for them then on my main website. If the goal of a separate website is truly to compliment the main website, then is it considered white hat? I'm thinking that, as long as my intentions are pure, they should go on the same IP. Placing them on separate IPs could be a good way of letting the big G know that I'm trying to cheat the system and get away with it.
White Hat / Black Hat SEO | | PatrickGriffith0 -
Could a sitewide footer EXACT MATCH anchor text link hurt or potentially penalize a site?
I am pretty sure this would hurt rankings yet I just want another's opinion on it. Would a sitewide footer link with exact match keyword anchor text to the page you want to rank for your main keyword hurt you? Basically if it were a link to the homepage, yet you wanted to make the anchor text your main objective keyword, would it hurt to have this in the footer along with the logo link at the top of a page that is just "home" anchor text?
White Hat / Black Hat SEO | | jbster130 -
Partner Site Hit with Penguin - Links hurt me
I work for a network of international websites, the site I work on is for Canada. Our partners in Australia were hit by penguin hard because they hired a black hat SEO guy and didn't know. He was creating profiles on highly authoritative sites and keyword stuffing them. Now, they've completely dropped off the SERP. This is where the issue occurs, because we are all international partners we are all linked together on the header of every page so visitors can choose their country. Now, because they were hit hard and we have reciprocal links (not for rankings but for usability) will we be affected? It seems like we have, but I just want some opinions out there. Also, should we go ahead and stop linking our sites between countries to avoid this mess?
White Hat / Black Hat SEO | | BeTheBoss0