Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can a Self-Hosted Ping Tool Hurt Your IP?
-
Confusing title I know, but let me explain.
We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates.
This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP.
Thoughts?
-
We are not using Wordpress for any of the tools, and all will be handled using PHP. This is a separate directory from our main site used strictly for the tools. Our main site does not use Wordpress either.
The server resources are not an issue at this time, as we have a very powerful setup.
I am not worried about how many times a subscribed user wants to ping their site. I am more concerned about where the ping is being sent out from and how many times.
-
First of all, there is no certainty that pinging a domain helps get it indexed, submitting a site map in the search console seems like the appropriate way to get that done.
I understand that if you ping your site when you update content it can let many sites, RSS feeds, and search engines know about it, but if you ping too much you risk getting blacklisted.
Second, it seems that using your server to send out many pings may slow down response time and therefor slow page load speed for your site which definitely has a negative effect on SEO.
Thirdly, if you can host the service on a separate IP, that would seem like the best course of action because if it gets blacklisted you can just start using a different one, don't risk your domains IP to get blacklisted.
Maybe, I'm missing something here but if you are using WordPress, doesn't that automatically create an auto-updating /feed/ URL for your site?
The following is from - https://en.support.wordpress.com/comments/pingbacks/
Granted I am using WordPress so that is mostly what I focus on. Are you using a different CMS?
How do I send out update pings?
Many services like Technorati, Feedster, Icerocket, Google Blog Search, and others want a “ping” from you to know you’ve updated so they can index your content. WordPress.com handles it all for you. When you post, we send a ping using Ping-o-Matic!, is a service that pings several different search providers all at once including Technorati, My Yahoo!, and Google Blog Search.
Pings are automatically sent if you have a public blog. If your blog is private or if you block search engines, pings will not be sent.
-
Yup!
Use "javascript" on client site to do pinging. Or Java app running from web as applet. Or Flash.
There are two major problems - javascript doesn't support cross-platform post without hacks. And not all computers comes with Java. Same is with Flash.
-
Thank you for your response. As to the IP getting blacklisted, since we have full server control we could always assign another dedicated IP address to the site. The issue is that we would not know if and when it happened to take such action. Obviously, we don't want to have to do this, and it could create headaches if the main site IP is blacklisted for our search position until we get it resolved.
We are also planning on adding in website submission limits. For example, you could only submit mysitehere.com up to 3 times per month per subscriber account. The only way they could spam the system is to create another account and sign up all over again. I doubt anyone would go through that much effort, but I could be wrong.
Thoughts?
-
TL;DR - YES
Long story - i'm author of similar desktop tool called SEOPingler:
http://www.mobiliodevelopment.com/seopingler/
so anyone you can use to ping anything. And bots are coming within second or two. This works perfect.The problem is when you use this to ping many URLs (like 10k-20k). At some time this stop working and ping API endpoint receive your request but i can't see that bots are coming. This mean that there is some threshold that if you pass it for IP and you're temporary blacklisted. I also heard (but i can't confirm this) that this temporary may vary due previous usage. For me this isn't problem because users can blacklist their own IPs. And they can use hotspot wifi internet or VPN for continuing pinging.
But on server this will be HUGE problem because you can't switch IPs on-fly. And no one can guarantee how long your IP will be blacklisted.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do content copycats (plagiarism) hurt original website rankings?
Hi all, Found some websites stolen our content and using the same sentences in their website pages. Does this content hurt our website rankings? Their DA is low, still we are worried about the damage about this plagiarism. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Pinging Links
Interested to know if anybody still uses the strategy of pinging links to make sure they get indexed, there are a number of sites out there which offer it. Is it considered dangerous/spamy?
White Hat / Black Hat SEO | | seoman100 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
Can I leave off HTTP/HTTPS in a canonical tag?
We are working on moving our site to HTTPS and I was asked by my dev team if it is required to declare HTTP or HTTPS in the canonical tag? I know that relative URL's are acceptable but cannot find anything about HTTP/HTTPS. Example of what they would like to do Has anyone done this? Any reason to not leave off the protocol?
White Hat / Black Hat SEO | | Shawn_Huber0 -
Can the disavow tool INCREASE rankings?
Hi Mozzers, I have a new client who has some bad links in their profile that are spammy and should be disavowed. They rank on the first page for some longer tail keywords. However, we're aiming at shorter, well-known keywords where they aren't ranking. Will the disavow tool, alone, have the ability to increase rankings (assuming on-site / off-site signals are better than competition)? Thanks, Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Does IP Blacklist cause SEO issues?
Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.
White Hat / Black Hat SEO | | bjs20100 -
Can you block backlinks from another domain
Wondering if this is somehow possible. A site got hacked and created a /data folder with hundreds of .php files that are web pages selling all sorts of stuff. We deleted the /data folder and blocked Google from indexing it. Just noticed in Webmaster Tools that the site has 35,000 backlinks from other sites that got hacked with the same way. Is there a way to block these sites? I am assuming there isn't, but wanted to see if anyone ran into the same problem. It is a wordpress site is that helps.
White Hat / Black Hat SEO | | phatride0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0