About google Disavow tool
-
My website is attacked by spammed link method, so should i use Goolge disavow tool to remove that links?
And i have an question that when i use google Disavow to remove backlinks, but i still not remove it on the webpage that placed my links. Does Google index that backlink again? or never?
-
Your meaning that they can not attack me by spam links? ( i found that they add my link on adult webpage). So i think i should remove it from my backlink list.
Yes, i think that the link exchange page is not good way to building link because of many reason and it is not my main way to building links.
But now, my webpage is in google Sanbox and i must escape from it before i am building good links.
-
Even if you haven't built the links yourself, that doesn't necessarily mean they're hurting your site.
If the links are coming from spammy sites, it could be hurting your site, but Google's algorithm is pretty sophisticated and if your search rankings have fallen, it may not actually be a penalty but just them adjusting things. I would focus my energy on developing more good links. I see on your site you have a reciprocal link directory. When it comes to SEO, I'm not a huge fan of using reciprocal links as a main link-building strategy. Google loves diversity, so if most of your backlinks are from sites you are also linking to in your directory, that might not be the best SEO strategy, especially when some of the sites are not in the same niche as yours.
-
Thanks Nick!
I found that many links to my site isnt built by myself. So i think that enermy spam that links to take me to the google sandbox. So i cant not know exactly that links come from which page but domain.
So i decide to use this tool.
But i have a trouble, the link from one web page but google indexed it like 2 link ( ex: backlink page is abc.com, but google indexed the backlinks are: abc.com and www.abc.com, too). So should i remove one of them or keep both and if i remove one of them, google will remove the last one ( i remove abc.com, will Google remove www.abc.com??? )
Thanks!
-
Are you sure these links are actually hurting your site? A lot of people think their site is being penalized when it actually isn't. If we're talking just a few links from lousy sites, it's probably not hurting your site. If we're talking a large scale thing though, it could have a negative effect.
Your best bet is to try to have the links actually removed from the spammy sites. If that fails, that's where the disavow tool comes in, but it can potentially take quite a long time to see any changes
-
Yes thats why it's there.
If google chooses to take action about your report they will simply ignore the links for good when calculating your ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Top Landing Page has disappeared from Google Search Console but still shows at the top of Google
Hi, One of the top landing pages in my website has disappeared from GSC Search Analytics results. But I do get a good traffic from that page until now. What might be the reason for GSC to stop showing it on the results?
Technical SEO | | eduapps0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Google Site Search
I'm considering to implement google site search bar into my site.
Technical SEO | | JonsonSwartz
I think I probably choose the version without the ads (I'll pay for it). does anyone use Google Site Search and can tell if it's a good thing? does it affects in any way on seo? thank you0 -
Webmaster tools
Hello, My sites are showing odd "links to your site" data in WMT. Its not showing any links to the homepages and reduced links for other pages. Anyone else seeing this? Penguin refresh maybe?
Technical SEO | | jwdl0 -
Site being indexed by Google before it has launched
We are currently coming towards the end of migrating one of our retail sites over to magento. To our horror, we find out today that some pages are already being indexed by Google, and we have started receiving orders through new site. Do you have any suggestions for what may have caused this? Or similarly, what the best solution would be to de-index ourselves? We most recently excluded anything with a certain parameter from robots.txt - could this being implemented incorrectly have caused this issue? Thanks
Technical SEO | | Sayers0 -
Frustration With Google Places
I have been trying to solve this problem with Google Places for quite some time now and just can't figure out where to go from here. I've tried several sent messages explaining the problem and even received several phone calls from Google Places trying to correct the issue with no luck. I have even tried totally deleting the listing and started over from scratch and re-verified the address with a mailed postcard. My site: http://www.captainrichsmith.com has a Google Places account set up and verified http://maps.google.com/maps/place?hl=en&georestrict=input_srcid:1c8fa43cf77e0c93&ie=UTF8&t=h&z=14&vpsrc=0 For some reason when you do a Google search for one of my keywords Miami Fishing Charters On the listings normally under the letter "E" on the Map another website has a placemark at my location Miami Fishing Charters Directory
Technical SEO | | captainrichsmith
www.fishing-charters-miami.com/ - Cached Fishing Charters Miami is a quality directory of the best fishing boats in the Miami area. The top Miami fishing charters are listed on this website.
2550 South Bayshore Drive, Miami
(786) 263-9231
captainrichsmith.com (7) When you view this Google places listing further. I see it is using my images, videos, placemark on map but NOT the address, phone number, or reviews. Any help on this issue would greatly be appreciated0 -
Google causing Magento Errors
I have an online shop - run using Magento. I have recently upgraded to version 1.4, and I installed a extension called Lightspeed, a caching module which makes tremendous improvements to Magento's performance. Unfortunately, a confoguration problem, meant that I had to disable the module, because it was generating errors relating to the session, if you entered the site from any page other than the home page. The site is now working as expected. I have Magento's error notification set to email - I've not received emails for errors generated by visitors. However over a 72 hour period, I received a deluge of error emails, which where being caused by Googlebot. It was generating an erro in a file called lightspeed.php Here is an example: URL: http://www.jacksgardenstore.com/tahiti-vulcano-hammock IP Address: 66.249.66.186 Time: 2011-06-11 17:02:26 GMT Error: Cannot send headers; headers already sent in /home/jack/jacksgardenstore.com/user/jack_1.4/htdocs/lightspeed.php, line 444 So several things of note: I deleted lightspeed.php from the server, before any of these error messages began to arrive. lightspeed.php was never exposed in the URL, at anytime. It was referred to in a mod_rewrite rule in .htaccess, which I also commented out. If you clicked on the URL in the error message, it loaded in the browser as expected, with no error messages. It appears that Google has cached a version of the page which briefly existed whilst Lightspeed was enabled. But I though that Google cached generated HTML. Since when does cache a server-side PHP file ???? I've just used the Fetch as Googlebot facility on Webmaster Tools for the URL in the above error message, and it returns the page as expected. No errors. I've had to errors at all in the last 48 hours, so I'm hoping it's just sorted itself out. However I'm concerned about any Google related implications. Any insights would be greatly appreciated. Thanks Ben
Technical SEO | | atticus70 -
Why Google did not index our domain?
Hi, We launched tmart 60 days ago and submitted to google, bing, yahoo 20 days later. But google had never indexed our website still when yahoo indexed it in one week. What we have checked or tried: 1. We got 20~50 inlinks in one month and now 81 inlinks via yahoo site explorer. 2. This domain has registered for 13 years and we purchased it from sedo last year. We
Technical SEO | | zt673
did not find any problems from domain archive pages. 3. Page similar: the homepage is 50% similar to one of our competitors when we just launched.
So we adjusted the page structure and modified the content one month later and decreased the similarity to 30% (by tools from webconfs.com) 4. Google Robots: googlebot crawled our website every day after we submitted for indexing.
We opened GWT account for it and added the xml sitemap last week. GWT said nothing
was wrong except the time of page loading. Our questions: Why google did not indexed our website? What should we do? Thanks, wu0