Google is somehow linking my two sites that aren't linked! HELP
-
Good Morning...
In my Google webmaster account it is showing an increase of backlinks between one site i own to the other.... This should not happen, as there are no links from one site to the other.
I have thoroughly checked many pages on the new site to see if i can find a backlink, but i can't.
Does anyone know why this is showing like this (google now shows 50,000 links from one site to the other).. Can someone please take a look and see if you can find any link from one to the other...
original site : http://goo.gl/JgK1e
new site : http://goo.gl/Jb4ng
Please let me know why you guys think this is happening or if you were actually able to find a link on the new site pointing back to the old site...
thanks a lot
-
I looked at your site and was not able to see any links from the plasticstorage.com site to the simplastics.co.uk site. I used a couple backlink tools and was not able to see any links there either.
I did notice a huge drop in backlinks recently. Once tool showed you had 30k+ links in March on the plasticstorage.com site but have dropped over 10k links. Is it possible the links you are seeing in Google were present and have just recently been removed? If so, expect it to take Google 30 days to crawl all of your site and recognize the changes.
-
screenshot not working too well, does this work:
| http://simplastics.co.uk/30234-4945Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/30260-4983Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/30287-5271Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/30288-5274Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/30290-5279Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/305a4-5087Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/31162-5228Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/33023Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/33061Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 |
| http://simplastics.co.uk/33105-5369Via this intermediate link:http://simplastics.co.uk/wr54-1236c-3387 | -
Can you share a screenshot of your Google WMT screen where it shows links between the sites? The image may help us better understand the root issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How google bot see's two the same rel canonicals?
Hi, I have a website where all the original URL's have a rel canonical back to themselves. This is kinda like a fail safe mode. It is because if a parameter occurs, then the URL with the parameter will have a canonical back to the original URL. For example this url: https://www.example.com/something/page/1/ has this canonical: https://www.example.com/something/page/1/ which is the same since it's an original URL This url https://www.example.com/something/page/1/?parameter has this canonical https://www.example.com/something/page/1/ like i said before, parameters have a rel canonical back to their original url's. SO: https://www.example.com/something/page/1/?parameter and this https://www.example.com/something/page/1/ both have the same canonical which is this https://www.example.com/something/page/1/ Im telling you all that because when roger bot tried to crawl my website, it gave back duplicates. This happened because it was reading the canonical (https://www.example.com/something/page/1/) of the original url (https://www.example.com/something/page/1/) and the canonical (https://www.example.com/something/page/1/) of the url with the parameter (https://www.example.com/something/page/1/?parameter) and saw that both were point to the same canonical (https://www.example.com/something/page/1/)... So, i would like to know if google bot treats canonicals the same way. Because if it does then im full of duplicates 😄 thanks.
Technical SEO | | dos06590 -
Google UK and the slog of Link building
Background:
Technical SEO | | Brinley
I have a number of sites built using the open eCommerce software zen cart. One of these sites was penalised by the original Penguin algorithm back in April 24, 2012. The reason for the panalty was that two ecommerce sites in Hong kong had a link to the above site in the footer of their 2000 & 4000 product website. I have no idea why the site had these links and even though I did contact them a few months before the Penguin massacre asking them to remove the footer link I was technically unaware of the ticking time bomb that they presented. The result, as is now engrained in SEO history, was that the site was moved to sit alongside Googles equivalent of the restaurant at the end of the universe and stayed there for 2 years until April 2014.
As I had never indulged in link building for the simple reason that I found it laborious I was obviously infuriated with the resulting loss of revenue but that was balanced with an understanding that I had not kept pace with the changing landscape of SEO according to Google. The quest I am now on is to increase my 3 sites profile on the web without getting another spanking from Google in the near future. The problem I have is that white hat today may well be black hat tomorrow. (I can recall the days when Google said links are good and everyone went out and asked other websites to link with them and look where that led.) So do I ignore actively cultivating links as some suggest and look to produce good content (which is quite difficult when you make mugs and candles by the way.) or do you go out and look to intentionally build links by studying competitors links, reviewing link opportunity or get bloggers to review products. For a small lifestyle entrepreneur like myself, the ever changing seo landscape and the amount of time & effort it requires is slowly and inevitably pushing us back out to that restaurant mentioned earlier. If only Google had a little brother that was designed purely for small businesses - like it was in the good old days before the dinosaur that is big business grunt and thought hmmm! whats that?
And if there were such a thing I would add a caveat that it would be illegal to generate pointless amount of cyber content because the web is becoming something akin to a landfill. Which leaves me nowhere really - but I think I am okay with that. Waiter !!0 -
Dev Site Was Indexed By Google
Two of our dev sites(subdomains) were indexed by Google. They have since been made private once we found the problem. Should we take another step to remove the subdomain through robots.txt or just let it ride out? From what I understand, to remove the subdomain from Google we would verify the subdomain on GWT, then give the subdomain it's own robots.txt and disallow everything. Any advice is welcome, I just wanted to discuss this before making a decision.
Technical SEO | | ntsupply0 -
Need help with home page on site
Hello! Thanks for reading in advance! I've got a relatively old site (12 year old domain) that has experienced a drop in rankings specifically for our home page. Some of the key terms that I'd assume we would rank well for are: "expedite us passport" According to SEOMOZ, our on page optimization receives a C for the termr. also, the root domain and page have decent links, etc. However; looking at Google (logged out and in incognito mode in chrome), a page on our site http://www.passportsandvisas.com/passport/index.asp ranks well and our HOME page isn't listed in the top 50 or 100. THis is the case for a lot of keywords we used to rank well for. I would have thought our home page would have at least outranked an internal page. Any thoughts would be very, very helpful!
Technical SEO | | santiago230 -
Blank pages in Google's webcache
Hello all, Is anybody experiencing blanck page's in Google's 'Cached' view? I'm seeing just the page background and none of the content for a couple of my pages but when I click 'View Text Only' all of teh content is there. Strange! I'd love to hear if anyone else is experiencing the same. Perhaps this is something to do with the roll out of Google's updates last week?! Thanks,
Technical SEO | | A_Q
Elias0 -
When is the last time Google crawled my site
How do I tell the last time Google crawled my site. I found out it is not the "Cache" which I had thought it was.
Technical SEO | | digitalops0 -
Way to find how many sites within a given set link to a specific site?
Hi, Does anyone have an idea on how to determine how many sites within a list of 50 sites link to a specific site? Thanks!
Technical SEO | | SparkplugDigital0 -
Site 'filtered' by Google in early July.... and still filtered!
Hi, Our site got demoted by Google all of a sudden back in early July. You can view the site here: http://alturl.com/4pfrj and you may read the discussions I posted in Google's forums here: http://www.google.com/support/forum/p/Webmasters/thread?tid=6e8f9aab7e384d88&hl=en http://www.google.com/support/forum/p/Webmasters/thread?tid=276dc6687317641b&hl=en Those discussions chronicle what happened, and what we've done since. I don't want to make this a long post by retyping it all here, hence the links. However, we've made various changes (as detailed), such as getting rid of duplicate content (use of noindex on various pages etc), and ensuring there is no hidden text (we made an unintentional blunder there through use of a 3rd party control which used CSS hidden text to store certain data). We have also filed reconsideration requests with Google and been told that no manual penalty has been applied. So the problem is down to algorithmic filters which are being applied. So... my reason for posting here is simply to see if anyone here can help us discover if there is anything we have missed? I'd hope that we've addressed the main issues and that eventually our Google ranking will recover (ie. filter removed.... it isn't that we 'rank' poorly, but that a filter is bumping us down, to, for example, page 50).... but after three months it sure is taking a while! It appears that a 30 day penalty was originally applied, as our ranking recovered in early August. But a few days later it dived down again (so presumably Google analysed the site again, found a problem and applied another penalty/filter). I'd hope that might have been 30 or 60 days, but 60 days have now passed.... so perhaps we have a 90 day penalty now. OR.... perhaps there is no time frame this time, simply the need to 'fix' whatever is constantly triggering the filter (that said, I 'feel' like a time frame is there, especially given what happened after 30 days). Of course the other aspect that can always be worked on (and oft-mentioned) is the need for more and more original content. However, we've done a lot to increase this and think our Guide pages are pretty useful now. I've looked at many competitive sites which list in Google and they really don't offer anything more than we do..... so if that is the issue it sure is puzzling if we're filtered and they aren't. Anyway, I'm getting wordy now, so I'll pause. I'm just asking if anyone would like to have a quick look at the site and see what they can deduce? We have of course run it through SEOMoz's tools and made use of the suggestions. Our target pages generally rate as an A for SEO in the reports. Thanks!
Technical SEO | | Go2Holidays0