External Links from own domain
-
Hi all,
I have a very weird question about external links to our site from our own domain.
According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za.
In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/
The weird part is that the number of external links kept on growing and is now sitting on a massive number.
On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT.
Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats:
11-Apr-11 - 543 747 534
12-Apr-11 - 554 066 716
13-Apr-11 - 554 066 716
14-Apr-11 - 554 066 716
15-Apr-11 - 521 528 014
16-Apr-11 - 515 098 895
17-Apr-11 - 515 098 895
18-Apr-11 - 515 098 895
19-Apr-11 - 520 404 181
20-Apr-11 - 520 404 181
21-Apr-11 - 520 404 181
26-Apr-11 - 520 404 181
27-Apr-11 - 520 404 181
28-Apr-11 - 603 404 378
I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links.
What do you think is the best solution to get rid of all these invalid pages.
-
We had 301s for about 6 months, and the old URLs did not disappear from google. Thats why we decided to change them to 404s, with the thinking that Google might remove them quicker. But the number of links from sub-domains just keeps on growing.
I am worried that by having these problem urls listed in the robots.txt actually prevents google from following them and seeing that it should be removed and that it returns a 404
-
Instead of trying to manage a massive 301 list, can you just customize your 404 page to redirect?
{script to test page URL}
$location = "http://www.YourSite.com/";
header("HTTP/1.1 301 Moved Permanently");
header("Location: {$location}");
exit;
}
-
Update:
There are 2 things that still puzzles me with this:
If you go to http://www.google.co.za/search?q=site:jump.co.za+-www&hl=en&rlz=1C1GPCK_enZA426ZA426&prmd=ivns&filter=0&biw=1920&bih=979 you notice all sorts of weird sub-domains, and all of these are invalid and have been removed from GWMT.
If you manage the domain m.jump.co.za on GWMT you also notice that it still reports on keywords, queries and all sorts of data, although the site is disabled and all the URLs generate 404 errors
There is only a few of these weird sub-domains that are causing the problems:
0www.
iiiiiwww.
iwww.
m.
wtfwww.
www.www.
wwww.All these domains feels very fimiliar to me and I am almost 100% sure that its domains that used to test when we found the problem on apache, meaning google took the data from the toolbar queries and probably started indexing these sub-domains. But now I can't get rid of them, and Google seems to be out of control with these.
So the main question is probably, should we just give 404s or should we add to Robots.txt as well?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will doing a 301 redirect for one domain to another give the latter domain the formers links?
I have some websites that I built a few years ago that are still in existence, but I no longer have access to the sites as they weren't hosted by myself. These sites all carry a "Designed by Me" text on the footer with a link to my (now old) website. I have since done 301 redirects on the domain names that are used in the footers of these sites so they link directly to my new site. However, will these websites now show up on Google Webmasters for example as external links to my site?
Technical SEO | | mickburkesnr0 -
Link Diversity
With the current updates in the Seo world how critical is link diversity. We are revamping our site and planning to add many new pages to our site and planning to build links to relevant pages with relevant anchor texts keywords. Also we are planning to add relevant H1, H2 and H3 tags with metatag description and content with keyword rich content specific to that page. Any advise
Technical SEO | | INN0 -
Too many links? Do links to named anchors count (ie page#nameanchor)?
Hi, I have an internal search results page that contains approx 200 links in total. This links to approx 50 pages. Each result listing contains a link to the page in the format /page.html and also has 3 more links (for each listing) to named anchors within the page. eg /page.html#section1, /page.html#section2, /page.html#section3 etc. Should i remove the named anchors to keep my links per page under the Seomoz suggested max of 100? Will it impact crawl-ability or link juice being passed? Thanks in advance for your response.
Technical SEO | | blackrails0 -
Added data to links
Hello I am in the process of cleaning a site and getting less pages cached. it is a magento site and I was wondering what is your advice fo pages that get this padded to the link ?material=139&price=10%2C12 accept the obvious canonical? thanks
Technical SEO | | ciznerguy0 -
Domain Crawl Question
We have our domain hosted by two providers - web.com for the root and godaddy for the subdomain. Why SEOMOZ is not picking up the total pages of the entire domain?
Technical SEO | | AppleCapitalGroup0 -
Do Backlinks to a PDF help with overall authority/link juice for the rest of the domain?
We are working on a website that has some high-quality industry articles available on their website. For each article, there is an abstract with a link to the PDF which is hosted on the domain. We have found in Analytics that a lot of sites link directly to the PDF and not the webpage that has the abstract of the article. Can we get any benefit from a direct PDF link? Or do we need to modify our strategy?
Technical SEO | | MattAaron0 -
Webmaster tools lists a large number (hundreds)of different domains linking to my website, but only a few are reported on SEOMoz. Please explain what's going on?
Google's webmaster tools lists hundreds of links to my site, but SEOMoz only reports a few of them. I don't understand why that would be. Can anybody explain it to me? Is there someplace to I can go to alert SEOMoz to this issue?
Technical SEO | | dnfealkoff0 -
Sub Domains
Hi,,, Okay we have 1 main site , a few years back we went down the road of sub domains and generated about 10. They have page rank and age but we wish to move them back to the main web site. What is the correct or best way to achieve this. 1 copy all content to the main web site creating dup pages and then use a redirects from the sub pages to the new dup pages on the main domain... or 2 write new content on the main domain for the subdomain pages and redirect to the new content. Problem with 2 is the amount of work involved...
Technical SEO | | NotThatFast0