Sub Directories Domain & Page Crawl Depth
-
Hi,
I just bought an old domain with good backlinks and authority, that domain was technology product formerly.
So, I want to make this domain for my money site. The purpose of this website is to serve technological information like WordPress tutorial and etc (free software or drivers).
And I just installed a sub directory on this domain like https://maindomain.com/subdirectory/ and this directory I made for a free software like graphics drivers download (NVIDIA or AMD). What you think with this website? Is it make sense?
Wait, I just added this domain to my campaign at MOZ and the result shown my sub directory was 6 times of crawl depth. Is it good for directory or I need to move the sub directory to my main site?
Thank you, hope someone answer my confuse.
Best Regard, Matthew.
-
Unless the content which earned the authority is very similar to the new content, Google might nuke the site's rankings anyway. I hope you also checked the site's historic SEO performance (traffic, ranking keywords) as, authority and links don't necessarily mean the site is work a damn
If the site had loads of good links but blocked Google from crawling it, then that authority may never have translated in SEO ranking power. If the site had strong SEO authority, that may have been offset by previous Google penalties and stuff like that - which don't reduce (but do entirely nullify) the benefit of having strong authority metrics
The days of just buying a site then putting your own stuff on and it does equally as well, are pretty much gone. Maybe you will be lucky, I don't know.
If Google does reboot your SEO authority, I don't think you'll get it back again with that kind of site. Free driver sites that push driver software (regmechanic, secunia PSI) are ten a penny and the best of them are 'iffy' at best (users often get issues like the wrong bit-architecture of the driver is installed on their OS - e.g: 32 bit driver is installed on system that supports 64 bit driver)
Some providers developed very powerful scanners, but they mostly evolved into software solutions (rather than more shallow websites). It's unlikely that in 2019, a site without a solid value-proposition (what unique value does it add to the web?) will take off or become Google-popular
If your SEO authority doesn't get nuked you might have a shot, but that will depend upon how similar your new content is to the old content (in mathematical terms, not human terms - e.g: Boolean string similarity comparison stats for whole content pieces)
It's kind of redundant when, most manufacturers produce software which automatically keeps all drivers up to date anyway (e.g: GeForce Experience). Those bespoke tools often do a much better job of driver installation too! One tool for all your PC upgrades has been a holy grail for decade, IMO it's a bit of a wild goose chase
Read this post also: https://moz.com/community/q/why-would-my-page-have-a-higher-pa-and-da-links-on-page-grade-still-not-rank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webshop landing pages and product pages
Hi, I am doing extensive keyword research for the SEO of a big webshop. Since this shop sells technical books and software (legal books, tax software and so on), I come across a lot of very specific keywords for separate products. Isn't it better to try and rank in the SERP's with all the separate product pages, instead of with the landing (category) pages?
Intermediate & Advanced SEO | | Mat_C0 -
On Page Content. has a H2 Tag but should I also use H3 tags for the sub headings within this body of content
Hi Mozzers, My on page content comes under my H2 tag. I have a few subheadings within my content to help break it up etc and currently this is just underlined (not bold or anything) and I am wondering from an SEO perspective, should I be making these sub headings H3 tags. Otherwise , I just have 500-750 words of content under an H2 tag which is what I am currently doing on my landing pages. thanks pete
Intermediate & Advanced SEO | | PeteC120 -
Big discrepancies between pages in Google's index and pages in sitemap
Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen
Intermediate & Advanced SEO | | Digirank0 -
Using Webmaster Tools to Redirect Domain to Specific Page on Another Domain
Hey Everyone, we redirected an entire domain to a specific URL on another domain (not the homepage). We used a 301 Redirect, but I'm also wondering if I should use the Google Webmaster Tools "Change of Address" section to redirect. There is no option to redirect the old domain to the specific URL on the new domain within the "Change of Address" section. Thoughts?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0 -
Which page to target? Home or /landing-page
I have optimized my home page for the keyword "computer repairs" would I be better of targeting my links at this page or an additional page (which already exists) called /repairs it's possible to rename & 301 this page to /computer-repairs The only advantage I can see from targeting /computer-repairs is that the keywords are in the target URL.
Intermediate & Advanced SEO | | SEOKeith0 -
How to Disallow Specific Folders and Sub Folders for Crawling?
Today, I have checked indexing for my website in Google. I found very interesting result over there. You can check that result by following result of Google. Google Search Result I aware about use of robots.txt file and can disallow images folder to solve this issue. But, It may block my images to get appear in Google image search. So, How can I fix this issue?
Intermediate & Advanced SEO | | CommercePundit0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280