Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Massive Amount of Pages Deindexed
-
On or about 12/1/17 a massive amount of my site's pages were deindexed. I have done the following:
- Ensured all pages are "index,follow"
- Ensured there are no manual penalites
- Ensured the sitemap correlates to all the pages
- Resubmitted to Google
- ALL pages are gone from Bing as well
In the new SC interface, there are 661 pages that are Excluded with 252 being "Crawled - currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling." What in the world does this mean and how the heck do I fix this. This is CRITICAL. Please help!
The url is https://www.hkqpc.com
-
the report was run prior canonical directives
Anytime remember to noindex your robots.txt
https://yoast.com/x-robots-tag-play/
There are cases in which the robots.txt file itself might show up in search results. By using an alteration of the previous method, you can prevent this from happening to your website:
<filesmatch "robots.txt"="">Header set X-Robots-Tag "noindex"</filesmatch>
**And in Nginx:**
location = robots.txt { add_header X-Robots-Tag "noindex"; }
-
Looking at the first report, "Redirect Chains".. As I understand the table, these are correct..
Column A is the page (source) with the redirecting link
Column B is the link that is redirecting (http://www.hkqlaw.com)
Column C shows 2 redirects happening
Column I shows the first redirect (http://www.hkqlaw.com -> http://www.hkqpc.com) (non ssl version)
Column N shows the second redirect (http://www.hkqpc.com -> https://www.hkqpc.com) (ssl version)The original link (hkqlaw.com) is a link in the footer of our news section so is common on those pages which is why it shows so often. So, like I said, this appears to be correct.
I added the canonical directives to the pages earlier so perhaps that report was run prior to me doing that?
Again, thanks so much for your effort in helping me!
-
Now I'm really baffled. I just ran Screaming Frog and don't see any of the redirects or other stats. Which software are you using that is showing this information? I'm trying to replicate it and figure out if there's something, somewhere else doing this.
-
Wow, I got it
your 301 redirecting a ton of URLs back to the homepage.
- Redirect chains https://bseo.io/cZW0w0
- internal URLs https://bseo.io/4sFqUk
- insecure content https://bseo.io/YDDKGD
- no canonical https://bseo.io/fWey1Q
- crawl overview https://bseo.io/Zg6bpM
- canonical errors https://bseo.io/YtTh7W
-
Ok, canonical is set for each page (and I fixed the // issue). I used x-robots header to noindex the robots.txt and sitemap.xml files, along with a few other extensions while I was at it.
I'll get the secured cookie header set after this is resolved. We don't store any sensitive data via cookies for this site so it's not of immediate concern but still one I'll address.
EDIT: The https://www.hkqpc.com/attorney/David-Saba.html/ page no longer exists which was the cause of the errors. I've redirected that to the appropriate page.
-
https://cryptoreport.websecurity.symantec.com/checker/
This server cannot be scanned for these vulnerabilities:HeartbleedServer scan unsuccessful. <a>See possible causes.</a>Poodle (TLS)Server scan unsuccessful. See possible causes.BEASTThis server is vulnerable to a BEAST attack. <a>More information.</a>
I am sorry I said your IP was Network solutions when it was 1&1 I still strongly recommend changing hosting companies even though I am German and so is 1&1
DNS resolves www.hkqpc.com to 74.208.236.66
The SSL certificate used to load resources from https://www.hkqpc.com will be distrusted in M70. Once distrusted, users will be prevented from loading these resources. See https://g.co/chrome/symantecpkicerts for more information.
Look: https://cl.ly/pCY5
Look: https://cl.ly/pAKa
symantec SSL certificates are now owned by DigiCert
<big>https://www.digicert.com/help/</big>
https://www.dareboost.com/en/report/5a70b33e0cf28f017576367f
The Set-Cookie HTTP header can be configured with your Apache server. Make sure that the mod_headers module is enabled. Then, you can specify the header (in your .htaccess file, for example). Here is an example: <ifmodule mod_headers.c=""># only for Apache > 2.2.4: Header edit Set-Cookie ^(.*)$ $1;HttpOnly;Secure # lower versions: Header set Set-Cookie HttpOnly;Secure</ifmodule>
- robots.txt file inside of the SERPS big photo https://i.imgur.com/cJeDR9t.png
- XML sitemap inside of SERPS should be no indexed big photo https://i.imgur.com/tlx5jc7.png
Double forward slashes after verdicts the same page without double forward slashes you need to add rel canonical tags zero canonical's on any page whatsoever.
- https://www.hkqpc.com/news/verdicts//hkq-attorneys-win-carbon-county-real-estate-case/
- https://www.hkqpc.com/news/verdicts/hkq-attorneys-win-carbon-county-real-estate-case/
The URLs above need a rel=canonical tag I have created an example below for you. For the page without the double forward slashes, and this tells Google the one you'd prefer to have indexed besides it keeps the query string pages and junk pages out of Google's index. Please see the resources below and add them to your website because I do not know what type of CMS you're using I cannot recommend a plug-in to do it but if you were using something like WordPress it would be automatically done by something like Yoast WordPress SEO for the site that you are using it may be a wise move to move to something like WordPress it is a solid platform for a site that size and makes things a lot easier for you to implement change across the entire site quickly.
- https://moz.com/blog/complete-guide-to-rel-canonical-how-to-and-why-not
- https://yoast.com/rel-canonical/
- https://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
You need to add a canonical
- Bigger photo of problem https://i.imgur.com/1qMMPSM.png
- this page https://www.hkqpc.com/attorney/David-Saba.html/
- Warning: Creating default object from empty value in /homepages/43/d238880598/htdocs/classes/class.attorneys.php on line 38
- Warning: Invalid argument supplied for foreach() in /homepages/43/d238880598/htdocs/headers/attorney.php on line 15
- ** FIx for this**
- https://stackoverflow.com/questions/14806959/how-to-fix-creating-default-object-from-empty-value-warning-in-php
- http://thisinterestsme.com/invalid-argument-supplied-for-foreach/
You have
Heartbleed Vulnerability
An unknown error occurred while scanning for the Heartbleed Bug.
-
Thanks for the great feedback! The hkqlaw.com url simply forwards (301) to hkqpc.com. The IP address you have is for hkqlaw.com which is registered through Network Solutions, but hosting of hkqpc.com is on 1and1.com hosting. Also, the timeout error you're getting is because there is no SSL cert for hkqlaw.com, again, it's just forwarded to hkqpc.com (which does have an SSL attached to it). As far as SC, everything is setup to index hkqpc.com.
-
Right now I cannot get that site to load on my browser, and when I used https://tools.pingdom.com it was unable to load as well you could be having some serious server problems, and that could be causing the issue although I was getting it to run through screaming frog which is surprising.
This is a zip file of your screen frog results this will show if there are any no index pages which I found none of it looks to me like you have a server issue. Zip file: http://bseo.io/BXYpZh
I checked your site for malware using https://sitecheck.sucuri.net/results/www.hkqlaw.com/ ( please understand this only check the homepage and a handful of others) and found none though when I checked your IP address I noticed a lot of ransomware information tied directly to your IP
https://ransomwaretracker.abuse.ch/ip/205.178.189.131/
Here is a large screenshot of when I tried to browse your website: https://i.imgur.com/OzcLhbx.png
Here is Pingdom ( remember to test on something outside of your local computer because you have caching and other things that could give you incorrect results.)
https://tools.pingdom.com/#!/bd6d52/https://www.hkqlaw.com/
in my experience network solutions, hosting is terrible I would strongly suggest doing two things.
Get a better hosting company for your site.
A good host that is not too expensive is and also managed is liquid Web, cloudways, rack space, pairnic, you can also build out your own system on non-managed hosting like Linode, digital ocean, AWS, Google cloud, Microsoft Azure if you want a high-quality, inexpensive manage host that offers more than one back and like the ones I've listed above https://www.cloudways.com/en/ will host anything and manage it, and you can use the backends provided before this. If you want what I think is the best and price is not a big deal considering you're not running WordPress https://armor.com is my preferred hosting company. Otherwise, cloudways or liquid Web would be where I would host your site.
Considering you already have an IP address attached to ransomware and you're using hosting company that will not be beneficial to you in security terms. I would add a web application firewall/reverse proxy you can do that with https://sucuri.net/website-firewall/ https://incapsula.com https://fastly.com and if you want most basic and least secure but better than what you have https://cloudflare.com
At the very least put Cloudflare on their but what I'm seeing is a severe problem coming from your web host and knowing that hosting company I would strongly advise you to move to a better host.
I hope this was of help,
Thomas
-
Not sure if this is of help to you, I suppose it depends how many pages you are expecting to be indexed, but according to John Mu at Google - Google does not necessarily index all pages.
https://www.seroundtable.com/google-index-all-pages-20780.html
-
Not recently. It migrated well over a year ago to HTTPS.
-
First thing to confirm - did you recently migrate to HTTPS?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL structure - Page Path vs No Page Path
We are currently re building our URL structure for eccomerce websites. We have seen a lot of site removing the page path on product pages e.g. https://www.theiconic.co.nz/liberty-beach-blossom-shirt-680193.html versus what would normally be https://www.theiconic.co.nz/womens-clothing-tops/liberty-beach-blossom-shirt-680193.html Should we be removing the site page path for a product page to keep the url shorter or should we keep it? I can see that we would loose the hierarchy juice to a product page but not sure what is the right thing to do.
Intermediate & Advanced SEO | | Ashcastle0 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Which of these examples are doorway pages?
Hi there, I am soon to launch a new platform/directory website, however, have a concern over doorway pages. I have read many articles on the difference between Doorway and Landing pages and do have a good understanding, however, am still very anxious that what I intend to do will be risking Google penalties. I have looked at other directory/platform websites and have noticed that a lot of them are still using doorway pages and are not getting penalised. So I was wondering if someone wouldn't mind kindly letting me know their opinion on which of the following examples are doorway pages and which are not so I can better understand what I can and cannot do? Example 1: When I Google 'piano lessons new york' and 'trumpet lessons new york' I get the following 'landing pages' in search: https://takelessons.com/new-york/piano-lessons https://takelessons.com/new-york/trumpet-lessons To me, the above pages are definitely doorway pages as they are very similar with content and text and are simply an intermediary step between the Google search and their listings pages for piano/trumpet teachers in New York. Is this correct? Example 2: When I Google 'piano lessons Sydney' I get presented with the following web page in search: http://www.musicteacher.com.au/directory/sydney-nsw/lessons/piano/ I would think that this is NOT a doorway page as the user has been taken directly to the search results page in the directory and the page doesn't seem to have been set up for the sole purpose of listing in search results for 'Piano Lessons in Sydney'. Example 3: When I Google 'pet minding Sydney' I get presented with the following two pages in search: https://www.madpaws.com.au/petsitters/Sydney-New-South-Wales?type=night&service=1&from=0&to=99&city=Sydney&state=New-South-Wales https://www.pawshake.com.au/petsitters/Sydney%252C%2520New%2520South%2520Wales%252C%2520Australia Like Example 2, I don't think these pages would be classified as doorway pages as they too direct to the search results page in the site directory instead of an intermediary page. What do you think? Thanks so much in advance for your expertise and help! Kind Regards, Adrian
Intermediate & Advanced SEO | | Amor20050 -
Pages with excessive number of links
Hi all, I work for a retailer and I've crawled our website with RankTracker for optimization suggestions. The main suggestion is "Pages with excessive number of links: 4178" The page with the largest amount of links has 634 links (627 internal, 7 external), the lowest 382 links (375 internal, 7 external). However, when I view the source on any one of the example pages, it becomes obvious that the site's main navigation header contains 358 links, so every new page starts with 358 links before any content. Our rivals and much larger sites like argos.co.uk appear to have just as many links in their main navigation menu. So my questions are: 1. Will these excessive links really be causing us a problem or is it just 'good practice' to have fewer links
Intermediate & Advanced SEO | | Bee159
2. Can I use 'no follow' to stop Google etc from counting the 358 main navigation links
3. Is have 4000+ pages of your website all dumbly pointing to other pages a help or hindrance?
4. Can we 'minify' this code so it's cached on first load and therefore loads faster? Thank you.0 -
How to check if the page is indexable for SEs?
Hi, I'm building the extension for Chrome, which should show me the status of the indexability of the page I'm on. So, I need to know all the methods to check if the page has the potential to be crawled and indexed by a Search Engines. I've come up with a few methods: Check the URL in robots.txt file (if it's not disallowed) Check page metas (if there are not noindex meta) Check if page is the same for unregistered users (for those pages only available for registered users of the site) Are there any more methods to check if a particular page is indexable (or not closed for indexation) by Search Engines? Thanks in advance!
Intermediate & Advanced SEO | | boostaman0 -
Effect of Removing Footer Links In all Pages Except Home Page
Dear MOZ Community: In an effort to improve the user interface of our business website (a New York CIty commercial real estate agency) my designer eliminated a standardized footer containing links to about 20 pages. The new design maintains this footer on the home page, but all other pages (about 600 eliminate the footer). The new design does a very good job eliminating non essential items. Most of the changes remove or reduce the size of unnecessary design elements. The footer removal is the only change really effect the link structure. The new design is not launched yet. Hoping to receive some good advice from the MOZ community before proceeding My concern is that removing these links could have an adverse or unpredictable effect on ranking. Last Summer we launched a completely redesigned version of the site and our ranking collapsed for 3 months. However unlike the previous upgrade this modifications does not URL names, tags, text or any major element. Only major change is the footer removal. Some of the footer pages provide good (not critical) info for visitors. Note the footer will still appear on the home page but will be removed on the interior pages. Are we risking any detrimental ranking effect by removing this footer? Can we compensate by adding text links to these pages if the links from the footer are removed? Seems irregular to have a home page footer but no footer on the other pages. Are we inviting any downgrade, penalty, adverse SEO effect by implementing this? I very much like the new design but do not want to risk a fall in rank and traffic. Thanks for your input!!!
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
NoIndexing Massive Pages all at once: Good or bad?
If you have a site with a few thousand high quality and authoritative pages, and tens of thousands with search results and tags pages with thin content, and noindex,follow the thin content pages all at once, will google see this is a good or bad thing? I am only trying to do what Google guidelines suggest, but since I have so many pages index on my site, will throwing the noindex tag on ~80% of thin content pages negatively impact my site?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0