There is nothing wrong with unlisted pages, it is even recommended for any content behind login screens or procedurally generated pages like carts. If something it not listed, Google will not see it so it will have add no value to your site from a search perspective.
- Home
- Tenlo
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Latest posts made by Tenlo
-
RE: Unlisted (hidden) pages
-
RE: Deep linking with redirects & building SEO
Set up Canonical tags and use Google's UTM codes to track your non-internal links (as your internal links will be tracked just fine by Google Analytics without any need for using redirects).
-
RE: Two divisions, same parent company, identical websites
That sounds rough. What you will want to do is alter your content for your single city based website to reflect that you serve that city, then when Google is looking for a match for a person near that city, it should see that site as the best match do to the weight it puts on geolocation. In the long run, you will want to re-write all of your content on one site so that your two sites will not be hurting each other or look like copy/paste spam sites.
-
RE: Google Indexed a version of my site w/ MX record subdomain
You should find the locations of those links and correct them to point to the proper URL. I find that Screaming Frog's crawl is the easiest for this, you can find every link and see where they are located.
-
RE: Our crawler was not able to access the robots.txt file on your site
There are 2 parts of your robots.txt that could be causing this, and it all just depends on how each bot is reading regular expressions in your robots.txt:
First, your Disallow: /? can be read as Disallow all paths starting with "/" with 0 to infinity characters "" and one character "?". Try replacing this part with Disallow: /*? to make it not crawl anything with a query string (which is what I believe you were going for).
Second, you have a open Disallow followed by the User-agent: rogerbot and while this should not be read this way, once again it all depends on how each bot reads the commands. To fix this you should change your Disallow following your Googlebot-Image as Disallow: /
-
RE: Is toggle Good For seo
If your content cannot be seen without logging in or getting past a "gate" then in general, Google will not serve people to that content. If you are planning on trying to spoof it so that the Google bot can crawl the content but people cannot see it, don't, Google will find out and then at best you will just see all of your rankings tank. Google wants to searve people the best, easiest content that will give them what they are looking for so if you are not openingly providing that content then Google will not send you traffic for those queries.
-
RE: Any solution to low search traffic on weekend
This is generally normal for most industries, less people are on computers on the weekends so you see a decrease in not just search, but overall traffic. The same thing occurs any time you have a holiday during the week (Thanksgiving, Christmas).
-
RE: "Get price" vs "Request a Quote"
IT will really depend on your area and the language your customers use. I have seen the same CTA wording, in the same region perform both amazing and far below expectation, and it had to do with what the end product was and the language that people in the area used when looking for those services. Your best bet it to run an A/B test yourself and see which CTA responds better to your potential customers.
-
RE: Will shortening down the amount of text on my pages affect it's SEO performance?
It could, but it also could help your ranking. The amount of text is not important as much as how relevant Google sees it compared to the search query, so if you are removing superfluous words and redundant passages they you should see an increase as your pages are more user friendly. If you remove good copy that was considered helpful then you would expect to see decreases.
-
RE: Is Keyword Density Still Relevant?
No, you should not be stuffing KW or worried about density, make your content user friendly and useful based on the KW phrases that you want to attract visitors with.
Best posts made by Tenlo
-
RE: "Get price" vs "Request a Quote"
IT will really depend on your area and the language your customers use. I have seen the same CTA wording, in the same region perform both amazing and far below expectation, and it had to do with what the end product was and the language that people in the area used when looking for those services. Your best bet it to run an A/B test yourself and see which CTA responds better to your potential customers.
-
RE: Will shortening down the amount of text on my pages affect it's SEO performance?
It could, but it also could help your ranking. The amount of text is not important as much as how relevant Google sees it compared to the search query, so if you are removing superfluous words and redundant passages they you should see an increase as your pages are more user friendly. If you remove good copy that was considered helpful then you would expect to see decreases.
-
RE: Can I add external links to my sitemap?
The point of a sitemap is to tell Google what is on your site so it can index it easier. There is no way, nor any reason why you would want to put external urls onto your sitemap.
-
RE: Two divisions, same parent company, identical websites
That sounds rough. What you will want to do is alter your content for your single city based website to reflect that you serve that city, then when Google is looking for a match for a person near that city, it should see that site as the best match do to the weight it puts on geolocation. In the long run, you will want to re-write all of your content on one site so that your two sites will not be hurting each other or look like copy/paste spam sites.
-
RE: Is toggle Good For seo
If your content cannot be seen without logging in or getting past a "gate" then in general, Google will not serve people to that content. If you are planning on trying to spoof it so that the Google bot can crawl the content but people cannot see it, don't, Google will find out and then at best you will just see all of your rankings tank. Google wants to searve people the best, easiest content that will give them what they are looking for so if you are not openingly providing that content then Google will not send you traffic for those queries.
-
RE: Google Indexed a version of my site w/ MX record subdomain
You should find the locations of those links and correct them to point to the proper URL. I find that Screaming Frog's crawl is the easiest for this, you can find every link and see where they are located.
-
RE: Deep linking with redirects & building SEO
Set up Canonical tags and use Google's UTM codes to track your non-internal links (as your internal links will be tracked just fine by Google Analytics without any need for using redirects).
Looks like your connection to Moz was lost, please wait while we try to reconnect.