Will blank category pages automatically get updated
-
Hello,
We've got old category pages that are blank like
domain/shoes.html (blank white page not in menu anymore)
domain/newshoesurl.html (working URL with link in menu)
Will the blank pages be automatically deindexed and updated by Google?
-
Thanks Mike,
I just checked using the seobook.com tool, they're 404s
No, no link equity is lost by not 301ing them.
GWT shows no crawl errors.
-
What HTTP status response codes are they returning? If they're a 200 but providing a poor user experience then eventually they'll be deindexed or at least ranking so poorly that no one will run across them, if they 404 or 410 then they'll be deindexed eventually, and if they 500 then you likely need to fix some other things on your site. Also, are you losing any link equity by not 301ing them?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
3 word brand name + SEO. Will I be losing out on organic searches with spaces?
Hello, Starting a new website and the company name has three words. We've made the decision for the brand guide that we will not have spaces when the name is included in copy. Are we going to have difficulties ranking for both instances? Thanks
White Hat / Black Hat SEO | | jessicarechkemmer0 -
More or Less pages helps in SEO?
Hi all, I have gone through some articles where less pages are suggested and they claim that they will be favoured by Google. I'm not sure as with limited pages, we can only target limited keywords. There might be threat from Google in-terms of doorway pages for more pages. But one of our competitor has many pages like dedicated page for every keyword. And their website ranks high and good for all keywords. I can see three pages created with differnet phrases for same on keyword. If less pages are good, how come this works for our competitor? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Is it wrong to have the same page represented twice in the Nav?
Hi Mozzers, I have a client that have 3 pages represented twice in the Nav. There are not duplicates since they land with the same URL. It seems odd to have this situation but I guess it make sense for my client to have those represented twice since these pages could fall into multiple categories? Is it a bad practice for SEO or is it a waste to have those in the NAV? Should I require to eliminate the extras? Thanks!
White Hat / Black Hat SEO | | Ideas-Money-Art0 -
Page Rank is 0
Hi. Can you please point me in the right direction concerning a site whose default page has a PR of 0? There does not appear to be any errors in the robots.txt file (that I can tell). When I ran a duplicate content check by searching the title tag and first sentance in quotes it did not return more than 2 sites. When I ran a site: it is reporting 287,000 results. Does this mean that they purchased links and have now been penalized? Or where should I go from here? Thank you for any feedback and assistance.
White Hat / Black Hat SEO | | JulB0 -
Does your website get downgraded if you link to a lower quality site?
My site has a pr of 4. My friends site has a pr of 2 but I think that he is doing some black hat seo techniques. I wanted to know whether the search engines would ding me for linking to (i.e., validating) a lower quality site.
White Hat / Black Hat SEO | | jamesjd70 -
Farmer Update Case Study. Please question my logic here. (Very long!)
Hi SEOmoz community! I would like to try to give a small (well...) case study of a Farmer victim and some logical conclusions of mine that you are more then welcome to shred to pieces. So, I run MANY sites ranging from low to super quality and actually have a few that have been hit by farmer but this particular site had me scratching my head as to why it was torched. Quick background: Sitei s in a very competetive niche, been around since 2004 initially as a forum site but from 2005 also a content driven site. Site is an affiliate site and has been ranking top 5 for many high-value commercial KW's and has a big long-tail of informational kw's. Limk profile is a mix between natural, good links and purchased links from various qualilty sources. Content is high quality written articles, how-to's, blog posts etc. by in-house pro writers plus UGC from a semi active forum (20-30 posts a day). Farmer: After Farmer, this site's vertical is pretty much same as before with the biggest exception being my site. I quickly discounted low-quality content (spider-food) and focused instead on technical reasons. I took this approach since this site isn't the most well kept site I have and I figured the crappy CMS + PHPBB might have caused isseus. I didn't want to waste my time crawling the site myself so I quickly downloaded all the URLs that Majestic had crawled. Too my surprise the result of Majestic's crawler was over 3 million URLs when the real number would likley be 30-40k and Google has about 20k indexed. After scanning through the file with URLs I knew I had issues. Massive amounts of auto-generated dupe pages from the forum and so on. By adding around 20 new lines to robots.txt I was able to block millions of pages from being crawled again. My logic: Ok, so now I think I've found what caused the drop. Milllions of dupe pages and empty pages could have tripped the Farmer algo update to think the site is low quality or dupe or just trying to feed the spiders with uselessness. My WEAK point in this logic is that I can't prove that Google even knew about (or smart enough to ignore them). Google WMT tells me they've crawled an average of around 10k pages the last 90 days. Given this I'm doubting my logic and if I've found the issue or not. My next step is to see if this gets resolved algorithmically or not, if not i feel I have a legitimate case to submit a reinclusion request but i'm not sure? Since I haven't been a contributing member to this community I'm not looking to get direct help with my site, but hopefully this could spark some discussion about Farmer and maybe some flaming of my logic regarding the update 🙂 So, would any of you have drawn similar conclusions as I did? (Sweet blog bro!)
White Hat / Black Hat SEO | | YesBaby0