Page content not being recognised?
-
I moved my website from Wix to Wordpress in May 2018. Since then, it's disappeared from Google searches. The site and pages are indexed, but no longer ranking.
I've just started a Moz campaign, and most pages are being flagged as having "thin content" (50 words or less), when I know that there are 300+ words on most of the pages.
Looking at the page source I find this bit of code: page contents
Does this mean that Google is finding this and thinks that I have only two words (page contents) on the page? Or is this code to grab the page contents from somewhere else in the code? I'm completely lost with this and would appreciate any insight.
-
That's great! I'm glad to hear that.
-
Solved this issue with the duplicate title tags. I'm using the Divi theme in Wordpress. I couldn't find the code I mentioned above in the header.php file, but when I went to the Divi theme options and clicked on the Integration tab, the code was in the Field under the code for the Bing verification code. Removed now and everything is fine.
-
I guess time will tell, meanwhile I have plenty other issues to work on
I'm still trying to find out where that extra code is coming from.
-
Sorry I couldn't attach the screenshot. But I wouldn't panic too much a lot of times websites tend to experience a loss in rankings after web redesigns so I'd give it some time and see if rankings improve. In the meantime, I'd look at updating the code to remove the extra title tag and fix the body tags.
-
Thanks again. The content does render correctly in search console so maybe I'm panicking about nothing. I can't find the extra title tag in the theme code but i'll keep looking.
-
The issue is that there are duplicate title tags on the site. You could have a developer remove the extra title tag that say "Your SEO optimized title". Then, you would need to work with a developer to correct the tag issue also. Currently, there isn't any content within the tag just leftover generic text "page contents". I'm not a developer but it appears as if it's an issue with the theme and these appear to be generic default settings. You can view this on any page of your site by pressing "CTRL U" on a PC and then pressing "CTR F" and search for <title>or <body> and you should be able to see the code issues.</p> <p>The Moz report could be incorrect I looked and you definitely have more than 50 words on certain pages but the majority of the pages are thin content. I wouldn't pay too much attention to that report.</p> <p>Also, just because Moz or another tool can't recognize certain content doesn't mean Google isn't actively crawling and indexing it. To double check this take a page where Moz is telling you there is 50 words or no content and do a "fetch and render" in Google Webmaster tools. This will show you how Google is viewing the page and will be a more accurate representation of what they are viewing.</p> <p>I hope that helps clear up the situation a bit more. Like I said I'm not a developer so that's my best guess as to what's going on.</p></title>
-
Hi Jordan,
Thanks for taking the time to look. Some great suggestions from you there that I've added to my list of things to do.
I have removed the Yoast SEO plugin that may have been causing the duplicate titles issue, but the issue persists.
About the content, I know my pages are a bit thin as you point out above, but what I'm concerned about is that Moz reports that I have less than 50 words on each page, which is not the case, and another tool reports that each page only has 13 characters of content. I can see this bit of code when I look at the page source code, but I don't know where it's coming from.
| <title></span>Your SEO optimized title<span class="html-tag"></title> |
| | |
| | |
| | page contents |
| | |I'm sure this is what's causing both issues. I don't want to add more content to the pages at this stage if crawlers can't see it, I'd rather fix this issue first.
-
I crawled your site and you have roughly 30 pages with under 300 words on them and the rest of your pages are under 1000 words or so. I'd recommend building out some of the content on these thinner pages.
You also have some paginated urls with a /page/ folder attached to them that you could apply a meta no index tag to and remove from Google.
I'd also look at disabling your current SEO plugin because "Your SEO optimized title" appears on all the pages and is causing your duplicate title tag issue. And I know meta descriptions aren't a ranking factor but all yours are blank and it's a great opportunity to build these out to drive extra click throughs from Google search results.
I hope that helps a bit.
-
This is what I mean - if you look at the attached screen shot you can see the words "page contents" at the top left. I think this is what all the tools are picking up and reporting back as my page content, and ignoring everything else. This is not usually visible but appeared when I used the Moz toolbar.
You can also see the other problem I have, that is duplicate page titles - there is the actual page title, and another which says "your seo title". I don't know what's causing these issues and have no idea how to fix it.
-
Hi Jordan,
Thanks for the reply. I set up 301 redirects only for the pages that I'd changed the name of. The domain format has changed from www.domain.co.uk to domain.co.uk without the www, and although it seems to redirect fine, the page authority has gone down from 26 to 16 after the move so there's something not right there.
The new site was submitted and indexed in search console, so the pages are indexed.
All the tools that I've used report that my page content is only 13 characters, - that's what makes me wonder if they think that "page contents" from this bit of code page contents is the actual content, and it's ignoring all the other stuff on the page.
-
It's kinda hard to understand what's going on without looking at the source code. But my first question is did you implement 301 redirects from your Wix site to your WordPress site, I believe Moz looks at the content within the
tags on your site. Also, did you set up your WordPress site within search console and submit your XML Sitemap? That will let you see any issues with your content being indexed.
Back to the thin content issue if you do indeed have thin content on your pages it's possible you are being penalized. Google is pretty explicit about having thin content that provides little or no value. I'd do an audit of your web pages using screaming frog or Moz and review the word count for some of your key pages.
Hope that helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Email and landing page duplicate content issue?
Hi Mozers, my question is, if there is a web based email that goes to subscribers, then if they click on a link it lands on a Wordpress page with very similar content, will Google penalize us for duplicate content? If so is the best workaround to make the email no index no follow? Thanks!
Technical SEO | | CalamityJane770 -
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
Huge number of indexed pages with no content
Hi, We have accidentally had Google indexed lots os our pages with no useful content at all on them. The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags. The problem occured, when we created a page for each cities, where we list the tags as links. Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them! We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused! Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:)) These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible. My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city. Can this work the way I imagine? Any better solution to cut this really bad nightmare short? Thank you in advance. Andras
Technical SEO | | Dilbak0 -
Fixing Duplicate Pages Titles/Content
I have a DNN site, which I created friendly URL's for; however, the creation of the friendly URL's then created duplicate page content and titles. I was able to fix all but two URL's with rel="canonical" links. BUT The two that are giving me the most issues are pointing to my homepage. When I added the rel = "canonical" link the page then becomes not indexable. And for whatever reason, I can't add a 301 redirect to the homepage because it then gives me "can't display webpage" error message. I am new to SEO and to DNN, so any help would be greatly appreciated.
Technical SEO | | VeronicaCFowler0 -
Dynamic page
I have few pages on my site that are with this nature /locator/find?radius=60&zip=&state=FL I read at Google webmaster that they suggest not to change URL's like this "According to Google's Blog (link below) they are able to crawl the simplified dynamic URL just fine, and it is even encouraged to use a simple dynamic URL ( " It's much safer to serve us the original dynamic URL and let us handle the problem of detecting and avoiding problematic parameters. " ) _http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html _It can also actually lead to a decrease as per this line: " We might have problems crawling and ranking your dynamic URLs if you try to make your urls look static and in the process hide parameters which offer the Googlebot valuable information. "The URLs are already simplified without any extra parameters, which is the recommended structure from Google:"Does that mean I should avoid rewriting dynamic URLs at all?
Technical SEO | | ciznerguy
That's our recommendation, unless your rewrites are limited to removing unnecessary parameters, or you are very diligent in removing all parameters that could cause problems" I would love to get some opinions on this also please consider that those pages are not cached by Google for some reason.0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0 -
Panda Update Question - Syndicated Content Vs Copied Content
Hi all, I have a question on copied content and syndicated content - Obviously copying content directly form another website is a big no no, but wanted to know how Google views syndicated content and if it views this differently? If you have syndicated content on your website, can you penalised from the lastest Panda update and is there a viable solutiion to address this? Mnay thanks Simon
Technical SEO | | simonsw0