Long title problem
-
I'm getting an incredible number of 4xx errors and long titles from a small website (northstarpad.com); over 13k 4xx errors and almost 20k "title element is too long". The number keeps climbing, but the site shouldn't have more than a couple hundred pages. When I look at the 4xx errors they are clearly being generated by some program since they have multiple and repeating keywords.
Here's an example:
I looked at the ftp files and plugins and couldn't see anything that could cause it, but I'm a beginner so no surprise there. Any suggestions where to look or how to fix this?
-
Thanks!
-
I can point you towards the best places online to find wordpress developers;
- https://clarity.fm/browse/technology/wordpress
- https://www.odesk.com/o/profiles/browse/skill/wordpress/
- http://premium.wpmudev.org/blog/find-wordpress-developer-designer/
Try those!
-
Hello Dan,
I've sent your message to a few web developers who apparently don't have a clue. Any suggestions on someone who could fix this?
Thanks!
-
Thanks a bunch Dan! I'm still clearly a neophyte, but am learning as I go so I appreciate your detailed responses. I'll get someone to work on the URLs right away.
It's also good to know Screaming Frog's limitations for the freebie.
-
I just wanted to clarify that the SEO plugin has nothing to do with this, and also turning all in one on/off will probably not fix anything.
Either you may have the free version of Screaming Frog which limits to 500 URLs, or you may need to adjust crawl settings - my crawl definitely was heading towards the 57k
-
The root of your issue is that there are links that are coded incorrectly
--> http://screencast.com/t/ndeKw3PL
which is resulting in infinite crawling of pages that do not really exist, and thus the same duplicate/long title tags.
For example this page is a good URL: http://northstarpad.com/category/business-portrait-metro-detroit/
But as shown in my screenshot the "Pet Photography" image links to: http://northstarpad.com/category/business-portrait-metro-detroit/pet-photography// which is a bad URL and NOT http://northstarpad.com/pet-photography/ which is where it should link.
Essentially your links should be "absolute" URLs (which show the full file path) not "relative"
--> http://screencast.com/t/koL5QX9B
You'll need to pass this to a web dev who knows how to edit your WordPress theme files.
-
I crawled the site with Screaming Frog and it only showed 475 pages instead of the 57k+ from Moz, and none of them had 404 errors. Is it possible that Moz is crawling our WordPress pages? I looked in our settings for a way to exclude that URL, but couldn't see where to do it.
We'll also try deleting/reinstalling the All in One SEO pack to see if that helps.
-
Thanks a bunch Nakul! I'll let you know if any of your suggestions help me find the problem.
-
Looks like a problem with Wordpress. Are you using any SEO Plugins like Yoast or something ? I'd suggest reaching out to your Wordpress Guy and have them look into it. It's clearly an issue, and needs to be fixed. You can also download Screaming Frog SEO Spider and crawl your website with it to see where Google is finding these links from. It could also be an issue with your Theme or Permalinks. These are the only areas I can think of. You are using the latest version of Wordpress, so you are good from that point of view. So has to be a Plugin, Theme or your Permalinks settings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
Https problem on google result.
Hello everyone. My problem is SSL certificate... Send all links to google, after google shows https link no problem. But a few minutes ago my home page link not have an SSL..
Technical SEO | | dalapayal
Please check this page : https://www.bodrumtransfermarket.com Where do I make a mistake? Thanks for all...0 -
Google SERPs Show Different Title
Hi Guys, Can anyone please help with my situation. My domain is www.greedybins.com.au. I had title setup in every pages differently, and it has been 2 months since I made the changes. I keep checking by using site:www.greedybins.com.au in search. So far, only 1 title that been show correctly in SERPs. I used SEO Yoast before, I changed to All In One SEO Pack plugin, the titles are changing, but still not accurate as I made it. Somehow Google altered it by itself. I have tried fetch and submit sitemap couple times from Google Search Console. Could anyone please advise?
Technical SEO | | ray.soms0 -
Duplicate Title/Meta Descriptions
Hi I had some error messages in the webmaster account, stating I had duplicate title/meta descriptions. Ive since fixed it, typically how long does it take for a full crawl if Ive fixed these issues? Webmaster is still showing problems with various Title/Descriptions. Also was wondering if I should block individual pages on a large ecomerce site? EX of a Large site - http://www.stubhub.com/chicago-bears-tickets/ (Page is structure and optimized) Then you have all individual games http://www.stubhub.com/chicago-bears-tickets/bears-vs-lions-soldier-field-4077064/ they have an H1 and a meta description, should the page above be blocked from google and concentrate only on the Pain page? Thanks!
Technical SEO | | TP_Marketing0 -
Long Domain Name - Subpage URL Question
I have a long domain name, so domainname/services/page title can get pretty lengthy. I have a services page, as a summary page since there's a few of htem, with more detailed on the actual page. In this situation, would it be better to do domainname.com/services/service-name which can exceed the suggested 70 characters, or would it be a better idea to do domain.com/service-name and just have hte m under the services menu? Is there any advantage/disadvantage to going out 2-3 tiers? or having the sub pages of those services off the domain instead of a child of the root child page Please let me know if any clarification is needed. Thanks!
Technical SEO | | tgr0ss0 -
Removing robots.txt on WordPress site problem
Hi..am a little confused since I ticked the box in WordPress to allow search engines to now crawl my site (previously asked for them not to) but Google webmaster tools is telling me I still have robots.txt blocking them so am unable to submit the sitemap. Checked source code and the robots instruction has gone so a little lost. Any ideas please?
Technical SEO | | Wallander0 -
Should I add my brand name to every page title
Currently for every page I automatically add my brand name Ie: poduct xxx - brand name product yyy - brand name. is this considered good or bad practice?
Technical SEO | | AsafY0 -
E-Commerce Site Crawling Problem
Our website displays all of the products in our website If you attempt to visit a category or page that doesn't exist but conforms to our site url structure. Somehow google crawled these pages and indexed them, and they have TONS of duplicate content that hurt us. How do I deal with this problem?
Technical SEO | | 13375auc30