Too many on page links for WP blog page
-
Hello,
I have set my WP blog to a page so new posts go to that page making it the blog. On a SEOmoz campaign crawl, it says there are too many links on one page, so does this mean that as I am posting my blog posts to this page, the search engines are seeing the page as one page with links instead of the blog posts?
I worry that if I continue to add more posts (which obviously I want to) the links will increase more and more, meaning that they will be discounted due to too many links.
What can I do to rectify this?
Many thanks in advance
-
Ah ok I get it...sorry I am a little slow!
Thanks for your help and I will appy that method to remove the cat. links.
-
No, there are indeed over 100 <a>tags on the page. I guess I wasn't very clear... sorry for the confusion.</a>
<a>You have 10 blog post summaries listed on the page. Each post summary has a few links plus there are many other nav links on the page, bringing you up over the 100 mark. If you keep adding posts that count should stay roughly the same....when you add more posts the older ones are automatically pushed through to Older Posts, keeping the 10 latest summaries on that page.</a>
<a>As I mentioned earlier, one method to reduce the</a> <a>tag count on the page would be to remove the category links. They don't seem necessary since you are only using the one blog category.</a>
-
On the SEOmoz crawl, it says that the blog category has 106 links on it so I assumed that was becuaes of the page the posts are harvested on? Is that wrong as if its only 10 or so I can handle that!
-
How many posts do you want to have on a page? If your count is around 10 you should be in good shape.... and you can control how many of these posts appear on your first (summary) page. It looks like you're at 10 now; any additional posts are viewed through your "view older posts" at the bottom.
Agreed though, if you're showing 15+ post starts on your list then you'll certainly be over in link count.
-
Ok thanks, that sounds like a good idea. My worry is though, that as I develop the blog and add more posts, this time next year for example I will surely have at least double the amount of links? I dont see any way round it really just because I am using the page to harvest the posts?
-
You have 10 stubs on the page ... that's a decent number and will keep your page's link count reasonable.
What about the idea of removing the "Filed under BLOG" links? If you're just running the single blog category, you wouldn't really need the links. That will save you 10 links right off the bat that can be reserved for links in the short descriptions.
The 100 link limit is a good rule of thumb, but you shouldn't be penalized if you're going a little over.
-
Good idea...sorry! Its here.
Thanks
-
Do you have a link to check out?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
#Page Jump link sharing
Hi I'm managing an in-house link building campaign in order to help in our key search term 'Location Holidays'. We were historically number 1 for this term until a recent re-design in May where our web design agency butchered our SEO. All of the main issued fixed, we're now fluctuating between 3rd & 4th on a daily basis. I'm putting together a social share comp to promote through the press in order to boost our backlink profile. We're nesting the competition within the body of the page we want to improve the rankings for. I will be including a #page jump link to quickly access it as it will be further down the page. My question is that if we get press to link to http://holidaycompany.com/destination/#comp will http://holidaycompany.com/destination/ receive the link juice or will http://holidaycompany.com/destination/#comp be looked upon as a whole new page? Thanks in advance!
Technical SEO | | MattHolidays0 -
Will Adding Publish Date at end of Page Title for Blog posts Hurt SEO?
I'd like to be able to easily track blog posts by month but in Google reports when you set a date range obviously older blog post still appear and with amount of blog posts we generate without seeing the date in the title it's not obvious what was published and when it was published. For example if a Blog Title was "/dangers-of-sharing-KM-knowledge-01-11-15 would it hurt SEO? The reason is I'd like to have a quick way to know how new posts do each month compared to older content
Technical SEO | | inhouseninja0 -
Product pages getting no internal links in Magento
Hello I think i have a serious problem. Most of my products are not getting internal links.
Technical SEO | | macrovet
I discoverd this when i was running a Crawl Test Tool Report | Moz Here an example of one product.
This product can be navigate to a normal way true the navigation structure on my website. The navigation is http://www.macrovet.nl/scheermachine/scheerapparaat-paard-paardenscheermachine.html
On this page is the product URL: http://www.macrovet.nl/aesculap-econom-equipe-gt674.html
Time Crawled 2014
Title tag: Aesculap Econom Equipe GT674 | Macrovet.nl
Meta Description: Bekijk en bestel een Aesculap Econom Equipe GT674 paardenscheermachine voor de scherpste prijs Macrovet.nl
HTTP Status Code: 200
Referrer http://www.macrovet.nl/sitemap.xml
Link Count: 550
Content-Type Header: text/html; charset=UTF-8
4XX (Client Error): NO
5XX (Server Error): NO
Title Missing or Empty: No
Duplicate Page Content: NO
URLs with Duplicate Page Content (up to 5)
Duplicate Page Title:No
Long URL NO
Overly-Dynamic URL NO
301 (Permanent Redirect) NO
302 (Temporary Redirect) NO
301/302 Target
Meta Refresh NO
Meta Refresh Target
Title Element Too Short NO
Title Element Too Long No
Too Many On-Page Links YES
Missing Meta Description Tag No
Search Engine blocked by robots.txt No
Meta-robots Nofollow No
Meta Robots Tag INDEX,FOLLOW
Rel Canonical Yes
Rel-Canonical Target http://www.macrovet.nl/aesculap-econom-equipe-gt674.html
Blocking All User Agents No
Blocking Google No
Internal Links 0
Linking Root Domains 0
External Links 0
Page Authority 1 Domain Autority 30 Do you have an answer what is wrong, thanks for your answers Regards,
Willem-Johan0 -
Should I ask third pages to erase their links pointing at my site?
Good Morning Seomoz Fans, let me explain what is going on: A surfing site has included a link to my Site in their Footer. apparently, this could be good for my site, but as It has nothing to do with my site, I ask myself if I should tell them to erase it. Site A (Surfing Site) is pointing at Site B (Marketing Site) on their Footer. So Site B is receiving backlinks from every single page on Site A. But Site B has nothing to do with Site A: Different Markets. Should I ask them to erase the link on their footer as Surfing people will not find my Marketing Site interesting? Thanks in advance.
Technical SEO | | Tintanus0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0