Ajax Server Snapshot Setup...
-
Hi All,
Having just joined we have noticed that we have an alarming amount of 'Too Many On-Page Links' we think this is because of our large navigation menu which can be seen here... www.dirtbikexpress.co.uk
We have had our programmer set up some AJAX filters for the category & subcategory pages but are worried that the snapshot or the ugly url that is returned is set up correctly - ie it is just a ugly url duplicate of the same (complete) page does this just need to return the content that has changed?
Basically we want to ensure our AJAX and navigation menu is correct as we want to have the same navigation and filtering as many large site do now - John Lewis, Debenhams etc etc.
We are finding that some pages are struggling to index and products are proving very tricky. Also our indexed pages against our submitted sitemap in webmaster tools is poor in ratio (6000 pages over 1000 indexed).
Any help would be really really grateful.
Kindest regards
Mark
-
First off, the 100 links thing isn't a law written in stone. SEOmoz's tools do yell about it if you go over 100 links. This "100 link lore" comes from a Matt Cutts blog post:
http://www.mattcutts.com/blog/how-many-links-per-page/
If you look close, you may notice that there are more than 100 links even on the page that Matt wrote about this. It's kind of a loose guideline in my eyes. From my own professional experience, if every page on your site has 500 links, you're going to hurt for it. But if you have 125 links on quite a few pages, or put out a blog post that's just an insane resource that links to a few hundred people, you'll still be just fine.
What it seems like they're going for here is just an indicator of quality and usability. People do really abuse internal linkage, and Google needs to make sure that those people hurt for it. Without having seen your site, I'd just say, think about what your users really end up needing on every single page of the site (usually it is much less than 100 links). Often times, a simpler navigation is the better one. Look at what pages people actually reaching in Google Analytics, or setup a heatmap to follow their behavior. Tweak it. Annotate it in Google Analytics. See if pageviews or other key goals improve.
As for only seeing 1k pages out of 6k in the index, I'd again take a close look at what is actually of value. If you have a lot of duplicate/thin content, you may be best off just using noindex,follow tags on some of it, to improve Google's perceptions of your domain's quality as a whole. If they are all of value, you could have other IA issues. One test site of mine has roughly 1,000,000 pages without noindex,follow, and that's exactly how many appear in the index. If it's really good, useful stuff you should definitely be able to get indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting indexed in Google Scholar
Hi all! We have a client who publishes scholarly research as a highly regarded non-profit. Their Publications aren't being indexed in Google Scholar 50% of the time and when they are, Google is pulling random stuff from a PDF vs. from the html page. Any advice on best practices is enormously appreciated
SERP Trends | | SimpleSearch1 -
What do we know about the "Shops" SERP Feature?
I came across this SERP Feature in a search today on a mobile device. It does not show for the same search query on desktop. What do we know about this "Shops" SERP feature? shops-blur.jpg
SERP Trends | | seoelevated0 -
Google showing different links in SERPs
Google search results are showing my site links in both URLs, "mydomain.com" and "https://mydomain.com". However the one with https is showing a favicon, and the other one is not. So i wanna keep the https one and remove the other one. I went to GSC to submit "mydomain.com" for removal and it said that URL will be deleted in ALL of its variations.So how do i delete the "mydomain.com" links? Should i just index the ones with https again? Would that work? Someone suggested me to do 301 redirect on all pages that are being displayed twice. But i am not sure if i need to do that since i am using squarespace, and both of the links lead to the same page?
SERP Trends | | winter22330 -
Has anyone used youramigo or know what Technics they use or alternative to YourAmigo
please share your experience. I don't know much about them and they approached me for our ecommerce site.
SERP Trends | | bizuhodge0 -
My website is being Cached with non-www and With WWW it is not indexed and cached
Hello Team, I had one question that my website is being indexed and cached with Non-www but with WWW it is not caching it is showing 404 error. Even each and every redirection is proper. Still it is showing an error. Can you please tell me what issue i had with my site?? Here is my links: https://webcache.googleusercontent.com/search?q=cache:nCH1DvhuQT8J:https://www.canvaschamp.com/+&cd=1&hl=en&ct=clnk&gl=usa
SERP Trends | | CommercePundit0 -
Getting indexed by Google scholar
Often my Google Scholar alerts result in exactly what I think they will: scholarly articles published in academic journals. However, today I got this completely non-scholarly article https://www.t-nation.com/training/the-exact-reps-that-make-you-grow and I have no idea why Google Scholar is indexing this site. I've read up on how to get indexed by Google Scholar, and this website doesn't seem to have the necessary requirements. I'm curious for anyone whose clients or industry need to get indexed by Google Scholar, what has worked for you?
SERP Trends | | newwhy2 -
Why rich snippets have disappeared in Google search results ?
Hello, Few weeks ago, we have implemented a snippets strategy in order to increase our ranking for our blog posts. That was successful and our results were showing up in Google. But today, every single snippet has disappeared. We go back to a simple search result, without snippets for us or for our competitors. It seems that Google has delete rich snippet for specific keywords because for the generic keywords (for exemple "inbound marketing definition" in our case), there is still a snippet result. Do you know if Google has changed snippets parameters for keywords with low search volume ? Thank you !
SERP Trends | | Laure-Nile0 -
What are the SEO challenges associated with private search engines, like DuckDuckGo?
I read recently that DuckDuckGo doubled in size in 2017. With their search engine, and other alternatives to Google, taking part of the search market away, how can SEO/Marketing/Web pros keep their websites optimized and get traffic from these private search engines? (Also, do any of you have experience with this? What portion of your search traffic is coming from private search engines?)
SERP Trends | | searchencrypt1