Over Optimised Magento Pages
-
We are working on a clients Magento site and we've added new copy which has a decent keyword density which is in line with best practice. When we run it through Moz we are getting a Key Word Stuffing alert saying the page has 27 keywords, where we can only see about 11.
This is the page https://www.greatbeanbags.com/bean-bag-cushions
The client is pushing back saying the page must have already been optimised before as our new copy has triggered the stuffing alert. But my guess is the page was already stuffed but buy some Magento code we can't see.
Any ideas?
#magento #Keyworddensity
-
We avoided keyword stuffing by having a simple real estate website landing page for properties in London
see https://richmondmanorestates.co.uk its clear simple and leads clients to relevant pages
-
It's possible that the page you're working on may have hidden or latent keywords within the Magento code that are contributing to the keyword stuffing alert in Moz. You might want to check the HTML source code of the page for any hidden elements or meta tags that could be adding extra keywords. Additionally, consider running the page through other SEO tools to cross-verify the keyword density and confirm if there's indeed an issue. If the keyword density issue persists, you may need to work on optimizing the code or structure of the page to ensure it aligns with best practices and doesn't trigger false alarms for keyword stuffing.
.
.
.
(Study abroad) (Best Books By Indian Authors)(Canada PR)(PMP Exam Prep) -
It's possible that Magento's code or some hidden elements are causing the discrepancy in keyword density on the page. While you can only see about 11 keywords, the Moz tool is detecting 27, likely due to background code or hidden text. To address this issue:
Check for Hidden Elements: Examine the page's source code for hidden elements, comments, or other content that might contain keywords. Sometimes, keywords can be hidden from the visible page but are still picked up by SEO tools.
Inspect Magento Templates: Magento's templates and dynamic content generation can sometimes insert keywords, meta tags, or other SEO-related elements into the page. Ensure that these are not artificially inflating keyword density.
Evaluate Moz's Criteria: Review Moz's specific criteria for detecting keyword stuffing. Sometimes, their algorithms may produce false positives. Ensure that the detected keywords are genuinely excessive and not common words or phrases used naturally.
Optimize Content: If you confirm that keyword stuffing exists, work with your client to optimize the content by removing unnecessary keywords and ensuring they are used naturally. The goal is to maintain a balance between SEO optimization and providing valuable content to users.
Documentation: Keep records of your findings and any changes made for future reference in case your client still questions the issue.
By investigating these areas, you can address the keyword stuffing alert and ensure that your client's Magento site is optimized for both search engines and user experience.
(PMP Exam Prep) (Remote Business Analyst Jobs)(Canada PR)(Study abroad)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated! For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL. My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Technical SEO | | BakeryTech
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error. I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index. Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this? P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.0 -
Shopify SEO - Collection or Blog post for ecomm seo?
Hi Moz folks, would love your thoughts on benefits of Shopify collection pages v blog posts for ranking secondary shopping keywords not suitable for existing shop pages - all help gratefully received, we are going down a rabbit hole on this one and need some sanity! So, we’re updating our site which already has a reasonable seo foundation and are looking to rank better for key shopping search keywords in our space (d2c sports nutrition). My question is should we prioritise store collection pages (category pages in Shopify terms) or blog posts for some of the main keywords not already covered by our core in-store collections/categories? Priority keywords already covered are things like protein powders, protein bars, energy drinks, etc. As context, we have a small product catalogue (10 products) and for easy navigation on site have these grouped into 7 collections/categories in the main menu and available from the homepage. All are quality high volume and high intent shopping keywords for our business. The secondary terms we are now looking to add content for are things like marathon nutrition, vegan sports nutrition, etc so now need to choose if we create product collection pages for these, or use blog posts to do the work. The advantage of collections, we believe, is that Google is likely to prioritise these in search. The disadvantage from a UX point of view is that more categories in store could make our simple and clear product range (10 products only) look complex or repetitive. Conversely, a blog post removes any UX confusion with too many categories, but we have a conversion rate issue with our blog. It performs well in search, but conversions are poor. We have addressed this with a new keyword targeting strategy and blog customisation, but we have yet to test this so while in theory it should work well, we do not know for certain. In summary: we want to rank key shopping keywords beyond our core ones we have - would you advise we use blog posts or product collection pages? All help gratefully received - thanks! Warren
SEO Tactics | | WP330 -
Page Break
Hey guys, A client has asked us to install a page break function on their Wordpress site in the hope of increasing page count. It's a very common practice but have never used it as part of a strategy. Any thoughts, pros or cons?
Content Development | | wearehappymedia0 -
Where to outsource product pages contents?
We have been told to write good unique content for every products but we just don't have the skill nor the time. (english is not my first-language) Can anyone suggest where to find a good product content writer?
Content Development | | ringochan0 -
The same phrase in many different pages of one site
Hi,
Content Development | | webg
Recently, I had to add the same phrase, with 15 words, nearly, in 700 posts in a same blog. In this phrase is written about the site ownership and eventually some links showing the posts sources. I thought in create a image, but it will be some variations in the source words (2 or 3), therefore I chose to use text format. I'd like to read some comments and opinions about this kind of insertion (the same phrase in many different pages of one site). For exemple, did you handle this in your site? Problems or benefits (mainly with indexing)? Special code to indicate in this case? Any threat?0 -
Simular product pages
I have 27000 products on my website, showed one by one on a separated webpage. Google index them almost all (+- 25000). But the SEOmoz report shows them as duplicated content. Indeed, most of the page is identical, only changing description and price of the product which is indeed not more than 2% of the total content of the page. On the bottom of the product page are shown the alternatives for this product, mainly other colors. So, within the same family of products that can have 50 products, the site creates 50 webpages showing the product and it's family. That's why nearly everything on the page is identical within this family of products. My guess is, as Google indexed them all, I should not worry about duplicated content. Is my guess correct? Thanks for a soon answer. Rik
Content Development | | noordhout0 -
Block Low Quality Pages?
What are your thoughts on blocking (in robots.txt) and/or noindexing low-quality pages to defend against Panda, assuming you can't remove, redirect, or add quality content to it? Also, assume there are no external links pointing to these low-quality pages, no social shares, and zero incoming organic traffic. Has anyone had experience with this as a solution to Panda?
Content Development | | poolguy0 -
Help with Duplicate Content Issue for pages...
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
Content Development | | pauledwards
There are about 300 pages affected by duplicate content currently. Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index? The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google. Any advice much appreciated. Kind Regards,0