Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Page Rank Worse After Optimization
-
For a long time, we had terrible on page SEO. No keyword targeting, no meta titles or descriptions. Just a brief 2-4 sentence product description and shipping information. Strangely, we weren't ranking too bad. For one product, we were ranking on page 1 of Google for a certain keyword.
My goal to reach the top of page 1 would be easy (or so I thought).
I have now optimized this page to rank better for the same keyword. I have a 276 word description with detailed specifications and shipping information. I have a strong title and meta description with keywords and modifers. I have also included a video demonstration, additional photos and an PDF of the owners manual.
In my eyes, the page is 100% better than it ever was. In the eyes of MOZ, it's better also. I've got an A with the On-Page Grader.
Why is this page now ranking on page 8 of Google? What have I done wrong? What can I do to correct it?
-
Thanks Dirk -
I will try out your suggestions and let you know my results. I appreciate your help.
-
Hi Dustin
I checked the last example a bit more in detail:
Webpagetest indicates that his page is even heavier than the one I tested the first time - results here: http://www.webpagetest.org/result/150501_XQ_Q1H/ (load time ok - but page way to heavy)- page speed analyser results are not terrific either. Check also the links on the page - a lot of your css files seemed to be 301 to a new location - it's better to call directly the final location. Optimising shouldn't be too difficult - compressing images, js,..etc should really increase your scores.
Also on the technical side, probably good to clean the code a bit. You don't really have to pass the W3C validator test, but it seems your page generates a awful lot of errors.
From a content perspective - looks good. A remark could be that a lot of content is not directly visible when loading the page (read more, tabs). Google announced end of 2014 that they don't really like content that is hidden when the page is loaded (https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html). Don't really think this would cause a drop - but it's something you could consider changing if you modify the HTML code.
Hope this helps,
Dirk
-
Hi Dirk -
It's actually another website, not included in my profile.
The page is: http://www.1-800-shaved-ice.com/simply-a-blast-snow-cone-machine.htmlI did the same optimization with some other pages, and we are ranked number 1 on the first page. These are the pages that have done extremely well.
http://www.1-800-shaved-ice.com/c-hc-8e.html
http://www.1-800-shaved-ice.com/swanblocshav.htmlThanks for your feedback
-
Hi Dustin,
If the site is the one mentioned in your profile - it's almost certain that it's a speed/performance issue. I checked one product page - and grade from google for desktop is 49/100 - 67/100 for mobile. Webpage test indicates the page size of 2700K - mainly images, javascript & flash. It should not be to difficult to improve these scores - optimising the images & the js will help a lot.
rgds,
Dirk
-
This can have different reasons - without having the actual example it's difficult to assess. Based on the info you give, adding all this content, video, images, it could be that your page became a lot heavier to load (check the page on webpagetest.org and with pagespeed insights). May be you exaggerated a bit with the optimisation - and is Google considering the page a bit too spammy.
If possible, can you give the actual url of the page?
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting homepage to internal page (2nd Tier page)
We are planning to experiment redirecting our homepage to one of the 2nd tier page. I mean....example.com to example.com/page. We need this page to rank well, but it doesn't have much internal links or external back-links, so we opt for this redirect. Advantage with this page is, it has "keyword" we want to rank for in URL. "page" in example.com/page. Will this help or hurt us in SEO? I think we are missing keyword in our root domain, so interested to highlight this page. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Ranking 2 pages on the same domain in the same SERP
I thought it was generally said that Google will favour 1 page per domain for a particular SERP, but I have seen examples where that is not the case (i.e. Same domain is ranking 2 different pages on the 1st page of the SERPs...) Are there any "tricks" to taking up 2 first page SERP positions, or am I mistaken that this doesn't always happen?
Intermediate & Advanced SEO | | Ullamalm0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
How to combine 2 pages (same domain) that rank for same keyword?
Hi Mozzers, A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like: www.mycompetition1.com www.mycompetition2.com www.mywebsite.com/page1.html
Intermediate & Advanced SEO | | rayvensoft
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1. Does anybody have any experience in this? Any advice is much appreciated.0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0