Too many SEO changes needed on a page. Create a new page?
-
I've been doing some research on a keyword with Page Optimization.
I'm finding there's a lot of changes suggested. I'm wondering that because of the amount of changes required is it better to create a new page entirely from scratch that has all the suggestions implemented OR change the current page?
Thanks,
Chris
-
Firstly see what Google thinks of the page. On-page SEO checkers require wider experience to use accurately and efficiently. You can't just take insights from online tools and run with them
If the page has no associated keywords / search queries (in Ahrefs, GSC, Google Analytics, SEMRush etc) then that would show Google isn't really that interested. Because keyword data (even from Google) is always heavily sampled, you'd also want to check if, as a 'landing page' (in Google Analytics), the page had been receiving any traffic from Google (Organic Search segment)
If the page is doing well despite what online tools say, rules be damned. If the page isn't performing well (or at all) then just re-build it from scratch in line with best practices, and 301 redirect the old URL to the new one. If the URL stays the same then just rebuild on the active URL, that's fine too (but don't publish until the complete re-build is 100% finished and you are 110% happy with it)
-
Chris,
As Jose stated there us a risk attached to creating a new page, however, if that's the only option then ensure that you have 301(s) in place.
-
Hi Jose,
Thanks for your reply.
It sounds like creating a new page would be the safer option?
If the URL is changed, would that diminish the value greatly i take, even if the URL is now optimised like the competitors pages?
-
Hi
If the page has already been indexed by google, you run the risk of losing the positioning you have achieved, as well as the backlinks that website has received. Therefore, I advise you to value before the importance and repercussions that it can have.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
Do you know if there is a tool that check all the scripts that are running on the page, and can diagonse scripts that can harm our seo?
Hi, Do you know if there is a tool that check all the scripts that are running on the page, and can diagnose scripts that can harm our seo? Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Pages that did NOT 301 redirect to the new site
Hi, Is there a tool out there that can tell me what pages did NOT 301 redirect to the new sites? I need something rather than going into google.com and typing in site:oldsite.com to see if it's still indexed and if it's not 301 redirecting.. I'm not sure if screaming frog can do that. Thanks.
Intermediate & Advanced SEO | | ggpaul5620 -
New g(TLD) advice needed
Hey all, I'm a bit confused by conflicting advice, need some direct input. We're quite experienced in SEO but that doesn't mean we can't get better 🙂 I manage a very old, well established, very generic TLD portal that ranks very highly in MANY keywords. (If you know our domain, I'd appreciate not naming it here) (145 1-3 ranks, 342 1-20 ranks) but there are also many topics we want to improve upon. Lets say, for example, I own gold.com, but I've failed to rank for 'gold events' and I acquired gold.events. What is the thought as to using some of the g(TLD)s versus the original .com? In the example events.gold.com or gold.events or gold.com/events/? I really can't find a consensus on which would bemost effective for SEO purposes. In a more general aspect of the same question, we own MANY "gold.newg(TLD)" domains and are conflicted as to best use of all of them. All advice greatly appreciated. Nat
Intermediate & Advanced SEO | | WorldWideWebLabs0 -
Do XML sitemaps need to be manually resubmitted every time they are changed?
I have been noticing lately that quite a few of my client's sites are showing sitemap errors/warnings in Google webmaster tools, despite the fact that the issue with the the sitemap (e.g a URL that we have blocked in robots.txt) was fixed several months earlier. Google talks about resubmitting sitemaps here where it says you can resubmit your sitemap when you have made changes to it, I just find it somewhat strange that the sitemap is not automatically re-scanned when Google crawls a website. Does anyone know if the sitemap is automatically rescanned and only webmaster tools is not updated, or am I going to have to manually resubmit or ping Google with the sitemap each time a change is made? It would be interesting to know other people's experiences with this 🙂
Intermediate & Advanced SEO | | Jamie.Stevens0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Any experience with using programs to create UGC pages?
We have a new client (a mobile app) who created a program to create thousands of pages of "unique, user-generated content" for their website. An example: A person in the forum in app asks a question, and people respond. The client's program then compiles the question and responses into a unique, auto-generated page for the website. (I don't think the app is utilizing deep linking -- though I was going to recommend it -- so the app content is not indexed by search engines yet.) The pages are already created -- they are just not live on the site yet. I'm very skeptical. But the client says it's similar to what Stack Overflow does (or something like that). Basic example. Say that a question for which the client wants to rank is, "What Are the Symptoms of Cancer?" I'd think that a quality, human-created, referenced, well-written, authoritative page would obviously rank more highly than a UGC page based on a forum discussion on that topic. But of course, doing that for hundreds of questions is costly and hard to scale -- both of which are concerns of the client (a startup with little money). Has anyone had any experience in this? It's the first time I've tackled such an issue. Thanks in advance for any thoughts!
Intermediate & Advanced SEO | | SamuelScott0 -
Varying Internal Link Anchor Text with Each New Page Load
I'm asking for people's opinions on varying internal anchor text. Before you jump in and say, "Oh yes, varying your anchor text is always a good idea", let me explain. I'm not talking about varying anchor text on different links scattered throughout a site. We all know that is a wise thing to do for a variety of reasons that have been covered in many places. What I'm talking about is including semi-useful links below the fold and then varying the anchor text with each page load. Each time Googlebot crawls a page, it sees different anchor text for each link. That way, Googlebot is seeing, for example, 'san diego bars', 'taverns in san diego', 'san diego clubs', and 'pubs in san diego' all pointing to a San Diego bar/tavern/club/pub page. I'm wondering if there is value in this approach. Will it help a site rank well for multiple search queries? Could it potentially be better than static anchor text as it may help Google better understand the targeted page? Is it a good way to protect a large site with a huge number of internal links from Penguin? To summarize, we're talking about the impact of varying the anchor text on a single page with each page load as opposed to varying the anchor text on different pages. Thoughts?
Intermediate & Advanced SEO | | RyanOD0