Dynamic URLs Appearing on Google Page 1\. Convert to Static URLs or not?
-
Hi,
I have a client who uses dynamic URLs thoughout his site. For SEO purposes, I've advised him to convert dynamic URLs to static URLs whenever possible.
However, the client has a few dynamic URLs that are appearing on Google Page 1 for strategically valuable keywords. For these URLs, is it still worth it to 301 them to static URLs? In this case, what are the potential benefits and/or pitfalls?
-
Thanks for the advise. The website name is Teachervoice.com. The site appears at #9 in Google results for the term 'elementary school teacher interview questions'. (#7 for high school teacher interview questions)
URL looks like this: http://teachervoice.com/ReviewStream.aspx?ios=1&lg=11&hg=13
Change it to static or leave it as-is?
-
I agree with Marie but I will add one twist.
You mentioned the pages are ranking on page 1 of Google. More information is needed. Are they ranking as #1? or #10? What are the client's goals?
If a client is not presently #1 but is focused on earning the #1 spot, then I would go through and perform even minor tweaks to the page such as correcting the URL. I am probably in the minority on this topic.
Keep in mind URL weight for ranking is very low. The primary value comes whenever a user uses the URL as a link to the site. If you find your site's visitors copy and paste the links into forums or other articles, then the importance of solid URLs increases.
-
I think you may need to provide more information about the dynamic urls. In my opinion, if you have something like,
www.example.com/testpage.php?id=3
then that's fine,
but if you have:
then that's not good.
With that being said, if the pages are ranking well, why change them?
It might also be a situation where you need a rel canonical. So, for example, if you had two pages that were the same that were ranking such as:
www.example.com/testpage.php?id=1
along with
www.example.com/testpage.php?id=1&productcode=4
then you'd want to rel canonical the pages to the appropriate one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will google be able to crawl all of the pages given that the pages displayed or the info on a page varies according to the city of a user?
So the website I am working for asks for a location before displaying the product pages. There are two cities with multiple warehouses. Based on the users' location, the product pages available in the warehouse serving only in that area are shown. If the user skips location, default warehouse-related product pages are shown. The APIs are all location-based.
Intermediate & Advanced SEO | | Airlift0 -
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
URL Structure for geo location for specific page
On hackerearth.com/challenges page, there is an option to select languages. This option is in the footer. Once you select the language the url changes. Ex - if we select French, the URL changes to hackereath.com/fr/challenges. In case we decide to change the URL of this page with Geo, what should be the URL structure which accommodates languages as well. My research says that it would good to keep the url like domainname.com/page/language.
Intermediate & Advanced SEO | | Rajnish_HE0 -
Can I tell Google to Ignore Parts of a Page?
Hi all, I was wondering if there was some sort of html trick that I could use to selectively tell a search engine to ignore texts on certain parts of a page. Thanks!
Intermediate & Advanced SEO | | Charles_Murdock
Charles0 -
Google WMT Turning 1 Link into 4,000+ Links
We operate 2 ecommerce sites. The About Us page of our main site links to the homepage of our second site. It's been this way since the second site launched about 5 years ago. The sites sell completely different products and aren't related besides both being owned by us. In Webmaster Tools for site 2, it's picking up ~4,100 links coming to the home page from site 1. But we only link to the home page 1 time in the entire site and that's from the About Us page. I've used Screaming Frog, IT has looked at source, JavaScript, etc., and we're stumped. It doesn't look like WMT has a function to show you on what pages of a domain it finds the links and we're not seeing anything by checking the site itself. Does anyone have experience with a situation like this? Anyone know an easy way to find exactly where Google sees these links coming from?
Intermediate & Advanced SEO | | Kingof50 -
Overly-Dynamic URLs & Changing URL Structure w Web Redesign
I have a client that has multiple apartment complexes in different states and metro areas. They get good traffic and pretty good conversions but the site needs a lot of updating, including the architecture, to implement SEO standards. Right now they rank for " <brand_name>apartments" on every place but not " <city_name>apartments".</city_name></brand_name> There current architecture displays their URLs like: http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=overview</client_apartments> http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=floorplans&floorPlanID=121</client_apartments> I know it is said to never change the URL structure but what about this site? I see this URL structure being bad for SEO, bad for users, and basically forces us to keep the current architecture. They don't have many links built to their community pages so will creating a new URL structure and doing 301 redirects to the new URLs drastically drop rankings? Is this something that we should bite the bullet on now for future rankings, traffic, and a better architecture?
Intermediate & Advanced SEO | | JaredDetroit0 -
Why will google not index my pages?
About 6 weeks ago we moved a subcategory out to becomne a main category using all the same content. We also removed 100's of old products and replaced these with new variation listings to remove duplicate content issues. The problem is google will not index 12 critcal pages and our ranking have slumped for the keywords in the categories. What can i do to entice google to index these pages?
Intermediate & Advanced SEO | | Towelsrus0 -
Export list of urls in google's index?
Is there a way to export an exact list of urls found in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0