Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Dynamic pages for locations, but SEOmoz counts them as multiple content. Are there any good work-arounds?
-
We have a client that has a list of locations. Instead of having multiple pages (65) for each, the pages content is dynamic, hence, the multiple content for each location. Just the 'city" variable changes on the page. Is there a graceful way to fix this problem so I don't get bookoos of errors in the crawl or if I am approaching this wrong?
Thanks in advance!
- Abe S.
-
Thanks for the quick response! I figured that's what I would have to do and I do think it's the best practice as well. Just thought there might be have been another way. Cheers!
-
If I understand your question correctly, you literally have duplicate content across the entire list of locations ("just the city variable changes on the page.").
If that's correct understanding, the best solution is to ensure that you have enough unique content for each location so list to ensure a comparison of location to location shows enough unique content compared to duplicate content.
Alternately, you would need to find a way to combine the least important locations into a smaller number of pages but the more locations you try to optimize a single page for, the more difficult the work becomes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting indexed in Google Scholar
Hi all! We have a client who publishes scholarly research as a highly regarded non-profit. Their Publications aren't being indexed in Google Scholar 50% of the time and when they are, Google is pulling random stuff from a PDF vs. from the html page. Any advice on best practices is enormously appreciated
SERP Trends | | SimpleSearch1 -
Getting indexed by Google scholar
Often my Google Scholar alerts result in exactly what I think they will: scholarly articles published in academic journals. However, today I got this completely non-scholarly article https://www.t-nation.com/training/the-exact-reps-that-make-you-grow and I have no idea why Google Scholar is indexing this site. I've read up on how to get indexed by Google Scholar, and this website doesn't seem to have the necessary requirements. I'm curious for anyone whose clients or industry need to get indexed by Google Scholar, what has worked for you?
SERP Trends | | newwhy2 -
Best proxy service to browse the Google from different countries to check the ranking
Hi Moz community, We need to check our website/pages rankings for random keywords at random timings in different countries. Beside checking in search console, we would like to check in browser. But Google now is not allowing us to browse the results of other country. I would like to use best proxy service to browse Google from different location to check how our pages are ranking & fluctuating. Please suggest on this. Thanks
SERP Trends | | vtmoz0 -
Search results vary in chrome vs other browsers even in Incognito mode: Google's stand?
Hi all, We use incognito mode or private browsing to check the Actual results which are not impacted by previous history, location (sometimes), etc. Even we browse this way, we can see the different search results. Why would this happen? What's Google's stand on this? What is the actual way to browse to get the unbiased results for certain search queries? I have experienced that Chrome will rank our own websites bit higher compared to the other browsers even in incognito mode. Thanks
SERP Trends | | vtmoz1 -
URL Parameter for Limiting Results
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions. 1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category? In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes. Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page. 2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do. Any advice or suggestions will be helpful! Thanks.
SERP Trends | | dkeipper0 -
Ways to fetch search analytics - historical search query data from Google Search Console
Is there any way to fetch all historical search query data from Google Search Console ? Google allows us to view only 90 days report at the maximum. Does integrating google search console with google analytics tool solve this problem ?
SERP Trends | | NortonSupportSEO0 -
Create original content or Copy from several sources?
I own a site that that has about 15,000 pages that need some description content. I plan to hire someone to retrieve or create content for each of these pages on my site. each page needs about 500 words. I was thinking that there is probably three ways i can go about this. If I hire someone cheap I can probably have them copy data from about 20 different sites. using about 5 sources for each page description. I can hire someone that has some experience writing english content, and have them go to a few sites and then in their own words summarize the description on my site. I can hire a great content writer, have them do research on each page, and create completely unique content. I probably will do some combination of these 3 things. (great content writer on a few pages, since thats all i can afford to do) and the rest do the cheaper route. Is copying sentences from multiple sources a good idea? or does the content really need to be original?
SERP Trends | | moneymm220 -
Can I submit multiple data feed to Google Merchant Center?
I have submitted my product feed to Google Merchant Center. There are 7K+ products in one product feed. (data_feed.txt) I am not satisfy with my products' performance on Google shopping. Because, I am not able to maintain data quality for all products as well as not able to focus on high selling categories. Can I submit multiple product feed with separation of categories as follow. category001_data_feed.txt (With 200 High selling products) category002_data_feed.txt (With 100 offers products)
SERP Trends | | CommercePundit0