Filter Content By State Selection and SEO Consideratoins
-
I have an insurance client that is represented in three states. They need to present different information to users by state identification. They prefer to have one page with all the information and then present the information relevant to the state by the users selection from a pop up window. Spiders will be able to index all the content. Users will only see the content based on their selection.
So, I wanted to ask the Moz community what SEO implication could this have? The information available on the web is very thin with this situation so really appreciate any guidance that can be given...thanks,
-
@liamis
As a rule of thumb, it is better to have one strong page, instead of 3 similar pages that can be treated by Google as duplicates. So, it seems that you're making the right decision. Regarding popup - really depends on this popup implementation.In general, Google dislikes popups, especially if they are leading into a new window. Also, consider is the behavior of this popup on mobile: it may result in weaker CLS score and failed Core Web Vitals, which will affect the rankings. And even worse, you probably will have two popups - one for cookie consent and the second - for state-relevant information.
My advice - try to avoid popups, and find another way to display the state-specific information. There are probably many options: use a GET parameter ?state=xx and/or use geotargeting, and then mark this parameter as "specifies page content" in Google Search Console. Or use a drop down menu on the top to display state-related info. Or even use three different URLs and specify one of them as canonical, so that Google will not treat these pages as duplicates.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
site speed
i use mid-quality pic and... but my site speed is low
On-Page Optimization | | zlbvasgabc
any suggestion?
my site is:
https://bandolini.ir/0 -
Looking for feedback on review website
Hello Moz community. I’m looking for feedback on how
Content Development | | Paul-Paquin-Golden-Financial
to improve this national company review/comparison website. We bring in experts in the industries we are writing reviews on. Well then conduct extensive keyword research and provide an SEO optimized recommendation content briefing to the writer. The writer creates the content including the high search volume and most relevant keyword phrases that people are searching for. We also make sure to match the intent of the keywords to Commercial intent for the content to ensure it matches our audience. For example, we brought in a leading audiologist and created the top Hearing aids for tinnitus.. How can we improve the process and site, any recommendations?0 -
How to get rid of bad backlinks
So I noticed my rankings going down and spam score going up. So under my spam score there are over 100 links for different websites but ALL redirect to semalt.com I researched it and it says they hijacked a bunched of backlinks but don't know much more. How can I get rid of all those backlinks? I was told I could use the disavow tool but apparently that can hurt my ranking as well. The semalt.com site has no backlink to me - it looks like the pages that have/had my backlinks, they've redirected to them. For instance this is one of the links http://www.oxvideos.xyz/indianantyphotoxxx 422005fb-9240-40a4-8b65-f5b1f5079dea-image.png
Link Building | | landlwoof40 -
How to best handle search landing pages - that don't exist
I have quite a bit of blog information that can be searched, which results in "pages" that don't actually live anywhere. These are scanned by Moz and appear as poor page quality for speed, etc. How do I get the service to either ignore all of these or is there a way to treat them as a real page with content? As there are quite a few generated over time, I'd like to be able to capture them somehow. Thanks.
On-Page Optimization | | amac70 -
International SEO and duplicate content: what should I do when hreflangs are not enough?
Hi, A follow up question from another one I had a couple of months ago: It has been almost 2 months now that my hreflangs are in place. Google recognises them well and GSC is cleaned (no hreflang errors). Though I've seen some positive changes, I'm quite far from sorting that duplicate content issue completely and some entire sub-folders remain hidden from the SERP.
Intermediate & Advanced SEO | | GhillC
I believe it happens for two reasons: 1. Fully mirrored content - as per the link to my previous question above, some parts of the site I'm working on are 100% similar. Quite a "gravity issue" here as there is nothing I can do to fix the site architecture nor to get bespoke content in place. 2. Sub-folders "authority". I'm guessing that Google prefers sub-folders over others due to their legacy traffic/history. Meaning that even with hreflangs in place, the older sub-folder would rank over the right one because Google believes it provides better results to its users. Two questions from these reasons:
1. Is the latter correct? Am I guessing correctly re "sub-folders" authority (if such thing exists) or am I simply wrong? 2. Can I solve this using canonical tags?
Instead of trying to fix and "promote" hidden sub-folders, I'm thinking to actually reinforce the results I'm getting from stronger sub-folders.
I.e: if a user based in belgium is Googling something relating to my site, the site.com/fr/ subfolder shows up instead of the site.com/be/fr/ sub-sub-folder.
Or if someone is based in Belgium using Dutch, he would get site.com/nl/ results instead of the site.com/be/nl/ sub-sub-folder. Therefore, I could canonicalise /be/fr/ to /fr/ and do something similar for that second one. I'd prefer traffic coming to the right part of the site for tracking and analytic reasons. However, instead of trying to move mountain by changing Google's behaviour (if ever I could do this?), I'm thinking to encourage the current flow (also because it's not completely wrong as it brings traffic to pages featuring the correct language no matter what). That second question is the main reason why I'm looking out for MoZ's community advice: am I going to damage the site badly by using canonical tags that way? Thank you so much!
G0 -
Is my website is having enough content on it to rank?
I have less content on my website, is this okay or I need to add more content on my pages? Website is - brandstenmedia.com.au Any other suggestions for the website?
Intermediate & Advanced SEO | | Green.landon0 -
What are your thoughts on Content Automation?
Hi, I want to ask forum members’ opinion on content automation. And before I raise the eyebrows of many of you with this question, I’d like to state I am creating content and doing SEO for my own website so I’m not looking to cut corners with spammy tactics that could hurt my website from an organic search perspective. The goal is to automate pages in the areas of headings, Meta Titles, Meta Descriptions, and perhaps a paragraph of content. More importantly, I’d like these pages to add value to the users experience so the question is…. How do I go about automating the pages, and more specifically, how is meta title, meta descriptions etc. automated? I’d also like to hear from people that recommend steering clear of any form of content automation. I hope my question isn’t too bit vague and I look forward to hearing from other Mozzers. Regards, Russell in South Africa
Intermediate & Advanced SEO | | Shamima0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0