SEO Overly-Dynamic URL Website with thousands of URLs
-
Hello,
I have a new client who has a Diablo 3 database. They have created a very interesting site in which every "build" is it's own URL. Every page is a list of weapons and gear for the gamer. The reader may love this but it's nightmare for SEO. I have pushed for a blog to help generate inbound links and traffic but overall I feel the main feature of their site is a headache to optimize.
They have thousands of pages index in google but none are really their own page. There is no strong content, H-Tags, or any real substance at all.
With a lack of definition for each page, Google see's this as a huge ball of mess, with duplicate Page Titles and too many onpage links.
The first thing I did was tell them to add a canonical link which seemed to drop the errors down 12K leaving only 2400 left...which is a nice start, but the remaining errors is still a challenge.
I'm thinking about seeing if I can either find a way to make each page it's own blurb, H Tag or simple have the Nav bar and all the links in the database Noindex. That way the site is left with only a handful of URLs + the Blog and Forum
Thought?
-
You bet. Just to be clear, I was talking about pulling the content from the page in some automated fashion into the title. Finding elements from each page that should be in variables and then inserting them into the title, description and H1 in a way that you can make each page unique.
You would need to make a final call on if you think the content is unique enough. We had 5000 locations that we used data in the database to make 5000 unique pages as each location has the name of the business, city, state, zip, address phone number etc.
We are now working to have user generated content/comments/reviews on each page so that each one page becomes more unique and more useful over time.
Having a IT guy who appreciates SEO is key for this and for the URLs. I would talk first about how he organizes the data in his system and then how this translates into the URL. You can then then just have him rename the URL using the same logic.
Show him some data on click thru rates on more readable URLs and how Google prefers not to spider them. I work to educate the IT guys as much as I can without making it sound like I know it all.
-
Hey CleverphD,
They're isn't a strong amount of traffic to sway any decision one direction to another at this point, but I'm sure this is the bread and butter of the site. Their programmer is going to hate me, but I agree there needs to be a way to optimize and like you say maximize the long tailed details.
So you think I pulling the title is the best route? maybe add a little content...I'm not overly confident that this guy knows how to create pretty SEO-friendly URLs using the data provided, but I'll try to explain why it's important.
thx
-
Wow! Yep you gotta lock this one down. This needs heavy automation support. Have you looked at using the content from each build can be pulled into the title tag etc automatically? That can help diversify how the tags look. If the users love all the builds, then is there a way to use automation to help Google see this.
If you work it right, you can have an awesome opportunity for long tail search. I worked on a site that had a yellow pages type setup. We had all the pages with title tags and descriptions that pulled in city state zip location address and name of location automatically. Even changed up the order on how it was presented and had different options for filler / connective words.
Worked pretty well to show off the unique content on each page as best we could automatically. We then paid an intern to go in and optimize page by page from there starting with the most viewed pages.
It may not be that each page is not strong enough as you mention, but you also said that users love the pages so I wanted to toss that out there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Audit my SEO Project
Hey professionals, I works on "MyInfo Community" as a SEO worker, anyone can help me to audit my this project? Because i am newbie in this field. Thanks!
Intermediate & Advanced SEO | | smartpoedgr0 -
Why has my website been removed from Bing?
I have a website that has recently been removed from Bing's index, but can't figure out why. The website isn't new, and it is indexed just fine on Google. These are the steps I've tried: The website is verified in Bing Webmaster Tools and successfully submitted the sitemap. I tested the URL to ensure that Bingbot is allowed to crawl the site I submitted URLs to Bing via the URL Submission tool There isn't a "noindex" on the site preventing it from being indexed When I do a URL Inspection, an error message comes up saying "The inspected URL is known to Bing but has some issues which are preventing us from serving it to our users. We recommend you to follow Bing Webmaster Guidelines." I contacted Bing to ask whether the website was removed in error, but received a reply that the website doesn't comply with Bing's quality guidelines, but they wouldn't go into detail as to which guidelines the website isn't meeting. The website URL is https://www.pardeehospital.org. Can anyone offer any advice or insight as to why Bing won't index our site? Thank you!
Intermediate & Advanced SEO | | lindsey.steinkamp0 -
Trailing Slashes on URLs
Hi we currently have a site on Wordpress which has two version of each URL trailing slash on URLs and one without it. Example: www.domain.com/page (preferred version - based on link data) www.domain.com/page**/** The non-slash version of the URL has most of the external links pointing to them, so we are going to pick that as the preferred version. However, currently, each version of every URL has rel canonical tag pointing to the non-preferred version. E.g. www.domain.com/page the rel canonical tag is: www.domain.com/page/ What would be the best way to clean up this setup? Cheers.
Intermediate & Advanced SEO | | cathywix0 -
Looking for SEO advice on Negative SEO attack. Technical SEO
please see this link https://www.dropbox.com/s/thgy57zmmwzodcp/Screenshot 2016-05-31 13.25.23.png?dl=0 you can see my domain is getting tons of chinese spam. I have 410'd the page but it still keeps coming.. 7tnawRV
Intermediate & Advanced SEO | | mattguitar990 -
Website not ranking
Firstly, apologies for the long winded question. I'm 'newish' to SEO We have a website built on Magento , www.excelclothing.com We have been online for 5 years and had reasonable success. Having used a few SEO companies in the past we found ourselves under a 'partial manual penalty' early last year. By July we were out of penalty. We have been gradually working our way through getting rid of 'spammy' links. Currently the website ranks for a handful of non competitive keywords looking at the domain on SEM RUSH. This has dropped drastically over the last 2 years. Our organic traffic over the last 2-3 years has seen no 'falling off a cliff' and has maintained a similar pattern. I've been told so many lies by SEO companies trying to get into my wallet I'm not sure who to believe. We have started to add content onto all our Category pages to make more unique although most of our Meta Descriptions are a 'boiler plate' template. I'm wondering.... Am I still suffering from Penquin ? Am I trapped by Panda and if so how can I know that? Do I need more links removed? How can I start to rank for more keywords I have a competitor online with the same DA, PA and virtually same number of links but they rank for 3500 keywords in the top 20. Would welcome any feedback. Many Thanks.
Intermediate & Advanced SEO | | wgilliland1 -
SEO question
Hi there! I'm the SEO manager for 5 Star Loans. I have 2 city pages running. We are running our business in 2 locations: Berkeley, CA & San Jose, CA. For those offices we've created 2 google listings with separate gmail accounts. Berkeley (http://5starloans.com/berkeley/) ranks well in Berkeley in Gmaps and it shows on first page in organic results. However the second city page San Jose (http://5starloans.com/san-jose/) doesn't show in the Gmaps local pack results and also doesn't rank well in organic results. Both of them have authentic backlinks and reviews. It has been a year already and it's high time we knew the problem 🙂 any comment would be helpful. thanks a lot
Intermediate & Advanced SEO | | moonalev0 -
Social Impacts on SEO? How to Do this?
I'm new in SEO and heard by one of my friend that social signals are important for SEO of a website. If people have shared a website's url on their twitter, then it will automatically get rank in google. Is that true and how google sees this social sharing? and how can I do this for my website?
Intermediate & Advanced SEO | | hammadrafique0 -
How will moving my website off of DNN platform affect my SEO?
Website is currently in Dot Net Nuke (DNN) and planning on moving it into a different platform, possibly Sitecore. How will shifting CMS affect SEO efforts? Thank you
Intermediate & Advanced SEO | | Unidev0