Latest posts made by GFD_Chris
-
RE: Sitemap use for very large forum-based community site
Agreed, you'll likely want to go with option #2. Dynamic sitemaps are a must when you're dealing with large sites like this. We advise them on all of our clients with larger sites. If your forum content is important for search then these are definitely important to include as the content likely changes often and might be naturally deeper in the architecture.
In general, I'd think of sitemaps from a discoverability perspective instead of a ranking one. The primary goal is to give Googlebot an avenue to crawl your sites content regardless of internal linking structure.
posted in Technical SEO
-
RE: Does data-bind hurt SEO?
No problem! A good golden rule of JavaScript SEO is to always SSR where possible. Let me know if you have any other questions!
posted in Moz Bar
-
RE: Does data-bind hurt SEO?
Basically those tools aren't reading the DOM but Google can which is why it can see your site's title tags, H1s etc. Your site is using client-side rendering which Google can crawl through. Notice how if you go to a given page and click "View Source", none of the page's content appears.
While it appears Google is reading the content in the pages I looked at, I would definitely look into this more to see if Google is able to crawl/index the content on all of your site's pages. Client side rendering is less reliable than SSR so there might be instances where Google isn't reading sections of your content.
posted in Moz Bar
-
RE: My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages.
In the HTML of your pages there's an <a href="">link with "javascript:void(0". It appears that Google is getting into those. Is possible, remove that link or take it out of an</a> <a href="">element. Otherwise, you should be OK, those pages should 404. </a>
posted in Intermediate & Advanced SEO
-
RE: Anyone have a collection or list of spammy websites?
Hey there! So while I don't have a list of spammy websites here's a general overview of how I analyze whether or not a website is spammy or not:
- Does the website have a high Spam Score when running it through Moz's Link Explorer?
- Does the website have an alternative TLD? (ga, rn, cl)
- Does the website have a low domain authority?
- Does the website use similar templates to others that are possibly low quality?
- Does the website cover a topic that's totally unrelated to the linking site?
- Is the website part of a blogspot link network?
- Does the website pass the "eye test" and look like it could be useful for users?
Of course one or two of these qualities does not make a site low quality but raises a flag that it could be. For me if a site has multiple factors that make it low quality and doesn't appear to pass the "eye test", then I might consider it a spammy domain. You can then submit all spammy domains to your disavow file.
posted in Content Development
-
RE: "5XX (Server Error)" - How can I fix this?
Good question! I've seen this happen with a few clients.
Here is my process for reviewing 5xx errors:
- Manually check a handful of reported 5xx URLs and note their status codes. Often times upon manual inspection, you'll often find that these pages return different status codes (200, 4xx).
- Perform a crawl using Screaming Frog and watch the "Status Codes" report. Watch to see if URL go from getting crawled as 200 to 5xx status codes.
- If this is the case, the issue might that your hosting can't handle the requests getting sent to your site. If not and your URLs are getting reported as 200, it might been a temporary status the crawler found.
- If possible, check your log files to see if Googlebot is returning a large number of 5xx errors.
Overall, if you only find a few 5xx errors in Search Console and your log files but don't find them in when you crawl the site, you're probably OK. However, if a crawl consistently reveals them or you see a large number in your log files then it's a high priority item to flag to your developers. You may need to consider upgrading your hosting or rolling back any recent development changes that were made. Definitely lean on your developer's insights here.
posted in Technical SEO
-
RE: Is there a way to download a report showing all meta descriptions for our web pages?
Hey Lydia!
You can simply perform a site crawl using using Screaming Frog. From there, you can select the "Meta Descriptions" tab and click "Export" which should provide you with all of your site' meta descriptions.
posted in Link Explorer
-
RE: 301 Redirects for Multiple Language Sites in htaccess File
You might be able to do this using the "RewriteCond" condition to specify the new subdomain paths. This guide should give you the basic template: https://www.hostwinds.com/guide/redirect-subdomain-using-htaccess/
Let me know if you have any other questions!
posted in Intermediate & Advanced SEO
-
RE: What to do with sold product pages when everything you sell are unique one off items
Without being able to see the site, here are some steps I would recommend taking:
- 404 all sold products
- Ensure users must enter unique titles, descriptions, images. Even if they create something similar, these aspects could be unique
- Show "Similar Products" for all sold pieces
Redirecting technically isn't the best practice unless the destination is very similar. As well, this will slow down your site's server response times if done at scale. Allowing them as 200 status codes will create a lot of duplicate content it seems.
If you send the site, I could take a further look here!
posted in Intermediate & Advanced SEO
-
RE: Creating a subdomain or subdirectory for each major city for our main website...
If you're seeing localized content rank for different city/state variants, then you can definitely create location pages! However, you'll want to ensure that these pages are unique from your home page and contain content localized to that specific area.
For example, here is page that's targeted to "los angeles car insurance": https://www.esurance.com/insurance/car/states/california/los-angeles
Instead of duplicate content on the home page, this page talks about:
- Required coverages for LA
- Driving in LA
- Auto repair locations
If you want to create dedicated local pages, this is definitely the best strategy!
posted in Link Building
Best posts made by GFD_Chris
-
RE: Does data-bind hurt SEO?
Basically those tools aren't reading the DOM but Google can which is why it can see your site's title tags, H1s etc. Your site is using client-side rendering which Google can crawl through. Notice how if you go to a given page and click "View Source", none of the page's content appears.
While it appears Google is reading the content in the pages I looked at, I would definitely look into this more to see if Google is able to crawl/index the content on all of your site's pages. Client side rendering is less reliable than SSR so there might be instances where Google isn't reading sections of your content.
posted in Moz Bar
-
RE: Does data-bind hurt SEO?
No problem! A good golden rule of JavaScript SEO is to always SSR where possible. Let me know if you have any other questions!
posted in Moz Bar
-
RE: My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages.
In the HTML of your pages there's an <a href="">link with "javascript:void(0". It appears that Google is getting into those. Is possible, remove that link or take it out of an</a> <a href="">element. Otherwise, you should be OK, those pages should 404. </a>
posted in Intermediate & Advanced SEO
-
RE: "5XX (Server Error)" - How can I fix this?
Good question! I've seen this happen with a few clients.
Here is my process for reviewing 5xx errors:
- Manually check a handful of reported 5xx URLs and note their status codes. Often times upon manual inspection, you'll often find that these pages return different status codes (200, 4xx).
- Perform a crawl using Screaming Frog and watch the "Status Codes" report. Watch to see if URL go from getting crawled as 200 to 5xx status codes.
- If this is the case, the issue might that your hosting can't handle the requests getting sent to your site. If not and your URLs are getting reported as 200, it might been a temporary status the crawler found.
- If possible, check your log files to see if Googlebot is returning a large number of 5xx errors.
Overall, if you only find a few 5xx errors in Search Console and your log files but don't find them in when you crawl the site, you're probably OK. However, if a crawl consistently reveals them or you see a large number in your log files then it's a high priority item to flag to your developers. You may need to consider upgrading your hosting or rolling back any recent development changes that were made. Definitely lean on your developer's insights here.
posted in Technical SEO
I am a Senior SEO Manager for the Go Fish Digital team. I work with unique problems and advanced search situations to help clients improve organic traffic through a deep understanding of Google's algorithm and web technology.
I love looking into interesting search problems! Feel free to reach out at chris.long@gofishdigital.com