Hi all,
We have an opening for a Senior SEO Associate. Would love to hire someone in the Moz Community. Here are the details: Sr SEO Associate https://g.co/kgs/Ucwzp7
Cheers,
Dana
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hi all,
We have an opening for a Senior SEO Associate. Would love to hire someone in the Moz Community. Here are the details: Sr SEO Associate https://g.co/kgs/Ucwzp7
Cheers,
Dana
Hi Steve,
This appears to be schema used to help search engines understand the nature of content in objects containing "stuff" that search engines have trouble completely understanding. For example, I found keywords as possible markup element for a Video: http://schema.org/VideoObject
I don't think this plays into rankings at all. Google is so over that kind of easy manipulation. However, I DO think that when these are marked up in conjunction with ALT attributes for images, or transcriptions for videos, they can help Google understand the semantic relevance of that content. For example (and I am totally making this up), imagine a video of a veterinarian administering vaccines to an animal. During the video the vet keeps referring to the animal as "the patient." So from the transcript, a search engine (or someone who's visually impaired) wouldn't know that this video is about medicine for animals instead of humans. Using the schema.org markup for keywords would allow terms like "animal vaccine best practices" to be included to help search engines understand better what the content is really about.
That's my 2 cents. Hope it helps!
Dana
Hi Steve,
This is a great question. I think it depends entirely on how much search volume there is surrounding the other variants for "refurbished" parts. If there's a reasonable amount, I'd recommend giving them their own URLs. I know this is harder because if they are substantively similar, producing unique content could be more difficult. I believe you can use schema.org markup to indicate "condition" [http://schema.org/OfferItemCondition] - This would help search engines understand that these items are unique from each other in important ways.
As long as it was clear to both humans and search engines that these power tools are uniquely different from each other in some way, I'd opt for the separate URLs and optimize them for long-tail terms.
If the only difference were color, say a power tool came in red or black, then maybe I would consider making these attributes that didn't necessarily influence the URL. Again there would be the caveat of search volume. If there was significant search volume for different colors, than having separate URLs and schema markup for each would be the way to go.
This is a similar type of question eCommerce site merchandisers (and SEOs!) ask themselves when strategizing how to handle faceted navigation. What combinations of facets warrant their own URLs and which ones do not? I would let search demand guide your answer.
I agree with Logan. If the total number of URLs in your master sitemap is 50,000 or less, you can do one sitemap. If it's more, split it into multiples.
You should mark Logan's answer as a good answer.
Sincerely,
Dana
Hi Bob,
I second Paul. His answer is a good one. Hope we helped you.
Sincerely,
Dana
Hi again Bob,
Take a look at this thread on how to remove query strings from static parameters...I believe your answer is there.
https://wordpress.org/support/topic/how-to-remove-query-strings-from-static-resources
Dana
P.S. Why is this a problem for SEO? A couple of reasons:
1. It's highly likely your content will get shared without the query parameter AND with the query parameter. This will effectively split your link equity between two versions of the same page.
2.Google Search Console is very bad at understanding that the page without the query string is the same as it is with the query string...you'll likely get a lot of duplicate content notifications.
3. From an end-user standpoint, it's just plain ugly....and end user experience matters to SEO right? - I understand that's somewhat facetious....but it's your business right? You want it to look a good, solid, high-quality, professional site. Ugly query parameters scream "I hired my 21 year old nephew to b build me a WordPress site."
Hi Bob,
What CMS are you working with? Once you answer that I might be able to help a little more.
Dana
Hi,
I agree with EGOL that the "100 links" rule is old information.
To more specifically answer your question, yes, all links in your global navigation, footer and links on category and product pages are all counted as internal links and all (provided you haven't done anything silly like added "no follow" attributes) pass link equity throughout your site. For this reason it's important to be strategic about the architecture of your navigation and internal linking structure. Ideally, your top most important pages should be included, if possible in your navigation and/or footer.
It's not unusual for large eCommerce sites to have significantly more than 100 links on a given page.
For example, Home Depot ranks #2 in Google for the term "flushmount lights" with this page: http://www.homedepot.com/b/Lighting-Ceiling-Fans-Ceiling-Lights-Flushmount-Lights/N-5yc1vZc7nk
As you can see from the attached screenshot, this page has 523 links on it. While clearly exceeding the "100 links" - this page still has no problem ranking very well for a targeted keyword.
For verification that Google dropped the "100 links" rule, check out this Matt Cutts video from November, 2013 - https://www.youtube.com/watch?v=QHG6BkmzDEM
EGOL is also right that Moz should update their suggested SEO best practices to reflect more current methodology.
Hope that's helpful!
Dana
Thanks Lesley. Yes, I agree. I think the only way we are going to get a definitive answer is to look at the logs. We are working on getting access.
Hi All! What constitutes a normal crawl rate for daily bingbot server requests for large sites? Are any of you noticing spikes in Bingbot crawl activity?
I did find a "mildly" useful thread at Black Hat World containing this quote: "The reason BingBot seems to be terrorizing your site is because of your site's architecture; it has to be misaligned. If you are like most people, you paid no attention to setting up your website to avoid this glitch. In the article referenced by Oxonbeef, the author's issue was that he was engaging in dynamic linking, which pretty much put the BingBot in a constant loop.
You may have the same type or similar issue particularly if you set up a WP blog without setting the parameters for noindex from the get go."
However, my gut instinct says this isn't it and that it's more likely that someone or something is spoofing bingbot.
I'd love to hear what you guys think!
Dana
Great question! Google Adwords Keyword Planner doesn't make the place where you can do this very obvious or easy to find. Here's where you can do what you need to do:
1. Log in to Keyword Planner
2. Click "Get Search Volume for a List of Keywords"
3. Upload your keyword list via the resulting popout screen
4. Run the report
5. In the subsequent screen, upper right, just below the chart is a link that says "Download Ideas" - Click that
Then you will have all the data in Excel, provided you are running something newer than Excel 2003 which I believe had limited capabilities (45,000 rows maybe?)
Does that answer your question or was I way off Let me know. Hope it helps!
I believe the problem here is being caused by the fact that you are using relative, rather than absolute URLs for your canonical tag. I've seen this happen before on a site I was working on. Thanks to awesome suggestions from Moz Q & A from community member George Andrews (endorsed by Dr. Pete Meyers), we updated all of our canonical tags to be absolute URLs instead of relative URLs. This completely solved the exact problem you are describing.
Here's a link to that thread: http://moz.com/community/q/what-is-the-proper-syntax-for-rel-canonical
The best news is, it's a very easy, inexpensive and quick SEO win. I love those!
Dana
Hi Garrett,
It really depends on the nature and use of the image. I'd really take that on a case by case basis. If you created unique images that received a ton of links or social shares, or they provided something important to your end users, than sure, recreating them on the same URLs might be a great idea. You might want to spot check a few in AHrefs, OSE and Google ANalytics to see if they were generating buzz, traffic, links or all three. Choose some likely candidates and find out what you see in those tools and let that guide how you proceed. Cheers,
Dana
If the images are no longer on the site it's perfectly acceptable to let them 404. Those 404s aren't going to hurt you. They should eventually drop out of Google's Index, although GWT can have a very long memory and hold on to 404 errors for what seems like forever.
If you think there is value for your end user in redirecting to a relevant image on the current site, that's fine too. I just wouldn't let your motivation be fear that 404 errors are somehow hurting you. If the 404s are resulting because the content has been removed, this is a perfectly acceptable reason to have a 404.
Hope that helps!
Dana