Any more thoughts on this?
- Home
- fablau
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
fablau
@fablau
Job Title: SEO
Company: Virtual Sheet Music
Favorite Thing about SEO
Playing with keywords
Latest posts made by fablau
-
RE: Combining images with text as anchor text
-
RE: Combining images with text as anchor text
Thank you Samuel for your reply as well.
Yes, what you describe is exactly what I also learned: no need to be too much "redundant" about keywords, but SEs will understand from the surrounding context... well, fact is some of our competitors are doing what I am suggesting here and they are dominating the 1st spot on Google for most of the keywords we are competing with. They also have a more clear "siloed" category-sub-category structure than us, which suggests this technique combined with the siloing technique help a great deal (also, note that for most category pages we compete with, we have much more external links than them! Hence my though that a more clear, siloed structure could help)
And of course, anything we do is with the user in mind: ALT text is always meant for users first, but I don't see harm in being a little bit redundant on that if it could help with SEO as well, don't you?
Thank you again very much, and please, any additional idea you may have is very welcome!
-
RE: Combining images with text as anchor text
Thank you Rob for your extensive reply.
I see what you mean, and I am aware of that. This "link technique" suggestion is part of a bigger plan I am working on where the goal is to create a more "siloed" structure to increase topical relevancy as I have discussed on this other thread of mine:
https://moz.com/community/q/panda-rankings-and-other-non-sense-issues
And even though that's a minor thing, everything adds up. For example, we have recently moved from http to https and that's also is a minor thing, but adds up with all other improvements we are working on.
As for your suggestion:
"I would consider is replacing the example music videos from your specific instruments pages to your home page so visitors know what kind of quality they are getting if they subscribe."
I don't exactly understand what you mean, are you talking about our own produced Music Expert videos or the YouTube videos inside our product pages submitted by the users?
Thank you again
-
Combining images with text as anchor text
Hello everyone,
I am working to create sub-category pages on our website virtualsheetmusic.com, and I'd like to have your thoughts on using a combination of images and text as anchor text in order to maximize keyword relevancy.
Here is an example (I'll keep it simple):
Let's take our violin sheet music main category page located at /violin/, which includes the following sub-categories:
-
Christmas
-
Classical
-
Traditional
So, the idea is to list the above sub-categories as links on the main violin sheet music page, and if we had to use simple text links, that would be something like:
Christmas
Classical
TraditionalNow, since what we really would like to target are keywords like:
"christmas violin sheet music"
"classical violin sheet music"
"traditional violin sheet music"
I would be tempted to make the above links as follows:
Christmas violin sheet music
Classical violin sheet music
Traditional violin sheet musicBut I am sure that would be too much overwhelming for the users, even if the best CSS design were applied to it. So, my idea would be to combine images with text, in a way to put those long-tail keywords inside the image ALT tag, so to have links like these:
Christmas
Classical
TraditionalThat would allow a much easier way to work the UI , and at the same time keep relevancy for each link. I have seen some of our competitors doing that and they have top-notch results on the SEs.
My questions are:
1. Do you see any negative effect of doing this kind of links from the SEO standpoint?
2. Would you suggest any better way to accomplish what I am trying to do?
I am eager to know your thoughts about this. Thank you in advance to anyone!
-
-
RE: What's the best way to noindex pages but still keep backlinks equity?
Thank you Chris for your in-depth answer, you just confirmed what I suspected.
To clarify though, what I am trying to save here by noindexing those subsequent pages is "indexing budget" not "crawl budget". You know the famous "indexing cap"? And also, tackling possible "duplicate" or "thin" content issues with such "similar but different" pages... fact is, our website has been hit by Panda several times, we recovered several times as well, but we have been hit again with the latest quality update of last June, and we are trying to find a way to get out of it once for all. Hence my attempt to reduce the number of similar indexed pages as much as we can.
I have just opened a discussion on this "Panda-non-sense" issue, and I'd like to know your opinion about it:
https://moz.com/community/q/panda-rankings-and-other-non-sense-issues
Thank you again.
-
RE: What's the best way to noindex pages but still keep backlinks equity?
You are right, hard to give advice without the specific context.
Well, here is the problem that I am facing: we have an e-commerce website and each category has several hundreds if not thousands of pages... now, I want just the first page of each category page to appear in the index in order to not waste the index cap and avoid possible duplicate issues, therefore I want to noindex all subsequent pages, and index just the first page (which is also the most rich).
Here is an example from our website, our piano sheet music category page:
http://www.virtualsheetmusic.com/downloads/Indici/Piano.html
I want that first page to be in the index, but not the subsequent ones:
http://www.virtualsheetmusic.com/downloads/Indici/Piano.html?cp=2
http://www.virtualsheetmusic.com/downloads/Indici/Piano.html?cp=3
etc...
After playing with canonicals and rel,next, I have realized that Google still keeps those unuseful pages in the index, whereas by removing them could help with both index cap issues and possible Panda penalties (too many similar and not useful pages). But is there any way to keep any possible link-equity of those subsequent pages by noindexing them? Or maybe the link equity is anyway preserved on those pages and on the overall domain as well? And, better, is there a way to move all that possible link equity to the first page in some way?
I hope this makes sense. Thank you for your help!
-
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone,
Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages?
For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page?
The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
-
RE: Can subdomains avoid spam penalizations?
Sure, I understand, that makes sense. Thank you for your help!
-
RE: Can subdomains avoid spam penalizations?
Sorry guys, I wasn't enough clear with my first question above, it was actually too generic.
To cut to the chase, I am talking about our main website:
www.virtualsheetmusic.com (IP 66.29.153.48)
and our affiliate website which is:
affiliates.virtualsheetmusic.com (IP 66.29.153.50)
They have 2 different IPs, but they are on the same server and same network, of course their are on the same IP block.
And I'd like to know to what extent the activity/status of one site can affect the other, but from what you are asking, I guess they could affect each other to some extent. I mean, Google could understand that they are part of the same "network" and then associate them anyway... right?
-
RE: Can subdomains avoid spam penalizations?
Thank you Vijay for your extensive answer, but as I wrote above, each sub-domain has its own separate IP address. So... if each sub-domain has its own IP address, are they treated as two completely different websites?
Best posts made by fablau
-
Is it better "nofollow" or "follow" links to external social pages?
Hello,
I have four outbound links from my site home page taking users to join us on our social Network pages (Twitter, FB, YT and Google+).
if you look at my site home page, you can find those 4 links as 4 large buttons on the right column of the page:
http://www.virtualsheetmusic.com/
Here is my question: do you think it is better for me to add the rel="nofollow" directive to those 4 links or allow Google to follow? From a PR prospective, I am sure that would be better to apply the nofollow tag, but I would like Google to understand that we have a presence on those 4 social channels and to make clearly a correlation between our official website and our official social channels (and then to let Google understand that our social channels are legitimate and related to us), but I am afraid the nofollow directive could prevent that. What's the best move in this case? What do you suggest to do?
Maybe the nofollow is irrelevant to allow Google to correlate our website to our legitimate social channels, but I am not sure about that.
Any suggestions are very welcome.
Thank you in advance!
-
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone,
Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages?
For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page?
The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
-
RE: Robots.txt: how to exclude sub-directories correctly?
Yes, everything looks good, Webmaster Tools gave me the expected results with the following directives:
allow: /directory/$
disallow: /directory/*
Which allows this URL:
http://www.mysite.com/directory/
But doesn't allow the following one:
http://www.mysite.com/directory/sub-directory2/...
This page also gives an update similar to mine:
https://support.google.com/webmasters/answer/156449?hl=en
I think I am good! Thanks
-
RE: Is there a way for me to automatically download a website's sitemap.xml every month?
The way I would do it would be to make a simple PHP (or Perl) program that every day, week or month (as you may need it), archives your sitemap.xml on a specific directory on your server, and possibly zip it. As a PHP programmer myself, I can tell you that that's really simply to do. Just ask to a PHP programmer, I am sure it will make it in a couple hours!
-
RE: Is there a way for me to automatically download a website's sitemap.xml every month?
If you use a MySQL database to store your website data, I think that to do this kind of automatic "archival" work by creating an automatic PHP script would take between 2 to 5 hours work. I don't see why it should take more than that.
If someone tells you that it is going to take more than that, I would be suspicious. Either the programmer is not good enough, or wants to cheat on you. That unfortunately happens more than you think!!
Be sure to ask for a step-by-step description of how they plan to complete the job. If you have doubts, please feel free to ask me, I am a pretty expert PHP programmer. I don't work for others, but just for myself (I built and keep tweaking my own websites virtualsheetmusic.com, musicianspage.com and others with very little help from external programmers).
Good luck!
-
Number of images on Google?
Hello here,
In the past I was able to find out pretty easily how many images from my website are indexed by Google and inside the Google image search index. But as today looks like Google is not giving you any numbers, it just lists the indexed images.
I use the advanced image search, by defining my domain name for the "site or domain" field:
http://www.google.com/advanced_image_search
and then Google returns all the images coming from my website.
Is there any way to know the actual number of images indexed? Any ideas are very welcome!
Thank you in advance.
-
Robots.txt: how to exclude sub-directories correctly?
Hello here,
I am trying to figure out the correct way to tell SEs to crawls this:
http://www.mysite.com/directory/
But not this:
http://www.mysite.com/directory/sub-directory/
or this:
http://www.mysite.com/directory/sub-directory2/sub-directory/...
But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way:
disallow: /directory/sub-directory/
disallow: /directory/sub-directory2/
disallow: /directory/sub-directory/sub-directory/
disallow: /directory/sub-directory2/subdirectory/
etc...
I would end up having thousands of definitions to disallow all the possible sub-directory combinations.
So, is the following way a correct, better and shorter way to define what I want above:
allow: /directory/$
disallow: /directory/*
Would the above work?
Any thoughts are very welcome! Thank you in advance.
Best,
Fab.
-
RE: Robots.txt: how to exclude sub-directories correctly?
Thank you Michael, it is my understanding then that my idea of doing this:
allow: /directory/$
disallow: /directory/*
Should work just fine. I will test it within Google Webmaster Tools, and let you know if any problems arise.
In the meantime if anyone else has more ideas about all this and can confirm me that would be great!
Thank you again.
-
RE: Combining images with text as anchor text
Any more thoughts on this?
-
RE: Combining images with text as anchor text
Thank you Rob for your extensive reply.
I see what you mean, and I am aware of that. This "link technique" suggestion is part of a bigger plan I am working on where the goal is to create a more "siloed" structure to increase topical relevancy as I have discussed on this other thread of mine:
https://moz.com/community/q/panda-rankings-and-other-non-sense-issues
And even though that's a minor thing, everything adds up. For example, we have recently moved from http to https and that's also is a minor thing, but adds up with all other improvements we are working on.
As for your suggestion:
"I would consider is replacing the example music videos from your specific instruments pages to your home page so visitors know what kind of quality they are getting if they subscribe."
I don't exactly understand what you mean, are you talking about our own produced Music Expert videos or the YouTube videos inside our product pages submitted by the users?
Thank you again
Fabrizio Ferrari is the founder of Virtual Sheet Music Inc, the leading company for classical sheet music downloads on the web since 1999.
Looks like your connection to Moz was lost, please wait while we try to reconnect.