It is hard for me to tell, as your template may vary. That does appear to be the correct spot, try it out and see what happens.
Make sure you are backing up for wordpress database and install before you go changing this stuff.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Job Title: Online Marketing
Company: In-house
Favorite Thing about SEO
Practical Creativity
It is hard for me to tell, as your template may vary. That does appear to be the correct spot, try it out and see what happens.
Make sure you are backing up for wordpress database and install before you go changing this stuff.
In wordpress editor, edit archives.php, find the following code block and change as needed:
<code># term->Title(); ?></code>
Click on "Export to csv" in the top right hand corner of your crawl report.
When the file downloads, there is a field called "referrer" which will tell you the exact page the broken link was found on.
I think it all depends on your needs and capacity. If you're asking the question "Are SEO and SEM different enough to merit two different positions?", then the answer is a resounding "Yes". Unfortunately this isn't possible for a lot of businesses so they have to look for someone a little more well rounded. If you can afford it and you have a decent sized business/funding, then yes, you should hire them as separate people.
In our case we are able to have both and so we do. Both of them are knowledgeable in both areas but each prefers his own job respectively. It's nice to have two people to bounce ideas and responsibilities off each other, if they are the right people they will work well together and do much more for your business than one alone.
How we help users and webmasters with duplicate content
We've designed algorithms to help prevent duplicate content from negatively affecting webmasters and the user experience.
1. When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster.
2. We select what we think is the "best" URL to represent the cluster in search results.
_3. We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL._Consolidating properties from duplicates into one representative URL often provides users with more accurate search results.
If you find you have duplicate content as mentioned above, can you help search engines understand your site?
First, no worries, there are many sites on the web that utilize URL parameters and for valid reasons. But yes, you can help reduce potential problems for search engines by:
1. Removing unnecessary URL parameters -- keep the URL as clean as possible.
2. Submitting a Sitemap with the canonical (i.e. representative) version of each URL. While we can't guarantee that our algorithms will display the Sitemap's URL in search results, it's helpful to indicate the canonical preference.
Hi Jorge,
If I am not mistaken the /product-page would receive link juice for the links with ?afl=XXXXXX at the end. No need to worry.
Hi Joshua,
In my opinion I do not believe the read more button is a good idea, but not because of an SEO issue. I don't think it's hurting you because GoogleBot can tell the difference between a div hidden for spam and a div hidden for valid reasons. See this quote :
The goal of our guidelines against hidden text and cloaking are to ensure that a user gets the same information as the Googlebot. However, our definition of webspam is dependent on the webmaster's intent. For example, common sense tells us that not all hidden text means webspam--e.g. hidden DIV tags for drop-down menus are probably not webspam, whereas hidden DIVs stuffed full of unrelated keywords are more likely to indicate webspam.
That being said, I do not think the "Read More" button is a good idea in your particular case for the following reasons :
If I were in your position I would nix the read more button, go with a 2 column layout and put the call to action in a very obvious spot in a color that stands out. Maybe a big button with text that says "Learn to surf for only $35" or something.
I hope this helped and sorry if my advice was not what you are expecting!
Ah I understand now. SEOmoz does not have anything like this in their arsenal. There are a number of tools that you can use to do this, the two i'm familiar with :
Of the two I found Raven Tools much better but Authority has some features Raven does not so it is a matter of preference. Best of luck!
Could you provide some clarification on your question? In my mind tracking the positions of multiple keywords over time is the best way to evaluate the effectiveness of your SEO efforts, along with :
Follow this up with :
Obviously if the client wants to get further into specifics you should be knowledgeable and prepared to do this.
Sam Crocker wrote a great YouMoz post on this just last week, it's a great read : Improving Reporting Efficiency and Relevance
This is my opinion and is not backed up by any concrete evidence.
Given the choice, I would opt for the cross-domain rel canonical. Matt Cuts has said that google prioritizes the original page in search results (link references rel canonical within domain, not cross-domain) and based on todays whiteboard friday and this video from matt cuts, I think rel canonical is the way things are moving, particularly for content syndication.
Edit: It also just occurred to me that there is no reason you can't ask for both. Rel canonical is helpful to GoogleBot determining who the original content creator is but offers absolutely nothing for the user. It takes little more than the flick of a pen to require your syndication partners to include both rel canonical and a link back.
Edit #2 : Regarding your question about the difference between the Google Source Attribution tag vs. Cross-Domain rel canonical :
Update 2/11/11:
We've had a lot of interest in these meta tags, particularly in how the syndication-source tag relates to rel=canonical. After evaluating this feedback, we’ve updated our system to use rel=canonical instead of syndication-source, if both are specified.
If you know the full URL, rel=canonical is preferred, and you need not specify syndication-source.
If you know a partial URL, or just the domain name, continue using syndication-source.
We've also had people ask "why metatag instead of linktag"? We actually support both forms for the tag, and you can use either. However, we believe the linktag form is more in line with the spirit of the standard, and encourage new users to implement the linktag form rather than the metatag form we originally proposed.
Hi Joshua,
In my opinion I do not believe the read more button is a good idea, but not because of an SEO issue. I don't think it's hurting you because GoogleBot can tell the difference between a div hidden for spam and a div hidden for valid reasons. See this quote :
The goal of our guidelines against hidden text and cloaking are to ensure that a user gets the same information as the Googlebot. However, our definition of webspam is dependent on the webmaster's intent. For example, common sense tells us that not all hidden text means webspam--e.g. hidden DIV tags for drop-down menus are probably not webspam, whereas hidden DIVs stuffed full of unrelated keywords are more likely to indicate webspam.
That being said, I do not think the "Read More" button is a good idea in your particular case for the following reasons :
If I were in your position I would nix the read more button, go with a 2 column layout and put the call to action in a very obvious spot in a color that stands out. Maybe a big button with text that says "Learn to surf for only $35" or something.
I hope this helped and sorry if my advice was not what you are expecting!
This is my opinion and is not backed up by any concrete evidence.
Given the choice, I would opt for the cross-domain rel canonical. Matt Cuts has said that google prioritizes the original page in search results (link references rel canonical within domain, not cross-domain) and based on todays whiteboard friday and this video from matt cuts, I think rel canonical is the way things are moving, particularly for content syndication.
Edit: It also just occurred to me that there is no reason you can't ask for both. Rel canonical is helpful to GoogleBot determining who the original content creator is but offers absolutely nothing for the user. It takes little more than the flick of a pen to require your syndication partners to include both rel canonical and a link back.
Edit #2 : Regarding your question about the difference between the Google Source Attribution tag vs. Cross-Domain rel canonical :
Update 2/11/11:
We've had a lot of interest in these meta tags, particularly in how the syndication-source tag relates to rel=canonical. After evaluating this feedback, we’ve updated our system to use rel=canonical instead of syndication-source, if both are specified.
If you know the full URL, rel=canonical is preferred, and you need not specify syndication-source.
If you know a partial URL, or just the domain name, continue using syndication-source.
We've also had people ask "why metatag instead of linktag"? We actually support both forms for the tag, and you can use either. However, we believe the linktag form is more in line with the spirit of the standard, and encourage new users to implement the linktag form rather than the metatag form we originally proposed.
Click on "Export to csv" in the top right hand corner of your crawl report.
When the file downloads, there is a field called "referrer" which will tell you the exact page the broken link was found on.
6/10/2009 For a long time now I have wondered what sets apart great search marketing articles (namely, YOUmoz posts) from not so great ones. Unfortunately, there is no short answer to this question. Extensive discourse both online and off makes search marketing a tough subject to get a definite answer on. Fortunately, the YOUmoz thumbs/comment system provides a great place for us to start, and by limiting ourselves to tangible factors, we can draw safe conclusions from its data. I will follow up with another post examining this information from a content-based viewpoint, but for today our discussion points will be strictly visual.
4/17/2008 The following is a gross exageration of my reaction to an incident I experienced recently, the form of writing is intended to poke fun at my ridiculous reaction to it. So i'm doing some routine checking in on my inbound link numbers for our site, open up google webmaster tools and I see this : ...
3/6/2008 Sometime in the last week, it seems that Google has lowered the standards for Sitelinks. I've seen them for the (only) 2 sites I manage, and Ann Smarty (seosmarty.com) has seen sitelinks for a client with whom she has not done work for in 3 months. I was talking to someone here at SMX West (name withheld for privacy reasons) and he said th...
2/15/2008 In preparation for a new project i'm starting, i've been reading mounds of information about web design and aesthetics. You all know the drill, blog after blog, sifting through the duplicate content and article after article separating the useful information from corporate fluff.It was in this that I realized my big mistake, reading information for the sake of reading infor...
12/8/2007 With Rand and the crew AFK at SMX West </jealousy>...I figured it's freed up some time in the spotlight for a novice like myself. For those wondering, I'm not an SEO expert, or a CEO or anything like that, I'm an IT Manager for a medium sized company. I've learned and applied concepts learned from my membership here at SeoMoz and a few other locations and it is...
In-house SEO
Looks like your connection to Moz was lost, please wait while we try to reconnect.