How about a temporary un-capping of the limit? You are making your product harder to use and it's affecting business. Fine if you want to charge more, but can we have it the way it was before you make (announce a change?
Posts made by 540SEO
-
RE: KW Difficulty Daily Limit |
-
RE: KW Difficulty Daily Limit |
I think a 400 limit per day is unreasonable, especially given the amount of the subscription fee for Moz. Usually when doing KW research, I have 10 categories or so of KW's, each with 50-75 terms. I don't use the tool daily, but when I do, getting hung up by a term limit puts my deadlines at risk.
Why was there no notification of this change to your paying users?
-
Client wants to distribute web content to dealers - iFrame?
I have a client who sells a product through a network of nationwide dealers. He wants to provide update-able content to these dealers so they can create sections on their websites dedicated to the product. For ex., www.dealer.com/product_XYZ. The client is thinking he'd like to provide an iframe solution to the dealers, so he can independently update the content that appears on their sites.
I know iFrames are old, but are there any SEO concerns I should know about? Another option is to distribute content via HTML that has a rel=canonical command as part of the code, but then he loses the ability to centrally update all the distributed content.
Are there other solutions he should consider?
Thanks --
-
RE: PDF on financial site that duplicates ~50% of site content
Thanks. Anybody want to weigh in on where to rel=canonical to? Home page?
-
RE: PDF on financial site that duplicates ~50% of site content
I thought the idea was to put rel=canonical on the duplicated page, to signal that "hey, this page may look like duplicate content, but please refer to this canonical URL"?
Looks like there is a pdf option for rel=canonical, I guess the question is, what page on the site to make canonical?
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Indicate the canonical version of a URL by responding with the
Link rel="canonical"
HTTP header. Addingrel="canonical"
to thehead
section of a page is useful for HTML content, but it can't be used for PDFs and other file types indexed by Google Web Search. In these cases you can indicate a canonical URL by responding with theLink rel="canonical"
HTTP header, like this (note that to use this option, you'll need to be able to configure your server):Link: <http: www.example.com="" downloads="" white-paper.pdf="">; rel="canonical"</http:>
-
RE: PDF on financial site that duplicates ~50% of site content
Not sure which page I would mark as being canonical, since the pdf contains content from several different pages on the site. I don't think it's possible to assign different rel=canonical tags to separate portions of a pdf, is it?
-
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site.
Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect.
Thanks --
-
RE: Local business with multiple sites
Related question: if location #2 is brought under site 1 as a location page, what's best practice as far as putting the address in the footer sitewide? Put location 1 address in the footer everywhere but the location 2 page(s)? Avoid altogether?
Thanks --
-
RE: Local business with multiple sites
Hi -- thanks for your help. Here is more info. in response to your answer:
-I have picked up duplicate content problems and will be working with the client to fix this
-The locations are in the same metro area (good to know that separate states can be a good reason to keep separate sites)
-The lost rankings/Places shake-up is a bit concerning. Site 1 is well-established and has domain authority of 38 and home page authority of 48 (this is the site I'd likely move everything to). Location 2's site is 2.5 years old but has 23 domain authority and 36 page authority. Site 3 is an online store for spa products and very new (not yet launched).
For queries that trigger a Places result, location #1 outranks location #2 in every instance I can find. Having location #2 disappear for a while wouldn't be great, but from what I see the location #1 site ranks really well Organically (and there is a prominent link to location #2 on the home page) so we may be OK.
Also, there are a few queries where local results are not triggered, and the location #2's site ranks high. I'm not worried about the Organic ranking scenario in this case because a 301 redirect should largely preserve the position, correct?
In any case, I think the benefits outweigh the costs and consolidated ranking power potential. I'll keep my fingers crossed that the Places shake-up will be short-lived and advise the client accordingly.
Thanks -- let me know if I missed anything.
-
RE: Local business with multiple sites
Hi -- there are no subdomains in the equation here. I would be moving to subfolder pages (not subdomains), and the current domains are separate domain names altogether.
Thanks --
-
RE: Youtube dofollow link to web site
YouTube links in the video and Channel to your site are all nofollow. I'm not sure if they were ever dofollow, but regardless, it's not a bad link to get to round out a natural looking link profile (plus the links can generate traffic). If your video has a good chance of getting shared outside of YouTube, make sure to put your domain name and/or brand in the video itself so you get attribution and traffic.
-
Local business with multiple sites
I'm auditing a local business' sites (a spa) and I wanted to run my recommendations by everyone.
There are 3 sites:
www.sitename1.com -- main store location, used for Google Places listing #1 www.sitename2.com -- 2nd store location, used for Google Places listing #2 www.sitename3.com -- used for product sales for both locations
Sitename1.com has the most ranking power. I'm going to recommend that they move sitename2.com and sitename3.com to sitename1.com as subfolders, 301 redirecting each page to the corresponding page on sitename1.com/subfolder.
Google Places listing #2 would be changed from www.sitename2.com to www.sitename.com/location2.
Any risks or problems with this strategy anyone can see?
-
RE: Video thumbnail pages with "sort" feature -- tons of duplicate content?
Thanks for the link. That does help with the paginated pages issue.
Anyone have any thoughts on the sort feature and how that will be viewed by the search engines? Are different "sort" results containing some of the same video thumbs/descriptions considered duplicate content? What's the best way to handle that?
-
Video thumbnail pages with "sort" feature -- tons of duplicate content?
A client has 2 separate pages for video thumbnails. One page is "popular videos" with a sort function for over 700 pages of video thumbnails with 10 thumbnails and short desriptions per page. (/videos?sort_by=popularity).
The second page is "latest videos" (/videos?sort_by=latest) with over 7,000 pages.
Both pages have a sort function -- including latest, relevance, popularity, time uploaded, etc. Many of the same video thumbnails appear on both pages.
Also, when you click a thumbnail you get a full video page and these pages appear to get indexed well.
There seem to be duplicate content issues between the "popular" and "latest" pages, as well as within the sort results on each of those pages. (A unique URL is generated everytime you use the sort function i.e. /videos?sort_by=latest&uploaded=this_week).
Before my head explodes, what is the best way to treat this? I was thinking a noindex,follow meta robot on every page of thumbnails since the individual video pages are well indexed, but that seems extreme. Thoughts?
-
RE: Press Release for INFOGRAPHICS
I've done a few IG's and an effective strategy is to develop a hit list of blogs and websites that are relevant to your topic, then contact each one with a brief explanation of the infographic and why it's relevant to their audience. Track responses in a spreadsheet and follow-up with folks you haven't heard from 5-7 days later. Keep emails brief, friendly and professional and craft a compelling headline that isn't spammy.
The most important part of the IG is the embed code. Provide it in the emails you send out so people can easily grab it and post to their site/blog. Also, create a page on your site (make sure you post the IG on your site first so Google credits you with the content) that includes the embed code. When people post using the embed code, the result should be your IG, anchor text link back to your site and a link for "Place this graphic on your site". Also, don't forget to register the work on Creative Commons.
Here's a recent example of how it works: http://540seo.com/are-online-reviews-killing-your-business
-
Robots.txt blocking site or not?
Here is the robots.txt from a client site. Am I reading this right --
that the robots.txt is saying to ignore the entire site, but the
#'s are saying to ignore the robots.txt command?See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
To ban all spiders from the entire site uncomment the next two lines:
User-Agent: *
Disallow: /
-
RE: Best way to address duplicate news sections within site
Good catch on the subdomains! That is a separate issue, and I am recommending they move everything to a clientsite.com/folder setup. The sub-domains do have unique content (except for the news) and they set it up that way because they've seen other sites, like Google, set up sub-domains for maps and their other products.
What's a good explanation to the client for why other large sites like Google set up different content sections as subdomains vs. the folder approach I am recommending?
-
Best way to address duplicate news sections within site
A client has a news section at www.clientsite.com/news and also at subdomain.clientsite.com/news. The stories within each section are identical:
www.clientsite.com/news/story-11-5-2011
subdomain.clientsite.com/news/story-11-5-2011
What's the best way to avoid a duplicate content issue within the site? A 301 redirect doesn't seem appropriate from the user experience point of view.
Is applying a rel=canonical <www.clientsite.com news="" story-a-b-c="">to each story within the subdomain news section the best option? They have 100's of stories, wondering if there might be an easier way?</www.clientsite.com>
Also, the news pages list the story headline and the first 3 lines of copy. Do these summaries present duplicate content issues with the full story page?
Thank you!
-
See any issues with this tabbed content page?
When I view source, and view as Googlebot it's showing as 1 long page of content = good. However, the developer uses some redirects and dynamic page generation to pull this off. I didn't see any issues from a Search perspective but would appreciate a second opinion:
Thanks!
-
City targeting on home page
Client has a site that ranks well for "Town_A_KW", "Town_B_KW" and "Town_C_KW". The home page is the page that's ranking. These towns are part of the larger metro area for Portland. They want to start ranking for "Portland_KW" and normally, I'd recommend optimizing the home page for this phrase, and better optimizing the sub-pages for town A, B and C KW's.
The client is understandably nervous about messing with re-targeting the home page since it already ranks well. Is it best to:
-
Add "Portland_KW" to home page meta titles, content, etc. to try and rank for that phrase? (so home page would be optimized for Town A, B and C KW's + Portland_KW).
-
Re-target home page for "Portland_KW" only, and better optimize sub-pages for town A, B and C?
-
Leave home page as is, and create a "Portland KW" sub-page? (client's original idea).
Thanks in advance for your insights!
-
-
RE: Blog page outranks static page for KW -- why?
No worries. If the blog page had gone viral I think that could be an explanation, but this isn't close to approaching that.
-
RE: Blog page outranks static page for KW -- why?
FYI -- checked with a source in another city, and he also saw the blog result outranking the static page. It's an odd one. If anyone has any ideas, let's hear them!
-
RE: Blog page outranks static page for KW -- why?
Weird. I signed out of Google, appended &pws=0 and I still get the blog page ranking #12, static page #61. The weird thing is the client is seeing the same dynamic. Weird.
Just to be double sure we are on the same page, blog is the one ending with "mid-year-update/".
-
RE: Blog page outranks static page for KW -- why?
Just PM'd you, thanks. They don't have much social activity in general. The static page scores well for the target KW using the Term Target tool, the blog page does not -- although both pages have authoritative and market-appropriate content.
-
RE: Blog page outranks static page for KW -- why?
I ran a crawl test and there's no 302 -- the site does appear to be down right now though. I used Rank Tracker for the rankings. Did you get wildly different rankings?
-
RE: Blog page outranks static page for KW -- why?
Just PM'd you there URL's. Thx for taking a look. I'm happy to send anyone else the URL's that wants to investigate. Thank you!
-
RE: Blog page outranks static page for KW -- why?
Static page crawled 7/15, blog page crawled 8/1. It's a client, so not sure when the last time the static page was modified. Looks like the blog page was created around June.
-
Blog page outranks static page for KW -- why?
Blog page ranks 10 in Google, while the static page is on page 7. What makes it more interesting is that the blog page scores an "F" with the Term Target tool while the static page scores an "A".
Static page has more inbound links and a mR/mT of 3.89/ 4.54 vs. 3.71/ 4.14 for the blog page.
Any ideas on how to approach this one?
-
Strange baclinks
I'm helping out a friend with his legal blog, and in checking OSE I found scores of backlinks that appear paid, but he says they've done no link buying at all. Here are a couple of examples:
http://solidinspiration.com/odhkj/degry.php?b=317022
http://retservilo.com/wzkuz/koex.php?u=391050They don't look like the typical paid link sites because of the number of links on the page, and the fact they're not all hyperlinked and anchor text. Almost looks like hacked sites. Stranger still, they point to the non-www version of the site, and they have non-www redirected to www.
Anyone seen this before?
Thanks --