Featured citations vs. regular citations
-
Do regular citations have the same impact as featured citations for Google Local? or is it the higher a company is isted the more impact it will have on a google local page?
-
Great! My pleasure, Donnie.
-
Yes this helps thanks again.
-
Hi Donnie,
Okay, I think I get it. Sorry to be slow on the uptake. I believe you saw this page on Yext:
http://www.yext.com/products.html
Yext's PowerListing product is described like this:
PowerListings
Business listing management made simple. Easily add your business
listings to premier local search sites like Yahoo!, Yelp and WhitePages.It then goes on to list 9 directories for which they are charging you a certain amount and 30+ directories for which there is no price listed. I found the setup of this page to be very vague, so in the spirit of offering you a really clear answer, I actually phoned Yext just now and spoke with John, a Senior Account Manager.
As I understand it, you've got 2 options with Yelp. You can purchase the $499 deal which will apparently give you enhanced listings in a variety of directories. He described these enhanced listings as ones that you could purchase yourself from the listed companies, but the benefit is central management of all + a quoted 50% cost savings from what you'd be charged if you made direct purchases of the enhanced listings.
Alternatively, you can purchase the Emerging Package from Yext which costs much less and which consists of listings in all the non-priced directories on that page I've linked to, plus you can choose a la carte enhanced listings with from the priced directories, but will be charged the retail price for these. Again, the benefit here is central management.
Now, here's something nice. John asked me to let you know that if you would like to talk to him directly, call him at (917) 210-6709 and he will not only be happy to answer your questions, but he will also give you a discount if you decide to sign up. How about that?
But, to return to your basic question, Donnie - will having an enhanced listing in a 3rd party directory positively influence your Google Local rank more than a free listing? You know, I would be very surprised if this is the case. I have seen no studies to indicate this, but I can't answer this with 100% certainty. So, my approach to purchasing Yext's service would be that managing the listings would be easier and that quoted 50% discount off the list price for enhanced listings would intrigue me if my clients wanted enhanced advertising. My clients are, for the most part, too strictly budgeted to warrant this, so that's a factor.
Hope this helps!
Miriam
-
I don't think being placed higher on third-party directories via paid listings will boost your ranking on Places. As long as your NAP is consistent and crawlable on directories, then this is best practice. A free listing is just as helpful to your local efforts as a premium listing.
However, paying for a listing on a local chamber of commerce or industry-related directory might have more impact. Finding and obtaining listings from directories that are local and relevant to you is key for having the most impact on your google local page.
-
Yes we are making good talk I was under the impression that when I sign up for Yext they automatically get me featured listings in all those directories... Why would anyone pay $450 for a couple hours of work? crazy... Anyways, if I did manually submit everywhere and decid to get featured in a few directories, would being listed higher on these pages help with Google Local efforts?
Again thank you very much, you are awesome!
-
Hey Donnie,
We're have a good chat today, aren't we? So, now, I'm still trying to understand your intent here. You can't get free listings via Yext. You have to pay for them as Yext's service is a paid service. I just know I'm not catching your meaning here. Do you have an example to share of an offer you've seen or something like that?
Maybe what you mean is...is there any ranking benefit to paying Yext to get your citations vs. you getting them for yourself, manually. If so, my answer would be: not that I've ever heard of. On an agency level, tools like Yext's are time savers and thus valued, but I still believe that manual citation development is the best way to have the most control over what you are doing.
-
Hi again Miriam
I was referring to a regular free listing vs. a premium listing. If I were to use Yext for all my local citations I would get premium listings... Would these listings have a higher impact on my Google Local efforts?
-
Hi Derek,
Possibly, that is what Donnie means. Thanks for chiming in and we'll have to see what he says when he returns. I'm actually wondering if he means citations from certain sources over others, or something like that. We'll see.
-
I was thinking the "featured" listings meant the paid listings on directories like Yelp and YP.com
-
Hi Donnie, Can you please define the term 'featured citation'? This is not one I've ever come across. Please describe what you mean by regular and featured citations and I'll be happy to share anything I know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 vs Canonical - With A Side of Partial URL Rewrite and Google URL Parameters-OH MY
Hi Everyone, I am in the middle of an SEO contract with a site that is partially HTML pages and the rest are PHP and part of an ecommerce system for digital delivery of college classes. I am working with a web developer that has worked with this site for many years. In the php pages, there are also 6 different parameters that are currently filtered by Google URL parameters in the old Google Search Console. When I came on board, part of the site was https and the remainder was not. Our first project was to move completely to https and it went well. 301 redirects were already in place from a few legacy sites they owned so the developer expanded the 301 redirects to move everything to https. Among those legacy sites is an old site that we don't want visible, but it is extensively linked to the new site and some of our top keywords are branded keywords that originated with that site. Developer says old site can go away, but people searching for it are still prevalent in search. Biggest part of this project is now to rewrite the dynamic urls of the product pages and the entry pages to the class pages. We attempted to use 301 redirects to redirect to the new url and prevent the draining of link juice. In the end, according to the developer, it just isn't going to be possible without losing all the existing link juice. So its lose all the link juice at once (a scary thought) or try canonicals. I am told canonicals would work - and we can switch to that. My questions are the following: 1. Does anyone know of a way that might make the 301's work with the URL rewrite? 2. With canonicals and Google parameters, are we safe to delete the parameters after we have ensures everything has a canonical url (parameter pages included)? 3. If we continue forward with 301's and lose all the existing links, since this only half of the pages in the site (if you don't count the parameter pages) and there are only a few links per page if that, how much of an impact would it have on the site and how can I avoid that impact? 4. Canonicals seem to be recommended heavily these days, would the canonical urls be a better way to go than sticking with 301's. Thank you all in advance for helping! I sincerely appreciate any insight you might have. Sue (aka Trudy)
Intermediate & Advanced SEO | | TStorm1 -
Google spending majority of time on NAV bar vs. most important pages
Google is spending 37% of its time crawling the NAV bar which is on every page. Google is spending very little time 1% on the most important pages -- product pages (https://www.skinsafeproducts.com/tatcha-violet-c-brightening-serum-20-vitamin-c-10-aha) on www.skinsafeproducts.com . Does anyone know what is going on and how I can change the behavior?
Intermediate & Advanced SEO | | akih0 -
Pagination new pages vs parameters
I'm working on a site that currently handles pagination like this cars-page?p=1 cars-page?p=2 In webmaster tools I can then tell ?p= designates pagination However I have a plugin I want to add to fix other seo issues, among those it adds rel="prev" rel="next" and it modifies the pagination to this cars-page-1.html cars-page2.html Notice I lost the parameter here and now each page is a different page url, pagination is no longer a parameter. I will not longer be able to specify the pagination parameter in webmaster tools. Would this confuse google as the pagination is no longer a parameter and there will now be multiple urls instead of one page with parameters? My gut says this would be bad, as I haven't seen this approach often on ecommerce site, but I wanted to see what the community thought?
Intermediate & Advanced SEO | | K-WINTER0 -
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
Category VS Post
I use my website for providing an international service, I made my URL structure website https://example.com/destinations/africa/country destinations is a category and Africa is a sub-category I made an article for every continent and inserted all the continent's country manually, the page url structure is https://example.com/destinations/africa/ and the continent category URL is https://example.com/category/destinations/africa/ I'm thinking about removing the continent article and strip the category Word from URL, So i will use the subcategories directly on the same link https://example.com/destinations/africa/ what's your advice about removing the continent article and using the sub categories instead? is it a good idea to use the child category as a reference for the internal links? what do you think about keeping both of them (child category and the Article)? in case you suggest to use the child category , Is removing Category word may hurt my SEO?
Intermediate & Advanced SEO | | batot_mahmoud0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Www vs. non-www differences in crawl errors in Webmaster tools...
Hey All, I have been working on an eCommerce site for a while that to no avail, continues to make me want to hang myself. To make things worth the developers just do not understand SEO and it seems every change they make just messes up work we've already done. Job security I guess. Anywho,most recently we realized they had some major sitemap issues as almost 3000 pages were submitted by only 20 or so were indexed. Well, they updated the sitemap and although all the pages are properly indexing, I now have 5000+ "not found" crawl errors in the non-www version of WMT and almost none in the www version of the WMT account. Anyone have insight as to why this would be?
Intermediate & Advanced SEO | | RossFruin0 -
Subdomains vs. Subfolders for unique categories & topics
Hello, We are in the process of redesigning and migrating 5 previously separate websites (all different niche topics, including dining, entertainment, retail, real estate, etc.) under one umbrella site for the property in which they exist. From the property homepage, you will now be able to access all of the individual category sites within. As each niche microsite will be focused on a different topic, I am wondering whether it is best for SEO that we use subdomains such as category.mainsite.com or subfolders mainsite.com/category. I have seen it done both ways on large corporate sites (ie: Ikea uses subdomains for different country sites, and Apple uses subfolders), so I am wondering what makes the most sense for this particular umbrella site. Any help is greatly appreciated. Thanks, Melissa
Intermediate & Advanced SEO | | grapevinemktg0