Is Google Making Life Harder For Aggregators?
-
Theres been a bunch of updates recently which have hurt aggregators:
-
Reducing the number of search results to 7 for branded search queries
-
The DMCA update which penalises those with trademark related takedown requests against them.
-
At least 2 'domain diversity' updates, the most recent last week, which seeks to reduce the ability of sites to dominate SERPS e.g. a site which may have 2 search results on page 1 now may have 1.
Plus Its commonly believed that Google favours big brands over smaller brands e.g. Marriott over examplehotelaggregator.com.
Is this a deliberate ploy against aggregators in favour of brands i.e. does Google believe a brand site is a better search result than an aggregator?
A brand site returned above an aggregator for a branded term may be seen by Google as a better fit, a better search result that should be higher. But is that true? Consumers like to see unbiased reviews and lowest prices and that isnt always available at the brand site.
Thoughts please.
-
-
If Google tried to avoid aggregators & removed 2 domain results from serps, so what will be in the results then?
-
For something a long those lines Google mixes in shopping results to try and bring the best results. But for a hotel kind of search I would much rather get the hotel itself.
-
- Reducing the number of search results to 7 for branded search queries
If a searcher typs in a branded query then there is a higher probability that his is looking for the brand and not Joe Aggregator.
- The DMCA update which penalises those with trademark related takedown requests against them.
This is good. Yes!!
- At least 2 'domain diversity' updates, the most recent last week, which seeks to reduce the ability of sites to dominate SERPS e.g. a site which may have 2 search results on page 1 now may have 1.
Enjoy this. This is actually working against the big brand.
Plus Its commonly believed that Google favours big brands over smaller brands e.g. Marriott over examplehotelaggregator.com.
Sure... most people want something trusted. They know Marriott. I you want examplehotelaggregator.com to rank then work on your brand.
Is this a deliberate ploy against aggregators in favour of brands
Only the aggregators are thinking that it is. The brands are not thinking that way and the average searcher is not thinking that way.
does Google believe a brand site is a better search result than an aggregator?
Yes.
-
This. The aggregator's big win could be fulfilling their role in the form of a media publisher and community hub, serving as a nexus for honest and objective opinions. Aggregators are a form of retailer except that the customer purchase is information exchange. When they abandon this opportunity and instead supply subjective info that doesn't put the consumer first, well, this is what makes the organic results such a polluted mess.
-
"When you do a search would you rather get the official companies or a site with a news feed or aggregation ran by a 3rd party source?"
- As I say in many cases the branded site isnt going to be your first port of call. E.g. Who buys their Wilson tennis racket from Wilson.com when you can get the same product may times cheaper elsewhere (random example, im not picking on Wilson!) and with honest reviews e.g. 'the strings on this racket are poor'?
-
I would say so. Google has been taking a lot of actions to bring back better quality results. When you do a search would you rather get the official companies or a site with a news feed or aggregation ran by a 3rd party source?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are Wordpress sites being dinged by Google? Read a few articles regarding.
I read a couple "SEO" related articles that sites built in Wordpress are going to be dinged by Google because Google sees Wordpress sites as simple to make and a higher potential to be "spammy". Is there any truth to this? Your thoughts? I do give "thumbs up" and "best answer" marks and appreciate receiving thumbs up myself... Thanks
Industry News | | JChronicle1 -
How do i get a description in my google local listing
My site is listed in the serps at number one but where google used to list the name of my site with the meta description below it, now Google lists my site title with my address to the right side and below it says Google+page instead of listing my meta description which had my key search phrase in it and also a call to action to see my video on my site. My click through was much better with the meta description below it and the call to action. is there any way i can get the description back under my title in the serps? Maybe by deleting my Google + page? Thanks in advance, Ron
Industry News | | Ron100 -
Anyone else know much about the Google Pirate penalty?
The Google 'Pirate' (no official name) seems to have gone largely undiscussed since it was launched last - Fri 10th August http://insidesearch.blogspot.co.uk/2012/08/an-update-to-our-search-algorithms.html. The idea of it is to ensure those 'Pirating' content or abusing trademarks e.g. fake ugg boot sites and file sharing sites do not appear higher in the search results than the genuine websites. Google is using DMCA take down requests for labeling sites as Pirate and demote their rankings, Im amazed not even seomoz has covered the subject yet as far as I can see, yet it is a hugely important new update, albeit affecting a relatively small number of sites now, and in some cases (at least one I know first hand) seemingly without justification (the example I know is not a file sharing, fake goods, trademark abusive site at all.) Google updating its search algorithm based on DMCA take down requests seems a bit strong - these are takedown requests, not legal proof that a site is infringing a trademark. A real weapon for negative SEO? Anyone else had experience of the pirate update or know much more about it? Outside Danny Sullivan I dont see many SEO folk covering it. Heres my own insights into it and what ive learned about what (only innocently) affected sites should do to appeal http://www.andy-maclean.net/the-google-pirate-dmca-guidance/
Industry News | | AndyMacLean0 -
Google guidlines 2011
Guys I have asked for the leaked SEO guidlines just to fine tune my SEO campaigns and it seems no one wanted to send it to me. Anyone can do it here, please?
Industry News | | SearchOfficeSpace230 -
Google Panda 2.5 Update?
On Sunday 18th Sept I noticed a huge drop in our rankings for keywords that we were doing extremely well. Majority of the keyword SERP positions for our main targetted keywords were #1 and #2. These have all drop the bottom part of first page. Other new keywords we were targetting had climbed very well (some hovering just below top 10 and some in top 10 of Google UK SERP. These have all completely dropped off. Although analysing the site thouroughly (both on-page and link profile) it doesnt appear to have any issue significant enough to cause a penalty. From Monday 20th Sept (everybody back to work) the threads here http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en&start=5760 and http://www.webmasterworld.com/google/4364389.htm seem to be buzzing over unexpected SERP drops and increases. By that I assume Panda 2.5 or at least some form of update taken/taking place? If anybody know of the reent heavy fluctuations which seem to have started in the weeken or have experienced unexpected positions increaes/drops, I would be very interested to hear/read from you. Cheers, Mo Raja
Industry News | | MoRaja0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Google API's
As you may know Google has API’s http://code.google.com/more/ I can see ones for Blogs, News etc. but not for general search am I being dense? If someone can point me in the right direction that would be great. Justin
Industry News | | GrouchyKids0 -
How to achieve the highest global and local relevance in google?
Let's say I have a company that has its main business in Europe for thefollowing languages: English German Portugese French Italian And let's say some other markets (e.g. the Portugese one in south america) is also important. The question now is how should we structure the Domain if we want onlyone top level domain (www.company.com)? a) By using subdomains to target users with Google Webmaster Tools for the relevant country: portugal.company.com/pt (same content) brasil.company.com/pt (same content) germany.company.com/de england.company.com/en etc. or b) by using virtual folders www.company.com/pt www.company.com/de www.company.com/en
Industry News | | imsi
etc. or c) something completely different I do not know about? What do you reckon is best? I appreciate all suggestions!0