Does google have the worst site usability?
-
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty?
For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things.
In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check.
In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one.
We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all.
Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
-
yes, that was horrible.
If they tested it first, it must have been with a bunch of goth geeks
-
"Do as we say, not as we do..."
No site is perfect, and Google is far from it. But kudos to them for testing much of the time and backing off of poor designs, like they did a few weeks ago with the extremely flawed nav menu they were testing: http://searchengineland.com/google-moves-away-from-large-navigation-drop-down-menu-111057
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical when using others sites
Hi all, I was wondering if this is a good way to safely have content on our website. We have a job search website, and we pull content from other sites. We literally copy the full content text from it's original source, and paste it on our own site on an individual job page. On every individual job page we put a canonical link to the original source (which is not my own website). On each job page, when someone wants to apply, they are redirected to the original job source. As far as I know this should be safe. But since it's not our website we are canonical linking to, will this be a problem? To compare it was indeed.com does, they take 1 or 2 senteces from the original source and put it as an excerpt on their job category page (ie "accountant in new york" category page). When you click the excerpt/title you are redirected to the original source. As you might know, indeed.com has very good rankings, with almost no original content whatsoever. The only thing that is unique is the URL of the indeed.com category where it's on (indeed.com/accountant-new-york), and sometimes the job title. Excerpt is always duplicate from other sites. Why does this work so well? Will this be a better strategy for us to rank well?
Algorithm Updates | | mrdjdevil0 -
Does Bing Support same sitemap for full site, mobile, and images?
We have 1 sitemap for our desktop site, mobile site, and images. This works for Google, but I'm not sure if it's supported by Bing or if they require separate sitemaps. Anyone know?
Algorithm Updates | | YairSpolter0 -
Can Google penalize a country keyword
Hello again guys Thank you for your previous help with www.kids-academy.co.uk - we are slowly getting there! I wanted to ask something I cannot seem to find an answer to, can Google penalize you by country? By this I mean; Search term
Algorithm Updates | | LeanneSEO
Nursery franchise UAE Page 1
Nursery franchise UK Nowhere to be found! The page in question (well a section of the site) has been optimised for UK, however, as they do have a sister site in the UAE, it mentions those areas too. The pages I have been working on are now ranking reasonably well to say there is a long way to go, but for long tailed keywords NOT including anything to do with the UK. There are no naughty backlinks with the anchor text to do with the UK, the server is hosted in the UK, it is a .co.uk URL (no geotagging but I would like to know if this is of any use with this type of URL, everything says no, but it cant harm can it?) - is it possible Google due to bad practices in the past have slapped a penalty on the specific keyword area? Not something I have come across previously but I am scratching my head over here! Time for a brew break 😄 Thanks in advance guys! Leanne1 -
Weekly traffic from Google: How do you explain this?
Hello here, I have question for you. Please, have a look at the attached image from my website analytics which shows the unique visits trend of the last 2 months. What it is interesting is that every Monday Google brings me more traffic than any other day of the week, whereas on Saturdays it gives me the lowest traffic. And looks like that's a pretty regular weekly pattern. Why is Google doing that? What does that mean? Why such a clear and steady pattern? I am eager to know your thoughts about this! CGuULrN.jpg
Algorithm Updates | | fablau0 -
Next Google PR update
When is next google Pagerank update is expected to arrive.
Algorithm Updates | | csfarnsworth
I know it takes one month to one year for Google to update it but I know many people sitting here at Moz know some secrets for sure.0 -
Regarding site url structure
OK so there are already some answers to questions similar to this but mine might be a little more specific. OK website is www.bestlifeint.com Most of our product pages are as such: http://www.bestlifeint.com/products-soy.html for instance. However I was trying to help the SEO for certain pages (namely two) with the URL's and had some success with another page our Soy Meal Replacement I changed the site URL of this page from www.bestlifeint.com/products-meal to www.bestlifeint.com/Soy-Amazing-Meal-Replacement-with-Omega-3s.html (notice I dropped the /product part of url and made it more seo friendly. The old page for this page was something like www.bestlifeint.com/products-meal The issue is that recently this new page and another page I have changed http://www.bestlifeint.com/Whey-Milk-Alternative.html I have dropped the "/product" on the URL even though they are both products. The new Meal Replacement page used to be ranked like 6th on google at the begining of the month and now is like 48th or something. The new "whey milk" page (http://www.bestlifeint.com/Whey-Milk-Alternative.html) is ranked like 45th or something for "Whey Milk" when the old page...."products/wheyrice.html" was ranked around 18th or so at the begining of the month. Have I hurt these two pages by not following www.bestlifeint.com/product.... site structure? And focusing more on the URL SEO? I have both NEW pages receiving all link juice inside web site so they are the new pages (can not go to old page) and recently seeing that google has pretty much dropped the old pages in search rankings I have deleted these two pages. Do i just need to just wait and see? According to my research we should rank much higher for "Whey Milk" we should be on the first page according to googles own statements of searchers finding good relevant material. Any advice moving forward? Thanks, Brian
Algorithm Updates | | SammisBest0 -
Google decreased use of Meta Descripiton Tag
Over the past month or so I have noticed that Google is not using the meta description for my pages but is instead pulling text from the actual page to show on the SERP. Is Google placing less emphasis on meta descriptions?
Algorithm Updates | | PerriCline0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0