Page HTML great for humans, but seems to be very bad for bots?
-
We recently switched platforms and use Joomla for our website. Our product page underwent a huge transformation and it seems to be user friendly for a human, but when you look at one of our product pages in SEOBrowser it seems that we are doing a horrible job optimizing the page and our html almost makes us look spammy.
Here is an example or a product page on our site:
http://urbanitystudios.com/custom-invitations-and-announcements/shop-by-event/cocktail/beer-mug
And, if you take a look in something like SEObrowser, it makes us look not so good.
For example, all of our footer and header links show up. Our color picker is a bunch of pngs (over 60 to be exact), our tabs are the same (except for product description and reviews) on every single product page...
In thinking about the bots:
1-How do we handle all of the links from footer, header and the same content in the tabs
2-How do we signal to them that all that is important on the page is the description of the product?
3-We installed schema for price and product image, etc but can we take it further?
4-How do we handle the "attribute" section (i.e. our color picker, our text input, etc).
Any clarification I need to provide, please let me know.
-
out of curiosity, what did you think this page was for? thanks for your insight.
-
Just being honest....
I had absolutely no idea that this was a page for designing an invitation. None at all - until I read your reply.
If this was my site I would not allow a cool color picker or a coding challenge or whatever to compromise my success by pushing the description down under. I would find a way to make it work because I bet this will kill the conversion rate.
It's easier to double your income from current traffic that it is to double your traffic.
-
Hi EGOL,
Completely agree on beefing up the content as well as making the product name more relevant. We have run into cannibalization issues before so we have made our product names less competitive with our category pages and are working on making the page titles incredibly relevant (so we haven't done this yet, but Beer Mug Party Invitation) would be an example of what we'll change the page title to.
We struggle with bringing product description above the fold because the call to action is to play with the colors and see how customizable, flexible our products really are. We don't want folks to miss that by first seeing the product description.
As far as our HTML of the page, however, what are your thoughts on that. You'll see that the color picker (for example) pulls 66 pngs right in a row with a bunch of random numbers...tells the bot nothing of the page. However, that is how the code is built to make the interface work.
-
First I would try to serve visitors by getting the product description up above the fold and immediately visible by the people who visit the website.
Second, I would expand the title tag because "Beer Mug" puts you into generic competition when you want to compete for easier SERPs such as "custom printed beer mug" (or appropriate language for your product).
Third, your description is really really short. I honestly believe that it has a very good chance of being filtered for trivial content. So, I would take my most important product first and start beefing up the description. As you do that you will add more relevant words to the page so in addition to making your content above trivial you will be qalifying for long tail traffic. Another benefit is that it adds sales appeal and reduces the number of questions that come in by email and phone.
At my office we spend lots of time improving trival content. I spend a hundred hours a month on that. Taking twenty word pages with one image and improving them to 200 word pages with four images. That is for retail pages. Informative pages go up to over a thousand words eight images and a video. (those numbers are just examples - we don't have word count goals). The payback in traffic can be very high if you are in a busy niche and have a site with a little authority.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Validated pages on GSC displays 5x more pages than when performing site:domain.com?
Hi mozzers, When checking the coverage report on GSC I am seeing over 649,000 valid pages https://cl.ly/ae46ec25f494 but when performing site:domain.com I am only seeing 130,000 pages. Which one is more the source of truth especially I have checked some of these "valid" pages and noticed they're not even indexed?
Intermediate & Advanced SEO | | Ty19860 -
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Page title inconsistency
Hi folks, Our agency rebranded from New Brand Vision to Decibel Digital a few weeks ago. Most things seem to be fine, 301 redirected the site and our site looks much better however there is one issue. When searching for our responsive site using my Iphone5, the page title appears as "New Brand Vision", even though "New Brand Vision" isn't within the source code. Our page title is <title></span><span data-mce-mark="1">Creative Digital Agency in London | Decibel Digital </span><span class="html-tag" data-mce-mark="1"></title> which is picked up on Desktop, but not through mobile search when sourcing our responsive site. Does anyone have any suggestions? Many thanks!
Intermediate & Advanced SEO | | Tangent0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Duplicated Pages and Forums
Does duplicate content hurt that particular duplicated content, or the entire site? There are some parts of my site that I don’t care about getting high rankings on search engines. For example, I have a forum and there are certain links that only logged in people can see. If you aren’t logged in, they will take you to a page where it tells u to log in. google, obviously not logged in, interprets this as lots and lots of the same duplicated page. Should I just leave it alone cause I dont care if those pages makes it to search engines. Will it not hurt the entire site? For example, can my homepage search rankings decrase? That leads to my next question. What is the best way to optimize a forum? Whenever someone posts a new post, it seems another url for the same forum thread is created..... which is obviously duplicated….in other words, if like 20 people post on a thread, i believe my site adds 20 urls for that page...anyone know how to fix this?
Intermediate & Advanced SEO | | waltergah0 -
How to have pages re-indexed
Hi, my hosting company has blocked one my web site seeing it has performance problem. Result of that, it is now reactivated but my pages had to be reindexed. I have added my web site to Google Webmaster tool and I have submitted my site map. After few days it is saying: 103 number of URLs provided 39 URLs indexed I know Google doesn't promesse to index every page but do you know any way to increase my chance to get all my pages indexed? By the way, that site include pages and post (blog). Thanks for your help ! Nancy
Intermediate & Advanced SEO | | EnigmaSolution0 -
Are links to on-page content crawled / have any effect on page rank?
Lets say I have a really long article that begins with links to <a name="something">anchors on the same page.</a> <a name="something"></a> <a name="something">E.g.,</a> Chapter 1, Chapter 2, etc, allowing the user to scroll down to different content. There are also other links on this page that link to other pages. A few questions: Googlebot arrives on the page. Does it crawl links that point to anchors on the same page? When link juice is divided among all the links on the page, do these links count and page rank is then lost? Thanks!
Intermediate & Advanced SEO | | anthematic0