Location of Content within the Code Structure
-
Hi guys,
When working with advanced modern websites it many times means that in order to achieve the look and feel we end up with pages that has almost 1000 lines of code or more. In some cases it is impossible to avoid it if we are to reach the Client's visual and technical specifications.Say the page is 1000 lines of code, and our content only starts at line 450 onwards, will that have an impact from a Google crawlability, hence affect our SEO making it harder to rank?
Thoughts?
Dan.
-
Yes it's most definitely a factor in rankings but as you say, to achieve visual perfection on a budget (using a theme and not coding from scratch) you do end up with a lot of code.
I always ensure my sites score as high as possible in speed tests, and the Html, Css, and Java are all properly minified (when possible), and that's about all you can do.
If the site scores at least a 90/100 in the page speed test then Google are not going to hold back a site that looks good and has great content because it has a lot of code in the site.
Most of all that code is for the browsers to render the site correctly but good Seo is mainly dependant on the content contained within certain tags. I just checked one of my sites, and it has 600 lines of code before my H1 tag, thanks to the revolution slider. But the site still ranks top 3 for many keywords and still achieves a 93/100 on page speed test.
All things equal, custom built flat html sites will always rank better than themes php template sites, but it's quite rare that all things are equal. Those 400 lines of code may be holding you back by 1 spot or 5 spots, but it's nothing that some good links or great content can't fix. I understand your point though, as it's a painfully slow process to fix that code.
-
if you have a high html/content rate, the google robot will need less time to crawl your site content.
In my opinion is an important SEO on page factor that´s why i´m always trying to:
-
avoid javascript inlines
-
have a clean & simple HTML code
-
optimize the code deleting unnecesary characters like spaces.
Last december, I changed my wordpress theme to one with a better/clean code, after 4/5 weeks I improve considerably my rankings, coincidence? I don´t think so
Br
//Oliver
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page content not being recognised?
I moved my website from Wix to Wordpress in May 2018. Since then, it's disappeared from Google searches. The site and pages are indexed, but no longer ranking. I've just started a Moz campaign, and most pages are being flagged as having "thin content" (50 words or less), when I know that there are 300+ words on most of the pages. Looking at the page source I find this bit of code: page contents Does this mean that Google is finding this and thinks that I have only two words (page contents) on the page? Or is this code to grab the page contents from somewhere else in the code? I'm completely lost with this and would appreciate any insight.
Technical SEO | | Photowife1 -
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
Geo Targeting Content Question
Hi, all First question here so be gentle, please My question is around geo targeted dynamic content; at the moment we run a .com domain with, for example, an article about running headphones and then at the end - taking up about 40% of the content - is a review of some people can buy, with affiliate links. We have a .co.uk site with the same page about running headphones and then 10 headphones for the UK market. Note: rel alternative is used on the pages to point to each other, therefore (hopefully) removing duplicate content issues. This design works well but it involves having to build links to two pages, in the case of this example. What we are thinking of doing is to just use the .com domain and having the product page of the page served dynamically, ie, people in the UK see UK products and people in US see US products. What are people's thoughts on this technique, please? From my understanding, it wouldn't be any problem with Google for cloaking etc because a googlebot and a human from the same country will see the same content. The site is made in Wordpress and has <....html lang="en-US"> (for the .com) in the header. Would this cause problems for the page ranking in the UK etc? The ultimate goal of doing this would be to reduce link building efforts by halving the number of pages which links would have to be built for. I welcome any feedback. Many thanks
Technical SEO | | TheMuffinMan0 -
Removing a lot of content & changing url structure.
I recently moved an existing ecommerce site, which I recently purchased, from Volusion to Shopify. The new site has a completely different link structure. The old site also had about 120 products which are not even close to being up to par with the products I now have on the site. So I had to remove all of those pages too. I was just wondering which measures I need to take to deal with this? I created a really nice 404 page. I also 301 redirected the pages which still exist. But I was wondering if there is anything else I should do? Should I request a removal of all the old pages, which no longer exist? Should I do something else I'm not thinking about? Any help would be greatly appreciated. Thanks. jim
Technical SEO | | PedroAndJobu0 -
Duplicate video content question
This is really two questions in one. 1. If we put a video on YouTube and on our site via Wistia, how would that affect our rankings/authority/credibility? Would we get punished for duplicate video content? 2. If we put a Wistia hosted video on our website twice, on two different pages, we would get hit for having duplicate content? Any other suggestions regarding hosting on Wistia and YouTube versus just Wistia for product videos would be much appreciated. Thank you!
Technical SEO | | ShawnHerrick1 -
Copying my content
Hi there, I run a successful e-commerce website, which the product pages are rich with content linking to other products etc, one of our retailers who sell our products I just noticed copied and pasted the content I have written for these product pages leaving in all the links, which it turn are linking back to my product pages, is this a good thing? or should I make that retailer put in canonical tags? Thanks for any help
Technical SEO | | Paul780 -
Is this dangerous (a content question)
Hi I am building a new shop with unique products but I also want to offer tips and articles on the same topic as the products (fishing). I think if was to add the articles and advice one piece at a time it would look very empty and give little reason to come back very often. The plan, therefore, is to launch the site pulling articles from a number of article websites - with the site's permission. Obviously this would be 100% duplicate content but it would make the user experience much better and offer added value to my site as people are likely to keep returning even when not in the mood to purchase anything; it also offers the potential for people to email links to friends etc. note: over time we will be adding more unique content and slowly turning off the pulled articled. Anyway, from an seo point of view I know the duplicate content would harm the site but if I was to tell google not to index the directory and block it from even crawling the directory would it still know there is duplicate content on the site and apply the penalty to the non duplicate pages? I'm guessing no but always worth a second opinion. Thanks Carl
Technical SEO | | Grumpy_Carl0