What coding works for SEO and what coding doesn't?
-
Hi:
I recently learned about inline styles and that Google started penalizing sites for that in October. Then I was told that Wix and Flash don't work (or work well) either for SEO as the engines won't crawl them (I think). Does anyone know of a blog that goes over everything that doesn't work so that I could recognize it when I look at someone's code. Anyone know of such a resource?
Cheers,
Wes
-
Hi Mark:
Thanks very much for your note. I tried something similar the SEO Browser, and it shows you what the search engines see. Very useful.
Cheers, Wes.
-
The general rule of using HTML for everything is one that I would follow. If you're unsure if something is crawlable, try downloading the web developer plugin for chrome http://chrispederick.com/work/web-developer/. Then disable javascript and plugins and refresh the page. Any content that can't be seen then probably won't be seen by search engines either.
-
Hi Wesley,
I would recommend going over a couple of technical SEO audits you can find on the web. They give a clear insight in what SEOs are looking for when doing a technical audit for SEO. It mostly contains way more than only checking for Flash or Wix sites that hide or dont' display any content in their HTML.
-
Thanks very much, I'll definitely try the adwords and the cached versions.
Cheers,
wes
-
Basically a web page has to be crawlable. If an engine can't crawl it than you don't exist. SEO 101, use html for all text. Google has made strides to try to understand what sites are about but to keep it easy for crawlers use html.
As for a tool you can use googles adwords tool and add the url into "landing page" and see how some of the site gets crawled by a simple bot. You can also look at google cached versions of sites and look at the "text only" options, that should give you a good view of what the crawlers see.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product schema GSC Error 'offers, review, or aggregateRating should be specified'
I do not have a sku, global identifier, rating or offer for my product. Nonetheless it is my product. The price is variable (as it's insurance) so it would be inappropriate to provide a high or low price. Therefore, these items were not included in my product schema. SD Testing tool showed 2 warnings, for missing sku and global identifier. Google Search Console gave me an error today that said: 'offers, review, or aggregateRating should be specified' I don't want to be dishonest in supplying any of these, but I also don't want to have my page deprecated in the search results. BUT I DO want my item to show up as a product. Should I forget the product schema? Advice/suggestions? Thanks in advance.
Technical SEO | | RoxBrock1 -
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Can't for the life of me figure out how this is possible !! Any ideas ?
I would imagine it's not all that easy to rank on 1st page ( not going for 1st position here ) for https://www.google.com.au/search?q=credi+cards. I am looking at the AU market. For some reason which I can't figure out Everyday Money Credit Card ( https://www.woolworthsmoney.com.au/ ) ranks number 4. The home page redirect to https://www.woolworthsmoney.com.au/wowm/wps/wcm/connect/wowmoney/wowmoney/home/home/ Why have your homepage in this format ? I would love to hear any theories you guys might have. It does not look like they have a strong link profile , I could not figure out how old the domain was or what other possible reason there is for the site to rank .
Technical SEO | | RuchirP0 -
Should I worry about these 404's?
Just wondering what the thought was on this. We have a site that lets people generate user profiles and once they delete the profile the page then 404's. I was told there is nothing we can do about those from our developers, but I was wondering if I should worry about these...I don't think they will affect any of our rankings, but you never know so I thought I would ask. Thanks
Technical SEO | | KateGMaker1 -
SEO Checklist
Ok I know that this would be a huge over-simplification but I am wondering if there is (at least a bird's eye view) a checklist of SEO do's and dont's? I checked to see if something like this existed but could not find one. Any help would be much appreciated. Thanks~
Technical SEO | | bobbabuoy0 -
What does it mean by 'blocked by Meta Robot'? How do I fix this?
When i get my crawl diagnostics, I am getting a blocked by Meta Robot, which means that my page is not being indexed in the search engines... obviously this is a major issue for organic traffic!!! What does it actually mean, and how can i fix it?
Technical SEO | | rolls1230