What is 508 compliance and how do I ensure it gets done?
-
Greetings Mozzers,
I'm completely new to 508 compliance and I hadn't really heard of it until yesterday. It came up in a conversation about W3C compliance (which I know Google doesn't necessarily validate for), but I hadn't ever heard of 508.
So question is, what is 508 and what needs to be done on the back end to become compliant here or to check for compliance. I do know that 508 refers to government regulation to make your website accessible to all.
Thanks for helping out on a rookie question
Cheers,
Pedram -
Hi Pedram,
It looks like 508 compliance is legislation in the US referring only to federal agencies and their information technology properties, so unless you are building for a federal agency, this won't be something you need to worry about.
Looking at your profile, it looks like your company supplies government clients but isn't a federal agency, is this right? (I was looking at this page http://www.learningtree.com/government/.)
If you are building for a federal body, the act refers mainly to accessibility issues and is part of the Rehabilitation Act (not the Americans with Disabilities Act as might be assumed). Sorry to link to Wikipedia, but information on the full act (with links to 508) is here: http://en.wikipedia.org/wiki/Rehabilitation_Act_of_1973
Hope this helps! Perhaps check with an internal person at Learning Tree who knows about compliance issues relating to your government work, but I do not see where contractors to government agencies are included in this. I could be wrong though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get visibility in Google Discover?
Hey everyone, I run a website that publish articles about pets. I have read some great things about Google Discover and the potential traffic it can bring to publishers (Condé Nast reported up to 20% of traffic coming from Discover in the US, at a certain point). I am currently trying to get indexed and after reading Google guidelines and a Ahrefs guide, I have made many optimizations to my site: structured data, creating an author page, fixing image size and publishing date... so far, it's not working. I feel the lack of a knowledge graph for my business may affect my chances. I'm currently building a GMB page to fix this. Do you have other recommendations or success stories of your own experiments with Discover? An example of an article I tried to get indexed was https://www.lebernard.ca/teletravail-chien-guide-survie/. Obviously, I'm not expecting feedback on the quality of the content since it's in French, but I'm curious if you see anything from a technical perspective that doesn't work. Thanks a lot for your help! Charles
Intermediate & Advanced SEO | | Cheebee1240 -
In Google Search Results ....Is it a site link or what? How to get this?
Hello Experts, When I search in google any keyword like abcd in search results for one website after meta description there are showing few links of website ( image attached ) Can you please let me know what is this & how to achieve such type of links? Thanks! mdJBLYb
Intermediate & Advanced SEO | | wright3350 -
Will these 301's get me penalized?
Hey everyone, We're redesigning parts of our site and I have a tricky question that I was hoping to get some sound advice about. We have a blog (magazine) with subcategory pages that are quite thin. We are going to restructure the blog (magazine) and feature different concert and have new subcategories. So we are trying to decide where to redirect the existing subcategory pages, e.g. Entertainment, Music, Sports, etc. www.charged.fm/magazine Our new ticket category pages ( Concert Tickets, NY Yankees Tickets, OKC Thunder Tickets, etc) are going to feature a tab called 'Latest News' where we are thinking of 301 redirecting the old magazine subcategory pages. So Sports News from the blog would 301 to Sports Tickets (# Latest News tab). See screenshot below for example. So my question is: Will this look bad in the eyes of the GOOG? Are these closely related enough to redirect? Are there any blatant pitfalls that I'm not seeing? It seems like a win/win because we are making a rich Performer page with News, Bio, Tickets and Schedule and getting to reallocate the link juice that was being wasted in an pretty much useless page that was allowed to become to powerful. Gotta keep those pages in check! Thoughts appreciated. Luke Cn6HPpH.jpg
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Can't get auto-generated content de-indexed
Hello and thanks in advance for any help you can offer me! Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated. Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly). When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up. Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary. I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them. Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed. One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either). Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end? Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813 Thanks for reading through all this!
Intermediate & Advanced SEO | | rja2140 -
Penalized because of Pharma Wordpress Hack, Fixed, When can we expect to get out?
Hey Guys, so one of our clients hired a web designers to re do his site. Unfortunately in the process the client got a nasty pharma hack and we had to completely re do his site in drupal by scratch because it was so difficult to remove the hack. In this process his lost all his rankings, sub 100 and the hack produced super low quality links from drug related sites pointing to his pages. We're 100% certain the hack is gone, we've disavowed every link, and used WMT to deindex all the drug pages the hack had created. Still 2 weeks later he is sub 100. Does anyone else know of any way to push this along faster? I wish there was some way to get Google to recognize its fixed faster as his business is destroyed.
Intermediate & Advanced SEO | | iAnalyst.com0 -
How do yo get local SEO to show up on search results
I am looking at an example of search results that displays the image below. I wanted to have the local address to the right of my website. How do I have something like this? qGJ6EBc
Intermediate & Advanced SEO | | herlamba0 -
What is the best approach for getting comments indexed, but also providing a great UX?
The way our in-house comments system was built, it uses AJAX to call comments as the page is loaded. I'm working on a set of requirements to convert the system over to be more SEO-friendly. Today, we have a "load more comments" after the first 20 comments, then it calls the server and loads more comments. This is what I'm trying to figure out. Should we load all the comments behind the scenes in the page, then lazy load the comments or use the same "load more" and just load what was already loaded behind the scenes? Or does anyone have a better suggestion about how to make the comments crawlable for Google?
Intermediate & Advanced SEO | | JDatSB0