The best way to do Interstitial (ads)
-
Hello,
I want to ask you guys what's the best way do to Interstitial without penalty?
and feel free to give me samples from another major websites.Thanks!
-
And technical aspects?
-
its the same thing.
make sure users can exit out of the ad and that’s it very clear how they can exit.
-
Hi, I'm not talking about Apps, I'm talking about websites
-
If you are employing full-screen ads that take over a user’s entire browser window, then you need to understand a few things from Google's standpoint. Users hate them, so be very careful when you trigger full-screen floating ads and how often you employ them per session. The more people that get annoyed by takeover ads and then jump back to the search results. Engagement drops, dwell time is low, and you are sending horrible signals to Google about user happiness.
If you do employ full-screen floating ads, then make sure users can exit out of the takeover and that’s it very clear how they can exit. On some sites I found myself extremely frustrated being forced to watch a full-screen ad (which I would never normally do by the way). Full-screen ads that literally take over my screen, don’t let me exit, etc. annoy the heck out of me. And many others feel the same way. Higher User Engagement is the king !
Using interstitial ads, I can tell you that a distinct portion of your traffic is not enjoying the roadblocks you have in place. And there’s a chance that many of those users are popping back to the SERPs. And as I’ve mentioned before, low dwell time is something you want to avoid.
.Refer:- https://support.google.com/googleplay/android-developer/answer/2986667?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are some best practices for optimizing alternate versions of a brand name?
What are the best methods for ensuring that the correct spelling/formatting of a brand name rank in the SERP when an alternate formatting/spelling of the brand name is searched. Take for example the brand name (made up for example purposes), "SuperFry". Many customers search using the term "Super Fry" (with a space). To make things worse, not only does Google not return the brand name SuperFry, but it also auto corrects to another brand name "Super-Fri". Is there a common best practice to ensure the customer finds the intended brand name when they simply add a space in the search term? I assume a quick fix would be to create an ad words campaign for the alternate spellings/formatting. What about an organic solution? Perhaps we could create a special page talking about the alternate ways to spell the brand name? Would this solution send mixed signals to Google and potential hurt the over all rankings? Thanks much for any advice!
Technical SEO | | Vspeed0 -
Changing the city of operation and trying to know the best way of informing Google
We are having a business operating out of three cities A, B and C with A being the primary address and the business provides its services in B and C as well. Business has decided to shut shop in C and instead add D as another city. Currently the URLs are like www.domainname.com/A/productswww.domainname.com/B/productswww.domainname.com/C/productsPlease help us in understanding the best way to inform google that City C is non operational now.Do we need to do the redirects, and if yes, should we do the redirects to Home Page?Or can we just remove the C city URLs from the webmaster tool and inform Google.
Technical SEO | | deep_irvin0 -
What is the better way to fix duplication https and http?
Hi All! I have a doubt about how to fix the duplication problem https @ http. What is the better way to fix it in your opionion/experience? Me, for instance, I have chosen to put "noindex, nofollow" into https version. Each page of my site has a https version so I put this metarobots into it....But I am not sure about what happens with all backlinks with "https" URLs I have, I've just checked I have some...What do you think about it? Thanks in advance for helping!
Technical SEO | | Red_educativa0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
Best Way To Handle Expired Content
Hi, I have a client's site that posts job openings. There is a main list of available jobs and each job has an individual page linked to from that main list. However, at some point the job is no longer available. Currently, the job page goes away and returns a status 404 after the job is no longer available. The good thing is that the job pages get links coming into the site. The bad thing is that as soon as the job is no longer available, those links point to a 404 page. Ouch. Currently Google Webmaster Tools shows 100+ 404 job URLs that have links (maybe 1-3 external links per). The question is what to do with the job page instead of returning a 404. For business purposes, the client cannot display the content after the job is no longer available. To avoid duplicate content issues, the old job page should have some kind of unique content saying the job is longer available. Any thoughts on what to do with those old job pages? Or would you argue that it is appropriate to return 404 header plus error page since this job is truly no longer a valid page on the site? Thanks for any insights you can offer.
Technical SEO | | Matthew_Edgar
Matthew1 -
Mobile site: robots.txt best practices
If there are canonical tags pointing to the web version of each mobile page, what should a robots.txt file for a mobile site have?
Technical SEO | | bonnierSEO0 -
Ads at the top of the page
hi mozzers, I have a website made in dreamweaver. Right at the very top of all of my pages is a text advert with a link to an affiliate in the ad. This link is very lucrative and gets a lot of sales, but i'm concerned it may have some negatives from an SEO perspective. As it's the first text on every page, i'm guessing bots will read it first. Could this potentially cause problems? If so, is there a best practice that would allow me to keep it there and keep the bots happy? Cheers, Peter
Technical SEO | | PeterM220