How do I authenticate a script with Search Console API to pull data
-
In regards to this article, https://moz.com/blog/how-to-get-search-console-data-api-python
I've gotten all the way to the part where I need to authenticate the script to run. I give access to GSC and the local host code comes up. In the article, it says to grab the portion between = and #, but that doesnt seem to be the case anymore. This is what comes up in the browser
When I put portions of it in, it always comes back with an error.
Help!
-
Hi Jo. So I think that you want everything after code= and before the &.
In the example you pasted, that would be:
4/igAqIfNQFWkpKyK6c0im0Eop9soZiztnftEcorzcr3vOnad6iyhdo3DnDT1-3YFtvoG3BgHko4n1adndpLqjXEE
If that doesn't work (or rather, it doesn't work when you re-run it and use whatever value comes up next time), let us know and I'll pull in someone who has done this themselves (I'm just reading the same instructions!).
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Nofollow versus data-href
We have a couple of Tier-1 websites that contain a lot of affiliate links. These outgoing affiliate links currently have the rel="nofollow" element. Yet, I am seeing a lot of other websites and competitors, use data-href="" instead of nofollow. Is the latter better for SEO purposes or are they just using data-href for better tracking?
Technical SEO | | LoyensT0 -
Google Search Console - Sitemap
Hi all, Quick question. I'm trying to update my sitemap via Google Search Console using a sitemap.xml file that I've created with ScreamingFrog. However, when trying to submit it, it seems that Google only allows sitemaps that are located at a path within your domain (i.e. www.example.com/sitemap.xml) as opposed to being able to directly upload a sitemap.xml file.Is there any way that I can easily upload my sitemap.xml file? Or is there any easy way that I can upload the file to a path on my domain so I can upload via the URL?Any insight would be much appreciated!Best,Sung
Technical SEO | | hdeg0 -
"Ghost" errors on blog structured data?
Hi, I'm working on a blog which Search Console account advises me about a big bunch of errors on its structured data: Structured data - graphics Structured data - hentry list Structured data - detail But I get to https://developers.google.com/structured-data/testing-tool/ and it tells me "all is ok": Structured data - test Any clue? Thanks in advance, F0NE5lz.png hm7IBtV.png aCRJdJO.jpg 15SRo93.jpg
Technical SEO | | Webicultors0 -
Need Help On Proper Steps to Take To De-Index Our Search Results Pages
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of. So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed. By tag pages I mean: Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to? So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can someone comment on what might be the best, safest, or fastest route? Thanks so much for any help you might offer me!! Craig So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can you tell me which would be the best, fastest and safest routes?
Technical SEO | | TheCraig0 -
HTTP to HTTPS Transition, Large Drop in Search Traffic
My URL is: https://www.seattlecoffeegear.comWe implemented https across the site on Friday. Saturday and Sunday search traffic was normal/slightly higher than normal (in analytics) and slightly down in GWT. Today, it has dropped significantly in both, to about half of normal search traffic. From everything we can see, we implemented this correctly. 301 redirected all http requests to https (and yes, they go to the correct page and not to the homepage 😉 ) Rewrote hardcoded internal links Registered/submitted sitemaps from https in Bing and GWT Used fetch and render to ensure Google could reach the site and also was redirected appropriately from http to https versions Ensured robots.txt does not block https or secure We also use a CDN (though I don't think that impacts anything) and have had no customer issues with accessing or using the website since the transition.Is there anything else I might be missing that could correlate to a drop in search impressions or is this just a waiting game of a few days to let Google sort through the change we've made and reindex everything (it dropped to 0 indexed for a day and is now up to 1744 of our 2180 pages indexed)?Thank you so much for any input!Kaylie
Technical SEO | | Marketing.SCG0 -
How much damage in search rank will my site suffer during an upcoming cms migration?
Hello, This is my first time on the seomoz forums and I hope I can get a real answer from this community of experts. I am migrating an existing site from an older cms (modx) to a newer cms (expression engine). The domain name isn't changing and neither will the keywords and keyphrases for existing pages. What is changing, however is the url suffix. This means I am going from www.domain.com/page-name.html to www.domain.com/page-name I can't seem to replicate the html sufffix in the new urls. Now this is the only reason why I will be setting up 301 permanent redirects from the old urls to the new ones. My question is: Will the domain suffer a loss in page rank or a substantial decline in search engine position as a result of this migration process? How fatal will it be? When can I expect my rankings to recover?
Technical SEO | | amit20760 -
New information in the description in natural search
Hello I wonder what that information to appear highlighted in the description in the attached file ... this is new to me. tks DLvtI.png
Technical SEO | | eder.machado0 -
Search optimal Tab structure?
Good day, We are in the process of starting a website redesign/development. We will likely be employing a tabbing structure on our home page and would like to be able to capitalize on the keyword content found across the various tabs. The tab structure will be similar to how this site achieves tabs: http://ugmo.com/ I've uploaded a screen grab of this page as the Googlebot user agent. The text "Soil Intelligence for professional Turf Managers" clicks through to this page: http://ugmo.com/?quicktabs_1=1#quicktabs-1 So I'm thinking there could be some keyword dilution there. That said Google is very much aware of the text on the quicktabs-1 page being related to the home page content: http://www.google.com/search?q=Up+your+game+with+precise+soil+moisture%2C+salinity+and+temperature+measurements.+And+in+the+process%2C+save+water%2C+resources%2C+money.+inurl%3Augmo.com&sourceid=ie7&rls=com.microsoft:en-us:IE-SearchBox&ie=&oe= Is this the best search optimal way to add keyword density on a home page with a tab structure? Or is there a better means of achieving this? {61bfcca1-5f32-435e-a311-7ef4f9b592dd}_tabs_as_Googlebot.png
Technical SEO | | Hershel.Miller0