Generating a signature and expires in java
-
Hello,
I am developing a tool for my company to get stats from SeoMoz using your API. During development, I have been using the example signature and expires values which are auto-generated for me. Now that testing is complete, my code will need to generate these values. I have been googling looking for a resource demonstrating how to do this using Java, but I have not found a good example. I was hoping that someone at SeoMoz would have a resource or an example that they could share.
The email associated with this account belongs to a non-developer, so if a response is provided via email in addition to the forum, sending it to my email would be much appreciated.
Thank you,
Anthony
-
Never mind, I have come up with a solution:
package com.yourpackage.signature;
import java.io.IOException;
import java.security.InvalidKeyException;
import java.security.NoSuchAlgorithmException;
import java.util.Date;import javax.crypto.Mac;
import javax.crypto.spec.SecretKeySpec;import org.apache.geronimo.mail.util.Base64; //can be whichever flavor of encoder you'd like
public class SignatureGenerator {
public static final String ACCESS_ID = "member-XXXXXXX";
public static final String SECRET_KEY = "XXXXXXXXXXXXXXXXXXXXXXXXXXx";//expireTime should be in seconds since Jan 1 1970 : new Date().getTime()/1000) + X
public static String generateSignature(String data, String key, String expireTime, String algorithm)
throws InvalidKeyException, NoSuchAlgorithmException, IOException {data += expireTime;
byte[] hmacData = null;
SecretKeySpec secretKey = new SecretKeySpec(key.getBytes("UTF-8"),
algorithm);
Mac mac = Mac.getInstance(algorithm);
mac.init(secretKey);
hmacData = mac.doFinal(data.getBytes("UTF-8"));String encoded = new String(Base64.encode(hmacData));
return encoded;
}public static void main(String[] args) {
try {Long longTime = new Long(new Date().getTime()/1000) + 60;
System.out.println(longTime);
String data = ACCESS_ID + "\n";
System.out.println(generateSignature(data, SECRET_KEY, String.valueOf(longTime), "HMACSHA1"));
} catch (Exception e) {
e.printStackTrace();
}}
}
-
There has been no response from SeoMoz on this forum or to my email.
Please provide some feedback. I am afraid If I cannot solve this issue I will be forced to cancel our account as it is not practical for me to manually load the sample signature and expired value on a daily basis.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NoIndex tag, canonical tag or automatically generated H1's for automatically generated enquiry pages?
What would be better for automatically generated accommodation enquiry pages for a travel company? NoIndex tag, canonical tag, automatically generated H1's or another solution? This is the homepage: https://www.discoverqueensland.com.au/ You would enquire from a page like this: https://www.discoverqueensland.com.au/accommodation/sunshine-coast/twin-waters/the-sebel-twin-waters This is the enquiry form: https://www.discoverqueensland.com.au/accommodation-enquiry.php?name=The+Sebel+Twin+Waters®ion_name=Sunshine+Coast
Technical SEO | | Kim_Lazaro0 -
Sitemap generator partially finding list of website URLs
Hi everyone, When creating my XML sitemap here it is only able to detect a portion of the website. I am missing at least 20 URLs (blog pages + newly created resource pages). I have checked those missing URLs and all of them are index and they're not blocked by the robots.txt. Any idea why this is happening? I need to make sure all wanted URLs to be generated in an XML sitemap. Thanks!
Technical SEO | | Taysir0 -
Generating a xml sitemap?
Hi What is everyone's preferred method of generating an XML sitemap? Just wondering if one piece of software is better than others?
Technical SEO | | TheZenAgency1 -
Expired domain 404 crawl error
I recently purchased a Expired domain from auction and after I started my new site on it, I am noticing 500+ "not found" errors in Google Webmaster Tools, which are generating from the previous owner's contents.Should I use a redirection plugin to redirect those non-exist posts to any new post(s) of my site? or I should use a 301 redirect? or I should leave them just as it is without taking further action? Please advise.
Technical SEO | | Taswirh1 -
CMS Auto Generated Sitemap Work Around?
Hey Moz Community, The Shopify ecommerce platform auto generates xml sitemaps and robots.txt for you. Frustratingly there is no way to augment either of these. If I noindex on a page it will still show up in the site map... Causing inconstancy with the sitemap submitted to GWT. In theory if put the MY version of the sitemap on site and point GWT to MY version.. Would this solve the inconstancy ? Or would Googlebot go in and still crawl the default /sitemap.xml anyway? Any suggestions and insight is greatly appreciated!
Technical SEO | | paul-bold0 -
Using Sitemap Generator - Good/Bad?
Hi all I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it. The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me! Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing? Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site? Any thoughts or optinions would be greatly appreciated. Kris
Technical SEO | | yousayjump0 -
Grabbing Expired Domains
How hard is it to grab expired domains? I have my eye on a domain that is expiring in 3 days, but I don't think it's quite that simple. Doesn't it go through months of waiting to become available? Is there an easy way to grab domains that are set to expire? Are the services that offer this type of service good? And who do you guys recommend?
Technical SEO | | applesofgold0 -
Seek help correcting large number of 404 errors generated, 95% traffic halt
Hi, The following GWT screen tells a bit of the story: site: http://bit.ly/mrgdD0 http://www.diigo.com/item/image/1dbpl/wrbp On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT. My traffic had been steady at about 1000 clicks/day. At midnight on 2/10, google traffic completely halted, down to 11 clicks/day. I submitted a recon request and was told 'no manual penalty' Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then. By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th. I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors! I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors. However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'. I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$. Assuming that the large number of 404 internal errors is the reason for the sudden shutoff... How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'.. Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP?? thanks
Technical SEO | | mantucket0