Tuesday, May 10, 2011

Google Prediction API helps all apps to adapt and learn

By Travis Green, Product Manager

Now your apps can get smarter with as little as a single line of code. They can learn to continually adapt to changing conditions and to integrate new information. This week at Google I/O, we’re making the Google Prediction API generally available, meaning you can create apps with these capabilities for yourself. Additionally, we’re introducing several significant new features, including:
  • The ability to stream data and tune your predictive models
  • A forthcoming gallery of user-developed, pre-built models to add smarts even faster.
The Google Prediction API can be used by almost any app to recommend the useful, extract the essential, and automate the repetitive. For example:
  • Recommend a new movie to a customer.
  • Identify most important customers.
  • Automatically tag posts with relevant flags.
For example, Ford Motor Co. Research is working to use the Prediction API to optimize plug-in hybrid vehicle fuel efficiency by optionally providing users with likely destinations to choose from, and soon, optimizing driving controls to conserve fuel. Because the API is a cloud-hosted RESTful service, Ford has been able to access its computationally-intensive machine learning algorithms to find patterns that rank potential destinations based on previous driving paths. Ford will be demonstrating their work at the API’s I/O Session.

Here’s a summary of the features we added to the API today:
  • Streaming training data: Continually incorporate feedback for fast-adapting systems (e.g. user-chosen tags vs predicted ones, final purchases vs expected).
  • General availability: Anyone can now sign up to use the API. Paid users also receive a 99.9% SLA with increased quota.
  • New JavaScript library: Now deploy the Prediction API in your JavaScript – in addition to our updated Python and Java libraries.
Today, we are also announcing the Prediction API’s forthcoming gallery of pre-trained third party predictive models (try these demo models right now), and we will be adding more constantly (maybe yours – waitlist). Once complete, all Prediction API users will be able to:
  • Subscribe to others’ models: improve your apps with others’ predictive data tools.
  • Sell access to your models (e.g. sentiment analysis on social media).
  • Import customized models through the open-standard PMML encoding.
See our recent blog post for even more ideas, and get started at the Google APIs Console.

Thanks to our community of preview developers, who have played a crucial role in helping us make the Google Prediction API simpler and more powerful since its announcement last year at I/O 2010. We are thrilled to invite all developers to join them.

Travis Green's favorite part about his job is designing smart applications. In his spare time, he is in the great outdoors (looking for trouble).

Posted by Scott Knaster, Editor


  1. What is the precision and recall of this Prediction APIs. What kind of data is used to test the precision and recall ? Is there a blog where I can access the test approach or strategy and results summary so that I get better understanding of its quality before I use this.

  2. This is a pretty cool idea but is the car going to do once it knows where I drive every day? Is it going to start driving itself? Feels like it would just be collecting data, unless it learned to soften the suspension when I get to a rough patch of road everyday.

  3. Can't believe I'm the first to post: Nice job! Eager to see what people will do with this :)

  4. That's why I love google, their innovation, their ability to create new services for both users and for organizations. The other will be a note to my blog that is specifically about search and information retrieval. http://recuperaciondeinformacion-gr21.blogspot.com/

  5. @Rowan if you mixed it with traffic flow it could tell you a better time to drive or a better route to avoid traffic jams. It of course could also provide ads for companies you drive past.

  6. We developed a similar idea at Monash except the data mining framework actually exists on the device & uses optimized algorithms which consume less power. It is called OMM, here is a brief of a seminar we provided http://www.acs.org.au/vic/index.cfm?action=event&area=9003&temID=eventdetails&eveID=10148598860936

  7. thanks Google,
    i am doing the same for Turkish language in my graduation thesis at Dokuz Eylul University

  8. Google is essential for the community of librarians and archivists, because with the rapid growth of digital information, it is necessary for information professionals to adopt new tools and takes ownership. I invite you to review a blog on search and information retrieval in http://recoveryinformation2011.blogspot.com/

  9. Can this API predict when it will be killed off by Google, thereby rendering your code useless?

  10. I am an online seller, how can google prediction help me. Can it help me with which products i should carry in my eshop

  11. Would second Snirp. Probably a better approach to this launch and retract spiral of hell, is to provide the date till which any of your APIs will be kept functioning and alive. After which all bets are off. That way I can either not use the API or use it knowing that it will be useful for only that much time.
    So my question here is: Do you guarantee this API will survive through to 2014? If so, It's probably worth looking into. Else, stop wasting my time.