Google Cloud—A View From The Top

From cloud computing for chip design to AI for whales and exoplanet discovery.

popularity

I recently had the opportunity to attend a retreat-style event in Napa Valley hosted by the Google Cloud team. Like many such events I’ve attended over the years, the guests were treated to excellent accommodations, fabulous food, a good amount of free time for networking or to partake in activities, relevant and on-point presentations and world-class wine (this is Napa Valley after all). All the ingredients for a memorable couple of days away from the office.

But this one was very different.

Let’s start with the parts that are familiar. We arrived on a Thursday evening where we kicked off the event with some fabulous food and even better wine. From there, the group convened for a panel discussion and the first of two presentations from the Google Cloud team.

The panel presented views of how to use Google Cloud Platform (GCP) from Palo Alto Networks, Fitbit and eSilicon. There was a lot of good, relevant information presented here. The three companies had very different business models to deploy in the cloud and that diversity made for some interesting discussions. eSilicon’s take on using the cloud was a bit different from the rest. A lot of the discussion centered around data consolidation and analytics – how to get the most out of the information your business creates.

eSilicon, on the other hand, discussed our vision to completely design and tape out custom integrated circuits in the cloud. We’ve mentioned this vision a few times recently. We have an ambitious goal of moving 100% of our design infrastructure to GCP by the end of the summer. That’s both very exciting and a bit frightening, too. It appears we’ll be the first semiconductor company to achieve this goal and I have to say that having the Google Cloud team on our side definitely helps me to sleep better at night. With the hard work of our IT and software development teams, along with a group of dedicated software and support partners and of course the massive CGP infrastructure and support team, we’ll make history this year.

Up until this point, the event had run as expected; however, when the first Google presenter addressed the room everything changed. Scott Penberthy, Director of Applied AI at Google Cloud could have told us how powerful and accessible Google’s AI technology was. He could have promoted the technology in the context of the businesses represented by the attendees. At an event like this, you would kind of expect that.

But that’s not what Scott talked about. He covered a lot of ground in his presentation, all having to do with changing the world as we know it with Google AI, but what sticks in my mind is how Google AI is being used to find new exoplanets.

Of course, the hunt for new planets and potential extra-terrestrial life isn’t new. NASA’s Kepler Space Telescope was launched in 2009 to discover Earth-size planets orbiting other stars. The mission has recently been retired, but during its 9 ½ years in space, 678 GB of data was collected and many new planets were discovered.

Recently, however, newer planets have been discovered using this data along with a new approach to analyzing it using machine learning techniques developed by Google. The conventional method of finding planets in the Kepler data was for researchers to analyze the data with the help of computer algorithms to look for patterns that indicate a planet. The new approach is to train an algorithm to look for planets and present the data back to the researcher. Taking the human out of the data inspection loop makes things run a lot faster. Google has made the results of this research available for all to use. You can learn more about Google’s contributions here.

The following day included several group activities, concluding with a visit to the historic Inglenook Winery. From there, our meeting ended with the second talk from a Google researcher. Once again, a “missed opportunity” to promote GCP to the audience for their business. Rif A. Saurous is a Google Research Director with a passion for audio, signal processing and machine learning. One could immediately envision someone like this improving the Google Assistant user experience. That is part of what Rif thinks about, but his presentation focused on something else – saving Humpback Whales.

Knowing where these creatures are and how they migrate helps to protect the species. But finding them isn’t easy. Humpback Whales sing, in a very distinctive way. The National Oceanic and Atmospheric Administration (NOAA) has collected a lot of sounds from the oceans over the years. About 9 terabytes in all. Sifting through all this data can be daunting. This should start sounding familiar. Enter Google and their AI technology. By teaching the AI to listen for Humpback Whale songs, it was able to find a lot of them with very little human resources. Just Rif and a few other folks. You can learn more about the Humpback Whale project here.

So, our event ended with another story of how Google is changing the world with its technology. The event was memorable, but the approach Google took to explain its mission is noteworthy. The perspective on technology is often focused on projects with a planetary (or multi-planetary) scale. How we all use that technology in our everyday businesses is left as an exercise to the user, with assistance from Google Professional Services when needed, of course.



Leave a Reply


(Note: This name will be displayed publicly)