My internship: integrating Gemini into Data Commons

Google’s internships are an important part of our culture of building for everyone. Internships are designed to be more than just a summer job; it’s an opportunity to tackle real-world challenges and make an impact. Interns work alongside full-time Googlers, contributing to the helpful products and services that people use every day. While they gain hands-on experience and grow their skills under the guidance of dedicated mentors, we benefit from their curiosity and new approaches to problem-solving. To learn more, visit our Google Careers site, set up job alerts, and apply when applications open in the fall.

Q. Meet our intern Javier Vazquez: Tell us about yourself!

I’m Javi, a grad student doing a Master’s in AI at the University of Texas at El Paso. I’m excited to share a glimpse into my summer internship and the impactful feature I had the privilege to work on. It’s been an incredible journey of learning, collaboration, and bringing ideas to life.

Q. What opportunity were you solving for? What is the solution you helped implement? 

A quote from my internship host truly resonated with me throughout the entire internship: “It’s easy to just do a thing, but it is a lot harder to do the thing right.” 

This idea inspired me to deeply consider what makes a feature useful for Data Commons users. With guidance from the team, I developed a new feature that leverages the power of LLMs to suggest new follow up questions for deeper exploration of our data. 

The core of the project was to develop a new feature that integrates Gemini directly into Data Commons. To better serve data analysts, I leveraged Gemini’s AI generation capabilities – which are fantastic at producing engaging content – to upgrade our Explore pages on our Data Commons platform. This feature uses context from the Explore pages to automatically generate new engaging content, thereby directly meeting data analysts’ data exploration needs

Keep exploring feature in Data Commons

As we thought about what model to use (Gemma, Gemini, Gemini 2.5 Pro vs. Flash) we had to think about tradeoffs of quality and speed. After extensive testing, we found the optimal configuration: Gemini 2.5 Flash was best for handling follow-up questions, it achieves the right balance of performance. I’m very proud of the fact that the follow up questions feature is now fully rolled out, and you can give it a try!

Discover more from Data Commons • Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading