Gemini Deep Research can now comb through Workspace files • The Register
Google’s Gemini Deep Research tool can now reach deep into Gmail, Drive, and Chat to obtain data that might be useful for answering research questions.
Gemini Deep Research is Gemini 2.5 Pro (presently) deputized as an agent, meaning it embarks on a multistep process to respond to a directive rather than spitting out an immediate response. Deep Research systems incorporate knowledge discovery, workflow automation, and research orchestration.
Google is not the only provider of such systems. OpenAI and Perplexity also offer deep research tools, and various open source implementations are also available.
“After you enter your question, it creates a multi-step research plan for you to either revise or approve,” explained Dave Citron, senior director of product management for Google’s Gemini service, in a blog post last year. “Once you approve, it begins deeply analyzing relevant information from across the web on your behalf.”
Now Gemini Deep Research can, if allowed, access data in your Gmail, Drive (e.g. Docs, Slides, Sheets, and PDFs), and Google Chat for added context. If the data you have stored in Google Workspace might be relevant to your research question, granting Gemini access to that data may lead to better results.
There is precedent for this sort of data access among other AI vendors, since providing AI models with access to personal files and data tends to make them more useful – at the expense of privacy and security. Anthropic’s Claude, for example, has web-based connectors for accessing Google Drive and Slack. Its iOS incarnation can access certain apps like Maps and iMessage. And Claude Desktop supports desktop extensions for access to the local file system.
Nonetheless, it’s worth considering Google’s expansive privacy notice for Gemini Apps. On the linked Google Privacy & Terms page, the company says it uses “publicly available information to help train Google’s AI models and build products and features like Google Translate, Gemini Apps, and Cloud AI capabilities.”
As the wording of that passage says nothing about private data, The Register asked Google to clarify. A company spokesperson confirmed that information available to Gemini via connected apps such as Gmail and Drive is not used to improve the company’s generative AI.
However, the Gemini Deep Research privacy notice does include this noteworthy passage: “Human reviewers (including trained reviewers from our service providers) review some of the data we collect for these purposes. Please don’t enter confidential information that you wouldn’t want a reviewer to see or Google to use to improve our services, including machine-learning technologies.”
And it comes with a caution not to use Deep Research for matters of consequence: “Don’t rely on responses from Gemini Apps as medical, legal, financial, or other professional advice.”
So far, reviews of Gemini Deep Research run the gamut, from glowing to cautious approval, meh, mixed, and skeptical, with caveats about source labeling accuracy and lack of access to paywalled research, among other things.
While the quality of the initial prompt has bearing on the end result, this isn’t just a case of “you’re holding it wrong.”
Earlier this year, education consultant and PhD candidate Leon Furze summarized the utility of deep research models as follows:
“The only conclusion I could arrive at is that it is an application for businesses and individuals whose job it is to produce lengthy, seemingly accurate reports that no one will actually read,” he wrote in February. “Anyone whose role includes the kind of research destined to end up in a PowerPoint. It is designed to produce the appearance of research, without any actual research happening along the way.” ®


