Google can now answer your questions with custom data visualization and graphs. The first domain for this is financial data when asking about stocks and mutual funds. This can be used to compare stocks, see prices during a specific period, and more. Google credits “advanced models [that] understand the intent of the question,” with AI Mode using historical and real-time information. It will then “intelligently determine how to present information to help you make sense of it.” You can interact with the generated chart and ask follow-up questions. Other AI Mode features Google previewed at I/O include Search Live, Deep Search, Personal Context, and agentic capabilities powered by Project Mariner. In other AI Mode tweaks, Google restored Lens and voice input to the Search bar when you’re scrolling through the Discover feed. Meanwhile, Google Labs announced an experiment that “lets you interact conversationally with AI representations of trusted experts built in partnership with the experts themselves.” You can ask questions of these “Portraits” and get back responses based on their knowledge and “authentic” content/work in the expert’s voice “via an illustrated avatar.” The first is from “Radical Candor” author Kim Scott. You might want to ask about “tough workplace situations or practice difficult conversations.” Portraits use “Gemini’s understanding and reasoning capabilities to generate a relevant and insightful response.” Google says it “conducted extensive testing and implemented user feedback mechanisms to proactively identify and address potential problematic scenarios.”