Precisely announced a new strategic technology partnership with Opendatasoft, a data marketplace solution provider. Together, they will deliver an integrated data marketplace designed to simplify access to trusted, AI-ready data across businesses and teams – seamlessly and in compliance with governance requirements. The new data marketplace will integrate with the Precisely Data Integrity Suite to solve these challenges by combining the Suite’s robust data management capabilities with the intuitive, self-service experience of Opendatasoft’s data sharing platform. This powerful combination will ensure that accurate, consistent, and contextual data products are not only well-managed behind the scenes – they are also easy to discover, use, and share across the organization, with partners, or even through public channels. The result is improved accessibility, faster adoption, and a frictionless experience that supports enterprise-wide compliance and data-sharing needs. Franck Carassus, CSO and Co-Founder of Opendatasoft said “Together with Precisely, we’re enabling them to support greater data sharing and consumption by business users, unlocking new opportunities for AI and analytics, and maximizing ROI on their data investments.” By creating a flexible foundation for AI, analytics, and automation, customers can streamline operations, reduce the cost of ownership, and accelerate time-to-insight. Precisely enables organizations to modernize with intelligence and resilience – empowering them to build the modern data architectures needed to support dynamic data marketplaces and self-service access across the enterprise.
Klaviyo’s enhanced MCP server securely links AI tools to customer data via real-time APIs and remote access; enabling conversational analytics, audience suggestions, and automated campaign execution.
CRM and marketing automation company Klaviyo Inc. announced the general availability of its enhanced Model Context Protocol server that gives marketers the ability to connect AI tools such as Claude Desktop, Cursor, VS Code, Windsurf and other local or web-based tools directly with Klaviyo. The enhanced MCP server includes improved reporting context and a new remote server for broader accessibility, making it easier for marketers to bring AI into their workflows with more opportunity for speed and automation. The solution assists marketers that want to scale up performance based on training AI platforms to deliver better insights, recommendations and content. The remote MCP server offers secure online setup and real-time access to create, read and update data through Klaviyo’s API without adding complexity to the marketing technology stack. The MCP server makes it easy for marketers to accelerate their work in Klaviyo with AI tools, including a conversational chat interface that allows customers to interact with Klaviyo using natural language prompts. Marketers can quickly ask questions such as which campaign is driving the most revenue, how clickthrough rates have changed over time or compare performance across accounts. Using the platform, marketers can request AI-generated suggestions for new audience segments, subject lines modeled on top performers, or strategies to improve open rates in key flows. Additionally, the MCP server also supports AI-driven execution, letting marketers move from idea to action. With simple prompts, users can upload event profiles, draft promotional emails, or add images directly into Klaviyo.
Klaviyo’s enhanced MCP server securely links AI tools to customer data via real-time APIs and remote access; enabling conversational analytics, audience suggestions, and automated campaign execution.
CRM and marketing automation company Klaviyo Inc. announced the general availability of its enhanced Model Context Protocol server that gives marketers the ability to connect AI tools such as Claude Desktop, Cursor, VS Code, Windsurf and other local or web-based tools directly with Klaviyo. The enhanced MCP server includes improved reporting context and a new remote server for broader accessibility, making it easier for marketers to bring AI into their workflows with more opportunity for speed and automation. The solution assists marketers that want to scale up performance based on training AI platforms to deliver better insights, recommendations and content. The remote MCP server offers secure online setup and real-time access to create, read and update data through Klaviyo’s API without adding complexity to the marketing technology stack. The MCP server makes it easy for marketers to accelerate their work in Klaviyo with AI tools, including a conversational chat interface that allows customers to interact with Klaviyo using natural language prompts. Marketers can quickly ask questions such as which campaign is driving the most revenue, how clickthrough rates have changed over time or compare performance across accounts. Using the platform, marketers can request AI-generated suggestions for new audience segments, subject lines modeled on top performers, or strategies to improve open rates in key flows. Additionally, the MCP server also supports AI-driven execution, letting marketers move from idea to action. With simple prompts, users can upload event profiles, draft promotional emails, or add images directly into Klaviyo.
Vast Data’s SyncEngine helps AI agents to tap unstructured data from every source; provides unlimited ingest throughput via scale‑out nodes, deep metadata indexing, and real‑time searchable catalogs spanning cloud, on‑prem, and edge
Data storage company Vast Data Inc., which is in the process of transforming itself into an “operating system” for artificial intelligence, announced a new capability called Vast SyncEngine. The company says it acts as a “universal data router,” combining a highly performant onboarding system for unstructured data with a global catalog for building AI data pipelines. Available at no additional cost to existing customers, Vast SyncEngine is designed to simplify headaches around discovering and mobilizing distributed, unstructured datasets and software-as-a-service tools, so these data sources can quickly be plugged into their AI applications. Vast SyncEngine combines core distributed computing services, including storage, compute, messaging and reasoning into a unified data layer that spans cloud, on-premises and edge environments to power AI applications and agents. Vast SyncEngine works by collapsing data cataloging, migration and transformation into a single, no-cost capability within the Vast AI OS platform, enabling simpler integration and faster time to insights with lower costs. It enables teams to catalog and search across trillions of files and objects leveraging the Vast DataBase, with unlimited ingest throughput that’s bound only by the performance of the source and target systems. Moreover, its massively parallel architecture supports rapid scaling, simply by adding more nodes. By using Vast SyncEngine, companies can quickly build real-time, searchable data catalogs that span everything from traditional file and object systems to enterprise applications such as Google Drive, Salesforce, Microsoft Office and Sharepoint, and more besides. It supports deep metadata indexing to make data instantly discoverable. All of this unstructured information, the company says, can be moved without needing to create custom scripts and data transformations, fed into Vast’s InsightEngine and DataEngine platforms, which optimize it for AI applications and agentic workloads. As an added benefit, companies should see much lower costs, as Vast SyncEngine is essentially a replacement for existing data transformation and migration tools.
FICO’s new platform “households” disparate data, catalogs lineage, and turns time‑based feature combinations into predictive lift for risk, fraud, and marketing decisions
FICO is evolving into a data analytics powerhouse, introducing an AI-driven platform designed to streamline financial operations while safeguarding the very data that fuels it. Managing financial data through one platform layer can be like looking at a “spiderweb,” according to Bill Waid, chief product and technology officer. Clients will be relying on data from different sources, trying to avoid redundancy and create linkage between pieces of the same data. “We have our own proprietary way of actually identifying linkage and householding,” Waid said. “And actually being able to bring together and unify that data so that you have common contexts for each of them, but we also provide an open-source way of doing that where they might be done by the data partner itself.” FICO helps its users create a core data catalog, from which they can derive predictive reasoning through AI features. The goal is collecting data and harnessing AI for a specific, concrete goal, Waid emphasized. “Just collecting data isn’t actually a good business outcome,” he said. “A good business outcome is that I’ve used that data … because very often the predictiveness isn’t in the data itself, it’s in some combination of those data elements over time.”
Consumer AI agents and zero‑party data will shift loyalty from discounts to context, as brands operationalize per‑app actions and segment‑specific perks such as Chipotle’s student‑authenticated rewards
If there’s one co-hort that understands the value of their data it is Gen Z. And if the value exchange is strong and they trust your brand (emphasis on brand trust) then they will be the first in the pool on sharing coveted zero party data with that trusted brand. Zero-party data is the information customers willingly volunteer about their preferences, needs, and context. Gen Z’s hope is brands will actually listen and enhance their experiences – “think training their algorithms.” Smart programs use onboarding and ongoing touchpoints to ask simple, thoughtful questions—and then act on them immediately. e.l.f. Beauty Squad and My Purina stand out for making the exchange obvious: tell us what you want, and we’ll make the experience better right away. This isn’t about surveys; it’s about creating a dialogue. Brands that do this well build trust faster and generate more relevant offers without relying on guesswork. When programs like Marriott Bonvoy and Starbucks connect memberships and recognize the shared customer while they’re actually on the road, the benefit lands in the moment, not in a quarterly statement. Principal Analyst at Forrester, John Pedini’s forward look is compelling—consumers will soon bring their own AI agents that assemble itineraries around “non-negotiables” and flex on everything else. Pedini flagged Chipotle Rewards U, a student-authenticated extension calibrated to campus life. It’s segment-specific by design—game days, finals, and perks that feel earned, not generic—and it signals a broader shift: loyalty moving from linear discounts to contextual relevance. In my own coverage of travel and experience brands, the programs that grow faster are the ones that make the next decision feel obvious. Pedini flagged Chipotle Rewards U, a student-authenticated extension calibrated to campus life. It’s segment-specific by design—game days, finals, and perks that feel earned, not generic—and it signals a broader shift: loyalty moving from linear discounts to contextual relevance.
Cisco-Qumulo platform streams OpenTelemetry data directly into Splunk Observability Cloud while consolidating billions of files across edge-core-cloud environments
Qumulo has announced a strategic partnership with Cisco to deliver a unified data platform spanning edge, core, and cloud for enterprise organizations with a keen focus on enabling enterprise-wide observability. The partnership combines Qumulo’s Data Platform with Cisco Unified Computing Systems (UCS) to help enterprises consolidate decades of stranded file and object data and unlock its value for AI, analytics, and observability. The partnership creates a single, software-defined data fabric that scales from edge environments to exabyte-scale data centers and across all major public clouds. The initial focus is on two critical enterprise workloads: Unified Platform for Splunk Observability Cloud: Qumulo’s Data Platform natively integrates with Splunk Observability Cloud, streaming OpenTelemetry directly into observability pipelines. Enterprises can deploy Splunk Observability Cloud at scale on Cisco UCS systems, spanning modular compute, high-performance all-flash, dense hybrid flash, and cost-effective HDD UCS platforms. This turnkey approach simplifies deployment and creates a secure and reliable foundation for cybersecurity, situational awareness, and unified visibility and real-time troubleshooting across all environments. Unlocking Trapped Data for AI: The joint solution enables the consolidation of billions of files and petabytes-to-exabytes of unstructured data into a single, globally consistent network-attached namespace. This eliminates decades of stranded data, giving AI systems complete and instant access to the information needed for more accurate models and better outcomes. Key workloads and data include medical imaging, signals intelligence, autonomous driving telemetry, life sciences research, geospatial mapping and imagery, video surveillance, and enterprise document management. When enterprise data is unified onto Qumulo and Cisco UCS, the Qumulo Data Platform natively streams OpenTelemetry data directly into Splunk Observability Cloud, which is also OpenTelemetry-native, helping strengthen enterprise data ownership. This not only creates a turnkey foundation for cybersecurity, situational awareness, and enterprise observability, it enriches the observability dataset. Organizations benefit from a stronger security posture through Splunk Observability Cloud’s visibility into real-time troubleshooting to drive faster anomaly detection, helping strengthen resilience against threats to sovereign data.
Precisely’s Data Integrity Suite automatically converts natural language descriptions of business users into custom data quality rules using AI code generation; they are also LLM agnostic
Precisely, announced new enhancements to the Precisely Data Integrity Suite, designed to further streamline and advance data quality across the enterprise. The latest release introduces powerful AI-driven features that enable organizations to operationalize high-quality, AI-ready data even more quickly and intelligently. With intuitive no-code interfaces, both technical practitioners and business users can now more easily contribute to data quality efforts. Meanwhile, expanded support for external LLMs allows enterprises to align Precisely’s trusted framework with their preferred AI models, ensuring freedom of choice, operational control, and faster time-to-value. The new capabilities include: Natural language interfaces: Data quality tasks described in plain language are automatically converted by AI into custom code, making advanced data quality accessible to both technical and business users. AI-powered rule and description generation: Data quality rules and descriptions created from natural language prompts reduce manual complexity and save time. AI-generated sample data: Data produced by AI for testing purposes helps to validate logic, accelerate deployment, and increase confidence in results. LLM integration in pipelines: Large language models (LLMs) integrated directly into transformation pipelines, with support for “bring your own LLMs,” enables more intelligent automation while maximizing flexibility.
CData’s MCP platform enables AI assistants to query Salesforce, SAP, Google Cloud and 300+ sources with governed access, reducing report-building time from weeks to seconds through semantic understanding and SQL translation
CData Software announced Connect AI, the first managed Model Context Protocol (MCP) platform that integrates AI assistants, agent orchestration platforms, AI workflow automation, and embedded AI applications with more than 300 enterprise data sources. With governed, in-place access to enterprise data, Connect AI preserves data semantics and relationships, giving AI complete understanding of the context. The solution also inherits user permissions and authentication directly from the source and can be deployed in the cloud or embedded within software products in minutes with point-and-click configuration. Connect AI takes the same enterprise-grade connectivity technology already embedded by top technology companies including Palantir, SAP, Salesforce Data Cloud, and Google Cloud into their offerings, and reimagines it specifically for AI workloads with real-time semantic integration capabilities. Connect AI solves two core challenges: First, through data-in-place access, Connect AI preserves the rich contextual relationships that AI agents need for intelligent decision-making, delivering both immediate data access and meaningful data understanding. Second, Connect AI inherits existing security and authentication protocols set in the source system ensuring AI access remains aligned with organizational controls. Data access is logged under the identity of the authenticated user or agent for comprehensive governance. Additional AI controls can be layered and managed within Connect AI. Enterprises use Connect AI with AI apps to get contextually-aware answers from business data in seconds; work that previously required days or weeks of report building. Its ability to handle complex queries across diverse systems with semantic understanding enables sales teams to use Claude for pipeline insights, marketing teams to prompt ChatGPT for campaign analysis, and finance teams to rely on Copilot for real-time budget updates and financial reports. ISVs embed Connect AI directly within their products to provide their end-users with self-service integration between their data sources and the ISV’s agentic capabilities.
ParadeDB’s open-source Postgres extension facilitates full-text search and analytics directly in Postgres without the need to transfer data to a separate source and can support heavy workloads that require frequent updating
ParadeDB is an open source Postgres extension that facilitates full-text search and analytics directly in Postgres without users needing to transfer data to a separate source. The platform integrates with other data infrastructure tools, including Google Cloud SQL, Azure Postgres, and Amazon RDS, among others. “Postgres is becoming the default database of the world, and you still can’t do good search over that information, believe it or not,” Philippe Noël, the co-founder and CEO of ParadeDB said. ParadeDB isn’t the first company to try to solve Postgres search. Noël said that Elasticsearch works by moving data back and forth between itself and Postgres, which can work, but this system isn’t great for heavy workloads or processes that require frequent updating. “That breaks all the time,” Noël said. “The two databases are not meant to work together. There’s a lot of compatibility issues, there’s a lot of latency issues, higher costs, and all of that deteriorates the user experience.” ParadeDB claims to eliminate a lot of those challenges by building as an extension on top of Postgres directly, no data transfer required.
