Snowflake announced the general availability of Openflow — a fully managed data ingestion service that pulls any type of data from virtually any source, streamlining the process of mobilizing information for rapid AI deployment. Powered by Apache NiFi, Openflow uses connectors — prebuilt or custom — with Snowflake’s embedded governance and security. Whether it’s unstructured multimodal content from Box or real-time event streams, Openflow plugs in, unifies, and makes all data types readily available in Snowflake’s AI Data Cloud. While Snowflake has offered ingestion options like Snowpipe for streaming or individual connectors, Openflow delivers a “comprehensive, effortless solution for ingesting virtually all enterprise data.” Snowflake’s Snowpipe and Snowpipe Streaming remain a key foundation for customers bringing data into Snowflake, and focus on the ‘load’ of the ETL process. Openflow, on the other hand, handles the extraction of data directly from source systems, then performs the transform and load processes. It is also integrated with our new Snowpipe Streaming architecture, so data can be streamed into Snowflake once it is extracted. This ultimately unlocks new use cases where AI can analyze a complete picture of enterprise data, including documents, images, and real-time events, directly within Snowflake. Once the insights are extracted, they can return to the source system using the connector. Openflow currently supports 200+ ready-to-use connectors and processors. Creating new connectors takes just a few minutes, speeding up time to value. Users also get security features such as role-based authorization, encryption in transit, and secrets management to keep data protected end-to-end. As the next step, Snowflake aims to make Openflow the backbone of real-time, intelligent data movement across distributed systems – powering the age of AI agents.