Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Join our daily and weekly newsletters for the latest updates and the exclusive content on AI coverage. Learn more
For anyone in AI, this is not great news that “data is the real price”. If you have solid data foundations, your models and applications supplied by them will be right on money.
But that’s where it becomes disorderly. Building this foundation is not a piece of cake, especially when there are dozens of data sources, each hosting valuable information. You must build and maintain integration pipelines for each source – a massive engineering burden for data teams juggling with ETL disparate tools to centralize what is necessary to feed the workloads of the AI. On a scale, these pipelines become rigid bottlenecks – difficult to adapt, extend or develop.
Snowflake think he has an answer.
Today, at its annual summit, the company announced the general availability of OpenFlow – an fully managed data ingestion service which draws any type of data from practically any source, rationalizing the information mobilization process for a rapid AI deployment.
Powered by Apache Nifi, OpenFlow uses connectors – predefined or personalized – with integrated Snowflake governance and security. Whether it is multimodal content not structured from box or real -time event flows, OpenFlow plugs, unifies and makes all data types available in the AI data cloud of Snowflake.
“Data engineers have often been confronted with a critical compromise – if they wanted highly controllable pipelines, they encountered a complexity and significant management of infrastructure. If they wanted a simple solution, they encountered problems of confidentiality, flexibility and limited personalization.
Although Snowflake has offered ingestion options like Snowpipe for streaming or individual connectors, OpenFlow provides a “complete and effortless solution to ingest practically all business data”.
“Snowflake snow and snow pipes are a key foundation for customers bringing data to snowflake and focus on the” load “of the ETL process. OpenFlow, on the other hand, manages data extraction directly from source systems, then performs processing and load data.
This ultimately unlocks new use cases where AI can analyze a complete image of business data, including real -time documents, images and events, directly in Snowflake. Once the information is extracted, they can return to the source system using the connector.
OpenFlow is currently supporting more than 200 loans to use connectors and processors, covering services like Box, Google Ads, Microsoft SharePoint, Oracle, Salesforce Data Cloud, Workday and Zendesk.
“Box integration with Snowflake OpenFlow … operates data extraction from the box using Box AI, honors original authorizations for secure access and fueling this data in Snowflake for the analysis. It also allows a bidirectional flow in which enriched ideas or metadata can be written in Venturebeat, which makes the content smarting over time “, Ben Kus, CTO in Box, said Venturebeat.
The creation of new connectors takes only a few minutes, accelerating the time to assess. Users also obtain security features such as roles -based authorization, encryption in public transport and secret management to keep the protected data from end.
“Organizations that require real -time data integration, process high volumes of data from various sources or rely on unstructured data such as images, audio and video to draw the value of the OpenFlow value,” added Child. A retail company, for example, could unify the partitioned sales data, electronic commerce, CRM and social media to offer personalized experiences and optimized operations.
Snowflake Irwin, Securonix and Workwave customers are among those who are ready to use OpenFlow to move and set up global data – although the company has not disclosed exact adoption numbers.
As a next step, Snowflake aims to make Open Flow of the dorsal spine of the movement of intelligent data in real time through distributed systems – fueling the age of AI agents.
“We focus on moving large -scale events and allowing real -time bidirectional communication and agent agent, so that ideas and actions flow transparently through distributed systems.
The calendar of these upgrades remains uncertain for the moment.