Snowflake plugs PostgreSQL into its AI Data Cloud
Briefly

Snowflake plugs PostgreSQL into its AI Data Cloud
""Say you want to build an app on data that is in Snowflake, but if that app doesn't have a relational OLTP [online transaction processing] database to store the data, they have to go break out of the boundary," Snowflake EVP of product Christian Kleinerman told The Register. "With the PostgreSQL service, our goal is to provide this secure boundary where, if customers build apps or build agents within that boundary, their data has not left the compliance and regulatory perimeter for Snowflake.""
""The service relies on pg_lake, a set of open source PostgreSQL extensions that allow developers and data engineers to read and write directly to Apache Iceberg tables from PostgreSQL, thereby cutting out the need to extract and move data. Iceberg is an open table format that proponents say lets users bring their preferred analytics engines to their data without moving it. It is widely used and supported across the cloud and data platform ecosystem, including by Snowflake, Google, AWS, and others.""
PostgreSQL now runs natively in Snowflake's AI Data Cloud, enabling applications, AI agents, analytics, recommendations, and forecasting to operate on operational data without separate pipelines. The service provides full compatibility with open source PostgreSQL so existing apps can migrate without code changes. It uses pg_lake extensions to read and write directly to Apache Iceberg tables from PostgreSQL, removing the need to extract and move data between transactional and analytical systems. The integration aims to keep data inside a single compliance and regulatory perimeter and reduce vendor and pipeline complexity. Snowflake had previously introduced a transactional capability called Unistore.
Read at Theregister
Unable to calculate read time
[
|
]