#change-data-capture

[ follow ]
DevOps
fromInfoQ
1 day ago

Netflix Automates RDS PostgreSQL to Aurora PostgreSQL Migration Across 400 Production Clusters

Netflix automated RDS to Aurora PostgreSQL migrations across 400 production clusters through infrastructure-level orchestration, eliminating manual intervention while maintaining data integrity and CDC pipeline correctness.
Software development
fromInfoQ
1 week ago

MySQL 9.6 Changes Foreign Key Constraints and Cascade Handling

MySQL 9.6 moves foreign key constraint and cascade management from InnoDB storage engine to SQL layer, improving CDC pipeline accuracy and data consistency across replication and analytics workloads.
Data science
fromInfoQ
1 week ago

Pinterest's CDC-Powered Ingestion Slashes Database Latency from 24 Hours to 15 Minutes

Pinterest deployed a next-generation database ingestion framework using CDC, Kafka, Flink, Spark, and Iceberg to reduce data latency from 24+ hours to minutes while processing only changed records.
Data science
fromMedium
2 months ago

Migrating from Historical Batch Processing to Incremental CDC Using Apache Iceberg (Glue 4...

Use Apache Iceberg Copy-on-Write tables in AWS Glue 4 to migrate from full historical batch reprocessing to incremental CDC, reducing redundant computation, I/O, and costs.
fromTechzine Global
4 months ago

Qlik Open Lakehouse is now generally available

Qlik is making Open Lakehouse generally available. The Apache Iceberg service promises real-time pipelines and automatic optimization without vendor lock-in. The solution combines change data capture (CDC) with automatic Iceberg optimization. Teams can continue to use their existing tools, including Amazon Athena, Snowflake, Spark, Trino, and Amazon SageMaker. During the preview phase, customers reported faster queries and significantly lower infrastructure costs. Qlik Open Lakehouse is now available to all Talend Cloud users.
Data science
Artificial intelligence
fromInfoQ
5 months ago

From Black Box to Blueprint: Thoughtworks Uses Generative AI to Extract Legacy System Functionality

Generative AI can accelerate reverse engineering of legacy black-box systems to produce validated functional blueprints enabling lower-risk modernization.
fromHackernoon
3 years ago

Building a Real-Time Change Data Capture Pipeline with Debezium, Kafka, and PostgreSQL | HackerNoon

The article provides a step-by-step guide to setting up a Change Data Capture (CDC) pipeline using PostgreSQL, Debezium, Apache Kafka, and Python.
Data science
[ Load more ]