1. DB2-MQ-Kafka to S2 - Duplicate entry for key 'PRIMARY'
2. Bad SQL cause ORA-00913: too many values
3. DB2-MQ-Kafka to S2 - Update failing with duplicate key error after successful insert
4. Fix parsing of connection stage config
5. Support XML
6. Support multiple replay strategy for BigQuery as target
7. Support MySQL RDS password rotation (Chegg)
8. Support auto-reinit if DDL fails
9. Upgrade S3 SDK version to 2.*
10. MySQL index creation error
11. Option to add prefix to topic
12. Db2 Kafka - Kafka logging improvements
13. Option to allow dot(.) as a delimiter in Kafka topic names
14. Fix system generated split-key and delta-snapshot-key handling for HANA as source
15. Initialize ASE source fields in target JSON format log ( for Kafka and Redis Stream )
16. Transforms Not Applying When Src-schemas is used
17. Log Delta Table Verbose
18. AWS SecretsManager - error parsing secretKeyPath with /
19. Setting csv-publish-method: READ as default when column transformation is used
20. DB2-MQ-Kafka - Update on timestamp column not working
21. Enable compression-type: GZIP for MemSQL/SingleStore bulk load
22. Redis Stream : Improve logging under snapshot config log-row-level-errors
23. Fix NPE in GLOBAL replay consistency validation code
24. Fix issue where we only allow connecting to a named instance by port
25. Fix for SQL Server change tracking and re-enable CI/CD tests for SQL Server
26. Fixed SQLServer fetchObjectIds() function, using schema name to filter tables from SQLServer
Deploy the only cloud-native data replication platform you’ll ever need. Get real-time, high-performance data pipelines today.
8 sources & 6 targets
Pre-configured enterprise instance
Available in four US AWS regions
*No payment info required
20+ enterprise source and target connectors
Deploy on-prem or VPC
Satisfy security requirements
Fully managed, in the cloud.