Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] No Delta transaction is committed to target Delta table #4218

Open
2 of 8 tasks
dwangatt opened this issue Mar 4, 2025 · 0 comments
Open
2 of 8 tasks

[BUG] No Delta transaction is committed to target Delta table #4218

dwangatt opened this issue Mar 4, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@dwangatt
Copy link

dwangatt commented Mar 4, 2025

Bug

Which Delta project/connector is this regarding?

  • Spark
  • Standalone
  • Flink
  • Kernel
  • Other (fill in here)

Describe the problem

We have multiple FLINK jobs that cannot commit data into Delta table. The job runs without any exception. Can do checkpoint. But when querying data from target Delta table, no data existed in it.
After checking some further, we found parquet files have been created on S3(Delta table is saved on S3), but no new delta transaction is added into delta log.
Delta table is OK.
We use FLINK 1.20 and io.delta:delta-flink:3.3.0

Steps to reproduce

  1. Launch the job reading from Kinesis, and append transformed data into Delta tables. There are multiple Delta sinks. We use forRowData to create Delta sink, with one partition column and enabled schema merge.
  2. Leave the job run. For initial multiple FLINK checkpoints, we can see corresponding commit records in Delta log.
  3. After running for a while, job runs well, but we cannot query new data in target Delta tables. Also no corresponding commit records in delta log.

Observed results

We checked log but did not find any significant issue. The only thing we noticed is in FLINK checkpoint, it gets bigger and bigger and can see many parquet file paths in checkpoint data.

Expected results

For each FLINK checkpoint, the data should be committed into Delta table

Further details

Environment information

  • Delta FLINK Connector version: 3.3
  • Spark version: N/A
  • Scala version: 2.12
  • FLINK version: 1.20

Willingness to contribute

The Delta Lake Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the Delta Lake code base?

  • Yes. I can contribute a fix for this bug independently.
  • Yes. I would be willing to contribute a fix for this bug with guidance from the Delta Lake community.
  • No. I cannot contribute a bug fix at this time.
@dwangatt dwangatt added the bug Something isn't working label Mar 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant