admin 管理员组

文章数量: 1086019

I initially create an iceberg table with two columns of type int and string. Later I try to overwrite the table with one extra column of type date being added in the middle. The operation errors out. I have attached the code and error log below:

import .apache.spark.sql.functions._
spark.range(10).withColumn("tmp", lit("hi")).writeTo("test.sample").using("iceberg").tableProperty("write.spark.accept-any-schema", "true").createOrReplace() // creating a table for the first time
spark.range(10).withColumn("cur_date", current_date).withColumn("tmp", lit("hi")).writeTo("test.sample").using("iceberg").tableProperty("write.spark.accept-any-schema", "true").option("mergeSchema", "true").createOrReplace()

Caused by: .apache.hadoop.hive.metastore.api.InvalidOperationException: The following columns have types incompatible with the existing columns in their respective positions : cur_date

Version Details

spark : version 3.4.1.3.3.6.4-7

scala version: 2.12.17

java: 17.0.14

iceberg-spark-runtime: iceberg-spark-3.4_2.12-1.4.3.3.3.6.4-7.jar

As per documentation, the existing table's configuration and data must be replaced with that of the dataframe.

本文标签:

Error[2]: Invalid argument supplied for foreach(), File: /www/wwwroot/roclinux.cn/tmp/view_template_quzhiwa_htm_read.htm, Line: 58
File: /www/wwwroot/roclinux.cn/tmp/route_read.php, Line: 205, include(/www/wwwroot/roclinux.cn/tmp/view_template_quzhiwa_htm_read.htm)
File: /www/wwwroot/roclinux.cn/tmp/index.inc.php, Line: 129, include(/www/wwwroot/roclinux.cn/tmp/route_read.php)
File: /www/wwwroot/roclinux.cn/index.php, Line: 29, include(/www/wwwroot/roclinux.cn/tmp/index.inc.php)