Skip to content

Commit

Permalink
[SPARK-45392][SQL][FOLLOWUP] Restore the original exception from data…
Browse files Browse the repository at this point in the history
… source

### What changes were proposed in this pull request?

This is a followup of #43193 to fix a behavior change. `Class.getDeclaredConstructor().newInstance()` will wrap the exception with `InvocationTargetException`, and this PR unwraps it to still throw the original exception from the data source

### Why are the changes needed?

restore a behavior change

### Does this PR introduce _any_ user-facing change?

no

### How was this patch tested?

new test

### Was this patch authored or co-authored using generative AI tooling?

no

Closes #43446 from cloud-fan/follow.

Authored-by: Wenchen Fan <wenchen@databricks.com>
Signed-off-by: yangjie01 <yangjie01@baidu.com>
  • Loading branch information
cloud-fan authored and LuciferYang committed Oct 19, 2023
1 parent 7057952 commit eda8507
Show file tree
Hide file tree
Showing 2 changed files with 19 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -699,7 +699,13 @@ object DataSource extends Logging {
val useV1Sources = conf.getConf(SQLConf.USE_V1_SOURCE_LIST).toLowerCase(Locale.ROOT)
.split(",").map(_.trim)
val cls = lookupDataSource(provider, conf)
cls.getDeclaredConstructor().newInstance() match {
val instance = try {
cls.getDeclaredConstructor().newInstance()
} catch {
// Throw the original error from the data source implementation.
case e: java.lang.reflect.InvocationTargetException => throw e.getCause
}
instance match {
case d: DataSourceRegister if useV1Sources.contains(d.shortName()) => None
case t: TableProvider
if !useV1Sources.contains(cls.getCanonicalName.toLowerCase(Locale.ROOT)) =>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,12 @@ class DataSourceV2Suite extends QueryTest with SharedSparkSession with AdaptiveS
}.head
}

test("invalid data source") {
intercept[IllegalArgumentException] {
spark.read.format(classOf[InvalidDataSource].getName).load()
}
}

test("simplest implementation") {
Seq(classOf[SimpleDataSourceV2], classOf[JavaSimpleDataSourceV2]).foreach { cls =>
withClue(cls.getName) {
Expand Down Expand Up @@ -1155,3 +1161,9 @@ class ReportStatisticsDataSource extends SimpleWritableDataSource {
}
}
}

class InvalidDataSource extends TestingV2Source {
throw new IllegalArgumentException("test error")

override def getTable(options: CaseInsensitiveStringMap): Table = null
}

0 comments on commit eda8507

Please sign in to comment.