Skip to content

build: change default Maven profile to Spark 4.0 / Scala 2.13#4140

Open
andygrove wants to merge 4 commits intoapache:mainfrom
andygrove:default-spark-4.0
Open

build: change default Maven profile to Spark 4.0 / Scala 2.13#4140
andygrove wants to merge 4 commits intoapache:mainfrom
andygrove:default-spark-4.0

Conversation

@andygrove
Copy link
Copy Markdown
Member

Summary

  • Changes the default Maven build profile from Spark 3.5 / Scala 2.12 to Spark 4.0 / Scala 2.13
  • Updates spark-3.4 and spark-3.5 profiles to explicitly set scala.binary.version=2.12, shims.majorVerSrc=spark-3.x, and semanticdb.version=4.8.8 (previously inherited from defaults)
  • Populates the scala-2.12 profile with explicit properties and empties scala-2.13 (now matches defaults)
  • Removes FIXME comment from spark-4.0 profile
  • Updates Dockerfile and Docker publish workflow to Spark 4.0 / Scala 2.13 / JDK 17
  • Updates all user guide and contributor guide documentation to reflect the new defaults

Which issue does this PR close?

N/A

Test plan

  • Verify ./mvnw compile -pl common -DskipTests builds with Spark 4.0 by default
  • Verify ./mvnw compile -pl common -DskipTests -Pspark-3.5 still builds with Spark 3.5
  • Verify ./mvnw compile -pl common -DskipTests -Pspark-3.4 still builds with Spark 3.4
  • CI passes for all supported Spark versions

🤖 Generated with Claude Code

andygrove and others added 3 commits April 28, 2026 18:29
Update the default build configuration from Spark 3.5 / Scala 2.12 to
Spark 4.0 / Scala 2.13. The spark-3.4 and spark-3.5 profiles now
explicitly set scala.binary.version, shims.majorVerSrc, and
semanticdb.version since those defaults have changed. The scala-2.12
profile is populated and scala-2.13 is now empty (matching defaults).

Also updates Dockerfile, Docker publish workflow, and all documentation
to reflect the new defaults.

Co-Authored-By: Claude Opus 4.6 <[email protected]>
…K to 17

The scala-2.13 profile must retain its properties so that
`-Pspark-3.x -Pscala-2.13` correctly overrides the Spark profile's
scala.binary.version=2.12. Without this, Iceberg CI builds produce
_2.12 artifacts when _2.13 is expected.

The TPC-DS/TPC-H verification jobs used JDK 11 with no explicit Spark
profile, so they now inherit the Spark 4.0 default which requires
JDK 17.

Co-Authored-By: Claude Opus 4.6 <[email protected]>
The spark/pom.xml Iceberg dependency profiles use activeByDefault to
provide the right Iceberg version when no -Pspark-* is passed. Since
the default is now Spark 4.0, the activeByDefault must be on the
spark-4.0 profile (Iceberg 1.10.0) rather than spark-3.5 (Iceberg
1.8.1), otherwise Maven resolves the non-existent artifact
iceberg-spark-runtime-4.0_2.13:1.8.1.

Co-Authored-By: Claude Opus 4.6 <[email protected]>
@andygrove andygrove marked this pull request as ready for review April 29, 2026 13:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant