Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not enough arguments for constructor SortShuffleWriter #88

Open
Nikitosique opened this issue Nov 25, 2024 · 1 comment
Open

Not enough arguments for constructor SortShuffleWriter #88

Nikitosique opened this issue Nov 25, 2024 · 1 comment

Comments

@Nikitosique
Copy link

Hi team,
We have a fork of this repository and recently tried to build it with the Spark 3.4.4.
The sbt clean package command failed with the following error:

[error] /builds/olp/narya/spark-runtime-environment/spark-s3-shuffle/narya/src/main/scala/org/apache/spark/shuffle/sort/S3ShuffleManager.scala:148:9: not enough arguments for constructor SortShuffleWriter: (handle: org.apache.spark.shuffle.BaseShuffleHandle[K,V,C], mapId: Long, context: org.apache.spark.TaskContext, writeMetrics: org.apache.spark.shuffle.ShuffleWriteMetricsReporter, shuffleExecutorComponents: org.apache.spark.shuffle.api.ShuffleExecutorComponents)org.apache.spark.shuffle.sort.SortShuffleWriter[K,V,C].
[error] Unspecified value parameter shuffleExecutorComponents.
[error]         new SortShuffleWriter(other, mapId, context, shuffleExecutorComponents)
[error]         ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed

It looks like the failure was caused by this update in Spark - a new constructor parameter has been added to the constructor of the SortShuffleWriter class:
apache/spark@da0c7cc#diff-9b02853e043201f08a560466d397ff6df85546f35c5c293d09e3891c9481f096R32

When we use Spark 3.4.3 and Spark 3.5.0 to build the project, sbt clean package works as expected.
Could someone please take a look at this problem?

@bollerman
Copy link

We are noticing the same thing. There seems to be a breaking change in the Spark shuffle API between Spark 3.4.3 and 3.4.4, and as well as between 3.5.1 and 3.5.2.

Are there any plans to maintain this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants