The Wayback Machine - https://web.archive.org/web/20200621065950/https://github.com/dotnet/spark/issues/15
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extract common classes from src/scala/microsoft-spark-<version>. #15

Open
imback82 opened this issue Mar 19, 2019 · 0 comments
Open

Extract common classes from src/scala/microsoft-spark-<version>. #15

imback82 opened this issue Mar 19, 2019 · 0 comments

Comments

@imback82
Copy link
Contributor

@imback82 imback82 commented Mar 19, 2019

We create multiple jars during our builds to accommodate multiple versions of Apache Spark. In the current approach, the implementation is copied from one version to another and then necessary changes are made.

An ideal approach could create a common directory and extract common classes from duplicate code. Note that even if class/code is exactly the same, you cannot pull out to a common class if it depends on Apache Spark.

Success Criteria:

  • PR that refactors all the classes appropriately
  • Documentation for all the classes changed/added
  • Documentation on upgrading versions (if it doesn't already exist)
@rapoth rapoth transferred this issue from another repository Apr 24, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
3 participants
You can’t perform that action at this time.