I am working on a project where I need to transfer thousands of files (each sized between 50-60 MB) every hour from an SFTP server to local storage or AWS S3. I am using Apache Spark 3.5 with Scala 2.12 for distributed processing.
I tried using the spark-sftp library, but it seems to be discontinued and incompatible with Spark 3.x. Currently, I am using SSH in Scala to transfer files sequentially, but this approach is causing delays due to single-threaded file transfers.
I want to implement parallel processing for transferring files from SFTP to local storage or AWS S3. Are there any alternative approaches compatible with Spark 3.x that can help achieve this? How can I optimize file transfers for better performance?
