I have a dataframe df1:
+-------------------+-----+
| start_date |value|
+-------------------+-----+
|2019-03-17 00:00:00| 35|
+-------------------+-----+
|2019-05-20 00:00:00| 40|
+-------------------+-----+
|2019-06-03 00:00:00| 10|
+-------------------+-----+
|2019-07-01 00:00:00| 12|
+-------------------+-----+
and another dataframe df_date :
+-------------------+
| date |
+-------------------+
|2019-02-01 00:00:00|
+-------------------+
|2019-04-10 00:00:00|
+-------------------+
|2019-06-14 00:00:00|
+-------------------+
I did the join and now I have df with date , start_date and value but the value I want should be like this :
+-------------------+-------------------+-----+
| start_date | date |value|
+-------------------+-------------------+-----+
|2019-02-01 00:00:00|2019-03-17 00:00:00| 0|
+-------------------+-------------------+-----+
|2019-04-10 00:00:00|2019-05-20 00:00:00| 35|
+-------------------+-------------------+-----+
|2019-06-14 00:00:00|2019-06-03 00:00:00| 85|
+-------------------+-------------------+-----+
everytime I should compare start_date with date if it's different I should add previous value with my value else I should keep the previous value
I already have the new dataframe with the join in Pyspark and trying to have the new value
I used this code to get the results
win = Window.partitionBy().orderBy("date")
df = df.withColumn("prev_date", F.lag(F.col("start_date")).over(win))
df = df.fillna({'prev_date': 0})
df = df.withColumn("value",F.when(F.isnull( F.lag(F.col("value"), 1).over(win)),df.value).when(df.start_date != df.prev_date,df.value + F.lag(F.col("value"), 1).over(win)) .otherwise(F.lag(F.col("value"),1).over(win)))
df.show(df.count(),False)
The problem that the modifications is done in the same time and I need the previous value everytime
Thank you