Skip to content

Commit

Permalink
fix: Fixing Spark min / max entity df event timestamps range return o…
Browse files Browse the repository at this point in the history
…rder (#2735)

fix: Fixing the return order of elements when calculating the min and max entity-DF event timestamps in the Spark offline store.

Signed-off-by: Lev Pickovsky <lev.pickovsky@ironsrc.com>
  • Loading branch information
levpickis authored Jul 25, 2022
1 parent a15fcb4 commit ac55ce2
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -330,8 +330,8 @@ def _get_entity_df_event_timestamp_range(
df = spark_session.sql(entity_df).select(entity_df_event_timestamp_col)
# TODO(kzhang132): need utc conversion here.
entity_df_event_timestamp_range = (
df.agg({entity_df_event_timestamp_col: "max"}).collect()[0][0],
df.agg({entity_df_event_timestamp_col: "min"}).collect()[0][0],
df.agg({entity_df_event_timestamp_col: "max"}).collect()[0][0],
)
else:
raise InvalidEntityType(type(entity_df))
Expand Down

0 comments on commit ac55ce2

Please sign in to comment.