-
Notifications
You must be signed in to change notification settings - Fork 1
[EXE-1528] Send jdbcOptions as MergedParameter to driver #10
[EXE-1528] Send jdbcOptions as MergedParameter to driver #10
Conversation
mergedParameters.parameters.foreach { case (key, value) => | ||
log.info(s"key: ${key} value ${value}") | ||
properties.setProperty(key, value) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here you will log every key value, not just the user
, password
ones like in the original implementation 🤔 Do you need to be logging them for "Prod" or is this a testing remnant ? I am not sure we should be logging them cause they may contain creds 🤷♂️
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fuuuuuuck will remove, this is for my testing 🤦, good catch!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lgtm 👍
@@ -97,7 +97,7 @@ class DefaultSource( | |||
} | |||
|
|||
def tableExists: Boolean = { | |||
val conn = jdbcWrapper.getConnector(params.jdbcDriver, params.jdbcUrl, params.credentials) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm curious, did you look into the history of why they ignore all the other params?
Even on latest master, only use these 3 params and ignore the others /~https://github.com/spark-redshift-community/spark-redshift/blob/master/src/main/scala/io/github/spark_redshift_community/spark/redshift/RedshiftRelation.scala#L111
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like they were fixing the same invalid character
8 years ago like we were last week 😅 , the way they fix is passing user/password as part of the options in this pr
Nice 👍 , this looks like an easy one you could contribute back to the open source fork? |
https://actioniq.atlassian.net/browse/EXE-1528
Didn't send queryGroup to driver, details here
Sending a query from spark-shell and see the query_label `select * from SYS_QUERY_HISTORY where query_label = '123456678908'
Deploy Note