-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[EXE-1558] Setup build and deploy; add pushdown #1
Conversation
Hmm how did Github let you request me for review if its still in draft mode 😄 |
No I was just adding the build and publish stuff but later figured I could just add pushdowns on top of it so converted it to draft... I split the pushdowns to another PR if you want to review the build part first |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good 👍 one thing to follow up, can you test the pushdowns that we listed in our connector confluence and make sure those still work in spark 3, I guess can wait until you deploy spark and run from AIQ if thats easier.
Also, can you please fix the formatting in RedshiftPushDownQuery.scala
please.
@@ -1,7 +1,4 @@ | |||
-Dfile.encoding=UTF8 | |||
-Xms1024M | |||
-Xmx1024M | |||
-Xss6M | |||
-XX:MaxPermSize=512m |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These dont work with java 17?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They were removed in java 17 (or earlier): https://docs.oracle.com/en/java/javase/17/docs/specs/man/java.html#removed-java-options
build.sbt
Outdated
// DON't UPGRADE AWS-SDK-JAVA if not compatible with hadoop version | ||
val testAWSJavaSDKVersion = sys.props.get("aws.testVersion").getOrElse("1.11.1033") | ||
val testAWSJavaSDKVersion = sys.props.get("aws.testVersion").getOrElse("1.11.1026") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We use aws java 1.12.x, upgrade this one? Or is it just for tests and we dont care
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was referencing the dependencies of hadoop-aws: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws/3.3.2 but yeah I try upgrading here
} | ||
|
||
if (jdbcProps.get("aiq_testing").exists(_.toBoolean)) { | ||
RedshiftPushDownSqlStatement.capturedQueries.append(finalQuery) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Prob dont need this anymore 🤷 , now that your exposed the statement in the plan. But I guess its fine to keep.
I dont see it used below in any tests?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will try adding some unit tests later
Noticed some issue with partially invalid queries, documented here: https://actioniq.atlassian.net/browse/EXE-1614, will continue digging as part of bringing generic jdbc pushdown to flame.
Tested reading from redshift in /~https://github.com/ActionIQ/flame/pull/440/files