Details
Description
I have added following configuration to spark-defaults.conf
spark.jars.packages com.amazonaws:aws-java-sdk:1.11.115,org.apache.hadoop:hadoop-aws:2.8.0
If I start spark-shell with this configuration, spark-shell loads these jars and starts without any issue.
But when I use Jupyter with Livy, I am getting error as shown in attached error.png.
Same Jupyter+Livy+Spark combo works well with default spark configuration as shown in screenshot success.png