When your code requires external libraries for Spark, %spark.dep
helps load them from a maven repository. The %spark.dep
interpreter leverages the Scala environment and enables you can write Scala expressions to call dependency load APIs.
%spark.dep// load Apache Commons RNG libraries from maven repositoryz.load("org.apache.commons:commons-rng-core:1.0")z.load("org.apache.commons:commons-rng-simple:1.0")
Note that %spark.dep
should be the first interpreter run in the notebook before %spark
, %spark.pyspark
or %spark.sql
. Otherwise, %spark.dep
will print several error messages and you'll need to shutdown and restart the container for the notebook again.