Zepl Commands
Zepl exposes several built in functions from the Zeppelin context. These functions are accessible through a predefined variable,
z
. The z
object can be used by directly invoking its methods. The methods available in the z
object are described below. With dynamic forms, you can enable notebook users to interact with your code without ever exposing them to it. To produce these form inputs, invoke the methods below:
Python
Scala
PySpark
Function Definition:
z.checkbox(string title, list[(value, display)] options, list default_selected): Return List
%python
options = [("apple","Apple"), ("banana","Banana"), ("orange","Orange")]
list = z.checkbox("fruit", options, ["apple", "orange"])
print(list)
Function Definition:
z.checkbox(String title, Seq((value, display))): Return Seq(Object)
%spark
val options = Seq(("apple","Apple"), ("banana","Banana"), ("orange","Orange"))
val list = z.checkbox("Fruit", options)
println(list)
Function Definition:
z.checkbox(string title, list[(value, display)] options, list default_selected): Return List
%pyspark
options = [("apple","Apple"), ("banana","Banana"), ("orange","Orange")]
list = z.checkbox("fruit", options, ["apple", "orange"])
print(list)

Python
Scala
Pyspark
Function Definition:
z.textbox(string name, string value): Return string
%python
s = z.textbox("name", "sun")
print("Hello " + s)
Function Definition:
z.textbox(String name, String value): Return String
%spark
val s = z.textbox("name", "sun")
println("Hello " + s)
Function Definition:
z.textbox(string name, string value): Return string
%python
s = z.textbox("name", "sun")
print("Hello " + s)

Python
Scala
PySpark
Function Definition:
z.select(string label, list[(value, name)]): Return value
%python
day = z.select("day", [("1","mon"),
("2","tue"),
("3","wed"),
("4","thur"),
("5","fri"),
("6","sat"),
("7","sun")])
print("Hello " + day)
Function Definition:
z.select(String label, Seq(value, name)): Return value
%spark
val day = z.select("day", Seq(("1","mon"),
("2","tue"),
("3","wed"),
("4","thur"),
("5","fri"),
("6","sat"),
("7","sun")))
println("Hello " + day)
Function Definition:
z.select(string label, list[(value, name)]): Return value
%pyspark
day = z.select("day", [("1","mon"),
("2","tue"),
("3","wed"),
("4","thur"),
("5","fri"),
("6","sat"),
("7","sun")])
print("Hello " + day)

In addition, variables can be added to SQL queries to allow the user to dynamically filter query results. This removes the need for users like data analysts to spend time making simple code or query updates.
With the
z
object you can share data between the Spark, Python and R environments using the z.put()
and z.get()
methods. These methods can only take in primitive data types such as, String, Int, Boolean, Vector, and Seq. For example, you can put some objects using Scala in a Spark interpreter and read it from Python and vice versa. Function Definition:
z.put(String name, Object data)
Function Definition:
z.get(String name): Return Object
Example
Pass data from Spark to R and Python:
%spark
// String
val scala_str = "Hello from spark"
z.put("scala_str", scala_str)
// Integer
val scala_int = new Integer(42)
z.put("scala_int", scala_int)
// Vector
val scala_vec = Vector(1,2,3,4)
z.put("scala_vec", scala_vec)
Print data in R and Python:
%r
#Prining Scala variables
z.get("scala_str")
z.get("scala_int")
z.get("scala_vec")
%python
#Prining Scala variables
print(z.get("scala_str"))
print(z.get("scala_int"))
print(z.get("scala_vec"))
This is not a supported method for passing DataFrame objects between languages. To do this, use Spark temporary tables and pass values between Scala and PySpark.
Last modified 2yr ago