SQL statements on DataFrames
By now, you will have noticed that many operations on DataFrames are inspired by SQL operations. Additionally, Spark allows us to register DataFrames as tables and query them with SQL statements directly. We can therefore build a temporary database as part of the program flow.
Let's register readingsDF
as a temporary table:
scala> readingsDF.registerTempTable("readings")
This registers a temporary table that can be used in SQL queries. Registering a temporary table relies on the presence of a SQL context. The temporary tables are destroyed when the SQL context is destroyed (when we close the shell, for instance).
Let's explore what we can do with our temporary tables and the SQL context. We can first get a list of all the tables currently registered with the context:
scala> sqlContext.tables DataFrame = [tableName: string, isTemporary: boolean]
This returns a DataFrame. In general, all operations on a SQL context that return data return...