Another way to explore data in Spark is by using Spark SQL. This allows analysts who may not be well-versed in language-specific APIs, such as SparkR, PySpark (for Python), and Scala, to explore Spark data.
I will describe two different ways of accessing Spark data via SQL:
- Issuing SQL commands through the R interface:
This has the advantage of returning the results as an R dataframe, where it can be further manipulated
- Issuing SQL queries via databricks SQL magic: directive
This method allows analysts to issue direct SQL commands, without regard to any specific language environment
Before processing an object as SQL, the object needs to be registered as a SQL table or view. Once it is registered, it can be accessed through the SQL interface of any language API.
Once registered, you can use the show tables SparkR command to get a list of registered tables for your...