Python
The Python SparkSession
object behaves in the same way as Scala. We can almost run the same commands as shown in the previous section, within the constraints of language semantics:
bin/pyspark
Refer to the following screenshot:
>>> spark.version u'2.0.0' >>> sc.version u'2.0.0' >>> sc.appName u'PySparkShell' >>> sc.master u'local[*]' >>> sc.getMemoryStatus Traceback (most recent call last): File "<stdin>", line 1, in <module> AttributeError: 'SparkContext' object has no attribute 'getMemoryStatus' >>> from pyspark.conf import SparkConf >>> conf = SparkConf() >>> conf.toDebugString() u'spark.app.name=PySparkShell\nspark.master=local[*]\nspark.submit.deployMode=client' >>> >>> exit() (To exit the pyspark shell)
The PySpark
instance does not have the getExecutorMemoryStatus
call yet, but we can get some information with the .toDebugString
call.