SparkContext doesn't have, SQLContext has:

from pyspark.sql import SQLContext

sqlContext = SQLContext(sc)
sqlContext.createDataFrame(pandas_df)
🌐
CopyProgramming
copyprogramming.com › howto › sparksession-object-has-no-attribute-createdataframe
AttributeError: 'SparkSession' object has no attribute 'createDataFrame' - Apache spark
June 3, 2023 - from pyspark import SparkContext, SQLContext sc = SparkContext.getOrCreate() spark = SQLContext(sc) result_dict = {'a':3,'b':44} data = list(map(list, result_dict.items())) f_rdd = spark.createDataFrame(data, ["A", "B"]).repartition(1) ... 5 result_dict = {'a':3,'b':44} 6 data = list(map(list, result_dict.items())) ----> 7 f_rdd = spark.createDataFrame(data, ["A", "B"]).repartition(1) AttributeError: 'SQLContext' object has no attribute 'createDataFrame'
🌐
Cloudera Community
community.cloudera.com › t5 › Support-Questions › AttributeError-in-Spark-2-3 › td-p › 185505
Solved: AttributeError in Spark 2.3 - Cloudera Community - 185505
July 17, 2018 - Hi, The below code is not working in Spark 2.3 , but its working in 1.7. Can someone modify the code as per Spark 2.3 import os from pyspark import SparkConf,SparkContext from pyspark.sql import HiveContext conf = (SparkConf() .setAppName("data_import") .set("spark.dynamicAllocation.enabled","tru...
🌐
GitHub
github.com › CODAIT › stocator › issues › 30
AttributeError: 'SparkContext' object has no attribute 'hadoopConfiguration' · Issue #30 · CODAIT/stocator
April 18, 2016 - AttributeError: 'SparkContext' object has no attribute 'hadoopConfiguration'#30 · Copy link · snowch · opened · on Apr 18, 2016 · Issue body actions · I'm not sure if this is a bug or user error. See the details of the ticket at: snowch/biginsight-examples#28 ·
Author   snowch
🌐
Dataiku Community
community.dataiku.com › questions & discussions › using dataiku
PySpark Recipes persist DataFrame — Dataiku Community
April 15, 2020 - But I have this error message: 'Job failed: Pyspark code failed: At line 186: <type 'exceptions.AttributeError'>: 'SparkSession' object has no attribute '_getJavaStorageLevel' Any idea??? Thank you for your help! ... It seems that Spark does not like mixing old and new style APIs (SQLContext created from a SparkSession instead of a SparkContext).
Find elsewhere
🌐
GitHub
github.com › delta-io › delta › issues › 1967
[BUG][Spark] DeltaTable.forPath doesn't work with Spark Connect · Issue #1967 · delta-io/delta
August 9, 2023 - However, when I want to load the data as a deltatable DeltaTable.forPath() it fails because it assumes that the Spark Session has a SparkContext · Prepare a spark connect server, see https://spark.apache.org/docs/latest/spark-connect-overview.html eg. ./sbin/start-connect-server.sh --packages org.apache.spark:spark-connect_2.12:3.4.0 ... from pyspark.sql import SparkSession spark = SparkSession.builder.remote("sc://localhost").getOrCreate() columns = ["id","name"] data = [(1,"Sarah"),(2,"Maria")] df = spark.createDataFrame(data).toDF(*columns) df.write.format('delta').mode('overwrite').save('s3a://<bucket>/<name>')
Author   stvno
🌐
Databricks Community
community.databricks.com › t5 › data-engineering › attributeerror-sparksession-object-has-no-attribute-wrapped-when › td-p › 33970
AttributeError: 'SparkSession' object has no attri... - Databricks Community - 33970
December 4, 2022 - I'm getting the error... AttributeError: 'SparkSession' object has no attribute '_wrapped' --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) in () 1 from sparknlp.training import CoNL...
🌐
CSDN
devpress.csdn.net › python › 63045844c67703293080b8b9.html
pyspark error: AttributeError: 'SparkSession' object has no attribute 'parallelize'_python_Mangs-Python
August 23, 2022 - --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) <ipython-input-9-1db231ce21c9> in <module>() ----> 1 spark_df = sqlContext.createDataFrame(df_in) /home/edamame/spark/spark-2.0.0-bin-spark-2.0.0-bin-hadoop2.6-hive/python/pyspark/sql/context.pyc in createDataFrame(self, data, schema, samplingRatio) 297 Py4JJavaError: ... 298 """ --> 299 return self.sparkSession.createDataFrame(data, schema, samplingRatio) 300 301 @since(1.3) /home/edamame/spark/spark-2.0.0-bin-spark-2.0.0-bin-hadoop2.6-hive/python/pyspark/sql/session.py
🌐
Cloudera Community
community.cloudera.com › t5 › Support-Questions › AttributeError-in-Spark › td-p › 185732
Solved: AttributeError in Spark - Cloudera Community - 185732
July 18, 2018 - Hi, The below code is not working in Spark 2.3 , but its working in 1.7. Can someone modify the code as per Spark 2.3 import os from pyspark import SparkConf,SparkContext from pyspark.sql import HiveContext conf = (SparkConf() .setAppName("data_import") .set("spark.dynamicAllocation.enabled","true")...
🌐
Great Expectations
discourse.greatexpectations.io › archive
AttributeError: 'NoneType' object has no attribute 'sc' - Archive - Great Expectations
March 19, 2021 - I am trying to execute pyspark script via emr. Script will process files from S3 bucket and put into another folder. Code was working fine till march first week. But suddenly getting error at the initial phase where datacontext was getting created. Below is the code that is responsible for data context Code: def build_dq_context(bucket): data_context_config = DataContextConfig( datasources={ DQ_DATASOURCE_NAME: DatasourceConfig( class_name=“SparkDFDatasource” ) }, store_backend_defaults...
🌐
Apache
spark.apache.org › docs › latest › api › python › reference › api › pyspark.SparkContext.html
pyspark.SparkContext — PySpark 4.1.1 documentation
An object setting Spark properties. gatewayclass:py4j.java_gateway.JavaGateway, optional · Use an existing gateway and JVM, otherwise a new JVM will be instantiated. This is only used internally. ... The JavaSparkContext instance. This is only used internally. ... Only one SparkContext should be active per JVM. You must stop() the active SparkContext before creating a new one. SparkContext instance is not ...
🌐
GitHub
github.com › commoncrawl › cc-pyspark › issues › 24
Use SparkSession instead of SQLContext · Issue #24 · commoncrawl/cc-pyspark
"As of Spark 2.0, [SQLContext] is replaced by SparkSession." (see SQLContext). replacing SQLContext by SparkSession might simplify the code as the session also holds the SparkContext obje...