Here is one way to solve this.

import org.apache.spark.sql.Row
import org.apache.spark.sql.functions._
import scala.collection.mutable.WrappedArray

val data = Seq((Seq(1,2,3),Seq(4,5,6),Seq(7,8,9)))
val df = sqlContext.createDataFrame(data)
val first = df.first

// use a pattern match to deferral the type
val mapped = first.getAs[WrappedArray[Int]](0)

// now we can use it like normal collection
mapped.mkString("\n")

// get rows where has array
val rows = df.collect.map {
    case Row(a: Seq[Any], b: Seq[Any], c: Seq[Any]) => 
        (a, b, c)
}
rows.mkString("\n")
Answer from Rockie Yang on Stack Overflow
🌐
Rvinyl
rvinyl.com › Chevrolet-Spark-Vehicle-Wraps
Chevrolet Spark Vehicle Wraps | Chevrolet Spark Car Wraps
Precision-cut Vehicle Wraps for a perfect fit on your Chevrolet Spark. Enhance aesthetics and make a statement! Shop Vehicle Wraps for a fresh look and enhanced flair. Perfect fit for your Chevrolet Spark!
🌐
Machine Learning Plus
machinelearningplus.com › blog › what is sparksession – pyspark entry point, dive into sparksession
What is SparkSession - PySpark Entry Point, Dive into SparkSession - machinelearningplus
April 11, 2023 - # Access SparkContext spark_context = spark.sparkContext # Access SQLContext sql_context = spark._wrapped # Access HiveContext (if Hive support is enabled) hive_context = spark._jwrapped
Discussions

How to iterate scala wrappedArray? (Spark) - Stack Overflow
I had issues with WrappedArray in my code and was able to replace it with Seq[Int]. More on stackoverflow.com
🌐 stackoverflow.com
April 9, 2017
apache spark - How to bring Scala DataFrame to Python and vice versa without registering a view? - Stack Overflow
The SQLContext is obtained from the SparkSession via its _wrapped field or from another DataFrame via its sql_ctx field. More on stackoverflow.com
🌐 stackoverflow.com
September 13, 2018
Wrapped My Spark
Love it. Looks great. More on reddit.com
🌐 r/seadoo
1
19
July 17, 2025
How to convert wrappedArray column in spark dataset to java array? - Stack Overflow
To convert scala wrappedArray column to Java list in java spark. More on stackoverflow.com
🌐 stackoverflow.com
🌐
CrowdStrike
crowdstrike.com › en-us › blog › spark-hot-potato-passing-dataframes-between-scala-spark-and-pyspark
Spark Hot Potato: Passing DataFrames Between Scala Spark and PySpark
September 5, 2024 - In order to demonstrate calling Scala functions on an existing Python DataFrame, we will now utilize our second Scala function, addColumnScala() in the same way we accessed the last one, using the SparkContext. This time we will only pass in the JVM representation of our existing DataFrame, which the addColumnScala() function will use to compute another simple calculation and add a column to the DataFrame. We will again wrap the returned JVM DataFrame into a Python DataFrame for any further processing needs and again, run the job using spark-submit: spark-submit --jars target/scala_2.11/PySparkScalaExample-0.1.jar pyspark_call_scala_example.py iris.data
🌐
GitHub
github.com › samelamin › spark-bigquery › blob › master › README.md
spark-bigquery/README.md at master · samelamin/spark-bigquery
import com.samelamin.spark.bigquery._ val sqlContext = spark.sqlContext sqlContext.setBigQueryGcsBucket("bucketname") sqlContext.setBigQueryProjectId("projectid") sqlContext.setGcpJsonKeyFile("keyfilepath") sqlContext.hadoopConf.set("fs.gs.project.id","projectid") val df = spark.sqlContext.read.format("com.samelamin.spark.bigquery").option("tableReferenceSource","bigquery-public-data:samples.shakespeare").load() `` ### Reading DataFrame From BigQuery in Pyspark ```python bq = spark._sc._jvm.com.samelamin.spark.bigquery.BigQuerySQLContext(spark._wrapped._jsqlContext) df= DataFrame(bq.bigQuerySelect("SELECT word, word_count FROM [bigquery-public-data:samples.shakespeare]"), session._wrapped)
Author   samelamin
Find elsewhere
🌐
Scsunlimited
scsunlimited.com › collections › sea-doo-spark
Sea-Doo Spark | Custom Powersport Graphics and Wraps
Sea-Doo Spark custom-designed wraps and performance graphics for BRP Sea-Doo Personal Watercraft and Boats. Experience the thrill of riding a custom-wrapped watercraft that cuts through the waves like a champion.
🌐
TikTok
tiktok.com › discover › chevy-spark-wrap
Chevy Spark Wrap | TikTok
September 1, 2025 - Watch the stunning Chevy Spark full wrap in Avery SC950 Ultra Silver Metallic. Discover design inspiration and car wrapping tips! #chevrolet #chevroletspark #carstagram.
🌐
Reddit
reddit.com › r/seadoo › wrapped my spark
r/seadoo on Reddit: Wrapped My Spark
July 17, 2025 -

Just got my '23 spark Trixx 2up back from the shop (I hit a log and put a hole in the hull) and spent the day putting on the zoovie wrap from scsunlimited

🌐
Databricks Community
community.databricks.com › t5 › data-engineering › attributeerror-sparksession-object-has-no-attribute-wrapped-when › td-p › 33970
AttributeError: 'SparkSession' object has no attri... - Databricks Community - 33970
December 4, 2022 - I'm getting the error... AttributeError: 'SparkSession' object has no attribute '_wrapped' --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) in () 1 from sparknlp.training import CoNL...
🌐
YouTube
youtube.com › watch
" A Hunter Must Hunt . . . " | Spark in the Dark — Dungeon & Chill [ Episode 1 ] - YouTube
[ About the Game ]Spark in the Dark is an isometric ARPG dungeon crawler wrapped in an unsettling and chilling atmosphere. The game features a slower methodi...
Published   1 day ago
🌐
Reddit
reddit.com › r/chevyspark › has anyone wrapped the hood of their white spark with the black roof and accents?
Has anyone wrapped the hood of their white spark with the black roof and accents? : r/chevyspark
January 20, 2023 - A subreddit dedicated to the mini-might of the Chevrolet Spark. ... Sorry, this post was deleted by the person who originally posted it. Share ... Just do it so we can all see what it looks like! ... I really want to but I’m afraid it’ll look like I just bought a black hood and slapped it on there from a junk yard or something 🤣 ... I wrapped my 2011 spark in matte grey with a black glossy roof and mirrors.
🌐
Databricks
databricks-prod-cloudfront.cloud.databricks.com › public › 4027ec902e239c93eaaa8714f173bcfc › 2485090270202665 › 666001780394423 › 8589256059752547 › latest.html
Plop - Databricks
databricks-prod-cloudfront · 2015-11-03T20:50:00.000Z · "d41d8cd98f00b204e9800998ecf8427e" · STANDARD · api/2.0/ · 2015-11-03T20:50:08.000Z · api/2.0/python_client/ · 2015-11-03T20:50:17.000Z
🌐
Sparkcollection
sparkcollection.com › products › gift-wrap
Gift Wrap – Spark Collection
Brighten someone's day with a personal touch. Includes gift wrapped items in yellow tissue paper, confetti, a hand-written gift message on a full-sized card, packaged in a bright yellow organza bag.
🌐
Throttle Addiction
throttleaddiction.com › collections › all › 7mm braided cloth spark plug wire kit - gold / black-red
7mm Braided Cloth Spark Plug Wire Kit - Gold / Black-Red — Throttle Addiction
7mm Braided Cloth Spark Plug Wire Kit - Gold / Black-Red
These vintage style cloth wrapped spark plug wires will add a nice accent to your vintage motorcycle. They feature 7mm solid core copper conductor, PVC wrapped and then covered in cotton wrap for that old school look. The kit includes clips and custom finned boots. These vintage style cloth wrapped spark plug wires will add a nice accent to your vintage motorcycle.  They feature 7mm solid core copper conductor, PVC wrapped and then covered in cotton wrap for that old school look.  The kit includes clips and custom finned boots. Just cut the spark plug wire to length and install to your coil or
Rating: 5 ​
🌐
GitLab
gitlab.cs.umd.edu › peter keleher › p3 › repository
spark-2.0.1-bin-hadoop2.7/python/pyspark/shell.py · a019088c34750e79878ede8394c3e238268cfaed · Peter Keleher / p3 · GitLab
spark-2.0.1-bin-hadoop2.7 · python · pyspark · shell.py · Find file BlameHistoryPermalink · Initial commit · c1ccd1ec · Peter J. Keleher authored Oct 08, 2019 ·