Database Administrator Guide > Loading Data into X100 Tables > Load Data with Spark SQL through the Spark Connector
Was this helpful?
Load Data with Spark SQL through the Spark Connector
The Spark Connector lets you interact with an Actian Ingres database using Apache Spark. A functional setup requires a working Spark installation and the Spark Connector JAR file. Contact Actian support for a copy of the Spark Connector JAR file and installation requirements for Spark.
If you are connected to a Spark -shell, ensure that the Spark Connector JAR file is in the classpath. Create a new Spark session as follows:
import org.apache.spark.sql._
import com.actian.spark_vector.extensions.VectorDataSourceV2Strategy
val spark = SparkSession.builder()
.withExtensions { extensions =>
extensions.injectPlannerStrategy(sp => new VectorDataSourceV2Strategy(sp))
}
.getOrCreate()
For the test_table:
CREATE TABLE test(col1 int)
Reference in Spark:
spark.sql("""CREATE TEMPORARY VIEW vector_table
USING com.actian.spark_vector.sql.VectorSourceV2
OPTIONS (
host "localhost",
instance "VW",
database "testdb",
table "test",
user "actian",
password "actian"
)""")
You can load data into the table as shown below:
import org.apache.spark.sql.types.IntegerType
import org.apache.spark.sql.types.StructField
import org.apache.spark.sql.types.StructType
import org.apache.spark.sql.DataFrame
import org.apache.spark.sql.Row
import scala.jdk.CollectionConverters._

val values: Seq[Row] = Seq(Row(1))
val schema = StructType(Seq(StructField("col1", IntegerType, nullable = true)))
val valuesDF: DataFrame = spark.createDataFrame(values.asJava, schema)
valuesDF.createTempView("spark_table")
 
spark.sql("insert into vector_table select * from spark_table")
To view the inserted data:
val res = spark.sql("select * from vector_table")
res.show()
Last modified date: 01/27/2026