Databricks Compute

Prerequisites

Supported Types of Compute

The Xonai Accelerator can be activated in the following types of compute:

  1. All-purpose compute

  2. Job compute

Info

Support for other types of compute may be added in the future.

For more information about Databricks types of compute, please refer to the Databricks compute documentation.

Supported Databricks Runtime Versions

The Xonai Accelerator is compatible with the Databricks runtime versions listed in the following table:

Spark version

Databricks release

3.5.0

15.4 LTS

14.3 LTS

3.4.1

13.3 LTS

3.3.2

12.2 LTS

3.3.0

11.3 LTS

3.2.1

10.4 LTS

3.1.2

9.1 LTS

For more information about Databricks runtime releases, please refer to the Databricks release notes.

Installation on Unity Catalog

This article explains how Xonai can be installed on a Unity Catalog volume and activated on a cluster via an init script.

Create a Volume

To create a volume to store Xonai files:

  1. Navigate to Catalog from the left navigation panel.

  2. Click Create > Create Volume within the schema you want to create the volume (e.g. /Volumes/main/default/).

  3. The Create a new volume dialog appears. Type xonai as the volume name and select Managed volume in Volume type.

Create an Init Script

After the volume is created:

  1. Create a file with the following content and change the placeholders as described after:

    1
    2
    %sh
    cp <volume-path>/xonai/xonai-spark-plugin-dbx-<release>-<version>-stable-linux-<arch>.jar /databricks/jars/
    
  2. Change <volume-path>, <runtime>, <release> and <arch> placeholders with:

    1. The path for the the managed volume you created (e.g. /Volumes/main/default/xonai).

    2. The Databricks runtime version you are using (e.g. 14.3).

    3. The Xonai JAR release version you want to use.

    4. The Worker processor architecture: amd64 for Intel/AMD or arm64 for Graviton/ARM.

  3. Copy xonai-init.sh to the managed volume you created for Xonai (e.g. /Volumes/main/default/xonai)

Info

You may want to create multiple configuration scripts with distinct names if you are using multiple Databricks runtime versions and processor architectures in your workspace.

Copy Xonai JARs

After the init script is created:

  1. Click Upload to this volume at the top right corner.

  2. The Upload files to volume dialog appears. Drag and drop or browse to the JAR(s) you want to upload, and click Upload.

Activation on All-Purpose Compute

The first step is to define the Xonai init script you created:

  1. Navigate to Compute > All-purpose compute from the left navigation panel.

  2. Edit an existing compute or create a new compute in Create compute top right corner.

  3. Uncheck Use Photon Acceleration in Performance panel.

  4. Click Advanced options to expand the section and then click Init Scripts tab.

  5. Select Volume as Source and then select the Xonai init script you created.

On completion it should look like this:

../_images/dbx-xonai-init-script.png

The second step is to define the required configuration for your Spark jobs. In the same Advanced options section, click Spark tab and copy the following code to Spark config.

Copy me to Spark config
spark.plugins com.xonai.spark.SQLPlugin

On completion it should look like this:

../_images/dbx-xonai-spark-conf.png

Your cluster is now configured to run the Xonai Accelerator in your Spark jobs.

Activation on Jobs Compute

The first step is to define the Xonai init script you created:

  1. Navigate to Job Runs > Jobs from the left navigation panel in Data Engineering section.

  2. Click in the job you want to activate the Xonai Accelerator.

  3. Click Compute > Configure in the right panel.

  4. Click Advanced options to expand the section and then click Init Scripts tab.

  5. Select Volume as Source and then select the Xonai init script you created.

On completion it should look like this:

../_images/dbx-xonai-init-script.png

The second step is to define the required configuration for your Spark jobs. In the same Advanced options section, click Spark tab and copy the following code to Spark config.

Copy me to Spark config
spark.plugins com.xonai.spark.SQLPlugin

On completion it should look like this:

../_images/dbx-xonai-spark-conf.png

Your cluster is now configured to run the Xonai Accelerator in your Spark jobs.


Last update: May 21, 2025