Skip to content


class DatabricksDetails(
host: StrictStr,
http_path: StrictStr,
featurebyte_catalog: StrictStr,
featurebyte_schema: StrictStr,
storage_type: StorageType,
storage_url: str,
storage_spark_url: StrictStr


Model for details used to connect to a Databricks data source.


  • host: StrictStr
    Databricks host. This is typically the URL you use to go to to access your databricks environment.

  • http_path: StrictStr
    Databricks compute resource URL.

  • featurebyte_catalog: StrictStr
    Name of the database that holds metadata about the actual data. This is commonly filled as hive_metastore.

  • featurebyte_schema: StrictStr
    The name of the schema containing the tables and columns.

  • storage_type: StorageType
    Storage type of where we will be persisting the feature store to.

  • storage_url: str
    URL of where we will be uploading our custom UDFs to.

  • storage_spark_url: StrictStr
    URL of where we will be reading our data from. Note that this technically points to the same location as the storage_url. However, the format that the warehouse accepts differs between the read and write path, and as such, we require two fields.


>>> details = fb.DatabricksDetails(
...   host="<host_name>",
...   http_path="<http_path>",
...   featurebyte_catalog="hive_metastore",
...   featurebyte_schema="<schema_name>",
...   storage_type=fb.StorageType.S3,
...   storage_url="<url>",
...   storage_spark_url="dbfs:/FileStore/<schema_name>",
... )