Snowflake

Snowflake is a popular managed data warehouse on AWS, Azure, and GCP.

In the UI for the Snowflake connector, you should enter the following:

The following options are required to create a snowflake connector:

  • url: the hostname for your account in the following format: .snowflakecomputing.com.
  • user: login name for the Snowflake user.
  • password: password of the Snowflake user. (required if token is not set)
  • token: OAuth token that can be used to access snowflake. (required if password is not set)
  • database: the database to use for the session after connecting.
  • schema: the schema to use for the session after connecting.

The remaining options are not required, but are optional:

  • warehouse: the default virtual warehouse to use for the session after connecting.
  • role: the default security role to use for the session after connecting.
  • table: the table to which data is written to or read from.

Additional snowflake options can be added as a list of key-value pair in sfOptions

There are two options available for authentication. The first option is to configure a username and a password. The second option is to use an OAuth token. See Configure Snowflake OAuth for instruction on how to configure OAuth support for snowflake, and Using External OAuth on how you can use External OAuth to authenticate to Snowflake.

With regards to the database driver, the library to interact with Snowflake is not included in Hopsworks - you need to upload the driver yourself. First, you need to download the jdbc driver and to use snowflake as the data source in spark the snowflake spark connector.

Upload the Snowflake the driver.
Upload the JDBC driver and Snowflake Spark connector to Hopsworks.

Then, you add the file to your notebook or job before launching it, as shown in the screenshots below.

When you start a Jupyter notebook, you need to add the driver so it can be accessed in programs.
When you start a Jupyter notebook for Snowflake, you need to add both the JDBC driver and the Snowflake Spark Connector.