Skip to content

Snowflake

Install

See the Install guide for the full setup, including Windows PowerShell.

curl -fsSL https://install.skippr.io/install.sh | shClick to copy

Loads data directly into Snowflake tables.

For a full end-to-end walkthrough, see the Snowflake Quick Start.

Configuration

yaml
warehouse:
  kind: snowflake
  account: ${SNOWFLAKE_ACCOUNT}
  user: ${SNOWFLAKE_USER}
  private_key_path: ${SNOWFLAKE_PRIVATE_KEY_PATH}
  database: ANALYTICS
  schema: RAW
  warehouse: COMPUTE_WH
  role: ACCOUNTADMIN
  stage: "@skippr_stage"
  staging_uri: "s3://my-bucket/skippr-staging"
  staging_storage_integration: "SKIPPR_S3_INT"
FieldDefaultDescription
accountSnowflake account identifier
userSnowflake login or service user
passwordPassword authentication value
private_key_pathPath to a private key for key-pair auth
database(required)Snowflake database
schema(required)Target schema
warehouseCompute warehouse
roleSnowflake role
stageOptional Snowflake stage name to use for uploads
staging_uriOptional external staging URI (s3://, azure://, or gcs://)
staging_storage_integrationOptional Snowflake storage integration for external staging
staging_azure_sas_tokenAzure SAS token for external Azure staging uploads
staging_azure_account_keyAzure account key for external Azure staging uploads
staging_gcs_service_account_key_pathPath to a GCS service account key for external GCS staging uploads

CLI

bash
skippr connect warehouse snowflake \
  --account ${SNOWFLAKE_ACCOUNT} \
  --user ${SNOWFLAKE_USER} \
  --private-key-path ${SNOWFLAKE_PRIVATE_KEY_PATH} \
  --database ANALYTICS \
  --schema RAW \
  --warehouse COMPUTE_WH \
  --role ACCOUNTADMIN \
  --stage @skippr_stage \
  --staging-uri s3://my-bucket/skippr-staging \
  --staging-storage-integration SKIPPR_S3_INT

Or run without flags to be prompted interactively.

FlagDescription
--accountSnowflake account identifier
--userLogin or service user
--passwordPassword auth value
--private-key-pathPath to a private key for key-pair auth
--databaseSnowflake database name
--schemaBronze/raw schema where extracted data lands
--warehouseCompute warehouse
--roleRole with appropriate grants
--stageOptional Snowflake stage name to use for uploads
--staging-uriOptional external staging URI (s3://, azure://, or gcs://)
--staging-storage-integrationOptional Snowflake storage integration for external staging
--staging-azure-sas-tokenAzure SAS token for external Azure staging uploads
--staging-azure-account-keyAzure account key for external Azure staging uploads
--staging-gcs-service-account-key-pathPath to a GCS service account key for external GCS staging uploads

Config output

Running connect warehouse snowflake writes the following to skippr.yaml:

yaml
warehouse:
  kind: snowflake
  account: ${SNOWFLAKE_ACCOUNT}
  user: ${SNOWFLAKE_USER}
  private_key_path: ${SNOWFLAKE_PRIVATE_KEY_PATH}
  database: ANALYTICS
  schema: RAW
  warehouse: COMPUTE_WH
  role: ACCOUNTADMIN
  stage: "@skippr_stage"
  staging_uri: "s3://my-bucket/skippr-staging"
  staging_storage_integration: "SKIPPR_S3_INT"

Authentication

account, user, password, and private_key_path are part of the public warehouse.snowflake config surface, but using environment variable interpolation is the recommended approach so secrets do not live in skippr.yaml.

VariableDescription
SNOWFLAKE_ACCOUNTAccount identifier (e.g. MYORG-MYACCOUNT or xy12345.us-east-1)
SNOWFLAKE_USERLogin username (or service account name)
SNOWFLAKE_PRIVATE_KEY_PATHPath to .p8 private key file (key-pair auth, recommended)
SNOWFLAKE_PASSWORDPassword (only when MFA is not enforced)

Key-pair auth is recommended and required when MFA is enabled on the Snowflake account. It works with both regular user accounts and service accounts.

Install OpenSSL (Windows)

OpenSSL is pre-installed on macOS and most Linux distributions. On Windows, install it first:

powershell
winget install OpenSSL

Restart your terminal after installing so the openssl command is available.

Generate the key pair

bash
openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out snowflake_key.p8 -nocrypt
openssl rsa -in snowflake_key.p8 -pubout -out snowflake_key.pub

Assign the public key

For a regular user account:

sql
ALTER USER myuser SET RSA_PUBLIC_KEY='MIIBIjANBgkqh...';

For a service account (see below):

sql
ALTER USER skippr_svc SET RSA_PUBLIC_KEY='MIIBIjANBgkqh...';

Set the environment variable

bash
export SNOWFLAKE_PRIVATE_KEY_PATH="/path/to/snowflake_key.p8"
powershell
$env:SNOWFLAKE_PRIVATE_KEY_PATH = "C:\path\to\snowflake_key.p8"

Service account authentication

A Snowflake service account is a dedicated, non-interactive user designed for automated workloads like Skippr. Using a service account means:

  • No impact on personal accounts — the service user is separate from human users, so MFA policies, password rotations, and account lockouts don't interrupt pipelines.
  • Least-privilege by default — grant only the specific roles and warehouses Skippr needs.
  • Audit clarity — all Skippr activity appears under a distinct user in Snowflake's query history.

Create the service account

sql
CREATE USER skippr_svc
  TYPE = SERVICE
  DEFAULT_ROLE = SKIPPR_ROLE
  DEFAULT_WAREHOUSE = COMPUTE_WH
  RSA_PUBLIC_KEY = 'MIIBIjANBgkqh...';

CREATE ROLE SKIPPR_ROLE;
GRANT ROLE SKIPPR_ROLE TO USER skippr_svc;

Then grant the required privileges to SKIPPR_ROLE.

Environment variables

bash
export SNOWFLAKE_ACCOUNT="MYORG-MYACCOUNT"
export SNOWFLAKE_USER="skippr_svc"
export SNOWFLAKE_PRIVATE_KEY_PATH="/path/to/snowflake_key.p8"

Everything else works the same — skippr connect warehouse snowflake, skippr run, etc.

Permissions or Network Requirements

The role needs:

  • USAGE on the warehouse
  • USAGE on the database
  • USAGE and CREATE TABLE on the raw schema
  • CREATE SCHEMA on the database (for silver/gold schema creation)

Example grants for a dedicated role:

sql
GRANT USAGE ON WAREHOUSE COMPUTE_WH TO ROLE SKIPPR_ROLE;
GRANT USAGE ON DATABASE ANALYTICS TO ROLE SKIPPR_ROLE;
GRANT USAGE ON SCHEMA ANALYTICS.RAW TO ROLE SKIPPR_ROLE;
GRANT CREATE TABLE ON SCHEMA ANALYTICS.RAW TO ROLE SKIPPR_ROLE;
GRANT CREATE SCHEMA ON DATABASE ANALYTICS TO ROLE SKIPPR_ROLE;

Staging

Skippr uses Snowflake's default internal staging flow by default. When Snowflake returns temporary stage credentials from PUT, Skippr uploads to the backing object store automatically with no extra config for Snowflake accounts hosted on:

  • AWS S3
  • Azure Blob Storage
  • Google Cloud Storage

That means AWS, Azure, and GCS backed Snowflake accounts do not need separate Skippr storage credentials or extra connector flags for the normal internal-stage path.

If you want to override that behavior, Skippr exposes the Snowflake staging controls directly in warehouse.snowflake:

  • stage
  • staging_uri
  • staging_storage_integration
  • staging_azure_sas_token
  • staging_azure_account_key
  • staging_gcs_service_account_key_path

Use environment interpolation for any credential-bearing values:

yaml
warehouse:
  kind: snowflake
  stage: "@skippr_stage"
  staging_uri: "azure://myaccount.blob.core.windows.net/mycontainer/skippr-staging"
  staging_storage_integration: "SKIPPR_AZURE_INT"
  staging_azure_sas_token: ${AZURE_STORAGE_SAS_TOKEN}

staging_uri supports:

  • s3://my-bucket/skippr-staging
  • azure://myaccount.blob.core.windows.net/mycontainer/skippr-staging
  • gcs://my-bucket/skippr-staging

Provider notes:

  • S3 external staging uses the standard AWS credential chain for uploads. staging_storage_integration is optional.
  • Azure external staging uses staging_azure_sas_token or staging_azure_account_key for uploads. Without staging_storage_integration, Snowflake still needs a SAS token for COPY INTO.
  • GCS external staging uses staging_gcs_service_account_key_path for uploads, and staging_storage_integration is required because Snowflake does not support inline GCS credentials for COPY INTO.

Equivalent environment variables are also supported for the shared staging controls:

  • SNOWFLAKE_STAGE
  • SNOWFLAKE_STAGING_URI
  • SNOWFLAKE_STAGING_STORAGE_INTEGRATION

For cloud upload credentials, the most portable env vars are:

  • AWS: standard AWS credential chain
  • Azure: AZURE_STORAGE_SAS_TOKEN or AZURE_STORAGE_ACCOUNT_KEY
  • GCS: GOOGLE_APPLICATION_CREDENTIALS

Troubleshooting

SymptomFix
Failed to connect: 250001Check the SNOWFLAKE_ACCOUNT format — use the org-account form (e.g. MYORG-MYACCOUNT) or include the region (e.g. xy12345.us-east-1)
Incorrect username or passwordVerify SNOWFLAKE_USER and auth env vars
Insufficient privilegesEnsure the role has the grants listed above
390197 — Multi-factor authentication is requiredSwitch to key-pair auth — password auth cannot work when MFA is enforced
openssl: command not foundInstall OpenSSL — on Windows: winget install OpenSSL, then restart your terminal

Further reading

CDC Support

Snowflake supports CDC with exactly-once final-state MERGE semantics. Skippr automatically creates _skippr_order_token columns and tombstone tables.

See CDC Destinations -- Snowflake for details.

Next steps

Install

See the Install guide for the full setup, including Windows PowerShell.

curl -fsSL https://install.skippr.io/install.sh | shClick to copy