Loading Snowflake tables

To mount a Snowflake database into the Splitgraph engine using SqlAlchemyFdw, run:

$ sgr mount snowflake local_schema -o@- <<EOF
{
    "username": "username",
    "secret": {
        "secret_type": "password",
        "password": "password"
    },
    "account": "acc-id.west-europe.azure",
    "database": "SNOWFLAKE_SAMPLE_DATA",
    "schema": "TPCH_SF100",
    "envvars": {"HTTPS_PROXY": "http://proxy.company.com"}
}
EOF

This will mount a remote Snowflake schema TPCH_SF100 from the SNOWFLAKE_SAMPLE_DATA database into a local schema local_schema on the local Splitgraph engine.

Alternatively, you can mount specific tables:

$ sgr mount snowflake test_snowflake_subquery -o@- <<EOF
{
    "username": "username",
    "secret": {
        "secret_type": "private_key",
        "private_key": "MIIEvQIBAD..."
    },
    "account": "acc-id.west-europe.azure",
    "database": "SNOWFLAKE_SAMPLE_DATA",
    "tables": {
        "balances": {
            "schema": {
                "n_nation": "varchar",
                "segment": "varchar",
                "avg_balance": "numeric"
            },
            "options": {
                "subquery": "SELECT n_nation AS nation, c_mktsegment AS segment, AVG(c_acctbal) AS avg_balance FROM TPCH_SF100.customer c JOIN TPCH_SF100.nation n ON c_nationkey = n_nationkey"
            }
        }
    }
}
EOF

To snapshot the table, making it part of an actual Splitgraph image, use the sgr import command.