Snowflake CREATE Warehouse
How to script a consistent and automated “create warehouse snowflake” environment
Prior to undertaking data loading, Snowflake should be configured with some baselines. In this post, we undertake some “lite” Snowflake data warehouse training using theWORKSHEET
feature. This is an easy-to-understand starting point in anyone’s journey to better learn Snowflake data warehouse setup and operations.
If you are wondering why use Snowflake data warehouse, check out that link for the benefits of the platform. For now, we are assuming you are sold and are ready to give it a try.
Let's get started!
Step 1: Create a Snowflake account
If you do not already have an account, Snowflake offers a free trial. Sign up for an account.
If you have an account already, log into it.
Step 2: Locate the WORKSHEET Feature In Your Snowflake Account
If you log in via a web browser, click theWORKSHEET
icon at the top of the page.
Important: Ensure that the role for the session is set toACCOUNTADMIN
to complete this process.
Step 2: Prepare Your Automated Commands
Your Snowflake environment will use the automated setup SQL command we detail later in this post. However, there are a few edits you will need to make.
First, make sure you change the user_name
,user_password
, and any other variables as needed in the commands where you seeCHANGEME
:
role_name = CHANGEME_ROLE;
user_name = ‘CHANGEME’;
user_password = ‘CHANGEME’;
warehouse_name = ‘CHANGEME_WAREHOUSE’;
database_name = ‘CHANGEME_DATABASE’;
schema_name = ‘CHANGEME_SCHEMA’;
For example, you might switchCHANGEME
to something that reflects your team or a project name:
role_name = MARKETING_ROLE;
user_name = ‘datasc!encet3amalpha’;
user_password = ‘@saN3m!toD!3411tTyUU912’;
warehouse_name = ‘MARKETING_WAREHOUSE’;
database_name = ‘MARKETING_DATABASE’;
schema_name = ‘MARKETING_SCHEMA’;
Ultimately, these values are up to you to supply in accordance with allowed Snowflake values.
Step 3: Configure The Automation Script
Here is the script to paste into your SnowflakeWORKSHEET
space. Just make sure you adjust accordingly for all of the variables previously detailed.
/*You can set up access to Snowflake in a few simple steps*/-- set variables (these need to be uppercase)set openbridge_role = 'OPENBRIDGE_ROLE';set openbridge_username = 'OPENBRIDGE_USER';set openbridge_warehouse = 'OPENBRIDGE_WAREHOUSE';set openbridge_database = 'OPENBRIDGE_DATABASE';set openbridge_schema = 'OPENBRIDGE_SCHEMA';-- set user passwordset openbridge_password = 'password';begin;-- create openbridge roleuse role securityadmin;create role if not exists identifier($openbridge_role);grant role identifier($openbridge_role) to role SYSADMIN;-- create openbridge usercreate user if not exists identifier($openbridge_username)password = $openbridge_passworddefault_role = $openbridge_roledefault_warehouse = $openbridge_warehouse;grant role identifier($openbridge_role) to user identifier($openbridge_username);-- change role to sysadmin for warehouse / database stepsuse role sysadmin;-- create openbridge warehousecreate warehouse if not exists identifier($openbridge_warehouse)warehouse_size = xsmallwarehouse_type = standardauto_suspend = 60auto_resume = trueinitially_suspended = true;-- create openbridge databasecreate database if not exists identifier($openbridge_database);-- grant openbridge warehouse accessgrant USAGEon warehouse identifier($openbridge_warehouse)to role identifier($openbridge_role);-- grant openbridge database accessgrant OWNERSHIPon database identifier($openbridge_database)to role identifier($openbridge_role);commit;begin;USE DATABASE identifier($openbridge_database);-- create schema for openbridge dataCREATE SCHEMA IF NOT EXISTS identifier($openbridge_schema);commit;begin;-- grant openbridge schema accessgrant OWNERSHIPon schema identifier($openbridge_schema)to role identifier($openbridge_role);commit;
Step 4: Optional Network Policy Whitelisting
Network policies provide options for managing network configurations to your Snowflake service.
If you want to create a network policy restricting access to your account based on user IP address enable you to create an IP allowed list, as well as an IP, blocked list if desired.
Here is a snippet to create a whitelist (allowed):
/*If you want to use a network policy or modify an existing one, uncomment the one (create or alter) that aligns with your needs.This example shows whitelisting two IP addresses: 1.1.1.1/32, 2.2.2.2/32Please adjust these to correspond to the IP addresses you want to whitelist.*/— create network policy openbridge_ip_whitelist ALLOWED_IP_LIST = (‘1.1.1.1/32’, ‘2.2.2.2/32’);— alter network policy <your_current_network_policy_name> SET {[ALLOWED_IP_LIST] = (‘1.1.1.1/32’, ‘2.2.2.2/32’)]};
Step 5: RUN Your Snowflake Worksheet
With your edits in place, you can paste the content into the Snowflake Worksheet. Make sure you have selected All Queries
when you start your RUN
. You should see a “success” output from Snowflake. If not, check your inputs in the script.
As a sanity check, review that the Warehouse, Database, Schema, and others elements are present and align with your configuration.
Snowflake is ready to receive data!
If all looks good, you have now configured a Snowflake.
The process demonstrated how you can start to automate the configuration of a base Snowflake configuration. For example, you can use parts of this process aWORKSHEET
as needed by modifying the script to separately create users, roles, whitelists, and so forth.
Good luck, and enjoy using your Snowflake cloud data warehouse.
About Openbridge: Make data work for you
Unify data in trusted, private industry-leading data lake or cloud warehouses like Amazon Redshift, Amazon Redshift Spectrum, Google BigQuery, Snowflake, Azure Data Lake, Ahana, and Amazon Athena. Data is always fully owned by you.
Take control, and put your data to work with your favorite analytics tools. Explore, analyze, and visualize data to deliver faster innovation while avoiding vendor lock-in using tools like Google Data Studio, Tableau, Microsoft Power BI, Looker, Amazon QuickSight, SAP, Alteryx, dbt, Azure Data Factory, Qlik Sense, and many others.