Skip to content
DataOps.live Professional EditionNEW
Purpose-built environment for small data teams and dbt Core developers.
DataOps.live Enterprise Edition
DataOps.live is the leading provider of Snowflake environment management, end-to-end orchestration, CI/CD, automated testing & observability, and code management, wrapped in an elegant developer interface.
Spendview for Snowflake FREE

An inexpensive, quick and easy way to build beautiful responsive website pages without coding knowledge.


Pricing and Edition

See whats included in our Professional and Enterprise Editions.

Getting Started
Docs- New to DataOps.liveStart learning by doing. Create your first project and set up your DataOps execution environment.
Join the Community
Join the CommunityFind answers to your DataOps questions, collaborate with your peers, share your knowledge!
#TrueDataOps Podcast
#TrueDataOps PodcastWelcome to the #TrueDataOps podcast with your host Kent Graziano, The Data Warrior!
Academy
DataOps AcademyEnroll in the DataOps.live Academy to take advantage of training courses. These courses will help you make the most out of DataOps.live.

Build Vs Buy
Resource Hub
On-Demand Resources: eBooks, White Papers, Videos, Webinars

Learning Resources
A collection of resources to support your learning journey.

Customer stories
Events
Connect with fellow professionals, expand your network, and gain knowledge from our esteemed product and industry experts.
#TrueDataOps.org
#TrueDataOps.Org#TrueDataOps is defined by seven key characteristics or pillars:
Blogs
Stay informed with the latest insights from the DataOps team and the vibrant DataOps Community through our engaging DataOps blog. Explore updates, news, and valuable content that keep you in the loop about the ever-evolving world of DataOps.
In The News

In The News

Stay up-to-date with the latest developments, press releases, and news.
About Us
About UsFounded in 2020 with a vision to enhance customer insights and value, our company has since developed technologies focused on DataOps.
Careers

Careers

Join the DataOps.live team today! We're looking for colleagues on our Sales, Marketing, Engineering, Product, and Support teams.
Colin BradfordMar 6, 2024 7:19:13 AM2 min read

Connecting to Snowflake from a Snowpark Container Services (SPCS) container

Connecting to Snowflake from a Snowpark Container Services (SPCS) container
3:12

Many workloads running as containers in Snowpark Container Services (SPCS) will want to connect to a Snowflake warehouse to interact with data stored in Snowflake. You can use any of the supported Snowflake clients from within the container. However, there are several prerequisites for setting up this access:

  • Credentials (username/password or private key) must be injected into the container. Snowflake provides secrets for this purpose.
  • An external access integration is required to allow outbound network connections from the container. See Creating and using an external access integration | Snowflake Documentation.
  • The Snowflake warehouse has to allow incoming connections from SPCS. The address ranges of SPCS are not yet well defined, so this allow-range has to be large.

To simplify access, Snowpark Container Services provides a token to the container in the file /snowflake/session/token that can be used for authentication. This token has many benefits:

  • The token is automatically provisioned. No user interaction is required, so users do not have to handle sensitive credential information.
  • The token can only be used within the container. If the token leaks, it cannot be used externally.
  • The token lifetime is 10 minutes, reducing the impact of a credential leak. The refresh is automatic and does not have to be configured or managed by a user.
  • The token supports internal, private connections to Snowflake using the connection parameter host. Using the host parameter forces the connection to stay internal to Snowflake. No data is sent to the internet to connect to Snowflake.

The Snowflake connection parameter host parameter is particularly important. Using host  means that the connection will resolve to a private IP address and, thus, will not connect to a public endpoint on the internet. Further, the private endpoint is not controlled by network policies, which avoids having to open up large IP address ranges in a network policy to allow access from SPCS containers to Snowflake.

Using a token instead of a username/password in a client library such as Python is straightforward. The sample code below shows making a connection:

def get_login_token():
with open('/snowflake/session/token', 'r') as f:
return f.read()

conn = snowflake.connector.connect(
host = os.getenv('SNOWFLAKE_HOST'),
account = os.getenv('SNOWFLAKE_ACCOUNT'),
token = get_login_token(),
authenticator = 'oauth',
database = os.getenv('SNOWFLAKE_DATABASE'),
schema = os.getenv('SNOWFLAKE_SCHEMA')
)


In addition to the token and host parameters discussed above, the connection uses the predefined variables SNOWFLAKE_ACCOUNT, SNOWFLAKE_DATABASE, and  SNOWFLAKE_SCHEMA. The values default to the same account, database, and schema in which the container’s Snowpark Container Services image registry resides.

Currently, the connection will have the same role as the one that created the service, but this may change in the future.

In summary, using the Snowflake-provided token to authenticate a connection to Snowflake from an SPCS container makes configuration more straightforward and provides a more secure connection.

avatar

Colin Bradford

Lead Architect, DataOps.live

RELATED ARTICLES