This article was co-written by
Vinodh Kumar Govindarajan & Sandeep Manivannan.
Global organizations often face steep regulatory and architectural roadblocks when sharing data between standard Snowflake accounts (in AWS, Azure, or GCP regions) and China-based Snowflake accounts.
Unlike classic Snowflake-to-Snowflake sharing, direct sharing across this boundary simply isn’t possible due to the China region’s separation—operated independently by Digital China Cloud Technology Limited (DCC)—and stringent cross-border data restrictions.
This blog pinpoints what’s not possible out-of-the-box, then guides you through two practical, tested workarounds for exchanging data between global and China Snowflake deployments.
These are not native solutions but real-world approaches based on AWS S3 transfers and external tables.
Ultimately, we’ll help you choose the best fit for your scenario.
What’s Not Supported: Snowflake Limitations
Before exploring the workarounds, it’s vital to understand what can’t be done today:
Direct Data Sharing / Listings: Not possible between global and China Snowflake accounts.
Replication / Failover: No cross-region replication or failover between China and non-China accounts.
External Stages: China Snowflake cannot load/unload data into external stages located outside of China.
These realities mean that cross-border data exchange requires extra architecture and processes. With native tools blocked, the only option is to use cloud storage as an intermediary and build around manual steps and service restrictions.
How Snowflake Data Exchange Works in China: Operated Separately
China Snowflake is independent. It runs on a separate platform with a unique domain (snowflakecomputing.cn), creating hard boundaries.
Accounts require special setup: Customers must go through DCC for account creation; US/global accounts and China accounts can’t access each other’s resources or leverage standard region features.
Some global Snowflake features are not available in China.
How to Migrate Data Between a US-based Snowflake Account and a China-Based Snowflake account on AWS
How did we do it? A couple of POCs were conducted in association with the Snowflake team to identify the best approach. In this blog, we will discuss two approaches selected from many that are very suitable for this use case.
Practical Approaches for Bi-Directional Data Sharing
Overview
Given these constraints, phData and Snowflake recommend two proven workarounds:
Approach 1: AWS Data Transfer Hub Automation
Approach 2: Direct S3/External Table Sync with Manual Intervention
Below, each approach is explained, including when to use it, its setup steps, automation caveats, and tradeoffs.
Approach | US → China | China → US | Automation | Caveats |
---|---|---|---|---|
1. AWS Data Transfer Hub + S3 | Supported | Supported | Auto-refresh | TextRequires Data Transfer Hub |
2. Direct S3/External Table | Manual refresh US→CN | Auto-refresh CN→US | Semi/Manual | US table requires manual refresh; use China S3 |
Approach 1: Automated Sync via AWS Data Transfer Hub
When to Use: Ideally, if you want automated transfer/sync and are okay with the complexity of the AWS Data Transfer Hub.
Steps:
Source Table Prep: Create (or update) your table in the global (US) Snowflake account.
Export to S3: Copy the table data to an S3 bucket in your US-based cloud region.
S3 Sync via Data Transfer Hub: Use AWS Data Transfer Hub to replicate/synchronize S3 bucket contents from the US to China region buckets.
External Table in China Snowflake: On the China Snowflake account, create an External Table pointing to the synced S3 location (auto-refresh ON).
Reverse as Needed: To move data from China to global, reverse the path: China Snowflake > S3 China > Data Transfer Hub > S3 US > External Table in US Snowflake.
Tradeoffs/Considerations:
Requires setup and monitoring of Data Transfer Hub.
Sync automation removes many manual steps.
Extra AWS costs and infrastructure to manage.
Approach 2: Direct S3 Copy and External Table (Semi-Manual)
When to Use: Suitable if Data Transfer Hub is unavailable, or when you need a more controlled, hands-on process.
Steps:
US to China:
Point both global and China Snowflake accounts to an AWS China S3 bucket using keyed credentials.
Export the US Snowflake table to S3 in China.
In China Snowflake, create an External Table from the S3 location with auto-refresh ON.
China to US:
From China Snowflake, export data to S3 China.
In global (US) Snowflake, create an External Table referencing the China S3 location—but auto-refresh must be set to
FALSE
(not supported with this config; manual refresh only).
Manual/Caveat:
Due to Snowflake restrictions, the External Table on US Snowflake cannot use auto-refresh when pointing to China-region S3—the update must be triggered manually.
Some steps may require regular intervention until Snowflake adds further support.
Limitations and Key Caveats
Automation is NOT always available: External tables in certain cross-region scenarios require manual refresh and extra management.
No direct sharing, listing, or cross-account replication: Accept up front that established Snowflake mechanisms are unworkable between global and Chinese regions.
Security, Governance, and Compliance: Always verify your approach with legal and compliance teams; cross-border data flows can raise regulatory risk.
Conclusion
While Snowflake’s core “data sharing” features don’t cross the global/China divide, enterprises can exchange data by orchestrating external tables, cloud storage, and smart AWS automation.
The approaches outlined above offer stable, real-world paths to enabling global data collaboration, despite architectural and regulatory hurdles.
Until Snowflake offers deeper China integration, these workarounds let organizations confidently move cross-border data. Expect more options as platforms evolve, but for now, clarity, planning, and a practical process are key.
Need help building or automating your global Snowflake pipelines? Contact phData’s experts for up-to-date, compliant solutions tailored to your enterprise needs.
Need help building or automating your global Snowflake pipelines?
Contact phData’s experts for up-to-date, compliant solutions tailored to your enterprise needs.