
## Metadata
- Author: [[matt-star|Matt Star]]
- Full Title:: What's Missing From the Amplitude and Snowflake Partnership
- Category:: #🗞️Articles, [[Product analytics and sequence analytics|Product Analytics And Sequence Analytics]]
- URL:: https://www.narrator.ai/blog/what-the-amplitude-and-snowflake-partnership-is-missing/?ref=www.narratordata.com/blog
- Finished date:: [[2023-12-28]]
## Highlights
> Instead of having to send data into Amplitude using their SDK, you can do the following:
> • Write a simple SQL snippet mapping data from Snowflake into Amplitude events and properties
> • Once saved, Amplitude will poll the data every hour and insert it into Amplitude’s database ([View Highlight](https://read.readwise.io/read/01hjqfh8dfh1s1pgt42fty57pq))
## New highlights added [[2023-12-28]]
> **Our solution:** The event stream data model that Narrator builds and maintains lives in your data warehouse. All processing and querying happens on your data warehouse without data ever leaving your system. ([View Highlight](https://read.readwise.io/read/01hjqfzj2pya4f6fek84ezbagz))
> By pushing data out of Snowflake and into Amplitude, it’s not clear where the events came from ([View Highlight](https://read.readwise.io/read/01hjqg24ztkhxesw8he67t9zfn))