Inspiration
I have been an ETL developer and i could tell that creating ETL job is a repeating task and requires a lot of testing.
What it does
We essentially starting with a yaml that will define all the downstream process that will create the data extraction/request from data exchange. Then the data will be pipe to kinesis stream to a amazon personalize (assuming we will be doing campaign, a/b test on recommendation system). All these in a yaml predefined!!
How I built it
I was doing the api of Boto using python notebook and it will be put into lambda and client script(kinesis part)
Challenges I ran into
Getting the data set from data exchange as 'Entitled' is challenging with api method.
Accomplishments that I'm proud of
I have manage to leared I can basically get any public data set and make them into amazon personalize dataset.
What I learned
API of BOTO(very important.
What's next for Data Exchange On The Go
A terraform/cloudformation script development.
Built With
- data-exchange
- kinesis
- lambda
- python



Log in or sign up for Devpost to join the conversation.