![]() In my last post I introduced you to Slack’s API and we built our first Slack application, or app. The Airflow DAG successfully ingests data from Circle.Build personalized applications with Slack’s APIs – Part 2 The project will be considered complete when the following criteria have been met: Slack instance, particularly the #triage-data-alerts channel Mural ERD of the desired circle schema in Snowflake GitHub repo (resilia-data/airflow) that contains the existing - API call and Airflow DAG Snowflake account to confirm the loading of data S3 bucket (resilia-mwaa-tmp-data) to confirm the temp storing of data S3 bucket (resilia-dag-source) that contains the Airflow DAG and API call Access to The Company's AWS data-orchestration-production account The successful completion of this project is dependent on the following assumptions: Documentation outlining the design and functionality of the DAG, including any parameters or dependencies required to run the DAG Error logs are sent to The Company's slack channel #triage-data-alerts Test cases and results to ensure the DAG works as intended An action that pushes updates from the repo resilia-data/airflow to resilia-dag-source when the repo is updated The completed Airflow DAG and dependencies (packages, etc.) The following deliverables are expected from this project: Document the design and functionality of the DAG, including any parameters or dependencies required to run the DAG Build monitoring such that any errors or issues can be troubleshot as needed Configure a GitHub action that updates changes from The Company's repository to production on MWAA Ensure that the data is transformed and loaded correctly into the S3 bucket and Snowflake ![]() Configure the Airflow DAG to run on schedule during weekdays Design and develop an Airflow DAG within The Company's MWAA that extracts data from Circle.so, load it into an S3 bucket, and then loads it into Snowflake The scope of this project includes the following tasks: The major deliverables are a GitHub repository that holds the required code and an Airflow job that runs in production on MWAA. The DAG should be designed to run hourly to ensure that the data in Snowflake is up-to-date. ![]() ![]() The objective of this project is to design and develop an Airflow DAG that will extract data from Circle.so, load it into an S3 bucket, and then load it into Snowflake. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |