Using AWS Cloud Products for Data Transformation and Integration for Fire Risk Prediction and 911 Response-Time Analysis
Python
SQL
Databases
Cloud
Neural Networks
For our Capstone project, we worked with our Sponsor, Levrum Data Technologies, to start to develop and AWS Cloud data pipeline that takes existing customer data and modifies it in a way that it can be useful to one of their existing products. The AWS Cloud products are used to perform transformations, cleaning, and other modifications as needed, index the data, and allow for the consumption of the data as a service to an existing neural network endpoint. The endpoint uses the data to predict fire risk and serves as part of one of Levrum's products.
0 Lifts 
Artifacts
Name | Description | |
---|---|---|
General Demonstration of AWS Cloud S3, Data Catalog and Glue ETL process | This video provides a brief demonstration of part of the data pipeline we were building. It touches on filesystem storage in AWS S3, using Glue Crawlers to build a AWS Data Catalog, and using some Glue ETL tools to manipulate and organize the data. This was made about halfway through the project and we were still learning about AWS products. | Link |
High Level Overview of Project Plan Structure | This is an early example of one possible end product could build. | Download |
Walkthrough of Project and Reflections on what we learned and what comes next. | This video goes through how we ended up building the project, a bit about what we learned and mistakes we make or issues we ran into along the way, and what could be done next to continue the project. | Link |
Project Structure after 8 weeks | This is a more detailed view of the structure of the AWS pipeline we built after about 8 weeks of working and learning. | Download |