top of page
Doordash_gif.gif
DoorDash
Database
Design

For our final project in the class "Database Design and Development" (67-262), we were assigned to choose an existing business and go through the key parts of the database development life cycle for that business. Our team chose DoorDash. Completing of the project required going from user stories to a conceptual model to a logical model to a physical model culminating in realizing the user stories as implemented code.

Project Overview

Team

Alexander Ma, Steven Zhang

Programs

PostgreSQL, psycopg2, Vertabelo, LucidChart

Timeline

4 weeks (Fall 2021)

Project Download

Database Modeling

User Stories

After researching how DoorDash operates, especially the relationship between customers, restaurants and DoorDash drivers, we created 10 user stories based on what each user's desired functional requirements would be. We determined if the user story would lead to a simple, analytical or complex query. A simple query is a query on a single table, an analytical query includes an aggregation function and a complex query is a query on multiple tables.

​

3 examples of user stories we wrote

User.png
UpdatedConceptualModel.png
Conceptual Model

Based on our user stories, we created a conceptual model with UML in LucidChart. Our conceptual model identified the domain entities, the attributes of each entity, and the associations and multiplicities between each entity. 

​

Based on the conceptual model, we then wrote out the logical model, where we identified the relations along with their primary and foreign keys.

Physical Model

Once we reached the logical model, we had to make sure all our relations were in BCNF form by confirming the determinants of a relation was the candidate key. If the relation wasn't in BCNF, we decomposed it until it was. 

​

From there, we were able to create the physical model in Vertabelo from the BCNF normalized relations. With the physical model, we identified the data types of each attribute along with the primary and foreign keys of each entity.

​

Based on the physical model, we were now able to generate a file that would create the database.

PhysicalModel.png

Querying Data

Mock Data

For each entity in our physical model, we created a .csv file containing each attribute for that entity, along with mock data that we would query over later.

We then created a file (called initialize.sql) that would input everything from the physical model and load data into the database from the .csv files using a series of \copy psql commands.

​

You can view all our mock data by running show_all.sql, which displays all the tables

Queries

For each user story, we wrote an SQL query that would query the mock data we created. Using Python and psycopg2, we made sure it printed out the desired results to confirm our queries were correct.

ComplexQuery1.png

By running complex_query_1.py, which contains the code for user story 3, the terminal displays restaurants that are of the specified category and are under a specified price point. In this example, the restaurants displayed are under the price point of 5 and belong to the "chicken" category

© 2023 by Alexander Ma

bottom of page