Week three has been filled with an abundance of work. From reengineering code to gathering and analyzing datasets, our progress is growing steadily. The week started off with success in getting the chatbot to talk to multiple people at the same time (an issue stemming from prior work on the project). Through a simple yet elegant means of data storage, we were able to use cookies to keep track of each individual who messaged the chatbot. In addition, we’ve also included a feature which allows individuals to send images of the rat or the evidence they discovered to the chatbot. Now it’s just a matter of cleaning code and preparing for deployment
Furthermore, our data analysis is making strides as well. We’ve been focused on collecting datasets that potentiallyindicate rat habitat suitability. These include tree cover, housing structures, land use, restaurants, and more. This was not straightforward; much of the data came from different sources, in different formats – for example, the restaurant data was scraped from a website with lists of health inspections and then geocoded to get the latitudes and longitudes from the addresses. Yet more datasets are either not available publicly online or are available in one county and not the other. Using these datasets, as well as a list of all reported rat sightings since 2009 on Atlanta’s Citizen Gateway that are then geocoded, we will be analyzing the spatial distribution of rats over time in relation to environmental and geographical features.