(opinions) Coronavirus Particle (copy)

According to the university, the purpose of the Virginia Tech COVID-19 Dashboard is to “inform (the) community as well as provide metrics which will be useful in our decision making process related to campus operations.” Indeed, keeping the public informed and guiding decision-making ought to be important goals. Unfortunately, the dashboard fails to be a reliable tool to fulfill these proclamations. Rather, it misrepresents key data and misleads readers. 

Note: All of the data used below were gathered on Feb. 27. 

First, consider the graph showing daily test results, which is the most visually appealing part of the dashboard. The left y-axis ranges from 0 to 1400,which may be an appropriate range to represent the total number of daily tests completed, but nothing more. Because of this large range, the counts of positive tests on the graph look minute. The 61 positive tests that came back on Feb. 23 are represented by a tiny orange stub, which is hard to distinguish from the 22 on Feb. 22 or the 43 on Feb. 25. Moreover, the line representing the seven-day moving average of positive tests looks completely flat due to the inappropriately large range of the left y-axis. A reader of the graph would be hard-pressed to gather that this number has more than doubled from Feb. 1 to Feb. 25. The line ought to reflect this. 

Compare VT’s graph to the graphs (plural!) on the University of Virginia’s COVID Tracker. UVA figured out how to construct y-axes in ways that appropriately match the type of data being conveyed. 

Next, consider VT’s reported number of total tests in the last seven days. This number could demonstrate to readers whether VT is on target to increase testing as well as anticipated testing trends. The trend of completed tests over time is best put into context when one can compare it to previous seven-day timeframes, but by only including the number of completed tests in the last seven days, the dashboard fails to do this. 

Compare this to UVA’s COVID Tracker. Distinct graphs show the counts and rolling averages of the total number of tests completed for the entire community, students, and faculty/staff. UVA, despite having almost 10,000 fewer students, sometimes completes more tests in one day than VT completes in one week. 

Context is also an issue with the reported number of students in “designated campus isolation/quarantine space.” As of Feb. 26, the number was 193. This number does not mean much if the total number of available isolation/quarantine spaces is not reported. 193 out of 1000 spaces tells a completely different story than 193 out of 200. The dashboard leaves the reader guessing. The best resource to use may be the Spring 2021 Operational Plan, which is not linked anywhere on the dashboard. This document indicates there were 513 beds available for isolation/quarantine at the beginning of the spring semester. Are quarantine spaces/beds the same thing as isolation spaces/beds? Does the number of available beds equal the number of available spaces, or do some spaces consist of more than one bed? Have the number of designated isolation/quarantine spaces or beds changed since the beginning of the semester? Don’t rely on the dashboard to answer these questions.

Once again, UVA’s COVID Tracker serves as a model for comparison. UVA’s tracker defines what quarantine and isolation beds are and shows data of quarantine and isolation beds separately and in terms of percentages of total availability. 

Finally, consider the VT dashboard’s response to media coverage. On Feb. 9, the Roanoke Times reported that VT’s seven-day positivity rate had surpassed the undesirable rate of 5%. Then, on Feb. 10, the dashboard suddenly showed the positivity rate on Feb. 9 was 4.9%, and that the rate on the previous days had never exceeded 5%. The Roanoke Times published a story about this on Feb. 11. It is understandable that dashboards may occasionally be retroactively changed to reflect cases reported from other collectors or because of errors. However, it is suspicious and concerning that retroactive changes were made the day after a media story and without a relevant disclaimer posted anywhere on the dashboard. 

These are not the practices of an organization that honestly represents and reports data. It is frustrating that this is the way data are shown to the public. It is frightening to think that this dashboard is a tool helping to guide decision making. 

Posting a generic disclaimer at the bottom of the dashboard noting that the information may not be complete does not excuse misleading design. 

There are plenty of capable people at VT who can design web pages that honestly show data. It’s not that hard. We don’t have to let UVA do a better job than us. 

Recommended Stories