In times of crisis, a great deal of information and data is gathered, and teams work tirelessly to analyze it. While effective analysis should be a priority, it is just as important that all the data and information we receive is accurate.
A large number of natural disasters and crises occur around the world. In these times of crisis, there are methods of collecting data to effectively create and visualize a story that will assist in driving responses in the most effective and efficient manner possible. With such a heavy reliance on data, how can we know if the data is accurate? How do we know if it is coming from a genuine source? The data we are talking about can come in many forms, ranging from demographical data, biological data, and even social media posts. In disaster management, agreements such as the Department of Homeland Security Information Sharing and Safeguarding Strategy have been made between governments so that this data can be shared and collectively utilized to its fullest. It is vital to make sure that all this data is not only vetted and verified, but then organized in standardized formats and tables for enhanced accessibility across multiple platforms.
What are some of the ways that data is being collected today?
The amount of data out there in the world is only growing. The International Data Corporation (IDC) predicts that by the year 2025, the amount of digital data created worldwide will rise to 163 zettabytes (1 ZB = 1 trillion gigabytes), and an estimated seventy five percent of the world’s population will be connected with data in some way. Astoundingly, every individual will be creating one data interaction every 18 seconds. Today, we are just beginning to see a plethora of devices that are meant to be used on a daily basis and are connected to the internet: modern refrigerators that can track what groceries you are low on, microwaves, humidifiers, smart outlets, TVs, ceiling fans, thermostats, lighting, home security cameras, smart locks, faucets, lawn mowers, vacuum cleaners, coffee machines, toothbrushes, mattresses, and countless other items. All these devices produce data insights that can or may be collected, and in this age of information, it’s safe to assume that if it is connected to the internet, it may very well be sharing data. Beyond appliances, data from social media accounts can also be collected if your privacy settings are set to allow it.
In addition to data you may share directly via your appliances or social media, data is constantly collected for activities in which you take a much more passive role. For example, satellites and drone imaging are constantly scanning the globe for up-to-date mapping services. In fact, in times of natural disasters and crises.
Get our Updates as They Happen
Subscribe to our blog to get new posts delivered straight to your inbox.
Why do we need accurate and standardized data?
There are many sources producing data that can be utilized for analysis and insights. Think about all the devices mentioned in the prior paragraph that are connected to WiFi at this current moment. Given the broad range and massive quantity of devices gathering data, even more importance must be placed on data verification, especially with innovations related to the Internet of Things (IoT). The IoT, which refers to the billions of physical devices around the world or the network of smart objects that are connected to the internet, has already proven to be a valuable source of data collection. One issue with using these as a data source, however, is that many of these devices are not yet widely available. For example, households that have smart lights probably skew higher on the socioeconomic scale, affecting generalizability. Is it then accurate to produce insights that will influence marketing or, more importantly, emergency responses, based only on these types of potentially skewed data?
When we plan our responses to crises, we want to make sure we are getting a complete picture of what is actually occurring. Additionally, we want to be able to quantify every aspect of society that is being affected so that we can ensure our remedies prove to be effective. We can only do so if the data is not only genuine, but also interpretable. Thus, another step in utilizing data for critical response is standardization. As defined by the Observational Health Data Sciences and Informatics, data standardization is the critical process of bringing data into a common format that allows for collaborative research, large-scale analytics, and sharing of sophisticated tools and methodologies. Data can be stored in a variety of different formats, at different times, using different database systems and information models, which leads to the lack of standardization amongst the data. To ensure that the data can be accessed and ultimately analyzed, it is important to have tools that can work across different platforms, as best evidenced in NASA’s now infamous $125 million Mars Climate Orbiter blunder. The issues arose when the navigation team programmed the spacecraft in the metric system, while the team responsible for building and designing the spacecraft did so using the English system for units of measurement. The craft’s navigation system, receiving conflicting formats of data due to the differences in measurement, pushed the spacecraft into the Martian atmosphere which caused it to burn up and break apart. Now I’m no rocket scientist, but I would guess there’s going to be a problem when a spacecraft’s thrusters receive commands in pounds of force rather than the standard newtons per square meter. There are efforts currently to standardize various kinds of data. Recognizing the costly mistakes that have been made by even the smartest of individuals, it is important that devices are all sharing and communicating with each other in the same language, especially considering the increasing prevalence and reliance on technology in general and the advent of IoT devices.
What could happen if we do not have accurate data during emergencies?
A typical year in the US sees 12 storms, 6 hurricanes, and 3 major hurricanes. What if one outdated data source suggested that supplies were abundant and unnecessary in an impacted area? If emergency relief teams went off of this incorrect data, they would be sending resources to a location that may not have the need for it as opposed to another location that desperately requires it, a mistake that could cost people their lives. Data changes constantly, and the availability of real-time information to make real-time decisions is of the utmost importance, especially in times of crisis.
Let’s explore another scenario highlighting the importance of standardization. A manufacturer develops smart sensors for concrete to provide data on a building’s structural integrity, among other variables. There are a lot of benefits of having such sensors in buildings, especially in earthquake prone zones; however, there are numerous smart sensor manufacturers out there, all with their own data standards. So, what happens when first responders receive an incompatible dataset from one building that’s affected by an earthquake but a compatible dataset from another building? Will they be able to respond appropriately in both cases? Not having standardized data could very well lead to life or death decisions based on that data. Unstandardized data is inaccurate data, which is not only uninformative, but dangerous too.
The value of data during COVID-19
Today, with the ongoing global pandemic, we are seeing firsthand how user data is being used to track population densities and to report on social distancing and anonymized travel patterns. How do we manage all this new data, especially considering that every byte of data can add valuable insight in the event of a crisis? These data insights are proving to be very useful indeed; however, there is always that ethical question of user data privacy, and at what cost must we sacrifice our digital privacy. While some companies have been accused of data collection violations, it is important to note that not all data that is collected is or will be used nefariously. In fact, in times of natural disasters and crises, this data can indeed be used to save lives. While that is a topic for another article, the landscape of collecting data is changing very dramatically in these unprecedented circumstances. Governments and organizations can use statistical analysis to determine areas of population density and then hospital locations nearby that may need more supplies based on that data. If a great number of hospitals are requesting supplies, this data can be used to prioritize which locations receive more supplies. These decisions are certainly difficult to make when lives are at stake, but when accurate data is used, leaders and institutions can make statistically informed decisions that could result in more lives saved. Governments have also expanded the scope of their data collection in response to the crisis, with some leveraging user social media accounts to enforce quarantines. For example, postings with specified words or phrases are flagged and used to determine the likelihood of someone breaking a quarantine. Though it sounds scary and Big Brother-esque, is such an invasion of our data worth the benefits of stopping the spread of the virus?
In addition to health and safety concerns, businesses and retailers are also deeply affected during this pandemic. Take for example a large US jewelry retailer. This retailer has new stores opening every month and each new store has countless aspects to coordinate: worker schedules, vendors, supply deliveries, lease dates, etc. What was already a stressful experience has now become downright disastrous. With COVID-19, vendors have shut down, freight companies have reduced shipping schedules, some suppliers have shipped supplies early, while others are unable to send materials at all, and further, workers’ schedules are impacted and shopper demand is volatile. Early warnings and just-in-time information is vital to reroute resources, stop shipments, and update schedules to stem this kind of derailment. While there are platforms and sophisticated predictive models to provide these insights, it’s the data that feeds these models that must be up to date and in sync. If the data is outdated or inaccurate, it could be the difference between a business enduring or falling victim to an unanticipated disaster, such as the one we are currently facing.
Data from the everyday devices we use around the house can also be useful in this time of crisis. Think about how a power company may be able to adjust its output based on the data insights it has gathered on the increased amount of devices being turned on. Not only could they look at how many are turned on, but they could also identify each device’s specific geographical location as well as its location on the grid, allowing for more accurate “stay at home” order adherence. Data from smart refrigerators could be utilized by grocers to determine what products are actually being consumed and where. This data, combined with their existing inventory data, can be leveraged using predictive statistical analysis tools to determine which locations require specific products rather than restocking each location with the same amount of product. Additionally, you could also determine which users have “over purchased” groceries and may not be returning to the store anytime soon. All these insights are helpful in efficient supply chain management and can minimize the stress that is placed on it in times of crisis.
With self-quarantining leading to increased usage of Netflix-type products, streaming services can collect data from their subscribers’ TVs to determine if they need to increase or decrease their streaming bandwidths to accommodate increased traffic. While it may sound intrusive, it can also be used to help alleviate stress on systems that are overwhelmed in crises. It can also help to ensure that society can come out of a crisis with minimal damage. In fact, the data may have the answers we need to adapt to crises and allow us to come out on the other side even better than before.
What do you think?
Even with the vast amount of data that has become available, many questions remain. Do you agree with the measures taken to collect data? Are the privacy costs worth the benefits? Data will always be here and will continue to grow. But quantity does not equal quality, and we must ensure that the data we utilize is as accurate as possible. Only then can this data be leveraged to keep us all safer, especially during times of crisis. But are governments and organizations doing a good job of collecting and verifying the accuracy of such data? How careful are they in protecting its validity and our privacy? Or does protecting privacy limit the scope of the data’s impact? Can we always trust the data that governments claim to have gathered? And what part do we as citizens play? How can we be sure that the algorithms that pervade our daily lives are being fed accurate data?
While the questions around accuracy continue to grow, there is no denying the presence and importance of data in our world. We are indeed all in this together, and data is very much in this as well. Stay safe!
- Data Cataloging Helps Businesses Work More Efficiently - January 12, 2021
- Data Democratization: The Key to Making your Organization Thrive - December 22, 2020
- Business is Turning to a Post-Pandemic Data Driven Culture - August 11, 2020