In adopting a monitoring tool, you should understand the difference between qualitative information and quantitative information. With qualitative data, you track behavior and practices. In quantitative data, you track satisfaction, frequency and availability. The analysis of qualitative data is more somatic, whereas analysis of quantitative data is more statistical.
Monitoring tools to gather qualitative information are made up of open-ended questions rather than just yes/no or ranking questions. Such tools can provide you with longer and more subjective answers. Focus groups or interview guides, which lay out planned questions with space for interviewers to record notes, are helpful to facilitate qualitative data collection. In particular, nondirective interviewing can be a helpful qualitative data-gathering technique. In nondirective interviewing, the interviewer does not frame questions in terms of right or wrong answers or limited sets of options and avoids leading the interviewee to answer in particular ways or within particular value systems. Instead, the interviewer uses an open-ended approach to explore the interviewee’s thoughts, attitudes and beliefs. Monitoring tools to gather quantitative data are made up of closed-ended questions, typically those with answers of yes, no or maybe, as well as rankings.
You should also understand the difference between primary data sources and secondary data sources. Primary data includes direct observations, whereas secondary data includes previously conducted research studies. You must gather primary data yourself with questionnaires and interviews, whereas secondary data can be gathered from sources such as reports produced by the government, international organizations or other non-governmental organizations; official statistics; or media articles.
Your approach to data collection will depend on the type of data you collect and how it is collected. Only once you’ve determined this will it make sense to look at what technologies can be used to facilitate data entry, management and analysis. Some tools are more useful with collecting structured data, but would not be helpful with free-form data that you might collect through qualitative interviews. Other tools — such as data mining tools — are more helpful in getting information from secondary sources. In some cases, digital tools might not be needed at all.
Data collection tools can be divided into three categories: data repositories, online forms and data mining tools. First, data repositories like Airtable24 or Google Sheets25 — basically, spreadsheets — are used to organize structured data. Information can be entered directly into the data repositories in a tabular format. However, this is not very user friendly and can lead to messy datasets, especially if more than one person is working on the database or there is no standardized format.
Second, online forms make it easier to input data into the repository in a structured format. More structured data facilitates analysis and visualization. Simple online forms, like Survey Monkey26 or Google Forms,27 can be used to limit human error by validating entries, by, for example, only allowing valid email addresses to be entered into an email field. Advanced survey tools that take advantage of modern technologies, like smartphones, can even collect data passively to validate data or simply collect additional data for analysis. For example, smartphones with geolocation services can collect data about the location of the person collecting the data and the time they collected it, in a way that would be hard to falsify.
Sometimes it is necessary for political process monitors to collect information directly from their research subjects, such as with opinion surveys, crowdsourcing or through the work of trained field-based observers. Crowdsourcing or crowdmapping tools like Fix My Community28 can help improve government service delivery through citizen reporting. Monitoring efforts that involve field-based observers can take advantage of structured data collection tools like Apollo29 to collect observation reports in real time. In both cases, it is important to consider how the medium might impact the collected data. For example, an online survey of a community will be biased toward individuals with internet access (see Surveys and Selection Bias in the Types of Political Process Monitoring section for more information). If field-based observers will be in areas without internet access, phone calls or SMS may be preferable to online forms.
Third, data mining tools take publicly available data and use it for analysis. Some platforms and websites enable direct downloads of structured data via application programming interfaces (APIs) or really simple syndication (RSS) feeds. Pulling data from APIs is in most cases legal and ethical, since the API data is deliberately regulated by platforms and structured so as not to violate users’ rights. In cases where data is only available in formats that cannot be readily downloaded and analyzed, such as information on websites or in PDFs, you might need to make use of data scraping tools. Web scraping tools like 0archive,30 for example, can be used to extract data from websites or social media platforms and convert it into structured data sets. Web scraping is not regulated by platforms and may violate their terms of service or even be illegal in some contexts. Tools like DocumentCloud31 or Adobe Acrobat32 can help extract data from PDFs or image files. Amazon Textract33 can also extract relationships or structure, which can help automate the process of organizing data extracted from PDFs into a structured data set. While these tools are helpful when dealing with massive amounts of data, for smaller monitoring projects, manual data entry might actually be more efficient.