Data and user research design
Intro: there 's a lot of data that is not quantified (how they value aspects of a community; the history of a town's power dynamic)
So there's important data that is important but can't be modeled. There's a wave of fetishization of data and data reporting but the source of that data is limited.
Mapping can be a value community outreach tool. To start a discussion. IE: "where do we feel unsafe" and map that. That can drive discussions. Why do you feel unsafe? Maybe it's a store selling Confederate Civil War nostalgia makes a part of the community unsafe and others did not notice that
How do we quantify "missing data"
example: County that was having problems getting an early voting place in a location that would be accessible by most people. With various orgs, make a map about where it will be difficult to get to. Digitize those and it was clear that there was a major barrier to participation given current poll locations. Of course the Board of Elections rejected the complaint but a community discussion had started which itself has power and has continued. Taking lived experience and finding a way to quantify it. (Process of mapping: changing scale of map based on input to show the difference between physical distance and real lived experience of how hard it was to get there.)
Even if you know how a political process like a campaign works, it's hard to make some points if the data is not published. Example of a lack of a published Ethnography of a campaign.
how can gathering oral history be useful in the process of creating data visualizations.
when collecting data we must ask ourselves what data should be recorded, what the sharing of that data mean to people's safety or security. What responsibilities do those that gather data have to the subjects of these explorations.
Complication: when different parts of the community want different things or are concerned about the impact of storing some data. Sometimes this can stall a project.
When datasets are released by community research (often not by academics) it tends to be viewed as problematic (not on same tier of 'officially' gathered numbers which can also be bad but gets an assumed credibility)
Sometimes gathered data sets do not turn into something solid, but that's okay because the process of gathering it in a community centric way has its own benefits and outcomes.
Balancing anonymity vs credit for ideas.
it is okay to state that 'some of data is anon but parts come from (credit people or orgs)'
- What are we going to ask
- who will be credited
- how will it be gathered
- how will it be stored
- who owns it afterwards
Ongoing discussion about the content of their outreach list. It's been gathered in ways that are not diverse and using that list to ask questions can be difficult since the answers come from non diverse community.
are there ways to mitigate problematic answers that can restrict organization from doing work that expands out the debate. (such as how to bring racial justice into discussions of environmental issues)
it's possible to ask a set of questions where some are disguised as ways of exposing bias in order to figure out how to interpret data results. (proxy questions or a push poll of sorts)
Campaign to build a community of formerly incarcerated people to take action. Got people to come to a meeting and listen to people in order to get people to come together and realize their issues are shared by others, but that can be difficult to quantify for funders ("like we built community trust"). Was able to show that even people that were not served by the org can be active in the issue anyway.
Ideas for how to visualize data when these issues exist were discussed. analyze text of interviews to pull out tendencies not being directly discussed. Might require tech that we don't have easy access to but could be useful if you can parse this out of your source data.
Color/hue/saturation can be used effectively for mapping data.
most important: you have a commitment to create as accurate a representation as possible, even if the conclusion is not what you want.
whatthefuckviz.com is a resource to show great examples of what you should not do.
even seemingly objective data still reflects the bias of those that gathered it