Ways to conduct user research in open technology
Note taker: David
Jane Park, Creative Commons
Director of Product and Research - 6 months of user research around relevance nad future of CC
Interviewed 85 people about motiviations and uses in sharing content
Design phase: prototyping new ideas
Also involved in CC search
Most important - make sure building for real world needs
Marie: Don't have much experience with user
Jack: Palante - recently started doing user stories and research
Laura: Always been user research adjacent
?: Lots of interviews and surveys and usability testing
Susan: User research in community
Andrew: Design practices at firm rooted in user research
Abi: Curious about solving usability issues from legal perspective
Peterson: Looking for best practices
Kyle: Work with groups who use UR
Ada: Lots of design and discovery, but not enough user stories
Jazmyn: UR and surveys
Kristyn: Done user personas, testing, interviews -- more robust processes of user involvement, feeling of participating
Nechari: Grassroots user research, similar to Kristyn
Steve: Trying to make best practices more discoverable
What did CC do? They call it "human research" so as to not be clinical.
CC has own search engine.
Research process: Not scientific. Adapted human-centered-design ...
ideo, jobs to be done, erika hall, just enough research
human center design: discovery, design, development
at beginning of design phase; hired doctoral stuents... including someone with experience with military, another person with large nonprofit experience
focused on sharing of long form texts... anything from blog posts to academic articles; also on images, designs, photos on flickr, 3d designs; any other material they were sharing
categorizing intwo 3 categories:
- superuser (knew CC, had been using actively) - future users - too young to know, professionals who could use but aren't - expert users - executives, researchers, etc -- peole who
asked for referrals
85 in total in the end. international to degree but all english
didn't ask about CC until last 10 minutes
dug deep on why someone would share something online and what they wanted out of what poeple shared with them
after interviews: listened, copied down most salient points on post-its -- looked for patterns in the data; themes and constellations; insights began to emerge
also looked at previous interviews, about 150 total
common themes and problems:
- How CC just works - what do they mean? how to use?
- Experience of using content itself
- Futures that CC might help build, not currently playing a role in (e.g. people feeling like they don't have control over their content any more)
- CC stands for free content sharing, but nuances hard to understand even for experts (internal assumption more education needed, despite many years of trying)
People introduced to CC after completing a work and don't want to think about it for sharing at that point? Could intervene earlier in creation process?
People don't know what to do with work beyond publishing it -- license may be perpetual but content is not always
Came up with 9 interventions
- polaroid watermark -- attribution added to image, add to frame (easy to crop), or add to image itself; also add as image metadata - with cc search, people want to find things easily (ccsearch.creativecommons.org) ... only images right now (question about music because so important for creators of podcasts + videos)
superuser slack channel, 17 of original 85 wanted to keep in touch
friday feedback group
q: in interviews did people know what attribution meant?
a: said "giving credit" but people mostly did know
q: how does friday user feedback sessions work?
a: ask devs at end of cycle to pick features where feedback desiered; ask group and get them to provide direct feedback or even make github issues
q: how is CC getting folks earlier?
a: back in the day CC was integrated into Adobe; hard to get platforms to do things because no incentive; could bake into search to create awareness?
q: did they write user persons / stories?
a: actually easier just to talk about it in terms of the real interviews, better to anchor in specific and real people
q: barrier is often not having access to source files, did it come up?
a: possibly mentioned a couple times, but were a lot of questions about provnenance
q: how did interviews match up with stuff they'd heard? what about stuff like analytics?
a: some were no brainers, expectations didn't match up with reality, etc ... CC catalog (with IDs)
... license pages get a lot of traffic, but website doesn't
q: is registry being open sourced?
a: available to public some day! as well as an API
ledger and record not yet, looked at blockchain but too much
q: 85 interviews is a lot - logistics / challenges sourcing interview subjects; how did know enough was enough? were there diminishing returns?
a: consultant said after 100 would start just hearing the same things. by 50 there were enough, but had obligations
sourcing -- anonymization and being clear about it upfront very important, don't point to specific users / people, vague descriptors ("students")
could be interesting to automatically transcribe
q: think about pay people? or did they volunteer? how come to that conclusion? will it remain volunteer-based?
a: money changes tenor of transaction
people love to talk about their lives
haven't had trouble finding people to talk with them
all word of mouth/referrals; would be presented at conference
group interviews weren't as good as one-on-ones
q: now in prototyping -- how to decide what user feedback to incorporate?
a: trying to incorporate most of it, lean startup -- test and see .. open question
q: waht about conflicting user feedback?
a: not really, some pretty granular about license itself (left wing of open source movement -- strong feelings about these issues), or just problems with search
q: who joined slack?
a: people who saw medium post or people from interview cohort ... 17 from there plus 10ish more from blog post
q: have put together testing plan?
a: not yet, but when prototypes exist, get feedback from slack group first
like moderated user testing -- having people talk through things as they're working
q: kalamuna's bias is towards
kalamuna does it remotely, use gotomeeting or zoom that can record
q: why pay with gift cards?
a: overhead + logistics
q: is it ethical to ask for free labor from community?
someone says limits who can do it -- some people can't do it without money; is it coercion?
how do you reach people you didn't otherwise?
could back up and ask similar questions of CC
huge question for cc going forward is who are they designed for?
will use user personas, likely to focus on educators
q: had ever done before?
a: not exactly, but had a lot of agency. if have a team, really need to train them well. colleagues struggled to replicate themselves at times
solo meant more work, but also more control
having interviewer and note taker really useful, have note taker be interviewer in future