Category Archives: Crowdsourcing

Posts about crowdsourcing projects and methods.

Crowdsourcing and Engaging the Public

Update: Here are our notes from the discussion: docs.google.com/document/d/1qKnwi–Y867jrIZECgNnyyCE8RhW67g_ChR_bxvxQ1k/edit

I’d like to propose a session on crowdsourcing and engaging the public in digital humanities efforts. My background is in computer science, so I’m especially interested in discussing how folks from the humanities are (thinking about) using crowdsourcing or otherwise soliciting public participation in their work. Possible discussion topics include:

  • How can we involve novices and amateurs in contributing meaningfully to DH projects?
  • What are examples of compelling crowdsourced DH efforts we should know about?
  • What technical and social challenges are we experiencing in engaging the public in our DH projects?
  • When is crowdsourcing a good idea? When are automated approaches more effective?
  • How can we build a critical mass of participants? How can we sustain that participation?
  • What ethical issues should we consider when planning crowdsourced DH projects?
  • What are fruitful ways for novices and experts (e.g. scholars) to support each other in a DH context?
  • What lessons from crowdsourcing in other domains (citizen science, etc.) can we bring to DH, and vice versa?

Musicplectics: Digitally Ranking Musical Complexity for Educators

000000">Musiplectics: the study of the complexity of music

We are in the middle of a collaborative research project here at Virginia Tech (in the departments of music and computer science) and ICAT, creating a web-based software program that automatically ranks the complexity of written music (music scores). The idea is that, by scanning a music score, using a pre-existing pdf, or an xml file, users would be able to use our application to determine the skill level required for the performance of a musical work. We have developed a working prototype to explore music written for clarinet, but are planning to expand its utility to include use for all other common musical instruments.

 

Pedagogical Value:

I. To Increase the utility of existing digital archives. With the growing availability of digitized scores (both from historical works in the public domain and xml that is being uploaded daily to the web), musicians and music teachers are overwhelmed by the amount of musical works newly available to them at the click of a button. We feel that if there were some way to rank and categorize these works by their complexity, these vast digital resources would become less daunting and more widely used by educators.

II. For competitions and festivals. Often times, educators must make highly subjective, and time-consuming decisions regarding the difficulty levels of the music that is chosen. This ranking system could clarify and objectively facilitate these often-debated choices.

 

Future Research Projects:

We would like to explore collaborating with existing databases, libraries, or digital reserves to rank as many scores as possible and explore other possible applications of our technology. Further collaborations and suggestions or ideas from our colleagues would be wonderful!

Questions:

– How can we use crowdsourcing, surveys, or other methods as a tool to set the parameters for judging exactly what is difficult on various instruments? Is there a way to get input from a large number of participants so that our parameters could represent a good average for what people believe?

– How can we find the most advanced OCR (optical character recognition) software that will help us to make sense of music, which is sometimes hard to decipher?

– Are there other applications of our software that might be unexplored?

– Who would like to collaborate with us?

 

Slide1