Course:LIBR559A/Hara, K., Le, V., & Froehlich, J. (2013)

From UBC Wiki

Hara, K., Le, V., & Froehlich, J. (2013). Combining Crowdsourcing and Google Street View to Identify Street-level Accessibility Problems. College Park: Human-Computer Interaction Lab (HCIL) Computer Science Department, University Of Maryland., 631-640.

The purpose of Hara, Le & Froehlich’s (2013) article is to assess the possibility of crowdsourcing, using untrained workers from Amazon Mechanical Turk, the task of identifying and documenting walkway accessibility problems using images derived from Google Street view. The authors begin their article by stating that many cities have various accessibility issues including ill-maintained walkways, lack of curb ramps, etc. In addition to this problem, there is a lack of effective systems for outlining accessible areas in cities. This unjustly affects the 30.6 million people with disabilities that impair their abilities to walk in the US alone as of 2010.

Hara, Le & Froehlich (2013) conduct two feasibility studies for their research. In the first study, a group of labellers, the researchers and wheelchair users acting as ‘sidewalk accessibility experts’ aim to prove that the labelling approach to addressing the identified problems is feasible and reliable; aim to identify a baseline for performance; and aim to identify ground truth labels that can be used to assess crowd worker performance. In the second study, the researchers assess the potential of using crowdsourced labour to perform the task of labelling accessibility issues on walkways.

Hara, Le & Froehlich’s (2013) studies gave positive results: turkers were able to identify the existence of a problem and identify the correct problem type with 80.6% and 78.3% accuracy respectively. The researchers identified limitations of their method including considering sidewalk width as a challenge to accessibility, blurry images from Google Street View, etc.

Hara, Le & Froehlich’s (2013) expands on works of human-information interaction and accessibility. They discuss importance of access to information by marginalized groups: in this case, the marginalized group is the disabled and the information in lack is information on accessible walkways in cities. Although, the subject is important and the researchers’ methodology is comprehensive, the proposed method (untrained labour from Amazon Mechanical Turk) has been criticized by many for being unethical (Hara, Le & Froehlich, 2013). The issue of the ethics of their method was not comprehensively addressed in the article.

Keywords: Crowdsourcing accessibility; accessible urban navigation; Google Street View; Mechanical Turk; image labeling; Information interfaces and presentation; Human-Information Interaction

References Fort, K., Adda, G., & Cohen, K. (2011). Amazon Mechanical Turk: Gold Mine or Coal Mine?. Computational Linguistics, 37(2), 413-420. http://dx.doi.org/10.1162/coli_a_00057

Hara, K., Le, V., & Froehlich, J. (2013). Combining Crowdsourcing and Google Street View to Identify Street-level Accessibility Problems (pp. 631-640). College Park: Human-Computer Interaction Lab (HCIL) Computer Science Department, University of Maryland.

Page Author: Salim Zubair