The educational technology and digital learning wiki
Jump to navigation
Jump to search
|
|
Line 5: |
Line 5: |
| |field_project_end_date=2013/09/01 | | |field_project_end_date=2013/09/01 |
| |field_project_open=No | | |field_project_open=No |
| | |field_subject_areas=Engineering and technology |
| |field_cs_subject_areas=other | | |field_cs_subject_areas=other |
| |field_project_description=Crowdsourcing is an increasingly popular technique used to complete complex tasks or collect large amounts of data. This report documents the effort to employ crowdsourcing using the Mechanical Turk service hosted by Amazon. The task was to collect labeling data on several thousands of short videos clips as such labels would be perceived by a human. The approach proved to be viable, collecting large amounts of data in a relatively short time frame, but required specific considerations for the population of workers and impersonal medium through which data were collected. | | |field_project_description=Crowdsourcing is an increasingly popular technique used to complete complex tasks or collect large amounts of data. This report documents the effort to employ crowdsourcing using the Mechanical Turk service hosted by Amazon. The task was to collect labeling data on several thousands of short videos clips as such labels would be perceived by a human. The approach proved to be viable, collecting large amounts of data in a relatively short time frame, but required specific considerations for the population of workers and impersonal medium through which data were collected. |
Latest revision as of 12:52, 29 November 2013
Cs Portal > List of citizen science projects > Crowd Sourcing Data Collection through Amazon Mechanical Turk - (2013/11/06)
THIS PAGE DESCRIBE A CITIZEN SCIENCE PROJECT
Start date : 2011/09/01
- Beta start date : N/A
- End date : The project was closed on 2013/09/01
⇳ Description
Crowdsourcing is an increasingly popular technique used to complete complex tasks or collect large amounts of data. This report documents the effort to employ crowdsourcing using the Mechanical Turk service hosted by Amazon. The task was to collect labeling data on several thousands of short videos clips as such labels would be perceived by a human. The approach proved to be viable, collecting large amounts of data in a relatively short time frame, but required specific considerations for the population of workers and impersonal medium through which data were collected.
➠ Purpose
[[Has project purpose::According to Pierce & Fung (2013), The goal of this project is to generate a large number of video vignettes meant to visually demonstrate specific verbs. The project called for 48 verbs to be demonstrated,each in 10 different exemplars. Each exemplar was filmed with 16 different setting variations, consisting of different backgrounds, camera angle, time of day,etc. This produces a total of 7680
vignettes.
The vignettes are to be data points provided to system design teams as part of a larger research and development project
This project included 6 variants (sub-projects) described in the Pierce & Fung, 2013 report.]]
? Research question
unknown
MAIN TEAM LOCATION
Loading map...
{"minzoom":false,"maxzoom":false,"mappingservice":"leaflet","width":"300px","height":"270px","centre":false,"title":"","label":"","icon":"","lines":[],"polygons":[],"circles":[],"rectangles":[],"copycoords":false,"static":false,"zoom":false,"defzoom":14,"layers":["OpenStreetMap"],"image layers":[],"overlays":[],"resizable":false,"fullscreen":false,"scrollwheelzoom":true,"cluster":false,"clustermaxzoom":20,"clusterzoomonclick":true,"clustermaxradius":80,"clusterspiderfy":true,"geojson":"","clicktarget":"","imageLayers":[],"locations":[],"imageoverlays":null}
Project team page
Leader:
Institution:
Partner institutions:
Contact:
CONTRIBUTION TYPE: data collection
PARTICIPATION TYPOLOGY: crowdsourcing
GAMING GENRE NONE
GAMING ELEMENTS: NONE
◉ Tasks description
[[Has participant task description::Recognition Task: Crowdsourced Study subproject
For each
task, also known as a stimulus, a vignette was displayed along with
a verb question
(“Do you see [verb]
?”)
and the verb definition.
Workers
responded to a single
verb question with a present/absent judgment]]
⤯ Interaction with objects
▣ Interface
- Data type to manipulate: pictures, text
- interface enjoyment: somewhat cool/attractive
- Interface usability: rather easy to use
GUIDANCE
- Tutorial: Somewhat
- Peer to peer guidance: Somewhat
- Training sequence: Somewhat
FEEDBACK ON
- Individual performance: Somewhat
- Collective performance: Somewhat
- Research progress: Somewhat
❂ Feedback and guidance description
COMMUNITY TOOLS
- Communication:
- Social Network: N/A
- Member profiles:: N/A
- Member profile elements:
NEWS & EVENTS
- Main news site:
- Frequency of project news updates: N/A
- Type of events:
- Frequency of events :
⏣ Community description
- Community size (volounteers based)
- Role:
- Interaction form:
- Has official community manager(s): N/A
- Has team work N/A
- Other:
- Community led additions: 2013/09/01
Other information
PROJECT
Url:N/A
Start date: 2011/09/01
End date: 2013/09/01
Infrastructure: Amazon mechanical turk
TEAM
Official team page:
Leader:
PROJECT DEFINITION
Subject
Engineering and technology > (other)
Description
Crowdsourcing is an increasingly popular technique used to complete complex tasks or collect large amounts of data. This report documents the effort to employ crowdsourcing using the Mechanical Turk service hosted by Amazon. The task was to collect labeling data on several thousands of short videos clips as such labels would be perceived by a human. The approach proved to be viable, collecting large amounts of data in a relatively short time frame, but required specific considerations for the population of workers and impersonal medium through which data were collected.
Purpose.
According to Pierce & Fung (2013), The goal of this project is to generate a large number of video vignettes meant to visually demonstrate specific verbs. The project called for 48 verbs to be demonstrated,each in 10 different exemplars. Each exemplar was filmed with 16 different setting variations, consisting of different backgrounds, camera angle, time of day,etc. This produces a total of 7680
vignettes.
The vignettes are to be data points provided to system design teams as part of a larger research and development project
This project included 6 variants (sub-projects) described in the Pierce & Fung, 2013 report.
Research question.
unknown
ABOUT PARTICIPANT TASKS
Tasks description.
Recognition Task: Crowdsourced Study subproject
For each
task, also known as a stimulus, a vignette was displayed along with
a verb question
(“Do you see [verb]
?”)
and the verb definition.
Workers
responded to a single
verb question with a present/absent judgment
.
Grey typology |
Participation typology |
Contribution type: |
|
Computing: |
NO |
Thinking: |
YES |
Sensing: |
Somewhat |
Gaming: |
NO |
|
|
Crowdsourcing |
☑ |
Distributed intelligence |
☐ |
Participatory science |
☐ |
Extreme citizen science |
☐ |
Science outreach |
☐ |
| |
|
Data collection |
☑ |
Data analysis |
☐ |
Data interpretation |
☐-------- |
|
Gaming |
Genre: |
Gaming elements: other |
Interface |
Data type to manipulate: pictures, text |
interface enjoyment: somewhat cool/attractive Interface usability: rather easy to use |
Member profiles::N/A Member profile elements: |
ABOUT GUIDANCE AND FEEDBACK
Guidance |
Feedback on |
Tutorial and documentation: |
SOMEWHAT |
Training sequence: |
SOMEWHAT |
Peer to peer guidance: |
SOMEWHAT |
|
individual performance: |
Somewhat |
collective performance: |
Somewhat |
research progress: |
Somewhat |
|
.
Tools |
News & Events |
Communication:
Social Network: N/A
|
Main news site:
Frequency of project news updates: N/A
Type of events:
Frequency of events :
|
Community description |
Community size (volounteers based):
Role:
Interaction form:
Has official community manager(s): N/A
Has team work N/A
|
Other information about community:
Community led additions: 2013/09/01
OTHER PROJECT INFORMATION
No
[[has completion level::Low]
No
Engineering and technology
other
[[Has project purpose::According to Pierce & Fung (2013), The goal of this project is to generate a large number of video vignettes meant to visually demonstrate specific verbs. The project called for 48 verbs to be demonstrated,each in 10 different exemplars. Each exemplar was filmed with 16 different setting variations, consisting of different backgrounds, camera angle, time of day,etc. This produces a total of 7680
vignettes.
The vignettes are to be data points provided to system design teams as part of a larger research and development project
This project included 6 variants (sub-projects) described in the Pierce & Fung, 2013 report.]]
unknown
USA
Crowd Sourcing Data Collection through Amazon Mechanical Turk
[[Has participant task description::Recognition Task: Crowdsourced Study subproject
For each
task, also known as a stimulus, a vignette was displayed along with
a verb question
(“Do you see [verb]
?”)
and the verb definition.
Workers
responded to a single
verb question with a present/absent judgment]]
data collection
Revenue
crowdsourcing
pictures, text, other:
Thinking: yes
Computing: no
Sensing: somewhat
Gaming: no
other
somewhat cool/attractive
rather easy to use
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low
Bibliography