Matches in Nanopublications for { ?s ?p ?o <http://purl.org/np/RAeIJfI3VXak4oGG9pGUrFJnP_PyO6UHtf1JH7u-kt_ZE#assertion>. }
Showing items 1 to 2 of
2
with 100 items per page.
- paragraph type Paragraph assertion.
- paragraph hasContent "Crowdsourcing [19] refers to the process of solving a problem formulated as a task by reaching out to a large network of (often previously unknown) people. One of the most popular forms of crowdsourcing are ‘microtasks’ (or ‘microwork’), which consists on di- viding a task into several smaller subtasks that can be independently solved. Conditional on the tackled prob- lem, the level of task granularity can vary (microtasks whose results need to be aggregated vs. macrotasks, which require filtering to identify the most valuable contributions); as can the incentive structure (e.g., pay- ments per unit of useful work vs. prizes for top par- ticipants in a contest). Another major design decision in the crowdsourcing workflow is the selection of the crowd. While many (micro)tasks can be performed by untrained workers, others might require more skilled human participants, especially in specialized fields of expertise, such as LD. Of course, expert intervention usually comes at a higher price; either in monetary re- wards or in the form of effort to recruit participants in another setting, such as volunteer work. Microtask crowdsourcing platforms such as Amazon Mechanical Turk (MTurk) 1 on the other hand offer a formidable and readily-available workforce at relatively low fees." assertion.