Citizen science

The educational technology and digital learning wiki
Jump to navigation Jump to search

Draft

Links and bibliography below was directly taken from http://en.wikipedia.org/wiki/Citizen_science

I will start working on this piece on January 2012 and try to do some quick literature review, in particular with respect to topics like "how do participants learn", "in what respect are citizens creative", "what is their motivation", "how to communities work"

- Daniel K. Schneider 16:43, 23 December 2011 (CET)

Introduction

Citizen science does not have a uniquely accepted definition. It could mean:

  • Participation of citizen for collection of data, for example observation of animals, pollution, or plant growth.
  • Participation of citizen for analyzing data, in various forms. For example, some provide computing power (a typical example is the captcha mechanism in this wiki for curbing spam). Another would be helping to recognize patterns (e.g. forms of galaxies)
  • Dissemination of scientific thought and result in schools in order to promote engagement with science or with the intent to help updating the curriculum.
  • Amateur science, i.e. citizen create scientific thoughts and other products.
  • Citizen assessment of science and scientific projects.

The variety of citizen science programs is imporant with respect to many criteria, e.g.: aims, target population, locations (schools, museums, media, Internet groups), forms, subject areas, etc.

Citizens as participants

(to do)

Informal science education

Informal science education refers mostly to larger or smaller top-down initiatives that aim to raise interest for STEM (Science, technology, engineering and mathematics) subjects. As an example, “The Informal Science Education (ISE) program at the National Science Foundation (NSF) invests in projects designed to increase interest in, engagement with, and understanding of science, technology, engineering, and mathematics (STEM) by individuals of all ages and backgrounds through self-directed learning experiences” (Ucko, 2008: 9)

Evaluation of citizen science projects

The US "Informatl education and outreach framework" (Uko 2008:11) identifies six impact categories with respect to both public audiences and professional audiences:

caption
Impact Category Public Audiences Professional Audiences
Awareness, knowledge or understanding (of) STEM concepts, processes, or careers Informal STEM education/outreach research or practice.
Engagement or interest (in) STEM concepts, processes, or careers Advancing informal STEM education/outreach field
Attitude (towards) STEM-related topic or capabilities Informal STEM education/outreach research or practice
Behavior (related to) STEM concepts, processes, or careers Informal STEM education/outreach research or practice
Skills (based on) STEM concepts, processes, or careers Informal STEM education/outreach research or practice
Other Project specific Project specific

Links

General

Organizations

Index pages

On-line environments

Evaluation

Bibliography

  • Ballard, H., Pilz, D., Jones, E.T., and Getz, C. (2005). Training Curriculum for Scientists and Managers: Broadening Participation in Biological Monitoring. Corvalis, OR: Institute for Culture and Ecology.
  • Baretto, C., Fastovsky, D. and Sheehan, P. (2003). A Model for Integrating the Public into Scientific Research. Journal of Geoscience Education. 50 (1). p. 71-75.
  • Bauer, M., Petkova, K., and Boyadjieva, P. (2000). Public Knowledge of and Attitudes to Science: Alternative Measures That May End the "Science War". Science Technology and Human Values. 25 (1). p. 30-51.
  • Bonney, R. and LaBranche, M. (2004). Citizen Science: Involving the Public in Research. ASTC Dimensions. May/June 2004, p. 13.
  • Bonney, R., Cooper, C.B., Dickinson, J., Kelling, S., Phillips, T., Rosenberg, K.V. and Shirk, J. (2009). Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy. BioScience. 59 (11). P. 977-984.
  • Brossard, D., Lewenstein, B., and Bonney, R. (2005). Scientific Knowledge and Attitude Change: The Impact of a Citizen Science Project. International Journal of Science Education. 27 (9). p. 1099-1121.
  • Cooper, C.B., Dickinson, J., Phillips, T., and Bonney, R. (2007). Citizen Science as a Tool for Conservation in Residential Ecosystems. Ecology and Society. 12 (2).
  • Cooper, Seth, Firas Khatib, Adrien Treuille, Janos Barbero, Jeehyung Lee, Michael Beenen, Andrew Leaver-Fay, David Baker, Zoran Popović and Foldit players. Predicting protein structures with a multiplayer online games. Nature 466, 756-760 (2010).
  • Firehock, K. and West, J. (2001). A brief history of volunteer biological water monitoring using macroinvertebrates. Journal of the North American Benthological Society. 14 (2) p. 197-202.
  • Khatib, Firas; Seth Cooper, Michael D. Tyka, Kefan Xu, Ilya Makedon, Zoran Popović, David Baker, and Foldit Players. Algorithm discovery by protein folding game players. In Proceedings of the National Academy of Sciences (2011).
  • McCaffrey, R.E. (2005). Using Citizen Science in Urban Bird Studies. Urban Habitats. 3 (1). p. 70-86.
  • Osborn, D., Pearse, J. and Roe, A. Monitoring Rocky Intertidal Shorelines: A Role for the Public in Resource Management. In California and the World Ocean: Revisiting and Revising California's Ocean Agenda. Magoon, O., Converse, H., Baird, B., Jines, B, and Miller-Henson, M., Eds. p. 624-636. Reston, VA: ASCE.
  • Silvertown, J. (2009). A New Dawn for Citizen Science. Trends in Ecology & Evolution. 24 (9). p. 467-471
  • Spiro, M. (2004). What should the citizen know about science? Journal of the Royal Society of Medicine, 97 (1).
  • Hand, Eric (2010). "Citizen science: People power". Nature 466, 685-687
  • Ucko, David A. (2008), Introduction To Evaluating Impacts Of Nsf Informal Science Education Projects, in Friedman, A. (Ed.). Framework for Evaluating Impacts of Informal Science Education Projects [On-line]. http://informalscience.org/evaluations/eval_framework.pdf.