Property:Wikipedia data extraction
Revision as of 18:14, January 23, 2014 by Fnielsen (clone -> Dump)
Wikipedia data extraction refers to the general means by which Wikipedia data was obtained for the purpose of the study. The options are:
- Dump: Data dumps was downloaded (and possibly installed locally) and analyzed.
- Live Wikipedia: Data was extracted from accessing the live Wikipedia website. This includes data extracted from history pages on the live Wikipedia, as long as a local version of Wikipedia was not reproduced to obtain the data.
- Secondary dataset: A preprocessed dataset of Wikipedia was used to obtain the data for analysis. That is, the researchers depended on someone else's reprocessing of a Wikipedia clone.
Unique values: Dump, Live Wikipedia, N/A, Secondary dataset
Pages using the property "Wikipedia data extraction"
Showing 25 pages using this property.(previous 25) (next 25)next 25)