Wikipedia revision toolkit: efficiently accessing Wikipedia's edit history
Publication (help) | |
---|---|
Wikipedia Revision Toolkit: Efficiently Accessing Wikipedia's Edit History | |
Authors: | Oliver Ferschke, Torsten Zesch, Iryna Gurevych [edit item] |
Citation: | 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies : 97-102. 2011 June 21. Portland, Oregon, USA. |
Publication type: | Conference paper |
Peer-reviewed: | Yes |
Database(s): | |
DOI: | Define doi. |
Google Scholar cites: | Citations |
Link(s): | Paper link |
Added by Wikilit team: | Yes |
Search | |
Article: | Google Scholar BASE PubMed |
Other scholarly wikis: | AcaWiki Brede Wiki WikiPapers |
Web search: | Bing Google Yahoo! — Google PDF |
Other: | |
Services | |
Format: | BibTeX |
Contents
[edit] Abstract
We present an open-source toolkit which allows (i) to reconstruct past states of Wikipedia, and (ii) to efficiently access the edit history of Wikipedia articles. Reconstructing past states of Wikipedia is a prerequisite for reproducing previous experimental work based on Wikipedia. Beyond that, the edit history of Wikipedia articles has been shown to be a valuable knowledge source for NLP, but access is severely impeded by the lack of efficient tools for managing the huge amount of provided data. By using a dedicated storage format, our toolkit massively decreases the data volume to less than 2% of the original size, and at the same time provides an easy-to-use interface to access the revision data. The language-independent design allows to process any language represented in Wikipedia. We expect this work to consolidate NLP research using Wikipedia in general, and to foster research making use of the knowledge encoded in Wikipedia’s edit history.
[edit] Research questions
"We present an open-source toolkit which allows (i) to reconstruct past states of Wikipedia, and (ii) to efficiently access the edit history of Wikipedia articles. Reconstructing past states of Wikipedia is a prerequisite for reproducing previous experimental work based on Wikipedia."
Research details
Topics: | Other natural language processing topics [edit item] |
Domains: | Computer science [edit item] |
Theory type: | Design and action [edit item] |
Wikipedia coverage: | Main topic [edit item] |
Theories: | "Undetermined?" [edit item] |
Research design: | Experiment [edit item] |
Data source: | [edit item] |
Collected data time dimension: | Longitudinal [edit item] |
Unit of analysis: | Edit [edit item] |
Wikipedia data extraction: | Clone [edit item] |
Wikipedia page type: | Article, History [edit item] |
Wikipedia language: | English [edit item] |
[edit] Conclusion
"In this paper, we presented an open-source toolkit which extends JWPL, an API for accessing Wikipedia, with the ability to reconstruct past states of Wikipedia, and to efficiently access the edit history of Wikipedia articles. Reconstructing past states of Wikipedia is a prerequisite for reproducing previous experimental work based on Wikipedia, and is also a requirement for the creation of time-based series of Wikipedia snapshots and for assessing the influence of Wikipedia growth on NLP algorithms. Furthermore, Wikipedia’s edit history has been shown to be a valuable knowledge source for NLP, which is hard to access because of the lack of efficient tools for managing the huge amount of revision data. By utilizing a dedicated storage format for the revisions, our toolkit massively decreases the amount of data to be stored. At the same time, it provides an easyto-use interface to access the revision data. We expect this work to consolidate NLP research using Wikipedia in general, and to foster research making use of the knowledge encoded in Wikipedia’s edit history. The toolkit will be made available as part of JWPL, and can be obtained from the project’s website at Google Code. (http:// jwpl.googlecode.com)"
[edit] Comments
Further notes[edit]
Abstract | We present an open-source toolkit which al … We present an open-source toolkit which allows (i) to reconstruct past states of
Wikipedia, and (ii) to efficiently access the edit history of Wikipedia articles. Reconstructing past states of Wikipedia is a prerequisite for reproducing previous experimental work based on Wikipedia. Beyond that, the edit history of Wikipedia articles has been shown to be a valuable knowledge source for NLP, but access is severely impeded by the lack of efficient tools for managing the huge amount of provided data. By using a dedicated storage format, our toolkit massively decreases the data volume to less than 2% of the original size, and at the same time provides an easy-to-use interface to access the revision data. The language-independent design allows to process any language represented in Wikipedia. We expect this work to consolidate NLP research using Wikipedia in general, and to foster research making use of the knowledge encoded in Wikipedia’s edit history.ledge encoded in Wikipedia’s edit history. |
Added by wikilit team | Yes + |
Collected data time dimension | Longitudinal + |
Conclusion | In this paper, we presented an open-source … In this paper, we presented an open-source toolkit which extends JWPL, an API for accessing Wikipedia, with the ability to reconstruct past states
of Wikipedia, and to efficiently access the edit history of Wikipedia articles. Reconstructing past states of Wikipedia is a prerequisite for reproducing previous experimental work based on Wikipedia, and is also a requirement for the creation of time-based series of Wikipedia snapshots and for assessing the influence of Wikipedia growth on NLP algorithms. Furthermore, Wikipedia’s edit history has been shown to be a valuable knowledge source for NLP, which is hard to access because of the lack of efficient tools for managing the huge amount of revision data. By utilizing a dedicated storage format for the revisions, our toolkit massively decreases the amount of data to be stored. At the same time, it provides an easyto-use interface to access the revision data. We expect this work to consolidate NLP research using Wikipedia in general, and to foster research making use of the knowledge encoded in Wikipedia’s edit history. The toolkit will be made available as part of JWPL, and can be obtained from the project’s website at Google Code. (http:// jwpl.googlecode.com)Google Code. (http:// jwpl.googlecode.com) |
Conference location | Portland, Oregon, USA + |
Dates | 21 + |
Google scholar url | http://scholar.google.com/scholar?ie=UTF-8&q=%22Wikipedia%2BRevision%2BToolkit%3A%2BEfficiently%2BAccessing%2BWikipedia%27s%2BEdit%2BHistory%22 + |
Has author | Oliver Ferschke +, Torsten Zesch + and Iryna Gurevych + |
Has domain | Computer science + |
Has topic | Other natural language processing topics + |
Month | June + |
Pages | 97-102 + |
Peer reviewed | Yes + |
Publication type | Conference paper + |
Published in | 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies + |
Research design | Experiment + |
Research questions | We present an open-source toolkit which al … We present an open-source toolkit which allows (i) to reconstruct past states of Wikipedia, and (ii) to efficiently access the edit history of Wikipedia articles. Reconstructing past states of Wikipedia is a prerequisite for reproducing previous experimental work based on Wikipedia.ious experimental work based on Wikipedia. |
Revid | 9,935 + |
Theories | Undetermined? |
Theory type | Design and action + |
Title | Wikipedia Revision Toolkit: Efficiently Accessing Wikipedia's Edit History |
Unit of analysis | Edit + |
Url | http://dl.acm.org/citation.cfm?id=2002440.2002457 + |
Wikipedia coverage | Main topic + |
Wikipedia data extraction | Clone + |
Wikipedia language | English + |
Wikipedia page type | Article + and History + |
Year | 2011 + |