Archive:QA Wikimedia Commons images review, May 2010
This page has been archived.
Its content is no longer being maintained, is likely out of date, and may be inaccurate. This page can be relocated to Meta-Wiki. |
Does the Foundation support Jimmy's desire to have materials removed from your projects?
Of course. Jimmy is acting in his traditional role as a thought leader in Wikimedia's volunteer community. The Wikimedia Foundation supports the continuing efforts of Jimmy and other Wikimedia volunteers, to ensure that information in the Wikimedia projects is of high quality and good educational value.
Why did Jimmy decide to launch this review?
Jimmy is very active on the Wikimedia projects. Without speculating on what exactly motivated him to start thinking about this issue, it's safe to say that he dedicates a lot of his time to working on Wikimedia projects and thinking about their general health and quality.
Is the Wikimedia Foundation removing questionable material from Wikipedia or its other projects?
No. Project volunteers are reviewing the materials and discussing their educational value, which is something that takes place constantly on all of our projects. Right now there is a significantly increased amount of conversation taking place on our projects, particularly Wikimedia Commons, focussed on the removal of questionable content.
The Foundation itself does not actively remove or alter content from our projects unless we are aware of illegal or copyright infringing materials.
How many images are being removed? What percentage of information is 'questionable'?
The categories in which questionable material can be found are comparatively very, very small. There are over 6.5 million files on the Wikimedia Commons. The images and media files in question are a very small percentage of the overall total.
The discussion around the removal of these images is part of a larger, important discussion about the presence of sexual-oriented images on our projects.
Does this mean that volunteers are removing all objectionable material?
No, right now we know that contributors are reviewing a larger amount of material on the projects than usual, and examining those images against pre-existing policies. The discussion around the removal of these images is part of a larger, important discussion about the presence of sexual-oriented images on our projects. We expect there will be a significantly increased amount of conversation about these images over the next few days. It's a serious issue that warrants serious discussion.
Are you only removing photographic media?
Volunteer contributors are reviewing all media materials. The Wikimedia Commons contains photographic images as well as other graphic, sound, and video formats.
Why did the Board of Trustees put out a statement on this topic?
The Wikimedia Foundation Board of Trustees takes leadership positions on key issues. In this case, it was important to the Board to reinforce the mission of the projects, and the importance of applying existing policies. The Board wanted to support Jimmy and other volunteers who are working to uphold good editorial standards.
Do volunteers have to do what the board tells them?
No, the Board of Trustees has no authority over the Wikimedia volunteers. Having said that, Wikimedia volunteers generally acknowledge that the Board of Trustees is made up of people who share their vision, mission and goals. So in general, Wikimedia volunteers tend to be interested in what the board says.
Are you carrying out this 'clean up' because of recent media coverage? To avoid legal risk?
Wikimedia volunteers regularly review editorial policies and check to ensure they're being complied with. It's a normal part of Wikimedia's editorial process. We have not received any complaints or inquiries from any legal authorities regarding materials on any of our projects.
Commons Image Review - May 2010, Questions and Answers
What is the Wikimedia Foundation's position regarding access to questionable content?
The Wikimedia Foundation believes that its projects should strive to provide high quality, informative, educational content. Having said that, the Wikimedia projects do generally contain material that some audiences may find inappropriate or offensive. That's inherent to the nature of the projects: they are global in scope, contain millions of words and images, and are read by hundreds of millions of people around the world who represent very different demographics and attitudes. The Wikimedia community expends a lot of energy discussing how to improve the projects and make them useful to the largest possible number of people. The Wikimedia projects aim to exclude material that offers little or no informational value that is potentially objectionable.
Are you currently discussing filtering, or tagging, or other technological solutions that would make Wikimedia material more palatable to people who want to protect children?
There are a number of technological mechanisms used by many sites. Options have been under consideration by the Wikimedia project volunteers almost since the projects began. We're hopeful that more energies can be focussed on finding solutions that give users opportunities to customize their experience while still ensuring our projects maintain their open nature.
Why would you keep any questionable material at all on Wikipedia?
There are more than six million images and 15 million articles on the Wikimedia sites, with new material continually being added. The Wikimedia Foundation relies on a large global community of active volunteers to exercise editorial judgement in determining what belongs in the projects. Volunteers take their responsibilities seriously, and every day they remove material that they believe is inappropriate. If they believe material is illegal, it will normally be immediately deleted. However, the Wikimedia projects do generally contain material that some audiences may find inappropriate or offensive. The projects are used by people around the world who represent an enormous diversity of ideologies, cultures, religions and viewpoints. It is pretty much a certainty that people will be able to find material they dislike, particularly if they look for it. We do not and cannot guarantee otherwise. The same is true for all other large internet properties that allow users to contribute content.