Spam

The educational technology and digital learning wiki
Jump to navigation Jump to search

Draft

Lookup IP addresses and domain names

This may allow to block whole domains (e.g. in the httpd.conf file or at the system level). Sometimes, wikis are spammed manually and this can help a bit.

If your Mediawiki is spammed: first you will have to go either through your web server logs, e.g. search for "submitlogin" or install an extension that shows the IP number of users.

The CheckUser extension
allows you to figure out where the come from (connect from) and may help you decide whether you should block a whole IP range or ranges (e.g. a whole country).
Installed on EduTechwiki and also some Wikipedia/media sites.

Alternatively dig through web server access logs and then consult one of these:

Mediawiki spamming

There exist several strategies:

Registered users

To fight spamming, only registered uses should be able to edit. Edit Localsettings.php

$wgGroupPermissions['*']['edit']            = false;
$wgGroupPermissions['*']['createaccount']   = true;
$wgGroupPermissions['*']['read']            = true;
Light-weight user creation that requires some math

This can defeat some scripts

Making user creation more difficult with captcha

This can defeat more scripts

Making user creation more difficult with recaptcha and contributes to a digitalization project.

This extension is currently used in Edutechwiki with (roughly the following setup)

# Anti Spam ConfirmEdit
# Recaptcha relies on ConfirmEdit, but only ONE needs to be loaded
# require_once("extensions/ConfirmEdit/ConfirmEdit.php");

# ReCaptcha
# See the docs in extensions/recaptcha/ConfirmEdit.php
# http://wiki.recaptcha.net/index.php/Main_Page
require_once( "$IP/extensions/recaptcha/ReCaptcha.php" );
$recaptcha_public_key = '................';
$recaptcha_private_key = '................';

# Users must be registered, once they are in, they they still must fill in captchas (at least over the summer)
$wgCaptchaTriggers['edit']          = true;
$wgCaptchaTriggers['addurl']        = false;
$wgCaptchaTriggers['create']        = true;
$wgCaptchaTriggers['createaccount'] = true;

Filtering edits and page names

Prevent creation of pages with bad words in the title and/or the text.

The builtin WgSpamRegex variable

Mediawiki includes a $wgSpamRegex variable. The goals is prevent three things: (a) bad words, (b) links to bad web sites and (c) CSS tricks to hide contents.

Insert in LocalSettings.php something like:

$wgSpamRegex = "/badword1|barword2|abcdefghi-website\.com|display_remove_:none|overflow_remove_:\s*auto;\s*height:\s*[0-4]px;/i"

I will not show ours here since I can't include it in this page ;)

Read the manual page for detail. It includes a longer regular expression that you may adopt.

Don't forget to edit MediaWiki:Spamprotectiontext

Spam blacklists extensions (an alternative)

The SpamBlacklist extension prevents edits that contain URL hosts that match regular expression patterns defined in specified files or wiki pages.

Links

General

General wiki spamming

Examples from content guidelines - what is spam ?

Mediawiki

  • Spam Filter (This is development page of Mediawiki. I includes extra information, e.g. cleanup scripts.)
  • Help:Spam (Wikia) Wikia is a commercial version of Wikipedia with many user-managed subwikis that have their own aims and content policies.