Spam: Difference between revisions

The educational technology and digital learning wiki
Jump to navigation Jump to search
m (Created page with '{{Stub}} == Lookup IP addresses and domain names == == Mediawiki spamming == There exist several strategies: === Registered users === To fight spamming, only registered use…')
 
mNo edit summary
Line 3: Line 3:
== Lookup IP addresses and domain names ==
== Lookup IP addresses and domain names ==


This may allow to block whole domains (e.g. in the httpd.conf file or at the system level). Sometimes, wikis are spammed manually and this can help a bit.
* [http://www.whois.net/ Whois.Net]
** [http://tools.whois.net/whoisbyip/ whois by IP]
** [http://tools.whois.net/index.php?fuseaction=ping.results Ping] (see if a web (or other server) is alive. Takes both IP and name.
* [http://www.easywhois.com/ Easywhois.com] (alternative to whois.net)
* [http://www.ipgp.net/ ipgp.net] (find domain names for IP numbers)


== Mediawiki spamming ==
== Mediawiki spamming ==
Line 16: Line 23:
  $wgGroupPermissions['*']['read']            = true;
  $wgGroupPermissions['*']['read']            = true;


; Making login more difficult
; Light-weight user creation that requires some math
This can defeat some scripts
* [http://www.mediawiki.org/wiki/Extension:ConfirmEdit http://www.mediawiki.org/wiki/Extension:ConfirmEdit]


* [http://www.mediawiki.org/wiki/Extension:ConfirmEdit http://www.mediawiki.org/wiki/Extension:ConfirmEdit]
; Making user creation more difficult with captcha
This can defeat more scripts
* [http://en.wikipedia.org/wiki/Captcha http://en.wikipedia.org/wiki/Captcha]
* [http://en.wikipedia.org/wiki/Captcha http://en.wikipedia.org/wiki/Captcha]
=== Filtering edits and page names ===
Prevent creation of pages with bad words in the title and/or the text.
; The builtin WgSpamRegex variable
Mediawiki includes a [http://www.mediawiki.org/wiki/Manual:$wgSpamRegex $wgSpamRegex] variable. The goals is prevent three things: (a) bad words, (b) links to bad web sites and (c) CSS tricks to hide contents.
Insert in LocalSettings.php something like:
$wgSpamRegex = "/badword1|barword2|abcdefghi-website\.com|display_remove_:none|overflow_remove_:\s*auto;\s*height:\s*[0-4]px;/i"
I will not show ours here since I can't include it in this page ;)
Read the [http://www.mediawiki.org/wiki/Manual:$wgSpamRegex manual page] for detail. It includes a longer [[regular expression]] that you may adopt.
Don't forget to edit MediaWiki:Spamprotectiontext
; Spam blacklists extensions (an alternative)
The SpamBlacklist extension prevents edits that contain URL hosts that match regular expression patterns defined in specified files or wiki pages.
* [http://www.mediawiki.org/wiki/Extension:SpamBlacklist SpamBlacklist extension]


== Links ==
== Links ==


* http://www.mediawiki.org/wiki/Manual:Combating_spam
* [http://meta.wikimedia.org/wiki/Wiki_Spam Wiki Spam] (Wikimedia)
 
=== Mediawiki ===
 
* [http://www.mediawiki.org/wiki/Manual:Combating_spam Combating spam] (Mediawiki Manual)
 
* [http://www.mediawiki.org/wiki/Anti-spam_features Anti spam features] (Mediawiki)




[[Category: Server administration]]
[[Category: Server administration]]

Revision as of 14:51, 6 July 2009

Draft

Lookup IP addresses and domain names

This may allow to block whole domains (e.g. in the httpd.conf file or at the system level). Sometimes, wikis are spammed manually and this can help a bit.

Mediawiki spamming

There exist several strategies:

Registered users

To fight spamming, only registered uses should be able to edit. Edit Localsettings.php

$wgGroupPermissions['*']['edit']            = false;
$wgGroupPermissions['*']['createaccount']   = true;
$wgGroupPermissions['*']['read']            = true;
Light-weight user creation that requires some math

This can defeat some scripts

Making user creation more difficult with captcha

This can defeat more scripts

Filtering edits and page names

Prevent creation of pages with bad words in the title and/or the text.

The builtin WgSpamRegex variable

Mediawiki includes a $wgSpamRegex variable. The goals is prevent three things: (a) bad words, (b) links to bad web sites and (c) CSS tricks to hide contents.

Insert in LocalSettings.php something like:

$wgSpamRegex = "/badword1|barword2|abcdefghi-website\.com|display_remove_:none|overflow_remove_:\s*auto;\s*height:\s*[0-4]px;/i"

I will not show ours here since I can't include it in this page ;)

Read the manual page for detail. It includes a longer regular expression that you may adopt.

Don't forget to edit MediaWiki:Spamprotectiontext

Spam blacklists extensions (an alternative)

The SpamBlacklist extension prevents edits that contain URL hosts that match regular expression patterns defined in specified files or wiki pages.

Links

Mediawiki