Mediawiki Robots.txt Example

From Bonus Bits
Jump to: navigation, search

Description

This article gives an example robots.txt file for a Mediawiki site. I'm not expert, but it seems to work testing with Google robots.txt.


Subject

User-agent: *
Disallow: /wiki/Help:*
Disallow: /wiki/MediaWiki:*
Disallow: /wiki/Special:*
Disallow: /wiki/Template:*
Disallow: /wiki/Manual:*
Disallow: /wiki/User:*
Disallow: /wiki/List:*
Disallow: /wiki/Category:*
Disallow: /index.php?*
Disallow: /images/*
Disallow: /skins/*
Disallow: /extensions/*
Disallow: /cache/*
Disallow: /uploads/*
Disallow: /mw-config/*
Disallow: /vendor/*
Disallow: /resources/*
Disallow: /includes/*
Disallow: /*.php
Disallow: /*.json
Disallow: /*.php5
Disallow: /wiki/Main_Page


Sources