Imagine finding yourself in the middle of nowhere wanting to browse wiki :))
Wikipedia offers whole of their content as a huge xml dump file, which you can download and host wikipedia on your local machine. There are filters such as WikiFilter available which can read this XML dump and render html dynamically. It also provides indexing and search features, So virtually you won't find any diffrerence when you are browsing the local version, except that you wont be seeing the images on the wiki page. (actually you can get a dump of even the image links, just that you got to have a 100 GB hard disk! ).
The whole xml dump file is just about 1.4 gb (compressed ver) and came to about 6.4 gb after decompressing.You can also get a sql dump which you can host using mysql which will obviously take less hard disk space, but I couldn't get anything which renders dynamic html out of it and can also provide search capabilities.
Aha...Its information about anything under the sun literally at your fingertips. All for free . Screw internet. Screw Encarta. Happy Wiki'ing !