odd-wiki-hive odd-list-wiki de-odd-Zentrum-wiki en-odd-center-wiki fr-odd-centre-wiki it-odd-centro-wiki Gespräch · talk
bavardage
all wikis all recent changes recent changes local names new page special odd-wiki-hive

Backup

http://kabowiki.org/pics/PicsForWikis.34.jpeg de, en, fr, it

How to get a backup of oddwiki?

You can download a compressed file (tarball) containing all pages of all existing oddwikis. Get the tarball from

http://oddwiki.org/oddwiki.tar.gz

The file is created automatically once every 24h by the maintenance script. Currently it has a size of ~ 17 MB.

To avoid unnecessary traffic caused by search-engines downloading like every fortnight the link above is deactivated. Paste it into your browser’s address bar and press enter / return to download the tarball, please.
See also community-wiki - community-wiki backup.

Making a Backup

This shows you how to copy all the pages to text files. Note that your filesystem needs to be able to save UTF-8 filenames. If you use languages other than English, chances are that you might have trouble on Windows, for example.

(Actually I don’t know what other languages use only ASCII characters – Indonesian, Hawaiian, Tagalog?)

Here’s how to copy your pages to the current directory:

  1. Make sure you use the bash shell or a compatible one
  2. Make sure you have curl installed (or use wget and change the instructions appropriately)
  3. Run it as follows, assuming your wiki is called MigrationTest:
    for p in `curl "http://oddwiki.org/odd/MigrationTest?action=index;raw=1"`; do
      sleep 5
      curl -o $p "http://oddwiki.org/odd/MigrationTest/raw/$p"
    done

The curl fetches all the pages from your wiki as plain text, one page name per line. Then the second curl is called for every page, downloading it. The call to sleep makes sure you’re not overloading the wiki.

You can use this to merge pages from several wikis into one, as long as their pagenames don’t overlap.

You can also copy only a selected few of these pages. Instead of using the action=index URL, prepare a list of pages manually. Create a new page, and list all the page names you want, one page per line. Here is an example you can try yourself. I have put the list on the page MigrationTest:import. It copies a few selected pages from this namespace to your current directory.

    for p in `curl "http://oddwiki.org/odd/MigrationTest/raw/import"`; do
      sleep 5
      curl -o $p "http://oddwiki.org/odd/raw/$p"
    done

Restoring a Backup

Here’s how to restore your wiki from pages in the current directory:

    for p in *; do
      echo $p
      curl -F question=1 -F title=$p -F text="<$p" "http://oddwiki.org/odd/MigrationTest" 
      sleep 5
    done

The echo statement is just to give you some feedback, because curl doesn’t print any output when using the -F option.

The question parameter bypasses our Oddmuse:QuestionAsker_Extension.

The same page on other sites:
DikiWiki:BackupEArtWiki:BackupKaboWiki:BackupOddWiki:Backup