From POV-Wiki
Jump to navigation Jump to search

The Setup

In the directory  /a/home/jholsenback/testbed  this is the expected directory structure:


You should be in the above mentioned directory to run the scripts, however, the batch-files wander a bit but always end up back home.

The Files

These are the files that make up the wikidocgen process:

  1. batchMacFiles
    • a bash script that processes the Mac OS documentation set
  2. batchWinFiles
    • a bash script that processes the Windows documentation set
  3. batchUnxFiles
    • a bash script that processes the Unix documentation set
  4. getWikiPages.php
    • gets the files from the POV-Wiki and processes them
  5. mkContentsPages.php
    • builds the various table of contents files and an index.html file for a given doc set
  6. mkWikiTOC.php
    • rebuilds this listing if any of these contents pages below gets changed.
      See additional notes at bottom for more about this!
  7. mkImagePackage.php
    • builds the images directory structure and copies files from the POV-Wiki
  8. getStyleSheet.php
    • gets the povray37.css style sheet and copies it to a given doc set directory
  9. common.php
    • frequently used functions associated with the wikidocgen process
  10. utilities.php
    • frequently used functions associated with the wikidocgen process, an attempt to organize
  11. documentation/WikiImages
    • a symbolic link to the POV-Wiki images directory
  12. documentation/favicon.ico
    • copied to a given doc set directory

Besides the content files on the POV-Wiki these files are also part of the process:

  1. Page Header
  2. Content Header Footer
  3. Style Sheet
  4. Index Page

Running The Process

ssh access to is required, so is this directory structure. All the shell and PHP scripts that make up the process are available revision control under povray > tools > wiki-docgen. Once you're logged and all setup has been done, do the following:

Note: This example deals with generating the Unix/Linux doc set, but the other platforms (MacOs and Windows) are the similar, they are just processed into different directories.

  1. cd testbed
  2. ./batchUnxFiles
  3. cd documentation/unx/
  4. find . -type f -exec chmod 444 '{}' \;
  5. tar -cvvf 24Oct2011unx.tar Arrow*.* favicon.ico images/* povray37.css *.html
  6. gzip 24Oct2011unx.tar
  7. find . -type f -exec chmod 644 '{}' \;
  8. chmod 444 *.tar.gz
  9. cd ..
  10. vi docsets.html

Typically it's okay to run each platforms batch scripts one right after another, then the individual platform directories get bundled into a tar file. Lastly they get compressed using gzip.

Note: If you've previously run the process make sure that the file permissions are correctly set before running the process again. See step 7 above.


The most important part of this process is that it is driven off these files:

The GatherTOCfiles and BuildDocMap functions reads and processes those files. They are located in utilities and common respectively, they depend on the class DocumentMap definition located in common. During batch processing if an exception message is seen, most likely something isn't correct with an associated table of contents entry and how it appears in the content body. The title portion of the toc entry and the heading entry in the content body MUST be and exact match. Batch processing will continue and the offender will appear in the content body as wiki markup.

Hint: look for the wiki tag  Documentation:

The content on the Wiki is organized such that a given section can span more than one file, the script getWikiPages.php effectively concatenates the files together, so look near the bottom of that script to find a place to tap into the stream as it were!

The file that mkWikiTOC.php produces needs to be copied into the table of contents file. Since this file spans several sections, you need to edit the entire page (top most edit button). Look for these comments embedding in the file delimiting where the changes should go.