This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

PATCH: wwwdocs/bin/preprocess


Last night we had an out-of-disk condition in /tmp on gcc.gnu.org, and
the nightly janitory service would replace all web pages by 1 byte long
files.  Ian was awake at that time, and restored everything by rerunning
the script after /tmp was available, but clearly the preprocess script
was not sufficiently defensive.

This patch should address this.  I successfully tested it locally with
a read-only /tmp directory and also updated /www/bin/preprocess on the
gcc.gnu.org box after committing it to CVS.

Gerald


Make process_file() more stable in case of error conditions and try to
avoid updating the destination tree in that case.

Index: preprocess
===================================================================
RCS file: /cvs/gcc/wwwdocs/bin/preprocess,v
retrieving revision 1.37
diff -u -3 -p -r1.37 preprocess
--- preprocess	4 Apr 2003 16:32:22 -0000	1.37
+++ preprocess	28 Aug 2003 12:39:08 -0000
@@ -146,8 +146,11 @@ process_file()
             cat $f >> $TMPDIR/input
             ${MHC} $TMPDIR/input > $TMPDIR/output

-            # Copy the page only if it's new or there has been a change.
-            if [ ! -f $DESTTREE/$f ]; then
+            # Copy the page only if it's new or there has been a change, and,
+            # first of all, if there was no problem when running MetaHTML.
+            if [ $? -ne 0 ]; then
+                echo "  Problem processing $f; not updated!"
+            elif [ ! -f $DESTTREE/$f ]; then
                 echo "  New file $f"
                 cp $TMPDIR/output $DESTTREE/$f
             else


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]