# HG changeset patch # User Christophe Lincoln # Date 1390651248 -3600 # Node ID 9df47b19131ba0faca97415ec0b59e704a77ca90 # Parent 190b9ba3af06dde8677e648b31f2693a27a491e5 Improve the export plugin diff -r 190b9ba3af06 -r 9df47b19131b plugins/export/export.cgi --- a/plugins/export/export.cgi Sat Jan 25 00:28:36 2014 +0000 +++ b/plugins/export/export.cgi Sat Jan 25 13:00:48 2014 +0100 @@ -5,8 +5,8 @@ . /usr/lib/slitaz/httphelper # -# NOTE: Exporting wiki and making all urls work is a bit tricky and -# actually doesn't work as expected. The goal is to have a SliTaz codex +# NOTE: Exporting wiki to HTML and making all urls work is a bit tricky. +# Actually it doesn't work as expected. The goal is to have a SliTaz codex # online that can be included on the ISO, so we could have an export # including a small CGI script to simply display wiki pages via HTTPd # knowing that with HTML we must also deal with ../../ @@ -22,22 +22,23 @@ cat << EOT

Export

-$(gettext "EXPERIMENTAL: Export to HTML and create a tarball of your text -content or plugins files.") +$(gettext "Create a tarball off your wiki and plugins files. EXPERIMENTAL: +Export wiki documents to HTML.")

EOT - # Functions + # HTML fixes EPERIMENTAL Functions css_path() { # Sed CSS style path in all documents sed -i s'/style.css/..\/style.css/' */*.html @@ -58,15 +59,7 @@ } # Export requested content case " $(GET export) " in - *\ cloud\ *) - export="cloud" - tmpdir="content" - echo '
'
-			gettext "Exporting:"; echo " $export"
-			gen_tarball
-			echo '
' - dl_link ;; - *\ wiki\ *) + *\ wikitohtml\ *) export="wiki" echo '
'
 			gettext "Exporting:"; echo " $export"
@@ -103,9 +96,13 @@
 		*\ export\ )
 			html_footer && exit 0 ;;
 		*)
+			export="$(GET export)"
+			tmpdir="content"
 			echo '
'
-			gettext "Export not yet implemented for"; echo ": $(GET export)"
-			echo '
' ;; + gettext "Exporting:"; echo " $export" + gen_tarball + echo '
' + dl_link ;; esac html_footer && exit 0