Writing and publishing a website used to be a painful process. Some years ago, I developped a simple xslt framework to produce my personnal web pages from simple xml files. This was in part because I did not liked any of the big CMS out at that time, requiring database backend, and also that I wanted to learn XSLT processing. CSS was a must and compliance with W3C standards whitout saying. As long as I did not wanted to change much on the site, this was ok, and the site was pretty robust since 2005.


So I needed a simple framework, what is better than homemade ? So lets remove the style, and start writing content, and only (mostly) content and metadata.

<!-- index.xml -->
<?xml version="1.0" encoding="UTF-8"?>
<?DOCTYPE page SYSTEM "my_pages.dtd"?>
<?xml-stylesheet type="text/xsl" href="my_pages.xsl.xml"?>

<page icon="stock_home">
<section id="section-id" name="Global Section Name">
<p>Blah Blah</p>
<!-- and any basic html tags -->

Simple enough, emacs made, now we need to transform that into a proper html static page, using xsl transformation, this was supposedly the future of xhtml

<!-- my_pages.xslt -->
<?xml version="1.0"  encoding="UTF-8"?>
<xsl:stylesheet version="1.0"

<xsl:output method="xml" indent="yes" encoding="UTF-8"
  doctype-public="-//W3C//DTD XHTML 1.1//EN" />

<xsl:include href="header.xsl"/>
<xsl:include href="footer.xsl"/>

<xsl:template match="page">
<xsl:element name="body">

<xsl:call-template name="make-header"/>

<div id="pageCell">
<div id="pageName">
  <h2><xsl:value-of select="title"/></h2>
  <xsl:element name="img">
    <xsl:attribute name="src">
      icons/<xsl:value-of select="@icon"/>.png
    <xsl:attribute name="alt">page icon</xsl:attribute>


This go on and on to construct the web page, with some common header/menu

<!-- header.xsl -->
<xsl:template name="html-header">

Alexandre Beelen @ IAS - <xsl:value-of select="title"/>
<meta name="author"      content="Alexandre Beelen" />
<meta name="keywords"    content="astrophysics, astronomy, quasars, black hole, cosmology, IAS, Orsay, MIC, Insitut   d'Astrophysique Spatiale" />
<meta name="description" content="Homepage of Alexandre Beelen" />
<meta name="page-topic"  content="science" />
<meta name="expires"     content="0" />
<meta name="robots"      content="INDEX,FOLLOW" />

<link rel="stylesheet" href="page.css" type="text/css" />


<xsl:template name="make-header">
<div id="Header">
  <div id="HeaderTitle">
    <h1><a href="http://www.ias.u-psud.fr/pperso/abeelen">Alexandre Beelen @ IAS</a></h1>
    <div id="HeaderTitleLinks">
    <a href="http://www.ias.fr"                    class="hlink">IAS</a> |
    <a href="http://www.mpifr-bonn.mpg.de"         class="hlink">MPIfR</a> |
    <a href="http://www.astro.uni-bonn.de/~webrai" class="hlink">AIfA</a> |
    <a href="http://www.obspm.fr"                  class="hlink">Obspm</a>
  <div id="HeaderNav">
    <a href="index.html"                            class="glink">Home</a> :
    <a href="cv.html"                               class="glink">C.V.</a> :
    <a href="talk.html"                             class="glink">Talk &amp; Posters</a> :
    <a href="publication.html"                      class="glink">Publications</a> :
    <a href="database.xml"                          class="glink">High-z CO database</a> :
    <a href="software.html"                         class="glink">Tools</a> :
    <a href="http://www.astro.uni-bonn.de/boawiki/" class="glink">Open Boa</a> :
    <a href="http://www.ias.u-psud.fr/sanepic/"     class="glink">SANEPIC</a> :
    <a href="schedule.html"                         class="glink">Schedule</a>

where all the links are hardcoded in the file directly. Same goes for the footer. Of course you need a Makefile produce all the pages, and even publish them through a rsync and ssh commands

SHELL= /bin/sh
srcdir    = .
outdir    = $(srcdir)/html
addstuff  = $(srcdir)/download $(srcdir)/icons $(srcdir)/pictures page.css sitemap.xml

all: update
web: index.html software.html talk.html cv.html publication.html schedule.html gildas.html

update: web
  cp -fr $(addstuff) $(outdir)
  rsync -qavz -e ssh $(outdir)/* abeelen@www:web_perso
  ssh abeelen@www.ias.u-psud.fr chgrp -R www web_perso
  ssh abeelen@www.ias.u-psud.fr chmod -R a+r web_perso

.SUFFIXES: .html .xml

%.html: %.xml
  xsltproc -o $(outdir)/$@ my_pages.xsl $<

Everything was working fined and was fully W3C HTML 1.1 compliant. And you can still see it at work there. But it was rigid, very rigid... And crazy to maintain... It's 2015 after all !


Fast forward to 2015, during a power outage, and with only access to Internet, I started the project to migrate my website. After a rapid survey of simple static web site generator, I decided to use Pelican mainly for its simplicity and the capability to use either Markdown or reStructuredTex. Simple enough to use, eigher with emacs or atom.

The previous home page could be written as a simple file


Global Section Name

blah blah

You could choose a theme and voilà ! So how did it go ? I just started a Pelican tutorial and finished with a full site in half a day. You start by creating a skeleton project

> pelican-quickstart

and fill the content/pages directory with cleaned version of my previously generated html pages translated to reStructuredTex with pandoc. The basic pages were there.

I just had to find a nice theme, and I quickly choose the bootstrap framework through the pelican-bootstrap3 theme. And I just had to put everything under revision control using git. And I was done with this website. And well, I could also push it with a Makefile !

Even more

The journey did not end by with only migrating the old pages to a new simpler system, but it is now easy to add new features with pelican-plugins, so here is a list of plugins I used :

  • better_tables, twitter_bootstrap_rst_directives, bootstrapify
  • sitemap
    • for a better indexation in web crawler, not having to produce it by hand is a real plus
  • tag_cloud
    • to produce a tag_cloud on the sidebard, utility is still to be proven...
  • render_math
  • tipue_search
    • allows for simple search over the static website
  • pelican_dynamic


Pelican and pelican-bootstrap3 allow for a very fine tuning of the created website. One can very simply define which pages are displayed in the menu, and in which order using the meta data sortorder for example.

The web site can very easily be tracked with piwik or Google Analytics. One can still transfert standalone files, as I needed for the Molecular Database, or even a copy of my old website