ansible/roles/web-data-analysis/files/countme-update.cron
Will Woods f8a5720535 add 'countme' stuff to web-data-analysis role
This should automate running the "countme" scripts every day to parse
new log data and publish updated totals.

Here's what I've added to the ansible role:

* install package deps for `mirrors-countme`
* make "countme" user with home /srv/countme
* clone 'prod' branch of https://pagure.io/mirrors-countme to /srv/countme
  * if changed: pip install /srv/countme/mirrors-countme
* make web subdir /var/www/html/csv-reports/countme
* make local data dir /var/lib/countme
* install `countme-update.sh` to /usr/local/bin
* install `countme-update.cron` to /etc/cron.d
  * runs /usr/local/bin/countme-update.sh daily, as user `countme`

That should make sure `countme-update.sh` runs every day.
That script works like this:

1. Run `countme-update-rawdb.sh`
  * parse new mirrors.fp.o logs in /var/log/hosts/proxy*
  * write data to /var/lib/countme/raw.db
2. Run `countme-update-totals.sh`
  * parse raw data from /var/lib/countme/raw.db
  * write updated totals to /var/lib/countme/totals.{db,csv}
3. Track changes in updated totals
  * set up /var/lib/countme as git repo (if needed)
  * commit new `totals.csv` (if changed)
4. Make updated totals public
  * Copy totals.{db,csv} to /var/www/html/csv-reports/countme

For safety's sake, I've tried to set up everything so it runs as the
`countme` user rather than running everything as `root`. This might be
an unnecessary complication but it seemed like the right thing to do.

Similarly, keeping totals.csv in a git repo isn't _required_, but it
seemed like a good idea to keep historical records in case we want/need
to change the counting algorithm or something.

I checked the YAML with ansible-lint and tested that all the scripts
work as expected when run as `wwoods`, so unless I've missed something
this should do the trick.
2020-10-13 16:17:00 +00:00

1 line
65 B
Text

0 09 * * * countme /usr/local/bin/countme-update.sh > /dev/null