helpers | ||
LICENSES | ||
timelogs | ||
.gitignore | ||
analyse.py | ||
check-timelog.sh | ||
check.py | ||
config.ini | ||
README.md | ||
report.py | ||
requirements_report.txt | ||
requirements_report.txt.license | ||
timelog_sum.awk | ||
timelog_sum.sh | ||
weekly.py |
GTimeLog helper utilities
These scripts are meant to support evaluation of time logs generated by
GTimeLog. There is an example timelog in
timelogs/test.txt
you can play with.
If you want to analyse a single GTimelog log, you can use the analyse.py
script. This does only need python3
installed on your computer (most likely
already the case). To analyse more then one log file at once please use
the report.py
script.
Check your own timelog
General overview of your time spend on a task - analyse.py
If you want to check your own timelog to see how much time you spend on
the different tasks, or to find errors in the log, you can use the analyse.py
script.
Use git to clone this repository and change into the directory. Then run the analyse script using
python3 analyse.py -f ~/.local/share/gtimelog/timelog.txt -r categories
the results will be printed to the terminal. If you see warnings like negative duration from followed by a date, open the timelog in an editor and search for that date to correct the entry in the log.
Spend time in a specific timeframe - timelog_sum.awk
Task: For example, you want to find out how much time you spend on a specific
task in the last 6 months. Then the timelog_sum.awk
script can help you with
this.
This awk script requires: gawk
On most systems this should be already installed. However if you run into a problem please check if this is the case. Open a terminal and run
timelog_sum.sh
For more specific dates (like YYYY-MM-DD) please take a look at timelog_sum.awk
In this case you will find the commands on how to run the script in timelog_sum.awk
You can open it with the editor of your choice.
Generate a weekly report from gtimelog.txt file
Every Monday morning at 7am, the weekly reports from 1 to Current Week will be generated for all staffers and stored in their private Nextcloud folder at ~/Nextcloud/timelog_username
.
For detailed information chek the page in our docs
Analyse
This script will calculate all timelogged activities of a gtimelog file
(typically timelog.txt
, but can also be from standard input). There are a
number of options available:
- You can reduce the level of detail. By default, all activities are listed in
the
CATEGORY OptionalSub: description
format. With-r categories
and-r supercategories
, you can limit the output toCATEGORY OptionalSub
andCATEGORY
respectively. - You can define the time format. By default, you will see all times in
HH:MM
, but with-t m
you can switch it to minutes. - You can set the output format. By default, you see a pretty table. For further
analysis, you can export a CSV file by stating
-s csv
. The values are comma-separated, and quoted if necessary. - By default, some categories that compose personal internal activities are
excluded. You can set these in
config.ini
. In the statistics, these are listed as "Ignored Categories". With-a
you can include them into your normal output.
In the pretty format, you will see some extra statistics. Please note that these are not available in the CSV style.
Run python3 analyse.py -h
to get an overview of all commands. Here are some
examples:
Detailed timelogs in pretty format
List all activities in the timelog (without tags), sorted alphabetically. For each activity, show the total duration in HH:MM, and the proprortion of this activity compared to the total respected working time.
This is the default behaviour.
$ python3 analyse.py -f timelogs/test.txt
Activity | Duration | Percentage
--------------------------------------------------- | -------- | ----------
INTERNAL Sysadmin: DNS | 1:35 | 3.86
INTERNAL Sysadmin: auto-er | 0:56 | 2.27
INTERNAL Sysadmin: coordination | 0:47 | 1.91
[...]
Statistic | Duration
------------------ | --------
Working time | 41:04
Sick time | 21:00
Vacation time | 14:00
Public holidays | 14:00
Ignored Categories | 6:49
Only supercategories in pretty format
List all supercategories, sorted alphabetically.
$ python3 analyse.py -f timelogs/test.txt -r supercategories
Activity | Duration | Percentage
-------- | -------- | ----------
INTERNAL | 13:14 | 32.22
LEGAL | 3:23 | 8.24
PA | 16:52 | 41.07
POLICY | 7:35 | 18.47
Statistic | Duration
------------------ | --------
Working time | 41:04
Sick time | 21:00
Vacation time | 14:00
Public holidays | 14:00
Ignored Categories | 6:49
Only categories in CSV format in minutes
List all categories, and show time as minutes.
$ python3 analyse.py -f timelogs/test.txt -r categories -s csv -t m
Activity,Duration,Percentage
INTERNAL Sysadmin,432,17.53
INTERNAL Technical,362,14.69
LEGAL Reuse,203,8.24
PA,142,5.76
PA Ilovefs,117,4.75
PA Podcast,61,2.48
PA PublicEvents,91,3.69
PA PublicEventsFOSDEM,43,1.75
PA Website,558,22.65
POLICY,147,5.97
POLICY PMPC,308,12.5
Report
This script is based on the Analyse script and all options of the script can be used with the Report script as well. In addition this script can be used to operate on multible gtimelog files at the same time and combine them into one report.
Requirements
The script uses the pandas framework and odfpy for writing ODS files.
On Debian/12 (Bookworm) you can install the requirements with
sudo apt install python3-odf python3-pandas
on Ubuntu 20 ODF files cannot be written, instead XSL files can. For this
version of Ubutu please install the python-xlwt-doc
package as well.
alternatively you can create a virtual environment and install the needed modules in it
python3 -m venv venv
source venv/bin/activate
pip3 install -r requirements_report.txt
Export formats
The report script can export the findings to CSV files or as LibreOffice
calc document. By default the naming of the files will be report.ods
or
many CSV files beginning with report_
.
You can set this name/prefix using the -o
parameter.
To export the report as ODS file, use the style option -s calc
.
Here an example:
python3 report.py -f timelogs/2023/* -r categories -t m -s calc -o 2023
Import file name(s)
You can pass the input file name using the -f
parameter as with the analyse
script. The report script can handle multible input files. In addition it tries
to get the name of the person from the filename for table headers in the ODS
file or CSV filenames. For this to work the gtimelog file names have to be in
the pattern timelog_NAME.txt
. If no name could be found, the name will be
set to person
and the persons are counted.
Date Limited Reports
To limit the generated report from the report.py
script to cover only a
selected period instead of the entire timelog file(s) you can use the command
line parameters -l
, -b
and -e
to set the limits.
The -l
will activate the date limited mode. With -b
you specify the start
date and with -e
the end date is specified. For the dates use the YYYY-MM-DD
format, midnight will be used as time for both.
check-timelog.sh
The 'check-timelog.sh' automates the process of checking a staffer's 'gtimelog.txt' using a 'check.py' script. It analyzes the timelog file for any errors or inconsistencies.If any issues are found, the script notifies the user directly through desktop notifications using the 'notify-send' command.
Additionally, the script copies the user's local 'timelog.txt' file to a staffer's 'Nextcloud/timelog_StafferName' folder. The script is called every hour via 'cronjob', ensuring that the 'timelog.txt' is checked periodically for any updates or changes.
License and credits
This repo contains some third-party code, mainly by the gtimelog project. As this repository is REUSE compliant, you can see all used licenses, which files they apply to, and their copyright holders easily.