fedora 40 and above missing on images on data-analysis.fedoraproject.org/csv-reports/images #12426
Labels
No labels
announcement
authentication
automate
aws
backlog
blocked
bodhi
ci
Closed As
Duplicate
Closed As
Fixed
Closed As
Fixed with Explanation
Closed As
Initiative Worthy
Closed As
Insufficient data
Closed As
Invalid
Closed As
Spam
Closed As
Upstream
Closed As
Will Not or Can Not fix
cloud
communishift
copr
database
deprecated
dev
discourse
dns
downloads
easyfix
epel
factory2
firmitas
gitlab
greenwave
hardware
help wanted
high-gain
high-trouble
iad2
koji
koschei
lists
low-gain
low-trouble
mbs
medium-gain
medium-trouble
mini-initiative
mirrorlists
monitoring
Needs investigation
notifier
odcs
OpenShift
ops
OSBS
outage
packager_workflow_blocker
pagure
permissions
Priority
Needs Review
Priority
Next Meeting
Priority
🔥 URGENT 🔥
Priority
Waiting on Assignee
Priority
Waiting on External
Priority
Waiting on Reporter
rabbitmq
rdu-cc
release-monitoring
releng
repoSpanner
request-for-resources
s390x
security
SMTP
src.fp.o
staging
taiga
unfreeze
waiverdb
websites-general
wiki
No milestone
No project
No assignees
7 participants
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference: Infrastructure/fedora-infrastructure#12426
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
could fedora 40, 41 and 42 be included in the graphs available at https://data-analysis.fedoraproject.org/csv-reports/images/summary.html
Metadata Update from @phsmoura:
cc: @james
I know about the numbers behind this, but not what generates the graphs etc.
cc @mattdm ?
This is done via a set of scripts not attached to the work James has been doing. The older data graphs were were to be end-of-lifed around Fedora 40 and replaced with ones generated weekly from the dnf countme data.
The scripts are a python and awk which do a rough count of 1 entry per arch, release,ip/day and then writes out a csv file. That csv file is then run against some rough gnuplot graphs which get updated daily.
All of these were really a short term hack which turned into a permanent solution so updating to newer releases requires a lot of manual hacking of scripts, gnuplot graphs and the existing CSV files to deal with the newer data. At the moment the only data I would consider it useful for is the EPEL-6,8,9 numbers and historical data for other items.
A better solution would be to finish the 'make a graph every week from the countme data' project and keep an older set of pictures with an end-date of when they were last valid (Jun 2024?).
The hacky stuff appears to be under roles/web-data-analysis/files/
Perhaps it's not hard to just add 40+ for now?
Or perhaps someone can finish the new thing... but I don't even know the status of that? was someone working on it?
I thought that the new thing is already finished and running.
Last year, I was playing with a rewrite of https://pagure.io/brontosaurusifier which includes a frontend for the graphs. I didn't finish it then, but (obviously) I kep the code and it probably wouldn't be hard to wrap it up. I could make PR for stg or communishift, if people want me to push on with it.
It would be nice to have for sure...