[Git][reproducible-builds/reproducible-misc][master] Migrate the report generation to the website repo.

Chris Lamb gitlab at salsa.debian.org
Sun Apr 22 13:15:06 CEST 2018


Chris Lamb pushed to branch master at Reproducible Builds / reproducible-misc


Commits:
7b30e845 by Chris Lamb at 2018-04-22T13:14:37+02:00
Migrate the report generation to the website repo.

- - - - -


12 changed files:

- − reports/.gitignore
- − reports/README
- − reports/bin/generate-draft
- − reports/bin/generate-draft.template
- − reports/bin/get-latest-data
- − reports/bin/history
- − reports/bin/newly-reproducible
- − reports/bin/review-bugs
- − reports/bin/review-issues
- − reports/bin/review-stats
- − reports/bin/review-uploads
- − reports/bin/sync-master-to-alioth


Changes:

=====================================
reports/.gitignore deleted
=====================================
--- a/reports/.gitignore
+++ /dev/null
@@ -1,2 +0,0 @@
-/data
-/latest


=====================================
reports/README deleted
=====================================
--- a/reports/README
+++ /dev/null
@@ -1,119 +0,0 @@
-Weekly reports
-==============
-
-The idea is to inform and update the project and everyone who is curious about
-the progress we are making. Also it highlights this is a team effort and
-credits the many people who help towards our goals.
-
-This repository contains the tools to help writing the report, while
-git.debian.org/git/reproducible/blog.git is where the report is actually
-drafted and published.
-
-Reports are from Sunday 00:00 UTC to Sunday 23:59 UTC.
-
-
-Process
--------
-
-0. Get the previous week's data:
-
-       $ bin/get-latest-data
-
-   If you are not a Debian developer, ask one to run
-   bin/sync-to-alioth which will sync the mailboxes to alioth.
-   (bin/get-latest-data will still be able to get most of the data for
-   you, so you should still run it too…)
-
-1. Look at packages that became reproducible in that week.
-
-       $ bin/newly-reproducible
-
-   This output will form the base for the stats in your report.
-   It needs to be manually verified. Some packages would've been
-   fixed by a change in the toolchain and only appear to be fixed
-   in a given version because of a timely upload. Also the script
-   might give some false positives: packages might have been
-   reproducible before, broken by a toolchain upload later fixed.
-
-   Interactive mode, to help with manual verification:
-
-       $ bin/newly-reproducible --interactive
-
-   Once you get used to the workflow, this will save you a lot of time:
-
-       $ bin/newly-reproducible -ia
-
-   You can also look at specific packages:
-
-       $ bin/newly-reproducible -ia ghc rustc ocamlc
-
-   The underlying script that powers interactive mode is bin/history;
-   see that for details.
-
-2. Look at all relevant bug reports that have been modified this week.
-
-   There's a script that will query UDD to bug reports that have been
-   updated in the past 7 days, then use the `bts` tool to cache
-   them, and finally display them using `mutt`.
-
-       $ bin/review-bugs     # this will take a while, do (3) while it runs
-       $ bin/review-bugs -o  # on subsequent runs, to avoid re-downloading
-
-   Non-FTBFS bugs that have patches:
-
-       $ bin/review-bugs -o <(comm -23 latest/bugs-patch latest/bugs-ftbfs)
-
-3. Look at all uploads for the past week.
-
-       $ bin/review-uploads
-
-   This regexp might help with formatting:
-
-   ^.*\d+ (\S+.*\S)\s+.* Accepted (\S*) (\S*)
-    * [[!pkg \2]]/\3 by \1
-
-   Note that mutt view truncates some names.
-
-4. Stats for package reviews:
-
-   Make sure you get a recent copy of `notes.git`. Then:
-
-       $ ../misc/reports/bin/review-stats       # changes to packages
-       $ ../misc/reports/bin/review-issues      # changes to issues
-
-5. QA work:
-
-       $ cat latest/new-bugs-ftbfs.txt          # new FTBFS bugs
-
-6. Manually reported:
-
-       $ less latest/weekly-log.txt
-
-   Make sure you only look at the entries for the previous week, and
-   don't accidentally include the current week. (The previous week is
-   probably the second section.)
-
-7. Git repositories and custom toolchain:
-
-   Look for recent uploads in the package repository:
-   https://reproducible.alioth.debian.org/debian/
-
-   Complete with stuff in Git:
-   https://lists.alioth.debian.org/pipermail/reproducible-commits/
-   https://anonscm.debian.org/cgit/reproducible/?s=idle
-
-   Also take a look at jenkins.debian.net:
-   https://lists.alioth.debian.org/pipermail/qa-jenkins-scm/
-   https://anonscm.debian.org/cgit/qa/jenkins.debian.net.git/
-
-8. Documentation updates:
-
-   Log in to `wiki.debian.org` and head to:
-   https://wiki.debian.org/RecentChanges?max_days=14
-
-   Look for changes in pages with “ReproducibleBuilds” in the name.
-
-9. Left-overs:
-
-   Look at the mailing list archive:
-   https://lists.alioth.debian.org/pipermail/reproducible-builds/


=====================================
reports/bin/generate-draft deleted
=====================================
--- a/reports/bin/generate-draft
+++ /dev/null
@@ -1,321 +0,0 @@
-#!/usr/bin/env python3
-
-import collections
-import datetime
-import jinja2
-import os
-import pickle
-import re
-import subprocess
-import sys
-import time
-import yaml
-
-WEEK_1_END = 1430611200  # May 3 2015, 00:00 UTC, Sunday
-
-PROJECTS = (
-    'diffoscope',
-    'strip-nondeterminism',
-    'disorderfs',
-    'reprotest',
-    'buildinfo.debian.net',
-    'trydiffoscope',
-    'reproducible-website',
-    'jenkins.debian.net',
-)
-
-
-def main(*args):
-    for x in PROJECTS + ('notes',):
-        ensure_dir(sibling_repo_gitdir(x))
-
-    week = int(args[0]) if len(args) > 0 else prev_week()
-
-    data = get_data(week)
-
-    env = jinja2.Environment(
-        loader=jinja2.FileSystemLoader(os.path.dirname(__file__))
-    )
-    print(env.get_template('generate-draft.template').render(**data))
-
-    return 0
-
-
-
-def log(msg, *args, **kwargs):
-    print("I: " + msg.format(*args, **kwargs), file=sys.stderr)
-
-
-def prev_week():
-    now = int(time.time())
-    return ((now - WEEK_1_END) // (7*24*3600) + 1)
-
-
-def sibling_repo_gitdir(path):
-    toplevel = os.path.dirname(subprocess.check_output((
-        'git',
-        'rev-parse',
-        '--show-toplevel',
-    )).decode('utf-8'))
-
-    return os.path.join(toplevel, path, '.git')
-
-
-def ensure_dir(path):
-    if not os.path.isdir(path):
-        raise ValueError("not a directory: {}".format(path))
-
-
-def get_data(week, max_age=3600):
-    filename = '/tmp/generate-draft-{}.pickle'.format(week)
-
-    try:
-        mtime = os.path.getmtime(filename)
-        mtime_self = os.path.getmtime(__file__)
-
-        if mtime > mtime_self and mtime >= time.time() - max_age:
-            log("Using cache from {}", filename)
-
-            with open(filename, 'rb') as f:
-                return pickle.load(f)
-    except (EOFError, OSError):
-        pass
-
-    log("Getting new data")
-
-    week_end = WEEK_1_END + (week - 1) * 7 * 24 * 3600  # exclusive
-    week_start = week_end - 7 * 24 * 3600  # inclusive
-
-    data = {x: y(week_start, week_end) for x, y in (
-        ('author', get_author),
-        ('commits', get_commits),
-        ('uploads', get_uploads),
-        ('patches', get_patches),
-        ('ftbfs_bugs', get_ftbfs_bugs),
-        ('issues_yml', get_issues_yml),
-        ('packages_yml', get_packages_yml),
-        ('packages_stats', get_packages_stats),
-    )}
-
-    data.update({
-        'week': week,
-        'week_start': datetime.datetime.utcfromtimestamp(week_start),
-        'week_end': datetime.datetime.utcfromtimestamp(week_end - 1),
-        'projects': PROJECTS,
-    })
-
-    log("Saving cache to {}", filename)
-
-    with open(filename, 'wb') as f:
-        pickle.dump(data, f)
-
-    return data
-
-
-
-def get_author(week_start, week_end):
-    return os.environ.get('DEBFULLNAME', 'FIXME')
-
-
-def get_ftbfs_bugs(week_start, week_end):
-    return bugs(
-        week_start,
-        week_end,
-        "bugs_usertags.tag = '{}'".format('ftbfs'),
-    )
-
-
-def get_patches(week_start, week_end):
-    return bugs(
-        week_start,
-        week_end,
-        "id IN (SELECT id FROM bugs_tags WHERE tag = 'patch')",
-    )
-
-
-def bugs(week_start, week_end, extra="true"):
-    log("Querying UDD for usertagged bugs with filter: {}", extra)
-
-    fields = (
-        'id',
-        'source',
-        'submitter',
-        'submitter_name',
-        'title',
-        'arrival',
-    )
-
-    sql = """
-        SELECT
-            {fields}
-        FROM
-            bugs
-        INNER JOIN
-            bugs_usertags USING (id)
-        WHERE
-            bugs_usertags.email = 'reproducible-builds at lists.alioth.debian.org'
-        AND
-            {extra}
-        AND
-            CAST(arrival AS DATE) BETWEEN to_timestamp(@{week_start}) AND to_timestamp(@{week_end})
-    """.format(**{
-        'fields': ', '.join(fields),
-        'extra': extra,
-        'week_start': week_start,
-        'week_end': week_end,
-    })
-
-    seen = set()
-    result = {}
-    for x in udd(sql, fields):
-        if x['id'] in seen:
-            continue
-        seen.add(x['id'])
-
-        result.setdefault(x['submitter_name'], []).append(x)
-
-    return {
-        x: list(sorted(y, key=lambda x: x['id'])) for x, y in result.items()
-    }
-
-
-def get_uploads(week_start, week_end):
-    log("Querying UDD for uploads")
-
-    fields = (
-        'source',
-        'version',
-        'distribution',
-        'signed_by_name',
-    )
-
-    data = udd("""
-        SELECT
-            {fields}
-        FROM
-            upload_history
-        WHERE
-            source IN ({sources})
-        AND
-            CAST(date AS date) BETWEEN to_timestamp(@{week_start}) AND to_timestamp(@{week_end})
-        ORDER BY
-            date
-    """.format(**{
-        'fields': ', '.join(fields),
-        'sources': ', '.join("'{}'".format(x) for x in PROJECTS),
-        'week_start': week_start,
-        'week_end': week_end,
-    }), fields)
-
-    result = {}
-    for x in data:
-        result.setdefault(x['source'], []).append(x)
-
-    return result
-
-
-def udd(query, fields):
-    lines = subprocess.check_output("""
-        echo "{}" | ssh alioth.debian.org psql --no-psqlrc service=udd
-    """.format(query), shell=True)
-
-    data = []
-
-    for line in lines.splitlines()[2:]:
-        split = line.decode('utf-8').split('|')
-
-        if len(split) != len(fields):
-            continue
-
-        row = dict(zip(fields, [x.strip() for x in split]))
-
-        data.append(row)
-
-    return data
-
-
-def get_commits(week_start, week_end):
-    return {x: commits(week_start, week_end, x) for x in PROJECTS}
-
-
-def get_issues_yml(week_start, week_end):
-    return commits(week_start, week_end, 'notes', 'issues.yml')
-
-
-def get_packages_yml(week_start, week_end):
-    return commits(week_start, week_end, 'notes', 'packages.yml')
-
-
-def open_packages_yml(date):
-    return subprocess.Popen(
-        "git show $(git rev-list -n1 --until @{0} origin/master):packages.yml".format(date),
-        shell=True,
-        cwd=sibling_repo_gitdir("notes"),
-        stdout=subprocess.PIPE).stdout
-
-
-def get_packages_stats(week_start, week_end):
-    old = yaml.safe_load(open_packages_yml(week_start))
-    new = yaml.safe_load(open_packages_yml(week_end))
-
-    removed = set(old.keys()) - set(new.keys())
-    added = set(new.keys()) - set(old.keys())
-    updated = 0
-    for name in set(old.keys()).intersection(new.keys()):
-        if old[name] != new[name]:
-            updated += 1
-    return {
-        "removed": len(removed),
-        "added": len(added),
-        "updated": updated,
-    }
-
-
-def commits(week_start, week_end, project, path='.'):
-    # Assume its in the parent dir
-    git_dir = sibling_repo_gitdir(project)
-
-    subprocess.check_call(('git', 'fetch', 'origin'), cwd=git_dir)
-
-    output = subprocess.check_output((
-        'git',
-        'log',
-        'origin/master',
-        '--since', '@{}'.format(week_start),
-        '--until', '@{}'.format(week_end),
-        '--pretty=format:%an\t%h\t%s',
-        '--no-merges',
-        '--all',
-        '--',
-        path,
-    ), cwd=git_dir).decode('utf-8')
-
-    result = collections.defaultdict(list)
-    for x in output.splitlines():
-        author, sha, title = x.split('\t', 2)
-
-        for pattern in (
-            r'^dependabot$',
-        ):
-            if re.search(pattern, author) is not None:
-                continue
-
-        for pattern in (
-            r'^--fix-deterministic$',
-            r'^Add missing usertagged bugs$',
-            r'^Remove archived bugs$',
-            r'^Release .* to Debian .*$',
-        ):
-            if re.search(pattern, title) is not None:
-                continue
-
-        result[author].append({
-            'sha': sha,
-            'title': title.replace('_', '\_'),
-        })
-
-    return result
-
-
-if __name__ == '__main__':
-    sys.exit(main(*sys.argv[1:]))


=====================================
reports/bin/generate-draft.template deleted
=====================================
--- a/reports/bin/generate-draft.template
+++ /dev/null
@@ -1,48 +0,0 @@
----
-layout: blog
-week: {{ week }}
----
-
-Here's what happened in the [Reproducible Builds](https://reproducible-builds.org) effort between {{ week_start.strftime('%A %B') }} {{ week_start.day }} and {{ week_end.strftime('%A %B') }} {{ week_end.day }} {{ week_end.year }}:
-
-* FIXME
-
-Packages reviewed and fixed, and bugs filed
--------------------------------------------
-
-FIXME: prune the below list so it doesn't duplicate the information above, and perhaps also run bin/review-bugs to see if anything was missed.
-
-{% for x, ys in patches.items()|sort %}* {{ x }}:
-{% for y in ys %}    * [#{{ y['id'] }}](https://bugs.debian.org/{{ y['id'] }}) filed against [{{ y['source'] }}](https://tracker.debian.org/pkg/{{ y['source'] }}).
-{% endfor %}{% endfor %}
-
-In addition, build failure bugs were reported by:
-{% for k, v in ftbfs_bugs.items()|sort %}
-* {{ k }} ({{ v|length }}){% endfor %}
-
-{% for project in projects %}
-{{ project }} development
-------------{{ "-" * project|length }}
-{% for x in uploads[project] %}
-Version [{{ x['version'] }}](https://tracker.debian.org/news/FIXME) was uploaded to {{ x['distribution'] }} by {{ x['signed_by_name'] }}. It [includes contributions lready convered by posts in previous weeks](https://anonscm.debian.org/git/reproducible/{{ project }}.git/log/?h={% if project != 'diffoscope' %}debian/{% endif %}{{ x['version'] }}) as well as new ones from:
-
-{% endfor %}
-{% for x, ys in commits[project].items() %}* {{ x }}:{% for y in ys %}
-    * [{{ y['title'] }}]({% if project == "jenkins.debian.net" %}https://salsa.debian.org/qa/jenkins.debian.net/commit/{{ y['sha'] }}{% else %}https://anonscm.debian.org/git/reproducible/{{ project }}.git/commit/?id={{ y['sha'] }}{% endif %}){% endfor %}
-{% endfor %}
-{% endfor %}
-
-Reviews of unreproducible packages
-----------------------------------
-
-{{ packages_stats['added'] }} package reviews have been added, {{ packages_stats['updated'] }} have been updated and {{ packages_stats['removed'] }} have been removed in this week, adding to our [knowledge about identified issues](https://tests.reproducible-builds.org/debian/index_issues.html).
-
-FIXME issue types have been updated:
-{% for _, xs in issues_yml.items()|sort %}{% for x in xs %}
-* [{{ x['title'] }}](https://anonscm.debian.org/git/reproducible/notes.git/commit/?id={{ x['sha'] }}){% endfor %}{% endfor %}
-
-
-Misc.
------
-
-This week's edition was written by {{ author }} & reviewed by a bunch of Reproducible Builds folks on IRC & the mailing lists.


=====================================
reports/bin/get-latest-data deleted
=====================================
--- a/reports/bin/get-latest-data
+++ /dev/null
@@ -1,136 +0,0 @@
-#!/bin/bash
-# Get latest data relevant to the previous week, or a week given by -w.
-#
-# If you don't have access to master.debian.org, ask someone who does to run
-# bin/sync-master-to-alioth. After they have done that, you can re-run this
-# script with "-m alioth.debian.org:/home/groups/reproducible/mirror"
-#
-# If you want to use torsocks, give this script TORSOCKS=torsocks
-
-set -e
-
-scriptdir="$(readlink -f "$(dirname "$0")")"
-week_1_end=1430611200 # May 3 2015, 00:00 UTC, Sunday
-
-mkdir -p data
-now="$(date +%s)"
-
-prev_week=$(((now - week_1_end) / (7*24*3600) + 1))
-
-while getopts 'w:m:' opt; do
-	case $opt in
-		w)
-			week="$OPTARG"
-			;;
-		m)
-			mailbase="$OPTARG"
-			;;
-	esac
-done
-shift `expr $OPTIND - 1`
-
-week="${week:-$prev_week}"
-defaultmailbase="master.debian.org:/srv/mail-archives/lists"
-othermailbase="alioth.debian.org:/home/groups/reproducible/mirror"
-mailbase="${mailbase:-$defaultmailbase}"
-
-week_end=$((week_1_end + (week - 1)*7*24*3600))
-week_start=$((week_end - 7*24*3600))
-week_path="week_${week}_ending_$(date -u -d@${week_end} +%Y-%m-%d)"
-
-echo >&2 "Getting data for the period $(date -u -d@${week_start} +%Y-%m-%d) to $(date -u -d@${week_end} +%Y-%m-%d)"
-
-mkdir -p "data/$week_path"
-(cd "data/$week_path"
-
-cat >variables <<eof
-export RB_REPORT_WEEK_NUMBER=$week
-export RB_REPORT_WEEK_START=$week_start
-export RB_REPORT_WEEK_END=$week_end
-eof
-
-echo >&2 "- reproducible.db (>350MB)"
-url=https://reproducible.debian.net/reproducible.db
-$TORSOCKS wget -q --show-progress -c $url
-
-echo >&2 "- changelogs of newly-reproducible packages"
-mkdir -p changelogs
-rm -f changelogs-failed && touch changelogs-failed
-
-(cd changelogs
-for url in $(REPRODUCIBLE_DB=../reproducible.db "$scriptdir/newly-reproducible" | sed -n -e 's,.*<,,;s,>.*,,p' | sort -u); do
-	p=${url%/*}; p=${p##*/};
-	ln -sf "$(basename "$url")" "$p"
-	echo -n "  * $p"
-	$TORSOCKS wget -q -N "$url" || ( echo $url >> ../changelogs-failed ; echo -n " failed." )
-	echo
-done
-)
-
-echo >&2 "- mails describing uploads to debian (will ssh to $mailbase)"
-current_month="$(date -u +%Y%m)"
-month_at_end="$(date -u +%Y%m -d@$week_end)"
-month_at_start="$(date -u +%Y%m -d@$week_start)"
-rm -f uploads.mbox
-
-warn_emails() {
-	test "$mailbase" "$1" "$defaultmailbase" || return 0
-	echo -e >&2 "  \e[1;31mAsk a DD to run bin/sync-to-alioth.\e[0m After they do this, re-run this script but as: "
-	echo >&2 "    $0 -m $othermailbase"
-}
-
-get_emails() {
-	if [ "$1" = "$current_month" ]; then
-		rsync -v "$mailbase/debian-devel-changes/debian-devel-changes.$1" "debian-devel-changes.$1.snapshot"
-		cat "debian-devel-changes.$1.snapshot" >> uploads.mbox
-	else
-		rsync -v "$mailbase/debian-devel-changes/debian-devel-changes.$1.xz" "debian-devel-changes.$1.xz"
-		xzcat "debian-devel-changes.$1.xz" >> uploads.mbox
-	fi
-}
-if ssh "${mailbase%:*}" true; then
-	if [ "$month_at_start" != "$month_at_end" ]; then
-		get_emails "$month_at_start" || warn_emails !=
-	fi
-	get_emails "$month_at_end" || warn_emails !=
-else
-	warn_emails =
-fi
-
-echo >&2 "- bug reports that were modified (will ssh to alioth.debian.org)"
-query_select="SELECT DISTINCT bugs.id FROM bugs_usertags, bugs"
-query_filter="bugs_usertags.email = 'reproducible-builds at lists.alioth.debian.org' \
-AND bugs.id = bugs_usertags.id \
-AND bugs.last_modified >= to_timestamp($week_start) \
-AND bugs.last_modified < to_timestamp($week_end)"
-query_order="ORDER BY bugs.id"
-udd_query="ssh alioth.debian.org psql service=udd -t"
-
-echo "${query_select} WHERE ${query_filter} ${query_order};" \
-  | $udd_query | awk '/[0-9]/ { print $1 }' > bugs
-echo "${query_select}, bugs_tags WHERE \
-$query_filter AND bugs.id = bugs_tags.id AND bugs_tags.tag = 'patch' ${query_order};" \
-  | $udd_query | awk '/[0-9]/ { print $1 }' > bugs-patch
-echo "${query_select} WHERE ${query_filter} AND bugs_usertags.tag = 'ftbfs' ${query_order};" \
-  | $udd_query | awk '/[0-9]/ { print $1 }' > bugs-ftbfs
-echo "SELECT bugs.submitter, count(distinct bugs.id) FROM bugs_usertags, bugs WHERE \
-${query_filter//last_modified/arrival} AND bugs_usertags.tag = 'ftbfs' GROUP BY bugs.submitter;" \
-  | $udd_query > new-bugs-ftbfs.txt
-
-echo >&2 "- weekly-log.txt (will ssh to alioth.debian.org)"
-scp -q alioth.debian.org:/home/groups/reproducible/weekly-log.txt .
-
-### report failures
-
-if [ -s changelogs-failed ] ; then
-	echo
-	echo "These URLs could not be downloaded; please investigate and download manually:"
-	echo
-	cat changelogs-failed
-fi
-
-)
-
-if [ "$prev_week" = "$week" ]; then
-	rm -f latest && ln -sf "data/$week_path" latest
-fi


=====================================
reports/bin/history deleted
=====================================
--- a/reports/bin/history
+++ /dev/null
@@ -1,108 +0,0 @@
-#!/bin/sh
-# history: look at the history of reproducibly test for a package
-#
-# Copyright © 2015 Lunar <lunar at debian.org>
-#           © 2015 Mattia Rizzolo <mattia at mapreri.org>
-# Licensed under WTFPL — http://www.wtfpl.net/txt/copying/
-
-changelog=false
-less=false
-full=false
-suite=
-arch=
-while getopts 'xlcs:a:' opt; do
-    case $opt in
-        x)
-            full=true
-            ;;
-        c)
-            changelog=true
-            less=true
-            ;;
-        l)
-            less=true
-            ;;
-        s)
-            suite=$OPTARG
-            ;;
-        a)
-            arch=$OPTARG
-            ;;
-    esac
-done
-shift `expr $OPTIND - 1`
-
-if [ -n "$1" ]; then
-    PACKAGE=$1
-else
-    echo "Please provide a package name as a first parameter to this script"
-    exit 1
-fi
-
-FILTER=
-if [ -n "$suite" ]; then
-    FILTER="${FILTER} AND suite='$suite'"
-fi
-
-if [ -n "$arch" ]; then
-    FILTER="${FILTER} AND architecture='$arch'"
-fi
-
-DB="${DB:-latest/reproducible.db}"
-LOGS="$(dirname "$DB")/changelogs"
-SQLITE_OPTS="${SQLITE_OPTS:--column -header}"
-
-main() {
-
-if $full; then
-    QUERY="SELECT * FROM stats_build WHERE name='$PACKAGE' $FILTER ORDER BY build_date"
-    WIDTH="5 0 0 0 0 15 0 0 13"
-else
-    if tty -s; then
-        status="replace(\
-                replace(\
-                replace(\
-                replace(\
-                status,\
-                'FTBFS', X'1B'||'[31mFTBFS'||X'1B'||'[0m'),\
-                'unrepr', X'1B'||'[91munrepr'||X'1B'||'[0m'),\
-                'depwai', X'1B'||'[93mdepwai'||X'1B'||'[0m'),\
-                'reprod', X'1B'||'[92mreprod'||X'1B'||'[0m') AS status"
-        swidth="15" # 6 + number of escape chars
-    else
-        status="status"
-        swidth="6"
-    fi
-    QUERY="SELECT name, version, suite, architecture AS arch, $status, build_date FROM stats_build WHERE name='$PACKAGE' $FILTER ORDER BY build_date"
-    WIDTH="0 25 0 7 $swidth 13"
-fi
-sqlite3 $SQLITE_OPTS -cmd ".width $WIDTH" "$DB" "$QUERY"
-
-$full || exit 0
-
-printf "\n\n@@@@@ RESULTS @@@@@@\n"
-QUERY="SELECT s.id as 'pkg id', s.name, s.version, s.suite, s.architecture as arch, s.notify_maintainer as notify, r.version as 'tested version', r.status, r.build_date, r.build_duration as duration, r.builder
-FROM sources AS s JOIN results AS r ON r.package_id=s.id WHERE s.name='$PACKAGE'"
-WIDTH="6 0 0 0 5 6 0 0 16 0 13"
-RESULT="$(sqlite3 $SQLITE_OPTS -cmd ".width $WIDTH" "$DB" "$QUERY" 2> /dev/null)"
-
-if [ ! -z "$RESULT" ] ; then echo "$RESULT" ; else echo "$PACKAGE has not been built yet" ; fi
-
-}
-
-if $less; then
-    main | less -R +G
-else
-    main
-fi
-
-if $changelog; then
-    if [ -f "$LOGS/$PACKAGE" ]; then
-        less $LOGS/$PACKAGE
-    elif grep -qF "/$PACKAGE/" "${LOGS}-failed"; then
-        { echo "failed to download $(grep -F "/$PACKAGE/" "${LOGS}-failed")";
-        echo "probably superseded by a newer version; you should check this yourself"; } | less
-    else
-        echo "no changelog for $PACKAGE found for this period" | less
-    fi
-fi


=====================================
reports/bin/newly-reproducible deleted
=====================================
--- a/reports/bin/newly-reproducible
+++ /dev/null
@@ -1,225 +0,0 @@
-#!/usr/bin/python3 -u
-# newly-reproducible: find packages that became reproducible in the past week
-#
-# Copyright © 2015 Lunar <lunar at debian.org>
-# Copyright © 2016 Ximin Luo <infinity0 at debian.org>
-# Licensed under WTFPL — http://www.wtfpl.net/txt/copying/
-
-import argparse
-import os
-import re
-import subprocess
-import sys
-import sqlite3
-import termios
-import time
-import traceback
-import tty
-
-def read1charmode(fd):
-    mode = termios.tcgetattr(fd)
-    mode[tty.LFLAG] = mode[tty.LFLAG] & ~(tty.ICANON)
-    mode[tty.CC][tty.VMIN] = 1
-    mode[tty.CC][tty.VTIME] = 0
-    return mode
-
-def output(name, details):
-    detail_string = "; ".join("on %s %s" % (", ".join(v), k) for k, v in details.items())
-    print(" * [[!pkg %s]] is reproducible %s." % (name, detail_string))
-
-def trace_call(*args, **kwargs):
-    try:
-        subprocess.check_call(*args, **kwargs)
-    except Exception:
-        traceback.print_exc()
-
-def interact(name, details, autoview, fd=sys.stdin, nextname=None, prevname=None):
-    oldattr = termios.tcgetattr(fd)
-    newattr = read1charmode(fd)
-    try:
-        termios.tcsetattr(fd, termios.TCSANOW, newattr)
-        full = []
-        suite = []
-        arch = []
-        autoview = bool(autoview)
-        helptext = """
-h   Show this help
-v   View build logs
-c   View Debian changelog for the latest version
-a   Do/don't automatically view build logs after changing settings
-
-r   Reset all settings
-x   Show/hide full details for "view build logs"
-
-t   Filter/unfilter "view build logs" to suite testing
-u   Filter/unfilter "view build logs" to suite unstable
-e   Filter/unfilter "view build logs" to suite experimental
-
-1   Filter/unfilter "view build logs" to arch amd64
-2   Filter/unfilter "view build logs" to arch i386
-3   Filter/unfilter "view build logs" to arch armhf
-
-w   Open status page (from tests.reproducible-builds.org) in a web browser.
-l   Open changelogs page (from changelogs.debian.net) in a web browser.
-    Useful if 'c' doesn't work; HTTP-only however.
-
-    For the above we use the BROWSER env var if set, or else x-www-browser.
-    If no suite is selected as a filter, uses "unstable".
-    If no arch is selected as a filter, uses "amd64".
-
-.   Go to next item (%s)
-,   Go to prev item (%s)
-Ctrl-C      Quit
-Enter       Go to next item or quit if last item
-""" % (nextname, prevname)
-        commands = "".join(filter(lambda x: x != " ", map(lambda x: x[0], filter(len, helptext.split("\n")))))[1:-2]
-        promptstr = "What do you want to do? [h]elp or [%s] (status: %%s) " % commands
-        view = lambda: trace_call(["bin/history"] + full + suite + arch + [name])
-        if autoview: view()
-        while True:
-            output(name, details)
-            status = filter(None, [
-              "autoview" if autoview else None,
-              "details" if full else None,
-              "suite=%s" % suite[0][2:] if suite else None,
-              "arch=%s" % arch[0][2:] if arch else None])
-            print(promptstr % ", ".join(status), end='', flush=True)
-            c = fd.read(1)
-            print()
-            if c == "\n":
-                return None
-            elif c == ".":
-                return 1
-            elif c == ",":
-                return -1
-            elif c == "h":
-                print(helptext)
-            elif c == "v":
-                view()
-            elif c == "c":
-                trace_call(["less", os.path.join(os.path.dirname(db_path), "changelogs", name)])
-            elif c in "a":
-                autoview = not autoview
-            elif c == "r":
-                return 0
-            elif c == "x":
-                full = ["-x"] if not full else []
-                if autoview: view()
-            elif c in "tue":
-                selected = dict((k[0], "-s"+k) for k in ["testing", "unstable", "experimental"])[c]
-                suite = [selected] if suite != [selected] else []
-                if autoview: view()
-            elif c in "123":
-                selected = dict((k[0], "-a"+k[1:]) for k in ["1amd64", "2i386", "3armhf"])[c]
-                arch = [selected] if arch != [selected] else []
-                if autoview: view()
-            elif c == "w":
-                browser = os.getenv("BROWSER", "x-www-browser")
-                w_suite = suite[0][2:] if suite else "unstable"
-                w_arch = arch[0][2:] if arch else "amd64"
-                url_fmt = "https://tests.reproducible-builds.org/debian/rb-pkg/%s/%s/%s.html"
-                trace_call([browser, url_fmt % (w_suite, w_arch, name)])
-            elif c == "l":
-                browser = os.getenv("BROWSER", "x-www-browser")
-                url_fmt = "http://changelogs.debian.net/%s"
-                trace_call([browser, url_fmt % (name)])
-            else:
-                print(helptext)
-        return 1
-    finally:
-        termios.tcsetattr(fd, termios.TCSANOW, oldattr)
-
-parser = argparse.ArgumentParser(
-    description='find packages that became reproducible in the past week')
-parser.add_argument(
-    '-i', '--interactive', action="store_true", default=False,
-    help='enter an interactive REPL to examine each package in more detail')
-parser.add_argument(
-    '-a', '--autoview', action="store_true", default=False,
-    help='when in interactive mode, automatically view build logs')
-parser.add_argument(
-    'package', nargs="*",
-    help='only select these packages (if they became reproducible)')
-args = parser.parse_args()
-
-query_add = "AND name IN ({})".format(', '.join(map(repr, args.package))) if args.package else ""
-
-db_path = os.environ.get('REPRODUCIBLE_DB', 'latest/reproducible.db')
-variables_path = os.path.join(os.path.dirname(db_path), 'variables')
-
-if os.path.isfile(variables_path):
-    with open(variables_path) as fp:
-        for line in fp.readlines():
-            matches = re.match("export (.*?)=(.*)\n", line)
-            if matches:
-                os.environ[matches.group(1)] = matches.group(2)
-
-if "RB_REPORT_WEEK_START" not in os.environ or "RB_REPORT_WEEK_END" not in os.environ:
-    raise ValueError("RB_REPORT_WEEK_{START,END} not set")
-
-query_date_begin = int(os.environ["RB_REPORT_WEEK_START"])
-query_date_end = int(os.environ["RB_REPORT_WEEK_END"])
-
-conn = sqlite3.connect(db_path)
-
-c = conn.cursor()
-
-now_reproducible = {}
-unreproducible_version = {}
-for name, reproducible_version, architecture, suite, reproducible_build_time in c.execute('SELECT name, version, architecture, suite, strftime("%s", build_date) FROM stats_build WHERE status = "reproducible" AND build_date > DATETIME(?, "unixepoch") AND build_date < DATETIME(?, "unixepoch") AND suite = "unstable" {} ORDER BY build_date DESC'.format(query_add), (query_date_begin, query_date_end)):
-    package_id = '%s/%s' % (name, architecture)
-    if package_id in now_reproducible or package_id in unreproducible_version:
-        continue
-    c2 = conn.cursor()
-    for version, status, build_time in c2.execute('SELECT version, status, strftime("%s", build_date) FROM stats_build WHERE name = ? AND architecture = ? AND suite = ? AND build_date < DATETIME(?, "unixepoch") ORDER BY build_date DESC', (name, architecture, suite, int(reproducible_build_time) - 1)):
-        #print("status %s" % status)
-        if status in ('FTBFS', 'depwait'):
-            continue
-        elif status == 'reproducible':
-            if package_id in now_reproducible:
-                del now_reproducible[package_id]
-            if version != reproducible_version:
-                break
-            if query_date_begin > int(build_time):
-                break
-        elif status == 'unreproducible':
-            if version == reproducible_version:
-                now_reproducible[package_id] = 'likely due to toolchain fixes'
-            elif package_id in unreproducible_version:
-                if unreproducible_version[package_id] != version:
-                   break
-            else:
-                if name.startswith('lib'):
-                    prefix = name[0:4]
-                else:
-                    prefix = name[0]
-                changelog_url = 'http://metadata.ftp-master.debian.org/changelogs/main/%(prefix)s/%(name)s/%(name)s_%(version)s_changelog' % { 'prefix':prefix, 'name': name, 'version': re.sub(r'^[0-9]+:', '', reproducible_version) }
-                now_reproducible[package_id] = 'since %s over %s <%s>' % (reproducible_version, version, changelog_url)
-                break
-            unreproducible_version[package_id] = version
-        else:
-            print('UNKNOWN STATUS %s' % status)
-
-now_reproducible_by_arch={}
-for package_id in sorted(now_reproducible.keys()):
-    name, architecture = package_id.split('/')
-    now_reproducible_by_arch.setdefault(name, {}).setdefault(now_reproducible[package_id], []).append(architecture)
-
-all_details = sorted(now_reproducible_by_arch.items())
-if args.interactive:
-    i = 0
-    n = len(all_details)
-    while i < n:
-        name, details = all_details[i]
-        nextname = all_details[(i+1)%n][0]
-        prevname = all_details[(i-1)%n][0]
-        chg = interact(name, details, args.autoview, sys.stdin, nextname, prevname)
-        if chg is None:
-            if i == len(all_details) - 1:
-                break
-            else:
-                chg = 1
-        i = (i+chg)%n
-else:
-    for name, details in all_details:
-        output(name, details)


=====================================
reports/bin/review-bugs deleted
=====================================
--- a/reports/bin/review-bugs
+++ /dev/null
@@ -1,28 +0,0 @@
-#!/bin/bash
-# review-bugs: look at bugs modified in the past week
-#
-# Copyright © 2015 Lunar <lunar at debian.org>
-# Licensed under WTFPL — http://www.wtfpl.net/txt/copying/
-
-offline=false
-while getopts 'o' opt; do
-	case $opt in
-		o)
-			offline=true
-			;;
-	esac
-done
-shift `expr $OPTIND - 1`
-
-bugs="${1:-latest/bugs}"
-
-if ! $offline; then
-	TOTAL="$(wc -l "$bugs" | awk '{ print $1 }')"
-	for bug in $(cat "$bugs"); do
-		bts cache --cache-mode=mbox "$bug" >/dev/null
-		echo "$bug"
-	done | pv -t -e -p -l -s "$TOTAL"
-fi
-for bug in $(cat "$bugs"); do
-	bts --mailreader="mutt -e \"set folder='$(dirname $bugs)';\" -f %s" -o show --mbox "$bug"
-done


=====================================
reports/bin/review-issues deleted
=====================================
--- a/reports/bin/review-issues
+++ /dev/null
@@ -1,7 +0,0 @@
-#!/bin/sh
-
-test -d ".git" || { echo >&2 "run this from notes.git"; exit 1; }
-scriptdir="$(readlink -f "$(dirname "$0")")"
-test -n "$RB_REPORT_WEEK_END" || { . "$scriptdir/../latest/variables"; }
-
-git log -U8 --since "@$RB_REPORT_WEEK_START" --until "@$RB_REPORT_WEEK_END" --graph -p master -- issues.yml


=====================================
reports/bin/review-stats deleted
=====================================
--- a/reports/bin/review-stats
+++ /dev/null
@@ -1,40 +0,0 @@
-#!/usr/bin/env python3
-# review-stats: compute stats about reviews between two packages.yml files
-# must be run from the notes.git repository, and not this one
-#
-# Copyright © 2015 Lunar <lunar at debian.org>
-# Licensed under WTFPL — http://www.wtfpl.net/txt/copying/
-
-import os
-import subprocess
-import sys
-import yaml
-
-if len(sys.argv) == 1:
-    # get packages.yml from git if $1 $2 not set
-    self_path = sys.argv[0]
-    if not os.path.isdir(".git"):
-        raise ValueError("either run this in notes.git, or give $1 $2")
-    load_variables = ""
-    if "RB_REPORT_WEEK_START" not in os.environ or "RB_REPORT_WEEK_END" not in os.environ:
-        variables_path = os.path.join(os.path.dirname(self_path), "../latest/variables")
-        load_variables = ". %s; " % variables_path
-        print("RB_REPORT_WEEK_{START,END} not set, loading from %s" % variables_path, file=sys.stderr)
-    sys.exit(subprocess.check_call(["bash", "-c", """%s%s \
-        <(git show $(git rev-list -n1 --until @$RB_REPORT_WEEK_START origin/master):packages.yml) \
-        <(git show $(git rev-list -n1 --until @$RB_REPORT_WEEK_END origin/master):packages.yml)
-    """ % (load_variables, self_path)]))
-
-old = yaml.safe_load(open(sys.argv[1]))
-new = yaml.safe_load(open(sys.argv[2]))
-
-removed = set(old.keys()) - set(new.keys())
-print("Removed: %s" % len(removed))
-added = set(new.keys()) - set(old.keys())
-print("Added: %s" % len(added))
-
-updated = 0
-for name in set(old.keys()).intersection(new.keys()):
-    if old[name] != new[name]:
-        updated += 1
-print("Updated: %d" % updated)


=====================================
reports/bin/review-uploads deleted
=====================================
--- a/reports/bin/review-uploads
+++ /dev/null
@@ -1,12 +0,0 @@
-#!/bin/bash
-# Show emails matching $pattern in debian-devel-changes at .
-set -e
-
-mbox="${1:-latest/uploads.mbox}"
-pattern="${2:-reproduc\|determinis\|SOURCE_DATE_EPOCH}"
-
-if [ -f "$(dirname "$mbox")/variables" ]; then . "$(dirname "$mbox")/variables"; fi
-if [ -z "$RB_REPORT_WEEK_END" ]; then echo >&2 "abort: RB_REPORT_WEEK_END not set"; exit 1; fi
-
-date_pattern="$(date -u -d"@$RB_REPORT_WEEK_START" +"%d/%m/%y")-$(date -u -d"@$((RB_REPORT_WEEK_END - 1))" +"%d/%m/%y")"
-env TZ=UTC mutt -e "set folder='$(dirname $mbox)'; push 'l ~d $date_pattern ~b $pattern<Enter>os';" -R -f "$mbox"


=====================================
reports/bin/sync-master-to-alioth deleted
=====================================
--- a/reports/bin/sync-master-to-alioth
+++ /dev/null
@@ -1,7 +0,0 @@
-#!/bin/sh
-
-mkdir -p data/debian-devel-changes
-rsync -av --delete --progress -f '- *19????.?z' -f '- *200???.?z' -f '- *201[0-4]??.?z' \
-  master.debian.org:/srv/mail-archives/lists/debian-devel-changes/ data/debian-devel-changes/
-rsync -rv --no-g -p --chmod=Dg+s,g+rw --delete --progress -f '- *19????.?z' -f '- *200???.?z' -f '- *201[0-4]??.?z' "$@" \
-  data/debian-devel-changes/ alioth.debian.org:/home/groups/reproducible/mirror/debian-devel-changes/



View it on GitLab: https://salsa.debian.org/reproducible-builds/reproducible-misc/commit/7b30e845719c8a21b1a75068d69ee817340c03ce

---
View it on GitLab: https://salsa.debian.org/reproducible-builds/reproducible-misc/commit/7b30e845719c8a21b1a75068d69ee817340c03ce
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.reproducible-builds.org/pipermail/rb-commits/attachments/20180422/d9ae1397/attachment.html>


More information about the rb-commits mailing list