[Git][reproducible-builds/debian-rebuilder-setup][master] 19 commits: Update srebuild and set up builder
kpcyrd
gitlab at salsa.debian.org
Mon Nov 5 22:30:14 CET 2018
kpcyrd pushed to branch master at Reproducible Builds / debian-rebuilder-setup
Commits:
a4cb2eb8 by Yash Srivastav at 2018-06-18T14:52:53Z
Update srebuild and set up builder
I have modified the existing srebuild script in order to
work for the current buildinfo format. The ansible playbook
is for the installation of the dependencies of this builder.
- - - - -
8b8b76f9 by Yash Srivastav at 2018-06-18T15:08:22Z
Fix very new package build issue
- - - - -
ab965ee6 by Yash Srivastav at 2018-06-19T16:07:58Z
Perform build in tempdir and cleanup
- - - - -
0f4f9bda by Yash Srivastav at 2018-06-25T02:45:05Z
Add documentation for builder
- - - - -
ddde978b by Yash Srivastav at 2018-06-25T02:45:23Z
Remove extra buildinfo file and extra steps in ansible setup
- - - - -
2e43c5ae by Yash Srivastav at 2018-06-26T03:49:42Z
Integrate in-toto with builder
- - - - -
2f2fd475 by Yash Srivastav at 2018-07-06T17:37:24Z
Reduce number of downloads
- - - - -
1397743d by Yash Srivastav at 2018-07-06T18:00:12Z
Add visualizer and accumulator to expose build results
- - - - -
4e7b8ce3 by Yash Srivastav at 2018-07-10T16:19:07Z
Add readme
- - - - -
f679d735 by Yash Srivastav at 2018-07-16T20:43:16Z
Set up ansible playbook
- - - - -
78fcca53 by Yash Srivastav at 2018-07-17T14:37:27Z
Improve project organisation
- - - - -
dc7d5659 by Yash Srivastav at 2018-07-17T15:57:46Z
Use requirements for installing dependencies
- - - - -
415f4ec2 by Yash Srivastav at 2018-07-17T15:58:00Z
Update TODO to reflect progress
- - - - -
9714bf3b by kpcyrd at 2018-11-01T16:39:14Z
Ansible bugfixes
- - - - -
aee182be by kpcyrd at 2018-11-01T22:13:03Z
Add basic scheduler deployment, waiting for bidb apis
- - - - -
db755a7d by kpcyrd at 2018-11-02T21:44:43Z
Add buildinfo server monitor
Depends on:
https://github.com/lamby/buildinfo.debian.net/pull/54
https://github.com/Foxboron/buildinfo.debian.net/pull/1
- - - - -
a70d2eea by kpcyrd at 2018-11-05T17:40:33Z
Automatically generate gpg key
- - - - -
0a0b0144 by kpcyrd at 2018-11-05T18:09:24Z
Start gunicorn-visualizer on boot
- - - - -
beae8c31 by kpcyrd at 2018-11-05T20:05:18Z
Refactor accumulator deployment
- - - - -
30 changed files:
- + README.md
- TODO
- + ansible.cfg
- + builder/README.md
- + builder/srebuild
- + builder/srebuild-hook
- + external_vars.yml
- + hosts
- playbook.yml
- + requirements.yml
- + roles/builders/tasks/main.yml
- + roles/builders/templates/srebuild-endpoints.j2
- + roles/gpg/defaults/main.yml
- + roles/gpg/tasks/main.yml
- + roles/gpg/templates/gpg-keygen.j2
- + roles/schedulers/tasks/main.yml
- + roles/visualizers/files/default.conf
- + roles/visualizers/files/gunicorn-accumulator.service
- + roles/visualizers/files/gunicorn-visualizer.service
- + roles/visualizers/handlers/main.yml
- + roles/visualizers/tasks/main.yml
- + scheduler/srebuild-monitor
- + scheduler/srebuild-worker
- + visualizer/README.md
- + visualizer/accumulator.py
- + visualizer/requirements.txt
- + visualizer/schema.sql
- + visualizer/templates/all_sources.html
- + visualizer/templates/all_versions_of_source.html
- + visualizer/visualizer.py
Changes:
=====================================
README.md
=====================================
@@ -0,0 +1,4 @@
+```shell
+$ ansible-galaxy install -r requirements.yml
+$ ansible-playbook playbook.yml
+```
=====================================
TODO
=====================================
@@ -1,9 +1,9 @@
todo
----
-a readme file describing all of this here
-a scheduler, to detect+schedule packages in sid
-a builder to build those packages
- using .buildinfo files fetched from buildinfo.debian.net (needs work server side too)
-a db to store the result
-a templating engine to turn these results into shiny HTML and json
-a deployment for all this on the server (using ansible)
+[ ] - a readme file describing all of this here
+[ ] - a scheduler, to detect+schedule packages in sid
+[x] - a builder to build those packages
+ using .buildinfo files fetched from buildinfo.debian.net (needs work server side too)
+[x] - a db to store the result
+[x] - a templating engine to turn these results into shiny HTML and json
+[-] - a deployment for all this on the server (using ansible)
=====================================
ansible.cfg
=====================================
@@ -0,0 +1,4 @@
+[defaults]
+inventory = ./inventory
+roles_path = ./roles
+retry_files_enabled = False
=====================================
builder/README.md
=====================================
@@ -0,0 +1,74 @@
+Builder
+===
+
+Builds a package given just the buildinfo file.
+
+
+# Steps
+
+
+## Parsing Buildinfo
+
+Parses buildinfo as a Dpkg Control File as the buildinfo
+format matches the format. Extracts essential information
+from the buildinfo.
+
+
+## Calculate sources
+
+For each dependency specified in `Build-Depends` of the buildinfo
+file, calculate the [snapshot.debian.org](https://snapshot.debian.org/)
+timestamp when it first appeared in the archives. Use this to calculate
+a complete list of timestamps for each dependency. These timestamps are
+now individual sources for the chroot `sources.list`.
+
+
+## Setup chroot
+
+Find out when the particular version of the package being built
+appeared in [snapshot.debian.org](https://snapshot.debian.org/).
+If it never appeared, assume its in latest sid. Using this information,
+set up a chroot using the calculated base repository (either a snapshot or sid).
+
+Also add the calculate sources in the previous step to the `sources.list` of the
+chroot.
+
+
+## Prebuild hooks
+
+
+### Chroot Setup
+
+In the chroot set up phase, perform the apt installation of all build depends.
+
+
+### Starting Build
+
+Just before starting build, verify using `dpkg` that all the required dependencies
+are at the exact version.
+
+
+## Build
+
+Build uses `sbuild` to build the package after which, the checksums are verified.
+
+
+# Work Remaining
+
+## Do something with results
+
+We need to forward the generated checksums / buildinfo to a central location.
+One suggestion was to post signed buildinfo to buildinfo.debian.net.
+
+We intend to expose some metadata about the builds (probably in-toto link metadata
+with a sub-layout) for verification purposes by the client (i.e., apt).
+
+
+# Previous Work and inspiration
+
+There has already been a lot of work regarding this. The two major scripts / code
+snippets I modified for this were:
+
+- <https://bugs.debian.org/774415> - <https://salsa.debian.org/reproducible-builds/packages/sbuil>
+- <https://github.com/StevenC99/reprobuild>
+
=====================================
builder/srebuild
=====================================
@@ -0,0 +1,461 @@
+#!/usr/bin/perl
+#
+# Copyright 2014 Johannes Schauer
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in
+# all copies or substantial portions of the Software.
+
+# srebuild uses sbuilds hook functionality (needs #774359 to be fixed) to
+# install the right dependencies from a .buildinfo into the sbuild schroot.
+#
+# Current limitations:
+#
+# - it will only search for results in Debian Sid, main
+
+use strict;
+use warnings;
+
+use Archive::Tar;
+use Cwd qw(abs_path);
+use Dpkg::Control;
+use Dpkg::Compression::FileHandle;
+use Dpkg::Deps;
+use Dpkg::Index;
+use Dpkg::Checksums;
+use DateTime::Format::Strptime;
+use Compress::Zlib;
+use File::Basename;
+use File::Copy "cp";
+use File::Temp qw(tempdir);
+use Digest::SHA qw(sha256_hex);
+use List::Util qw(first);
+
+eval {
+ require LWP::Simple;
+ require LWP::UserAgent;
+ no warnings;
+ $LWP::Simple::ua =
+ LWP::UserAgent->new( agent => 'LWP::UserAgent/srebuild' );
+};
+if ($@) {
+ die
+"Unable to run: Couldn't load LWP::Simple: $@ (is the libwww-perl package installed?)";
+}
+
+eval { require JSON; };
+if ($@) {
+ die
+"Unable to run: Couldn't load JSON: $@ (is the libjson-perl package installed?)";
+}
+
+sub uniq {
+ my %seen;
+ return grep { !$seen{$_}++ } @_;
+}
+
+sub system_fatal {
+ my @args = @_;
+ print "srebuild: executing: @args\n";
+ my $retval = system @args;
+ $retval >>= 8;
+ if ( $retval != 0 ) {
+ die "failed: @args";
+ }
+}
+
+# this subroutine is from debsnap(1)
+sub fetch_json_page {
+ my ($json_url) = @_;
+ my $content = LWP::Simple::get($json_url);
+ return unless defined $content;
+ my $json = JSON->new();
+ my $json_text = $json->allow_nonref->utf8->relaxed->decode($content);
+ return $json_text;
+}
+
+sub check_checksums {
+ my $checksums = shift;
+
+ foreach my $fname ( $checksums->get_files() ) {
+ my $chksum = $checksums->get_checksum( $fname, 'sha256' );
+ my $size = $checksums->get_size($fname);
+ my $size2 = ( stat($fname) )[7];
+ if ( $size != $size2 ) {
+ print "buildinfo: $size\n";
+ print "actual: $size2\n";
+ die "size mismatch for $fname\n";
+ }
+ open my $fh, '<', $fname;
+ my $chksum2 = sha256_hex <$fh>;
+ close $fh;
+ if ( $chksum ne $chksum2 ) {
+ print "buildinfo: $chksum\n";
+ print "actual: $chksum2\n";
+ die "checksum mismatch for $fname\n";
+ }
+ }
+}
+
+sub generate_in_toto_metadata {
+ my $checksums = shift;
+
+ my @files = $checksums->get_files();
+
+ system_fatal "/usr/local/bin/in-toto-run", "--step-name=rebuild", "--gpg", "--products", @files,
+ "--no-command";
+
+ system_fatal "ls";
+
+ my $lnk_file = (glob( "./rebuild.*.link" ))[0];
+ die "could not find in-toto link metadata" unless defined ( $lnk_file );
+ return abs_path( $lnk_file );
+}
+
+sub parse_buildinfo {
+ my $buildinfo = shift;
+
+ # the CTRL_FILE_CHANGES type should be closest to the .buildinfo format
+ my $cdata = Dpkg::Control->new( type => CTRL_FILE_CHANGES );
+
+ if ( not $cdata->load($buildinfo) ) {
+ die "cannot parse";
+ }
+
+ my $depends = $cdata->{"Installed-Build-Depends"};
+ die "need Installed-Build-Depends field" unless defined $depends;
+
+ my $arch = $cdata->{"Build-Architecture"};
+ die "need Build-Architecture field" unless defined $arch;
+
+ my $checksums = Dpkg::Checksums->new();
+ $checksums->add_from_control($cdata);
+ if ( scalar $checksums->get_files() == 0 ) {
+ die "need Checksums-* field";
+ }
+
+ my @depends = ();
+ $depends =~ s/^\s+|\s+$//g;
+ foreach my $dep ( split( /\s*,\s*/m, $depends ) ) {
+ my $pkg = Dpkg::Deps::Simple->new($dep);
+ die "name undefined" unless defined $pkg->{package};
+ if ( defined( $pkg->{relation} ) ) {
+ if ( $pkg->{relation} ne "=" ) {
+ die "wrong relation";
+ }
+ die "version undefined" unless defined $pkg->{version};
+ }
+ else {
+ die "no version";
+ }
+ push @depends,
+ {
+ name => $pkg->{package},
+ architecture => ( $pkg->{archqual} || $arch ),
+ version => $pkg->{version}
+ };
+ }
+
+ return $cdata, $arch, $checksums, @depends;
+}
+
+sub get_first_seen {
+ my $archive = shift;
+ my $suite = shift;
+ my $area = shift;
+ my $arch = shift;
+ my $pkg = shift;
+ my $ver = shift;
+ my $url =
+ "http://snapshot.debian.org/mr/binary/$pkg/$ver/binfiles?fileinfo=1";
+ my $json_text = fetch_json_page($url);
+
+ unless ( $json_text && @{ $json_text->{result} } ) {
+ print STDERR "Unable to retrieve information for $pkg=$ver from $url.\n";
+ return;
+ }
+ my $hash = undef;
+ if ( scalar @{ $json_text->{result} } == 1 ) {
+ if ( @{ $json_text->{result} }[0]->{architecture} ne "all" ) {
+ print STDERR "expected arch:all\n";
+ return;
+ }
+ $hash = ${ $json_text->{result} }[0]->{hash};
+ }
+ else {
+ foreach my $result ( @{ $json_text->{result} } ) {
+ if ( $result->{architecture} eq $arch ) {
+ $hash = $result->{hash};
+ last;
+ }
+ }
+ }
+ if ( not defined($hash) ) {
+ print STDERR "cannot find architecture for $pkg=$ver\n";
+ return;
+ }
+ my @first_seen = grep { $_->{archive_name} eq $archive }
+ @{ $json_text->{fileinfo}->{$hash} };
+ if ( scalar @first_seen != 1 ) {
+ print STDERR "more than one package with the same hash\n";
+ return;
+ }
+ @first_seen = map { $_->{first_seen} } @first_seen;
+ return $first_seen[0];
+}
+
+sub setup_chroot_sbuild {
+ my $base_repo = shift;
+ my $suite = shift;
+ my $area = shift;
+ my $arch = shift;
+ my $src_pkg = shift;
+ my $src_ver = shift;
+ my @repos = @_;
+
+ # my $build_root = $ENV{'SBUILD_CHROOT_DIR'};
+ my $build_root = abs_path("./$src_pkg-$src_ver");
+ die "need chroot path" unless defined $build_root;
+ my $bn_build_root = basename $build_root;
+
+ @repos = map { "--extra-repository=$_" } @repos;
+
+ unlink glob "/etc/schroot/chroot.d/$suite-$bn_build_root-$arch-sbuild-*";
+ # Setup chroot
+ say STDOUT "Extracting chroot";
+ system_fatal "mkdir", "--parents", "$build_root";
+ system_fatal "sbuild-createchroot",
+ "--alias=$bn_build_root", "--chroot-prefix=$suite-$bn_build_root", @repos,
+ "$suite", "$build_root",
+ "$base_repo";
+ say STDOUT "Done extracting chroot";
+ return $build_root;
+}
+
+sub filter_depends {
+ my $base_repo = shift(@_);
+ my $suite = shift(@_);
+ my $area = shift(@_);
+ my $arch = shift(@_);
+ my %reqpkgs = ();
+ foreach my $pkg (@_) {
+ $reqpkgs{"$pkg->{name}:$pkg->{architecture}=$pkg->{version}"} = $pkg;
+ }
+ my $snapshot_url = "$base_repo/dists/$suite/$area/binary-$arch/Packages.gz";
+ my $response = LWP::Simple::get($snapshot_url);
+ my $dest = Compress::Zlib::memGunzip($response)
+ or die "Cannot uncompress\n";
+
+ print STDERR "process Packages.gz\n";
+
+ open my $fh, '<', \$dest;
+
+ while (1) {
+ my $cdata = Dpkg::Control->new(type => CTRL_INDEX_PKG);
+ last if not $cdata->parse($fh, "Packages.gz");
+ my $pkgname = $cdata->{"Package"};
+ next if not defined($pkgname);
+ my $pkgver = $cdata->{"Version"};
+ my $pkgarch;
+ if ($cdata->{"Architecture"} eq "all") {
+ $pkgarch = $arch;
+ } else {
+ $pkgarch = $cdata->{"Architecture"};
+ }
+ my $key = "$pkgname:$pkgarch=$pkgver";
+ if (exists $reqpkgs{$key}) {
+ delete $reqpkgs{$key};
+ }
+ }
+ close $fh;
+
+ my @depends = values %reqpkgs;
+ return @depends;
+}
+
+sub generate_sources {
+ my $archive = shift(@_);
+ my $suite = shift(@_);
+ my $area = shift(@_);
+ my $arch = shift(@_);
+ my $builddate = shift(@_);
+
+ say STDERR "retrieve last seen snapshot timestamps for each dependency";
+ my @timestamps = ();
+
+ my $dtparser = DateTime::Format::Strptime->new(
+ pattern => '%Y%m%dT%H%M%SZ',
+ on_error => 'croak',
+ );
+
+ foreach my $pkg (@_) {
+ my $first_seen =
+ get_first_seen( $archive, $suite, $area, $arch, $pkg->{name},
+ $pkg->{version} );
+ die "" unless defined $first_seen;
+ push @timestamps, $dtparser->parse_datetime($first_seen);
+ print "Done $pkg->{name}=$pkg->{version}\n";
+ }
+
+ @timestamps = sort @timestamps;
+ @timestamps = uniq(@timestamps);
+
+ @timestamps = grep { DateTime->compare( $_, $builddate ) != 1 } @timestamps;
+ @timestamps = map { $_->strftime("%Y%m%dT%H%M%SZ") } @timestamps;
+ @timestamps = map {
+ "deb http://snapshot.debian.org/archive/$archive/$_/ $suite $area"
+ } @timestamps;
+
+ return @timestamps;
+}
+
+my $archive = "debian";
+my $suite = "sid";
+my $area = "main";
+
+my $buildinfo = shift @ARGV;
+if ( not defined($buildinfo) ) {
+ die "need buildinfo filename";
+}
+$buildinfo = abs_path($buildinfo);
+
+my $temp_dir = tempdir(CLEANUP => 1);
+cp( "/usr/lib/srebuild-hook", "$temp_dir/srebuild-hook" );
+chdir $temp_dir;
+
+my ( $cdata, $arch, $checksums, @depends ) = parse_buildinfo $buildinfo;
+
+my $environ = $cdata->{"Environment"};
+my @environ = ();
+if ( defined($environ) ) {
+ $environ =~ s/^\s+|\s+$//g;
+ @environ = split /^/, $environ;
+}
+
+ at environ = map {
+ ( my $trimmed = $_ ) =~ s/^\s+|\s+$//g;
+ $trimmed;
+} @environ;
+
+my $src_pkg = $cdata->{"Source"};
+if ( not defined($src_pkg) ) {
+ die "need Source field";
+}
+
+my $src_ver = $cdata->{"Version"};
+if ( not defined($src_ver) ) {
+ die "need Version field";
+}
+
+my $dtparser = DateTime::Format::Strptime->new(
+ pattern => '%a, %d %b %Y %H:%M:%S %z',
+ on_error => 'croak',
+);
+
+my $builddate = $dtparser->parse_datetime( $cdata->{"Build-Date"} );
+if ( not defined($builddate) ) {
+ die "need Build-Date field";
+}
+
+my $build_path = $cdata->{"Build-Path"};
+if ( not defined($build_path) ) {
+ die "need Build-Path field";
+}
+
+my $first_seen =
+ get_first_seen( $archive, $suite, $area, $arch, $src_pkg, $src_ver );
+
+my $base_repo;
+
+# If first_seen is defined, then the archive exists in some snapshot
+# If not, we assume that it is a very new snapshot in sid.
+# NOTE: This might fail in extremely rare edge cases (package neither in a
+# snapshot nor in sid)
+if ( defined( $first_seen ) ) {
+ $base_repo = "http://snapshot.debian.org/archive/$archive/$first_seen/";
+} else {
+ $base_repo = "http://deb.debian.org/$archive/";
+}
+
+ at depends = filter_depends ( $base_repo, $suite, $area, $arch, @depends );
+
+my @timestamps =
+ generate_sources( $archive, $suite, $area, $arch, $builddate, @depends );
+
+my $build_root =
+ setup_chroot_sbuild( $base_repo, $suite, $area, $arch, $src_pkg, $src_ver, @timestamps );
+
+print "architecture = $arch\n";
+
+my $bn_buildinfo = basename $buildinfo;
+my $bn_build_root = basename $build_root;
+
+# calculate absolute path because sbuild changes directories and the user
+# should not be required to specify the absolute path on the command line
+
+say STDOUT "starting prebuild";
+
+system_fatal "mkdir", "--parent", "$build_root/tmp/";
+cp( $buildinfo, "$build_root/tmp/$bn_buildinfo" );
+cp( $buildinfo, "/tmp/$bn_buildinfo" );
+cp( "/usr/lib/srebuild-hook", "$build_root/tmp/srebuild-hook" );
+cp( "/usr/lib/srebuild-hook", "/tmp/srebuild-hook" );
+
+say STDOUT "starting sbuild";
+
+say STDOUT "resetting environment";
+
+my %pres_env = %ENV;
+undef %ENV;
+foreach my $env (@environ) {
+ $env =~ /^(.*)="(.*)"$/;
+ $ENV{$1} = $2;
+}
+
+$build_path =~ m#^(.*)/([^/]+)$#;
+
+system_fatal "sbuild", "--arch=$arch", "--dist=$suite",
+ "--build-path=$1",
+ "--no-apt-update", "--no-apt-upgrade", "--no-apt-distupgrade",
+ "--chroot-setup-commands=/tmp/srebuild-hook chroot-setup /tmp/$bn_buildinfo",
+"--starting-build-commands=/tmp/srebuild-hook starting-build /tmp/$bn_buildinfo",
+ "--chroot=$bn_build_root", "${src_pkg}_${src_ver}";
+
+say STDOUT "restoring environment";
+%ENV = %pres_env;
+
+say STDOUT "check new checksums";
+
+check_checksums $checksums;
+
+say STDOUT "package successfully rebuilt!";
+
+my $lnk = generate_in_toto_metadata $checksums;
+my $buildinfo_file = abs_path( (glob( "./*.buildinfo" ))[0] );
+
+system_fatal "gpg", "--clearsign", $buildinfo_file;
+
+system_fatal "ls";
+$buildinfo_file = abs_path( (glob( "./*.asc" ))[0] );
+
+say STDOUT $lnk;
+say STDOUT $buildinfo_file;
+
+system_fatal "cat", $lnk;
+system_fatal "cat", $buildinfo_file;
+
+foreach my $visualizer (@ARGV) {
+ system_fatal "curl", "-fF", "metadata=\@$lnk", "-F", "buildinfo=\@$buildinfo_file", "$visualizer";
+}
+
+say STDOUT "everything is okay!";
+
+unlink glob "/etc/schroot/chroot.d/$suite-$bn_build_root-$arch-sbuild-*";
+
+say STDOUT "removing schroot entry"
=====================================
builder/srebuild-hook
=====================================
@@ -0,0 +1,161 @@
+#!/usr/bin/perl
+#
+# Copyright 2014 Johannes Schauer
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in
+# all copies or substantial portions of the Software.
+
+use strict;
+use warnings;
+
+use Dpkg::Control;
+use Dpkg::Compression::FileHandle;
+use Dpkg::Deps;
+use File::Copy;
+
+sub none(&@) {
+ my $code = shift;
+ foreach (@_) {
+ return 0 if $code->();
+ }
+ return 1;
+}
+
+sub system_fatal {
+ my @args = @_;
+ print "srebuild: executing: @args\n";
+ my $retval = system @args;
+ $retval >>= 8;
+ if ($retval != 0) {
+ die "failed: @args";
+ }
+}
+
+sub parse_buildinfo {
+ my $buildinfo = shift;
+
+ my $fh = Dpkg::Compression::FileHandle->new(filename => $buildinfo);
+
+ my $cdata = Dpkg::Control->new(type => CTRL_INDEX_SRC);
+ if (not $cdata->parse($fh, $buildinfo)) {
+ die "cannot parse"
+ }
+ my $arch = $cdata->{"Build-Architecture"};
+ if (not defined($arch)) {
+ die "need Build-Architecture field"
+ }
+ my $environ = $cdata->{"Installed-Build-Depends"};
+ if (not defined($environ)) {
+ die "need Build-Environment field"
+ }
+ close $fh;
+
+ $environ =~ s/^\s+|\s+$//g;
+ my @environ = ();
+ foreach my $dep (split(/\s*,\s*/m, $environ)) {
+ my $pkg = Dpkg::Deps::Simple->new($dep);
+ if (not defined($pkg->{package})) {
+ die "name undefined";
+ }
+ if (defined($pkg->{relation})) {
+ if ($pkg->{relation} ne "=") {
+ die "wrong relation";
+ }
+ if (not defined($pkg->{version})) {
+ die "version undefined"
+ }
+ } else {
+ die "no version";
+ }
+ push @environ, { name => $pkg->{package},
+ architecture => ( $pkg->{archqual} || $arch ),
+ version => $pkg->{version}
+ };
+ }
+
+ return $arch, @environ
+}
+
+sub chroot_setup {
+ my $buildinfo = shift;
+ my @timestamps = @_;
+
+ my ($arch, @environ) = parse_buildinfo $buildinfo;
+
+ @environ = map { "$_->{name}:$_->{architecture}=$_->{version}" } @environ;
+
+ my $fh;
+ open $fh, '>', '/etc/apt/apt.conf.d/80no-check-valid-until';
+ print $fh 'Acquire::Check-Valid-Until "false";';
+ close $fh;
+
+ open $fh, '>', '/etc/apt/apt.conf.d/99no-install-recommends';
+ print $fh 'APT::Install-Recommends "0";';
+ close $fh;
+
+ my $debug_content = <<'END_MSG';
+Debug {
+ pkgAutoRemove "true";
+ pkgDepCache {
+ AutoInstall "true";
+ Marker "true";
+ };
+ pkgProblemResolver "true";
+};
+END_MSG
+ open $fh, '>', '/etc/apt/apt.conf.d/11debug-apt';
+ print $fh $debug_content;
+ close $fh;
+
+ system_fatal "apt-get", "update";
+ system_fatal "apt-get", "--yes", "--allow-downgrades", "--allow-remove-essential", "install", @environ;
+}
+
+sub starting_build {
+ my $buildinfo = shift;
+
+ my ($arch, @environ) = parse_buildinfo $buildinfo;
+
+ @environ = map { "$_->{name}:$_->{architecture}=$_->{version}" } @environ;
+
+ open my $fh, '-|', 'dpkg-query --show --showformat \'${Package}:${Architecture}=${Version}\n\'';
+ my @installed = ();
+ while (my $line = <$fh>) {
+ chomp $line;
+ # make arch:all packages build-arch packages
+ $line =~ s/:all=/:$arch=/;
+ push @installed, $line;
+ }
+
+ foreach my $dep (@environ) {
+ if (none {$_ eq $dep} @installed) {
+ die "require $dep to be installed but it is not";
+ }
+ }
+ print "srebuild: all packages are in the correct version\n"
+}
+
+my $mode = shift @ARGV;
+if (not defined($mode)) {
+ die "need mode argument";
+}
+
+my $buildinfo = shift @ARGV;
+if (not defined($buildinfo)) {
+ die "need buildinfo filename";
+}
+
+if ($mode eq "chroot-setup") {
+ chroot_setup $buildinfo;
+} elsif ($mode eq "starting-build") {
+ starting_build $buildinfo;
+} else {
+ die "invalid mode: $mode";
+}
=====================================
external_vars.yml
=====================================
@@ -0,0 +1,9 @@
+---
+build_gpg_user: root
+build_gpg_realname: "foo bar"
+build_gpg_email: "foo at localhost"
+main_template_enable: true
+http_template_enable: true
+
+rebuilder_publish:
+- http://127.0.0.1/new_build
=====================================
hosts
=====================================
@@ -0,0 +1,5 @@
+[builders]
+builder1
+
+[visualizers]
+visualizer1
\ No newline at end of file
=====================================
playbook.yml
=====================================
@@ -1,18 +1,48 @@
-- name: setup-reproducer
- hosts: reproducer
- remote_user: "{{ login_username }}"
+- name: Debugging Stuff (this is needed to obtain the ipv4 addresses once)
+ hosts: all
+ tasks:
+ - debug: var=hostvars[inventory_hostname]['ansible_default_ipv4']['address']
+
+- name: Setup Builders
+ hosts: builders
become: yes
become_user: root
become_method: sudo
+ vars_files:
+ - external_vars.yml
+ tags:
+ - builders
+ roles:
+ - builders
+ # The gpgkey generation role had to be disabled because it was taking a lot of time. I'm not sure we are supposed to do this.
+ - { role: gpg, gpg_user: "{{ build_gpg_user }}", gpg_realname: "{{ build_gpg_realname }}", gpg_useremail: "{{ build_gpg_email }}" , gpg_generator_user: "root", gpg_home: "/root" }
- tasks:
+- name: Setup Scheduler
+ hosts: schedulers
+ become: yes
+ become_user: root
+ become_method: sudo
+ vars_files:
+ - external_vars.yml
+ tags:
+ - schedulers
+ roles:
+ - schedulers
- - user:
- name: rebuilder
- comment: "Unprivileged account to rebuild packages"
- uid: 1001
+- name: Setup Visualizers
+ hosts: visualizers
+ become: yes
+ become_user: root
+ become_method: sudo
+ vars_files:
+ - external_vars.yml
+ tags:
+ - visualizers
+ pre_tasks:
+ - name: Install dirmngr for nginx
+ apt:
+ name: "dirmngr"
+ roles:
+ - role: nginxinc.nginx
+ - visualizers
- - name: Install pbuilder
- apt:
- name: pbuilder
- state: latest
=====================================
requirements.yml
=====================================
@@ -0,0 +1 @@
+- src: nginxinc.nginx
=====================================
roles/builders/tasks/main.yml
=====================================
@@ -0,0 +1,40 @@
+- name: Install all dependencies
+ apt:
+ name:
+ - sbuild
+ - libdatetime-format-strptime-perl
+ - libwww-perl
+ - libjson-perl
+ - gnupg2
+ - curl
+ - python-pip
+ - haveged
+
+- name: Install in-toto
+ pip:
+ name:
+ - in-toto
+ - colorama
+
+- name: Write endpoints that collect build data
+ template:
+ src: srebuild-endpoints.j2
+ dest: /etc/srebuild-endpoints
+
+- name: Copy srebuild
+ copy:
+ src: ../../../builder/srebuild
+ dest: /usr/bin/srebuild
+
+- name: Copy srebuild-hook
+ copy:
+ src: ../../../builder/srebuild-hook
+ dest: /usr/lib/srebuild-hook
+
+- name: Set permissions
+ file:
+ path: "{{ item }}"
+ mode: 0755
+ with_items:
+ - /usr/bin/srebuild
+ - /usr/lib/srebuild-hook
=====================================
roles/builders/templates/srebuild-endpoints.j2
=====================================
@@ -0,0 +1,3 @@
+{% for endpoint in rebuilder_publish %}
+{{ endpoint }}
+{% endfor %}
=====================================
roles/gpg/defaults/main.yml
=====================================
@@ -0,0 +1,11 @@
+---
+#gpg_generator_user: "{{ ansible_ssh_user }}"
+gpg_generator_user: "myuser"
+## Note: gpg_home is the path of user generating keys, it could be gpg_user or different.
+## it's both keys destination and home path for .gnupg dir
+gpg_home: "/home/{{ gpg_generator_user }}"
+
+gpg_user: "{{ ansible_ssh_user }}"
+gpg_realname: "GPG Ansible user"
+#gpg_userhome:
+gpg_useremail: "{{ gpg_user }}@localhost"
=====================================
roles/gpg/tasks/main.yml
=====================================
@@ -0,0 +1,11 @@
+- name: Copy gpg keygen config
+ template:
+ src: gpg-keygen.j2
+ dest: "{{ gpg_home }}/gpg-keygen"
+
+- name: Generate gpg key
+ shell: "gpg --no-tty --batch --gen-key < ~/gpg-keygen"
+ args:
+ creates: "{{ gpg_home }}/.gnupg/pubring.kbx"
+ become: yes
+ become_user: "{{ gpg_generator_user }}"
=====================================
roles/gpg/templates/gpg-keygen.j2
=====================================
@@ -0,0 +1,9 @@
+Key-Type: RSA
+Key-Length: 4096
+Key-Usage: sign
+Name-Real: {{ ansible_hostname }}
+Name-Comment: Automatically generated key for signing .buildinfo files
+Expire-Date: 0
+%no-ask-passphrase
+%no-protection
+%commit
=====================================
roles/schedulers/tasks/main.yml
=====================================
@@ -0,0 +1,24 @@
+- name: Install all dependencies
+ apt:
+ name:
+ - redis-server
+ - python3-redis
+ - python3-requests
+
+- name: Copy scheduler-worker
+ copy:
+ src: ../../../scheduler/srebuild-worker
+ dest: /usr/bin/srebuild-worker
+
+- name: Copy scheduler-monitor
+ copy:
+ src: ../../../scheduler/srebuild-monitor
+ dest: /usr/bin/srebuild-monitor
+
+- name: Set permissions
+ file:
+ path: "{{ item }}"
+ mode: 0755
+ with_items:
+ - /usr/bin/srebuild-worker
+ - /usr/bin/srebuild-monitor
=====================================
roles/visualizers/files/default.conf
=====================================
@@ -0,0 +1,31 @@
+server {
+ listen 80;
+
+ server_name _;
+
+ access_log /var/log/nginx/access.log;
+ error_log /var/log/nginx/error.log;
+
+ location / {
+ proxy_pass http://127.0.0.1:8000;
+ proxy_redirect off;
+
+ proxy_set_header Host $host;
+ proxy_set_header X-Real-IP $remote_addr;
+ proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
+ proxy_set_header X-Forwarded-Proto $scheme;
+ }
+
+ location /new_build {
+ proxy_pass http://127.0.0.1:4000;
+ proxy_redirect off;
+
+ allow 127.0.0.1;
+ deny all;
+
+ proxy_set_header Host $host;
+ proxy_set_header X-Real-IP $remote_addr;
+ proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
+ proxy_set_header X-Forwarded-Proto $scheme;
+ }
+}
=====================================
roles/visualizers/files/gunicorn-accumulator.service
=====================================
@@ -0,0 +1,10 @@
+[Unit]
+Description=Gunicorn server for Accumulator
+
+[Service]
+WorkingDirectory=/var/accumulator
+Restart=on-failure
+ExecStart=/usr/local/bin/gunicorn --bind 127.0.0.1:4000 accumulator:app
+
+[Install]
+WantedBy=multi-user.target
=====================================
roles/visualizers/files/gunicorn-visualizer.service
=====================================
@@ -0,0 +1,10 @@
+[Unit]
+Description=Gunicorn server for Visualizer
+
+[Service]
+WorkingDirectory=/var/visualizer
+Restart=on-failure
+ExecStart=/usr/local/bin/gunicorn --bind 127.0.0.1:8000 visualizer:app
+
+[Install]
+WantedBy=multi-user.target
=====================================
roles/visualizers/handlers/main.yml
=====================================
@@ -0,0 +1,13 @@
+- name: restart gunicorn-accumulator
+ systemd:
+ name=gunicorn-accumulator
+ daemon_reload=yes
+ enabled=yes
+ state=restarted
+
+- name: restart gunicorn-visualizer
+ systemd:
+ name=gunicorn-visualizer
+ daemon_reload=yes
+ enabled=yes
+ state=restarted
=====================================
roles/visualizers/tasks/main.yml
=====================================
@@ -0,0 +1,51 @@
+- name: Install pip
+ apt:
+ name:
+ - python-pip
+ - sqlite3
+
+- name: Create necessary folders
+ file: path="{{ item }}" state=directory
+ with_items:
+ - /var/builds
+ - /var/accumulator
+ - /var/visualizer
+
+- name: Copy all files
+ copy: src={{ item.src }} dest={{ item.dest }}
+ with_items:
+ - { src: ../../../visualizer/accumulator.py, dest: /var/accumulator/accumulator.py }
+ - { src: ../../../visualizer/visualizer.py, dest: /var/visualizer/visualizer.py }
+ - { src: ../../../visualizer/requirements.txt, dest: /var/requirements.txt }
+ - { src: ../../../visualizer/templates, dest: /var/visualizer/ }
+ - { src: ../../../visualizer/schema.sql, dest: /var/schema.sql }
+
+- name: Generate DB
+ shell: sqlite3 /var/rebuilder.db < /var/schema.sql
+
+- name: Install Python dependencies
+ pip:
+ requirements: /var/requirements.txt
+
+- name: Run accumulator
+ copy: src=gunicorn-accumulator.service dest=/etc/systemd/system/gunicorn-accumulator.service
+ notify:
+ - restart gunicorn-accumulator
+
+- name: Run visualizer
+ copy: src=gunicorn-visualizer.service dest=/etc/systemd/system/gunicorn-visualizer.service
+ notify:
+ - restart gunicorn-visualizer
+
+- name: Copy nginx config
+ copy: src=default.conf dest=/etc/nginx/conf.d/default.conf
+ notify: "(Handler: All OSs) Reload NGINX"
+
+- name: enable systemd services
+ systemd:
+ name: "{{ item }}"
+ state: started
+ enabled: yes
+ with_items:
+ - gunicorn-visualizer
+ - gunicorn-accumulator
=====================================
scheduler/srebuild-monitor
=====================================
@@ -0,0 +1,51 @@
+#!/usr/bin/env python3
+import requests
+import urllib.parse
+import redis
+import time
+import sys
+
+
+def main(server):
+ most_recent = 0
+ db = redis.StrictRedis(host='localhost', port=6379, db=0)
+
+ # try to load most_recent, default to 0
+ most_recent = db.get('rebuild-most-recent') or '0'
+ most_recent = int(most_recent)
+
+ while True:
+ print('[*] Requesting new buildinfo files')
+
+ url = urllib.parse.urljoin(server, '/api/buildinfo/since/%d' % most_recent)
+ r = requests.get(url)
+ r.raise_for_status()
+ buildinfos = r.json()['buildinfos']
+
+ for buildinfo in buildinfos:
+ raw_uri = buildinfo['raw-uri']
+ print('[+] Adding %r' % raw_uri)
+
+ # request buildinfo file
+ r = requests.get(raw_uri)
+ r.raise_for_status()
+
+ # add buildinfo to queue
+ db.rpush('rebuild-q', r.text)
+
+ # update most_recent
+ most_recent = buildinfo['created']
+
+ # store most_recent
+ db.set('rebuild-most-recent', most_recent)
+
+ if not buildinfos:
+ print('[*] Sleeping zZz')
+ time.sleep(10)
+
+
+if __name__ == '__main__':
+ if len(sys.argv) < 2:
+ print('Usage: %s http://buildinfo.nyu.wtf/' % sys.argv[0])
+ else:
+ main(sys.argv[1])
=====================================
scheduler/srebuild-worker
=====================================
@@ -0,0 +1,31 @@
+#!/usr/bin/env python3
+import redis
+import tempfile
+import subprocess
+
+
+def rebuild(buildinfo):
+ with open('/etc/srebuild-endpoints') as f:
+ endpoints = filter(None, (x.strip() for x in f))
+
+ with tempfile.NamedTemporaryFile() as f:
+ f.write(buildinfo)
+ cmd = ['srebuild', f.name] + endpoints
+ print('[+] invoking %r' % cmd)
+ rc = subprocess.call(cmd)
+ if rc != 0:
+ # TODO: we should handle this properly
+ print('[!] srebuild returned an error')
+
+
+def main():
+ r = redis.StrictRedis(host='localhost', port=6379, db=0)
+ print('[*] monitoring queue')
+ while True:
+ q, item = r.blpop('rebuild-q')
+ print('[*] rebuilding %r' % item)
+ rebuild(item)
+
+
+if __name__ == '__main__':
+ main()
=====================================
visualizer/README.md
=====================================
@@ -0,0 +1,36 @@
+Visualizer
+======
+
+This module consists of 2 web servers:
+* Accumulator: This webserver receives information about builds as and when they happen and stores them. This is supposed to be exposed only inside the network for the other modules to communicate with it.
+* Visualizer: This is the only external interface to the whole rebuilder setup with a webserver which exposes the builds and their metadata + buildinfo.
+
+## Accumulator
+
+API Endpoint: `/new_build`
+
+Form parameters: metadata (File) and buildinfo (File)
+
+## Visualizer
+
+API Endpoint: `/sources`
+
+Get the names of the source packages ever rebuilt on this infrastructure
+
+---
+
+API Endpoint: `/sources/<source>`
+
+See all versions of a particular source built on this infrastructure
+
+---
+
+API Endpoint: `/sources/<source>/<version>/metadata`
+
+Get metadata for a particular source version.
+
+---
+
+API Endpoint: `/sources/<source>/<version>/buildinfo`
+
+Get buildinfo for a particular source version.
=====================================
visualizer/accumulator.py
=====================================
@@ -0,0 +1,56 @@
+from debian.deb822 import Deb822
+from flask import g, request, Flask
+from os import mkdir
+from os.path import join
+from sqlite3 import connect
+from time import time
+
+app = Flask(__name__)
+
+DIR = '/var/builds/'
+DATABASE = '/var/rebuilder.db'
+
+
+def get_db():
+ db = getattr(g, '_database', None)
+ if db is None:
+ db = g._database = connect(DATABASE)
+ return db
+
+
+ at app.teardown_appcontext
+def close_connection(exception):
+ db = getattr(g, '_database', None)
+ if db is not None:
+ db.close()
+
+
+ at app.route('/new_build', methods=['POST'])
+def new_build():
+ metadata = request.files['metadata']
+ buildinfo = request.files['buildinfo']
+ source = None
+ version = None
+ for paragraph in Deb822.iter_paragraphs(buildinfo):
+ for item in paragraph.items():
+ if item[0] == 'Source':
+ source = item[1]
+ if item[0] == 'Version':
+ version = item[1]
+ buildinfo.seek(0)
+ folder_name = '%s-%s' % (source, version)
+ directory = join(DIR, folder_name)
+ mkdir(directory)
+ timestamp = time()
+ metadata.save(join(directory, metadata.filename))
+ buildinfo.save(join(directory, buildinfo.filename))
+ db = get_db()
+ c = db.cursor()
+
+ c.execute('INSERT INTO BUILDS VALUES (?, ?, ?, ?, ?)',
+ (source, version, timestamp,
+ metadata.filename, buildinfo.filename)
+ )
+
+ db.commit()
+ return 'OK'
=====================================
visualizer/requirements.txt
=====================================
@@ -0,0 +1,3 @@
+flask
+gunicorn
+python-debian
=====================================
visualizer/schema.sql
=====================================
@@ -0,0 +1,7 @@
+CREATE TABLE IF NOT EXISTS BUILDS(
+ source TEXT,
+ version TEXT,
+ timestamp INTEGER,
+ metadata TEXT,
+ buildinfo TEXT
+);
=====================================
visualizer/templates/all_sources.html
=====================================
@@ -0,0 +1,15 @@
+<!DOCTYPE html>
+<html>
+ <head>
+ <title> All sources </title>
+ </head>
+ <body>
+ <ul>
+ {% for result in results %}
+ <li>
+ <a href="/sources/{{ result[0] }}"> {{ result[0] }} </a>
+ </li>
+ {% endfor %}
+ </ul>
+ </body>
+</html>
=====================================
visualizer/templates/all_versions_of_source.html
=====================================
@@ -0,0 +1,25 @@
+<!DOCTYPE html>
+<html>
+ <head>
+ <title> {{ results[0][0] }} </title>
+ </head>
+ <body>
+ <table>
+ <tr>
+ <th> Version </th>
+ <th> Timestamp </th>
+ <th> Buildinfo </th>
+ <th> in-toto metadata </th>
+ </tr>
+ {% for result in results %}
+ <tr>
+ <td> {{ result[1] }} </td>
+ <td> {{ result[2] }} </td>
+ <td> <a href="/sources/{{ result[0] }}/{{ result[1] }}/buildinfo"> Link </a> </td>
+ <td> <a href="/sources/{{ result[0] }}/{{ result[1] }}/metadata"> Link </a> </td>
+ </tr>
+ {% endfor %}
+ </table>
+ </body>
+</html>
+
=====================================
visualizer/visualizer.py
=====================================
@@ -0,0 +1,80 @@
+from flask import g, render_template, Flask, Response
+from os.path import join
+from sqlite3 import connect
+
+app = Flask(__name__)
+
+DIR = '/var/builds/'
+DATABASE = '/var/rebuilder.db'
+
+
+def get_db():
+ db = getattr(g, '_database', None)
+ if db is None:
+ db = g._database = connect(DATABASE)
+ return db
+
+
+ at app.teardown_appcontext
+def close_connection(exception):
+ db = getattr(g, '_database', None)
+ if db is not None:
+ db.close()
+
+
+ at app.route('/sources')
+def all_sources():
+ db = get_db()
+ c = db.cursor()
+ c.execute('SELECT source FROM BUILDS')
+ results = c.fetchall()
+ return render_template('all_sources.html', results=results)
+
+
+ at app.route('/sources/<source>')
+def all_versions_of_source(source):
+ db = get_db()
+ c = db.cursor()
+ c.execute('SELECT * FROM BUILDS WHERE source=?', (source,))
+ results = c.fetchall()
+ if len(results) == 0:
+ return ('Not Found', 404, {})
+ return render_template('all_versions_of_source.html', results=results)
+
+
+ at app.route('/sources/<source>/<version>/metadata')
+def get_metadata(source, version):
+ db = get_db()
+ c = db.cursor()
+ c.execute('SELECT metadata FROM BUILDS WHERE source=? AND version=?',
+ (source, version))
+ metadata = c.fetchone()
+ if metadata is None:
+ return ('Not Found', 404, {})
+ folder_name = '%s-%s' % (source, version)
+ directory = join(DIR, folder_name)
+ content = open(join(directory, metadata[0])).read()
+ return Response(content, mimetype='text/plain',
+ headers={
+ 'Content-Disposition': 'attachment; filename="' +
+ metadata[0] + '"'
+ })
+
+
+ at app.route('/sources/<source>/<version>/buildinfo')
+def get_buildinfo(source, version):
+ db = get_db()
+ c = db.cursor()
+ c.execute('SELECT buildinfo FROM BUILDS WHERE source=? AND version=?',
+ (source, version))
+ buildinfo = c.fetchone()
+ if buildinfo is None:
+ return ('Not Found', 404, {})
+ folder_name = '%s-%s' % (source, version)
+ directory = join(DIR, folder_name)
+ content = open(join(directory, buildinfo[0])).read()
+ return Response(content, mimetype='text/plain',
+ headers={
+ 'Content-Disposition': 'attachment; filename="' +
+ buildinfo[0] + '"'
+ })
View it on GitLab: https://salsa.debian.org/reproducible-builds/debian-rebuilder-setup/compare/7b322e68a2a313d0379dc832b706cbe57440f1d0...beae8c31d8441a96a48c421fed15309a2d532419
--
View it on GitLab: https://salsa.debian.org/reproducible-builds/debian-rebuilder-setup/compare/7b322e68a2a313d0379dc832b706cbe57440f1d0...beae8c31d8441a96a48c421fed15309a2d532419
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.reproducible-builds.org/pipermail/rb-commits/attachments/20181105/1f5f1097/attachment.html>
More information about the rb-commits
mailing list