<HTML><BODY><div><div><div>Patch introduce preparation for changing packages deploying process:<br>1. Instead of PackageCloud storage will be used S3 compatible storage. </div><div>The main reason for such changes is that PackageCloud repository is</div><div>limited with space and not enough for keeping all packages.</div><div>2. Introduce two new root repositories:</div><div>* "live" for packages from master and release branches.</div><div>* "release" for packages from tagged commits</div><div>3. Developer branches with "-full-ci" building packages but don’t deploy.</div><div> </div><div>For these purposes the standalone script for pushing DEB and RPM <br>packages to the new S3 storage is used. For creating RPM packages <br>it uses 'createrepo' utility and for DEB packages - 'reprepro' utility.</div><div> </div><div>The script implements the following flow:<br>* creates new metafiles for the new packages<br>* fetches relevant metafiles from the storage</div><div>* copies new packages to S3 storage<br>* merges and pushes the updated metadata to S3 storage</div></div><div> </div><blockquote style="border-left:1px solid #0857A6; margin:10px; padding:0 0 0 10px;">Пятница, 31 января 2020, 9:00 +03:00 от Alexander V. Tikhonov <avtikhon@tarantool.org>:<br> <div id=""><div class="js-helper js-readmsg-msg"><style type="text/css"></style><div><div id="style_15804504121752432211_BODY">The changes introduce new Gitlab-CI rules for creating packages on<br>branches with "-full-ci" suffix and their subsequent deployment to the<br>'live' repository for master and release branches. Packages for tagged<br>commits are also delivered to the corresponding 'release' repository.</div></div></div></div></blockquote></div><div><blockquote style="border-left:1px solid #0857A6; margin:10px; padding:0 0 0 10px;"><div><div class="js-helper js-readmsg-msg"><div><div>Packages creation activities relocating from the PackageCloud storage<br>to the new self-hosted MCS S3 where all old packages have been synced.<br>Benefits of the introduced approach are the following:<br>* As far as all contents of self-hosted repos are fully controlled<br>theirs layout is the same as the ones provided by the corresponding<br>distro<br>* Repo metadata rebuild is excess considering the known repo layout<br>* Old packages are not pruned since they do not affect the repo<br>metadata rebuilding time<br><br>For these purposes the standalone script for pushing DEB and RPM</div><div id="style_15804504121752432211_BODY">packages to self-hosted repositories is introduced. The script</div><div id="style_15804504121752432211_BODY">implements the following flow:</div><div id="style_15804504121752432211_BODY">* creates new metafiles for the new packages</div><div id="style_15804504121752432211_BODY">* copies new packages to S3 storage</div><div id="style_15804504121752432211_BODY">* fetches relevant metafiles from the repo</div><div id="style_15804504121752432211_BODY">* merges the new metadata with the fetched one</div><div id="style_15804504121752432211_BODY">* pushes the updated metadata to S3 storage</div><div id="style_15804504121752432211_BODY"><br>There are OS distribution dependent parts in the script:<br>* for RPM packages it updates metadata separately per each OS<br>distribution considering 'createrepo' util behavior<br>* for DEB packages it updates metadata simultaniously for all<br>distributions within single OS considering 'reprepro' util behaviour<br><br>Closes #3380<br>---<br><br>Github: <a href="https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci" target="_blank">https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci</a><br>Issue: <a href="https://github.com/tarantool/tarantool/issues/3380" target="_blank">https://github.com/tarantool/tarantool/issues/3380</a><br><br>v7: <a href="https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013872.html" target="_blank">https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013872.html</a><br>v6: <a href="https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013763.html" target="_blank">https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013763.html</a><br>v5: <a href="https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013636.html" target="_blank">https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013636.html</a><br>v4: <a href="https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013568.html" target="_blank">https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013568.html</a><br>v3: <a href="https://lists.tarantool.org/pipermail/tarantool-patches/2019-December/013060.html" target="_blank">https://lists.tarantool.org/pipermail/tarantool-patches/2019-December/013060.html</a><br>v2: <a href="https://lists.tarantool.org/pipermail/tarantool-patches/2019-November/012352.html" target="_blank">https://lists.tarantool.org/pipermail/tarantool-patches/2019-November/012352.html</a><br>v1: <a href="https://lists.tarantool.org/pipermail/tarantool-patches/2019-October/012021.html" target="_blank">https://lists.tarantool.org/pipermail/tarantool-patches/2019-October/012021.html</a><br><br>Changes v8:<br>- corrected commit message<br>- removed extra check for bucket naming<br>- changed bucket naming from underscore to dot, like: 2_2 to 2.x<br>- changed git tag checking routine as suggested<br><br>Changes v7:<br>- removed additional functionality for working with DEB repositories<br> using complete pool path w/o specifing packages<br>- implemented new flag '-f|--force' that helps to overwite the packages<br> at MCS S3 if it checksum changed - implemented check on the new<br> packages for it<br>- implemented check with warning on the new RPM packages with the same checksum<br><br>Changes v6:<br>- implemented 2 MCS S3 repositories 'live' and 'release'<br>- added AWS and GPG keys into Gitlab-CI<br>- corrected commit message<br>- corrected return functionality code in script<br>- moved all changes for sources tarballs at the standalone patch set<br><br>Changes v5:<br>- code style<br>- commits squashed<br>- rebased to master<br><br>Changes v4:<br>- minor corrections<br><br>Changes v3:<br>- common code parts merged to standalone routines<br>- corrected code style, minor updates<br>- script is ready for release<br><br>Changes v2:<br>- made changes in script from draft to pre-release stages<br><br> .gitlab-ci.yml | 152 ++++++++++--<br> .gitlab.mk | 20 +-<br> tools/update_repo.sh | 571 +++++++++++++++++++++++++++++++++++++++++++<br> 3 files changed, 723 insertions(+), 20 deletions(-)<br> create mode 100755 tools/update_repo.sh<br><br>diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml<br>index 3af5a3c8a..c68594c1a 100644<br>--- a/.gitlab-ci.yml<br>+++ b/.gitlab-ci.yml<br>@@ -10,6 +10,10 @@ variables:<br> only:<br> refs:<br> - master<br>+<br>+.fullci_only_template: &fullci_only_definition<br>+ only:<br>+ refs:<br> - /^.*-full-ci$/<br> <br> .docker_test_template: &docker_test_definition<br>@@ -24,13 +28,29 @@ variables:<br> tags:<br> - docker_test<br> <br>+.pack_template: &pack_definition<br>+ <<: *fullci_only_definition<br>+ stage: test<br>+ tags:<br>+ - deploy<br>+ script:<br>+ - ${GITLAB_MAKE} package<br>+<br>+.pack_test_template: &pack_test_definition<br>+ <<: *fullci_only_definition<br>+ stage: test<br>+ tags:<br>+ - deploy_test<br>+ script:<br>+ - ${GITLAB_MAKE} package<br>+<br> .deploy_template: &deploy_definition<br> <<: *release_only_definition<br> stage: test<br> tags:<br> - deploy<br> script:<br>- - ${GITLAB_MAKE} package<br>+ - ${GITLAB_MAKE} deploy<br> <br> .deploy_test_template: &deploy_test_definition<br> <<: *release_only_definition<br>@@ -38,7 +58,7 @@ variables:<br> tags:<br> - deploy_test<br> script:<br>- - ${GITLAB_MAKE} package<br>+ - ${GITLAB_MAKE} deploy<br> <br> .vbox_template: &vbox_definition<br> stage: test<br>@@ -141,96 +161,194 @@ freebsd_12_release:<br> # Packs<br> <br> centos_6:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'el'<br> DIST: '6'<br> <br> centos_7:<br>- <<: *deploy_test_definition<br>+ <<: *pack_test_definition<br> variables:<br> OS: 'el'<br> DIST: '7'<br> <br> centos_8:<br>- <<: *deploy_test_definition<br>+ <<: *pack_test_definition<br> variables:<br> OS: 'el'<br> DIST: '8'<br> <br> fedora_28:<br>- <<: *deploy_test_definition<br>+ <<: *pack_test_definition<br> variables:<br> OS: 'fedora'<br> DIST: '28'<br> <br> fedora_29:<br>- <<: *deploy_test_definition<br>+ <<: *pack_test_definition<br> variables:<br> OS: 'fedora'<br> DIST: '29'<br> <br> fedora_30:<br>- <<: *deploy_test_definition<br>+ <<: *pack_test_definition<br> variables:<br> OS: 'fedora'<br> DIST: '30'<br> <br> fedora_31:<br>- <<: *deploy_test_definition<br>+ <<: *pack_test_definition<br> variables:<br> OS: 'fedora'<br> DIST: '31'<br> <br> ubuntu_14_04:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'ubuntu'<br> DIST: 'trusty'<br> <br> ubuntu_16_04:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'ubuntu'<br> DIST: 'xenial'<br> <br> ubuntu_18_04:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'ubuntu'<br> DIST: 'bionic'<br> <br> ubuntu_18_10:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'ubuntu'<br> DIST: 'cosmic'<br> <br> ubuntu_19_04:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'ubuntu'<br> DIST: 'disco'<br> <br> ubuntu_19_10:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'ubuntu'<br> DIST: 'eoan'<br> <br> debian_8:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'debian'<br> DIST: 'jessie'<br> <br> debian_9:<br>- <<: *deploy_definition<br>+ <<: *pack_definition<br> variables:<br> OS: 'debian'<br> DIST: 'stretch'<br> <br> debian_10:<br>+ <<: *pack_definition<br>+ variables:<br>+ OS: 'debian'<br>+ DIST: 'buster'<br>+<br>+# Deploy<br>+<br>+centos_6_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'el'<br>+ DIST: '6'<br>+<br>+centos_7_deploy:<br>+ <<: *deploy_test_definition<br>+ variables:<br>+ OS: 'el'<br>+ DIST: '7'<br>+<br>+centos_8_deploy:<br>+ <<: *deploy_test_definition<br>+ variables:<br>+ OS: 'el'<br>+ DIST: '8'<br>+<br>+fedora_28_deploy:<br>+ <<: *deploy_test_definition<br>+ variables:<br>+ OS: 'fedora'<br>+ DIST: '28'<br>+<br>+fedora_29_deploy:<br>+ <<: *deploy_test_definition<br>+ variables:<br>+ OS: 'fedora'<br>+ DIST: '29'<br>+<br>+fedora_30_deploy:<br>+ <<: *deploy_test_definition<br>+ variables:<br>+ OS: 'fedora'<br>+ DIST: '30'<br>+<br>+fedora_31_deploy:<br>+ <<: *deploy_test_definition<br>+ variables:<br>+ OS: 'fedora'<br>+ DIST: '31'<br>+<br>+ubuntu_14_04_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'ubuntu'<br>+ DIST: 'trusty'<br>+<br>+ubuntu_16_04_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'ubuntu'<br>+ DIST: 'xenial'<br>+<br>+ubuntu_18_04_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'ubuntu'<br>+ DIST: 'bionic'<br>+<br>+ubuntu_18_10_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'ubuntu'<br>+ DIST: 'cosmic'<br>+<br>+ubuntu_19_04_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'ubuntu'<br>+ DIST: 'disco'<br>+<br>+ubuntu_19_10_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'ubuntu'<br>+ DIST: 'eoan'<br>+<br>+debian_8_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'debian'<br>+ DIST: 'jessie'<br>+<br>+debian_9_deploy:<br>+ <<: *deploy_definition<br>+ variables:<br>+ OS: 'debian'<br>+ DIST: 'stretch'<br>+<br>+debian_10_deploy:<br> <<: *deploy_definition<br> variables:<br> OS: 'debian'<br>diff --git a/.gitlab.mk b/.gitlab.mk<br>index 48a92e518..1f921fd6e 100644<br>--- a/.gitlab.mk<br>+++ b/.gitlab.mk<br>@@ -98,14 +98,28 @@ vms_test_%:<br> vms_shutdown:<br> VBoxManage controlvm ${VMS_NAME} poweroff<br> <br>-# ########################<br>-# Build RPM / Deb packages<br>-# ########################<br>+# ########<br>+# Packages<br>+# ########<br>+<br>+GIT_DESCRIBE=$(shell git describe HEAD)<br>+MAJOR_VERSION=$(word 1,$(subst ., ,$(GIT_DESCRIBE)))<br>+MINOR_VERSION=$(word 2,$(subst ., ,$(GIT_DESCRIBE)))<br>+BUCKET="$(MAJOR_VERSION).$(MINOR_VERSION)"<br> <br> package: git_submodule_update<br> git clone <a href="https://github.com/packpack/packpack.git" target="_blank">https://github.com/packpack/packpack.git</a> packpack<br> PACKPACK_EXTRA_DOCKER_RUN_PARAMS='--network=host' ./packpack/packpack<br> <br>+deploy: package<br>+ for key in ${GPG_SECRET_KEY} ${GPG_PUBLIC_KEY} ; do \<br>+ echo ${key} | base64 -d | gpg --batch --import || true ; done<br>+ ./tools/update_repo.sh -o=${OS} -d=${DIST} \<br>+ -b="s3://tarantool_repo/live/${BUCKET}" build<br>+ git name-rev --name-only --tags --no-undefined HEAD 2>/dev/null && \<br>+ ./tools/update_repo.sh -o=${OS} -d=${DIST} \<br>+ -b="s3://tarantool_repo/release/${BUCKET}" build<br>+<br> # ############<br> # Static build<br> # ############<br>diff --git a/tools/update_repo.sh b/tools/update_repo.sh<br>new file mode 100755<br>index 000000000..00e7aa0c6<br>--- /dev/null<br>+++ b/tools/update_repo.sh<br>@@ -0,0 +1,571 @@<br>+#!/bin/bash<br>+set -e<br>+<br>+rm_file='rm -f'<br>+rm_dir='rm -rf'<br>+mk_dir='mkdir -p'<br>+ws_prefix=/tmp/tarantool_repo_s3<br>+<br>+alloss='ubuntu debian el fedora'<br>+product=tarantool<br>+force=<br>+# the path with binaries either repository<br>+repo=.<br>+<br>+# AWS defines<br>+aws="aws --endpoint-url ${AWS_S3_ENDPOINT_URL:-https://hb.bizmrg.com} s3"<br>+aws_cp_public="$aws cp --acl public-read"<br>+aws_sync_public="$aws sync --acl public-read"<br>+<br>+function get_os_dists {<br>+ os=$1<br>+ alldists=<br>+<br>+ if [ "$os" == "ubuntu" ]; then<br>+ alldists='trusty xenial bionic cosmic disco eoan'<br>+ elif [ "$os" == "debian" ]; then<br>+ alldists='jessie stretch buster'<br>+ elif [ "$os" == "el" ]; then<br>+ alldists='6 7 8'<br>+ elif [ "$os" == "fedora" ]; then<br>+ alldists='27 28 29 30 31'<br>+ fi<br>+<br>+ echo "$alldists"<br>+}<br>+<br>+function prepare_ws {<br>+ # temporary lock the publication to the repository<br>+ ws_suffix=$1<br>+ ws=${ws_prefix}_${ws_suffix}<br>+ ws_lockfile=${ws}.lock<br>+ if [ -f $ws_lockfile ]; then<br>+ old_proc=$(cat $ws_lockfile)<br>+ fi<br>+ lockfile -l 60 $ws_lockfile<br>+ chmod u+w $ws_lockfile && echo $ >$ws_lockfile && chmod u-w $ws_lockfile<br>+ if [ "$old_proc" != "" -a "$old_proc" != "0" ]; then<br>+ kill -9 $old_proc >/dev/null || true<br>+ fi<br>+<br>+ # create temporary workspace for the new files<br>+ $rm_dir $ws<br>+ $mk_dir $ws<br>+}<br>+<br>+function usage {<br>+ cat <<EOF<br>+Usage for store package binaries from the given path:<br>+ $0 -o=<OS name> -d=<OS distribuition> -b=<S3 bucket> [-p=<product>] <path to package binaries><br>+<br>+Usage for mirroring Debian|Ubuntu OS repositories:<br>+ $0 -o=<OS name> -d=<OS distribuition> -b=<S3 bucket> [-p=<product>] <path to packages binaries><br>+<br>+Arguments:<br>+ <path><br>+ Path points to the directory with deb/prm packages to be used.<br>+<br>+Options:<br>+ -b|--bucket<br>+ MCS S3 bucket already existing which will be used for storing the packages<br>+ -o|--os<br>+ OS to be checked, one of the list:<br>+ $alloss<br>+ -d|--distribution<br>+ Distribution appropriate to the given OS:<br>+EOF<br>+ for os in $alloss ; do<br>+ echo " $os: <"$(get_os_dists $os)">"<br>+ done<br>+ cat <<EOF<br>+ -p|--product<br>+ Product name to be packed with, default name is 'tarantool'<br>+ -f|--force<br>+ Force updating the remote package with the local one despite the checksum difference<br>+ -h|--help<br>+ Usage help message<br>+EOF<br>+}<br>+<br>+for i in "$@"<br>+do<br>+case $i in<br>+ -b=*|--bucket=*)<br>+ bucket="${i#*=}"<br>+ shift # past argument=value<br>+ ;;<br>+ -o=*|--os=*)<br>+ os="${i#*=}"<br>+ if ! echo $alloss | grep -F -q -w $os ; then<br>+ echo "ERROR: OS '$os' is not supported"<br>+ usage<br>+ exit 1<br>+ fi<br>+ shift # past argument=value<br>+ ;;<br>+ -d=*|--distribution=*)<br>+ option_dist="${i#*=}"<br>+ shift # past argument=value<br>+ ;;<br>+ -p=*|--product=*)<br>+ product="${i#*=}"<br>+ shift # past argument=value<br>+ ;;<br>+ -f|--force)<br>+ force=1<br>+ ;;<br>+ -h|--help)<br>+ usage<br>+ exit 0<br>+ ;;<br>+ *)<br>+ repo="${i#*=}"<br>+ pushd $repo >/dev/null ; repo=$PWD ; popd >/dev/null<br>+ shift # past argument=value<br>+ ;;<br>+esac<br>+done<br>+<br>+# check that all needed options were set and correct<br>+if [ "$bucket" == "" ]; then<br>+ echo "ERROR: need to set -b|--bucket bucket option, check usage"<br>+ usage<br>+ exit 1<br>+fi<br>+if [ "$option_dist" == "" ]; then<br>+ echo "ERROR: need to set -d|--option_dist OS distribuition name option, check usage"<br>+ usage<br>+ exit 1<br>+fi<br>+if [ "$os" == "" ]; then<br>+ echo "ERROR: need to set -o|--os OS name option, check usage"<br>+ usage<br>+ exit 1<br>+fi<br>+alldists=$(get_os_dists $os)<br>+if [ -n "$option_dist" ] && ! echo $alldists | grep -F -q -w $option_dist ; then<br>+ echo "ERROR: set distribution at options '$option_dist' not found at supported list '$alldists'"<br>+ usage<br>+ exit 1<br>+fi<br>+<br>+# set the subpath with binaries based on literal character of the product name<br>+proddir=$(echo $product | head -c 1)<br>+<br>+# set bucket path of the given OS in options<br>+bucket_path="$bucket/$os"<br>+<br>+function update_deb_packfile {<br>+ packfile=$1<br>+ packtype=$2<br>+ update_dist=$3<br>+<br>+ locpackfile=$(echo $packfile | sed "s#^$ws/##g")<br>+ # register DEB/DSC pack file to Packages/Sources file<br>+ reprepro -Vb . include$packtype $update_dist $packfile<br>+ # reprepro copied DEB/DSC file to component which is not needed<br>+ $rm_dir $debdir/$component<br>+ # to have all sources avoid reprepro set DEB/DSC file to its own registry<br>+ $rm_dir db<br>+}<br>+<br>+function update_deb_metadata {<br>+ packpath=$1<br>+ packtype=$2<br>+<br>+ if [ ! -f $packpath.saved ] ; then<br>+ # get the latest Sources file from S3 either create empty file<br>+ $aws ls "$bucket_path/$packpath" >/dev/null 2>&1 && \<br>+ $aws cp "$bucket_path/$packpath" $packpath.saved || \<br>+ touch $packpath.saved<br>+ fi<br>+<br>+ if [ "$packtype" == "dsc" ]; then<br>+ # WORKAROUND: unknown why, but reprepro doesn`t save the Sources<br>+ # file, lets recreate it manualy from it's zipped version<br>+ gunzip -c $packpath.gz >$packpath<br>+ # check if the DSC hash already exists in old Sources file from S3<br>+ # find the hash from the new Sources file<br>+ hash=$(grep '^Checksums-Sha256:' -A3 $packpath | \<br>+ tail -n 1 | awk '{print $1}')<br>+ # search the new hash in the old Sources file from S3<br>+ if grep " $hash .* .*__BODY_HTML_PLACEHOLDER__quot; $packpath.saved ; then<br>+ echo "WARNING: DSC file already registered in S3!"<br>+ return<br>+ fi<br>+ # check if the DSC file already exists in old Sources file from S3<br>+ file=$(grep '^Files:' -A3 $packpath | tail -n 1 | awk '{print $3}')<br>+ if [ "$force" == "" ] && grep " .* .* $file__BODY_HTML_PLACEHOLDER__quot; $packpath.saved ; then<br>+ echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"<br>+ echo "New hash: $hash"<br>+ # unlock the publishing<br>+ $rm_file $ws_lockfile<br>+ exit 1<br>+ fi<br>+ updated_dsc=1<br>+ elif [ "$packtype" == "deb" ]; then<br>+ # check if the DEB file already exists in old Packages file from S3<br>+ # find the hash from the new Packages file<br>+ hash=$(grep '^SHA256: ' $packpath)<br>+ # search the new hash in the old Packages file from S3<br>+ if grep "^SHA256: $hash" $packpath.saved ; then<br>+ echo "WARNING: DEB file already registered in S3!"<br>+ return<br>+ fi<br>+ # check if the DEB file already exists in old Packages file from S3<br>+ file=$(grep '^Filename:' | awk '{print $2}')<br>+ if [ "$force" == "" ] && grep "Filename: $file__BODY_HTML_PLACEHOLDER__quot; $packpath.saved ; then<br>+ echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"<br>+ echo "New hash: $hash"<br>+ # unlock the publishing<br>+ $rm_file $ws_lockfile<br>+ exit 1<br>+ fi<br>+ updated_deb=1<br>+ fi<br>+ # store the new DEB entry<br>+ cat $packpath >>$packpath.saved<br>+}<br>+<br>+# The 'pack_deb' function especialy created for DEB packages. It works<br>+# with DEB packing OS like Ubuntu, Debian. It is based on globaly known<br>+# tool 'reprepro' from:<br>+# <a href="https://wiki.debian.org/DebianRepository/SetupWithReprepro" target="_blank">https://wiki.debian.org/DebianRepository/SetupWithReprepro</a><br>+# This tool works with complete number of distributions of the given OS.<br>+# Result of the routine is the debian package for APT repository with<br>+# file structure equal to the Debian/Ubuntu:<br>+# <a href="http://ftp.am.debian.org/debian/pool/main/t/tarantool/" target="_blank">http://ftp.am.debian.org/debian/pool/main/t/tarantool/</a><br>+# <a href="http://ftp.am.debian.org/ubuntu/pool/main/t/" target="_blank">http://ftp.am.debian.org/ubuntu/pool/main/t/</a><br>+function pack_deb {<br>+ # we need to push packages into 'main' repository only<br>+ component=main<br>+<br>+ # debian has special directory 'pool' for packages<br>+ debdir=pool<br>+<br>+ # get packages from pointed location<br>+ if ! ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null ; then<br>+ echo "ERROR: files $repo/*.deb $repo/*.dsc $repo/*.tar.*z not found"<br>+ usage<br>+ exit 1<br>+ fi<br>+<br>+ # prepare the workspace<br>+ prepare_ws ${os}<br>+<br>+ # copy single distribution with binaries packages<br>+ repopath=$ws/pool/${option_dist}/$component/$proddir/$product<br>+ $mk_dir ${repopath}<br>+ cp $repo/*.deb $repo/*.dsc $repo/*.tar.*z $repopath/.<br>+ pushd $ws<br>+<br>+ # create the configuration file for 'reprepro' tool<br>+ confpath=$ws/conf<br>+ $rm_dir $confpath<br>+ $mk_dir $confpath<br>+<br>+ for loop_dist in $alldists ; do<br>+ cat <<EOF >>$confpath/distributions<br>+Origin: Tarantool<br>+Label: tarantool.org<br>+Suite: stable<br>+Codename: $loop_dist<br>+Architectures: amd64 source<br>+Components: $component<br>+Description: Tarantool DBMS and Tarantool modules<br>+SignWith: 91B625E5<br>+DebIndices: Packages Release . .gz .bz2<br>+UDebIndices: Packages . .gz .bz2<br>+DscIndices: Sources Release .gz .bz2<br>+<br>+EOF<br>+ done<br>+<br>+ # create standalone repository with separate components<br>+ for loop_dist in $alldists ; do<br>+ echo ================ DISTRIBUTION: $loop_dist ====================<br>+ updated_files=0<br>+<br>+ # 1(binaries). use reprepro tool to generate Packages file<br>+ for deb in $ws/$debdir/$loop_dist/$component/*/*/*.deb ; do<br>+ [ -f $deb ] || continue<br>+ updated_deb=0<br>+ # regenerate DEB pack<br>+ update_deb_packfile $deb deb $loop_dist<br>+ echo "Regenerated DEB file: $locpackfile"<br>+ for packages in dists/$loop_dist/$component/binary-*/Packages ; do<br>+ # copy Packages file to avoid of removing by the new DEB version<br>+ # update metadata 'Packages' files<br>+ update_deb_metadata $packages deb<br>+ [ "$updated_deb" == "1" ] || continue<br>+ updated_files=1<br>+ done<br>+ # save the registered DEB file to S3<br>+ if [ "$updated_deb" == 1 ]; then<br>+ $aws_cp_public $deb $bucket_path/$locpackfile<br>+ fi<br>+ done<br>+<br>+ # 1(sources). use reprepro tool to generate Sources file<br>+ for dsc in $ws/$debdir/$loop_dist/$component/*/*/*.dsc ; do<br>+ [ -f $dsc ] || continue<br>+ updated_dsc=0<br>+ # regenerate DSC pack<br>+ update_deb_packfile $dsc dsc $loop_dist<br>+ echo "Regenerated DSC file: $locpackfile"<br>+ # copy Sources file to avoid of removing by the new DSC version<br>+ # update metadata 'Sources' file<br>+ update_deb_metadata dists/$loop_dist/$component/source/Sources dsc<br>+ [ "$updated_dsc" == "1" ] || continue<br>+ updated_files=1<br>+ # save the registered DSC file to S3<br>+ $aws_cp_public $dsc $bucket_path/$locpackfile<br>+ tarxz=$(echo $locpackfile | sed 's#\.dsc$#.debian.tar.xz#g')<br>+ $aws_cp_public $ws/$tarxz "$bucket_path/$tarxz"<br>+ orig=$(echo $locpackfile | sed 's#-1\.dsc$#.orig.tar.xz#g')<br>+ $aws_cp_public $ws/$orig "$bucket_path/$orig"<br>+ done<br>+<br>+ # check if any DEB/DSC files were newly registered<br>+ [ "$updated_files" == "0" ] && \<br>+ continue || echo "Updating dists"<br>+<br>+ # finalize the Packages file<br>+ for packages in dists/$loop_dist/$component/binary-*/Packages ; do<br>+ mv $packages.saved $packages<br>+ done<br>+<br>+ # finalize the Sources file<br>+ sources=dists/$loop_dist/$component/source/Sources<br>+ mv $sources.saved $sources<br>+<br>+ # 2(binaries). update Packages file archives<br>+ for packpath in dists/$loop_dist/$component/binary-* ; do<br>+ pushd $packpath<br>+ sed "s#Filename: $debdir/$component/#Filename: $debdir/$loop_dist/$component/#g" -i Packages<br>+ bzip2 -c Packages >Packages.bz2<br>+ gzip -c Packages >Packages.gz<br>+ popd<br>+ done<br>+<br>+ # 2(sources). update Sources file archives<br>+ pushd dists/$loop_dist/$component/source<br>+ sed "s#Directory: $debdir/$component/#Directory: $debdir/$loop_dist/$component/#g" -i Sources<br>+ bzip2 -c Sources >Sources.bz2<br>+ gzip -c Sources >Sources.gz<br>+ popd<br>+<br>+ # 3. update checksums entries of the Packages* files in *Release files<br>+ # NOTE: it is stable structure of the *Release files when the checksum<br>+ # entries in it in the following way:<br>+ # MD5Sum:<br>+ # <checksum> <size> <file orig><br>+ # <checksum> <size> <file debian><br>+ # SHA1:<br>+ # <checksum> <size> <file orig><br>+ # <checksum> <size> <file debian><br>+ # SHA256:<br>+ # <checksum> <size> <file orig><br>+ # <checksum> <size> <file debian><br>+ # The script bellow puts 'md5' value at the 1st found file entry,<br>+ # 'sha1' - at the 2nd and 'sha256' at the 3rd<br>+ pushd dists/$loop_dist<br>+ for file in $(grep " $component/" Release | awk '{print $3}' | sort -u) ; do<br>+ sz=$(stat -c "%s" $file)<br>+ md5=$(md5sum $file | awk '{print $1}')<br>+ sha1=$(sha1sum $file | awk '{print $1}')<br>+ sha256=$(sha256sum $file | awk '{print $1}')<br>+ awk 'BEGIN{c = 0} ; {<br>+ if ($3 == p) {<br>+ c = c + 1<br>+ if (c == 1) {print " " md " " s " " p}<br>+ if (c == 2) {print " " sh1 " " s " " p}<br>+ if (c == 3) {print " " sh2 " " s " " p}<br>+ } else {print $0}<br>+ }' p="$file" s="$sz" md="$md5" sh1="$sha1" sh2="$sha256" \<br>+ Release >Release.new<br>+ mv Release.new Release<br>+ done<br>+ # resign the selfsigned InRelease file<br>+ $rm_file InRelease<br>+ gpg --clearsign -o InRelease Release<br>+ # resign the Release file<br>+ $rm_file Release.gpg<br>+ gpg -abs -o Release.gpg Release<br>+ popd<br>+<br>+ # 4. sync the latest distribution path changes to S3<br>+ $aws_sync_public dists/$loop_dist "$bucket_path/dists/$loop_dist"<br>+ done<br>+<br>+ # unlock the publishing<br>+ $rm_file $ws_lockfile<br>+<br>+ popd<br>+}<br>+<br>+# The 'pack_rpm' function especialy created for RPM packages. It works<br>+# with RPM packing OS like Centos, Fedora. It is based on globaly known<br>+# tool 'createrepo' from:<br>+# <a href="https://linux.die.net/man/8/createrepo" target="_blank">https://linux.die.net/man/8/createrepo</a><br>+# This tool works with single distribution of the given OS.<br>+# Result of the routine is the rpm package for YUM repository with<br>+# file structure equal to the Centos/Fedora:<br>+# <a href="http://mirror.centos.org/centos/7/os/x86_64/Packages/" target="_blank">http://mirror.centos.org/centos/7/os/x86_64/Packages/</a><br>+# <a href="http://mirrors.kernel.org/fedora/releases/30/Everything/x86_64/os/Packages/t/" target="_blank">http://mirrors.kernel.org/fedora/releases/30/Everything/x86_64/os/Packages/t/</a><br>+function pack_rpm {<br>+ if ! ls $repo/*.rpm >/dev/null ; then<br>+ echo "ERROR: Current '$repo' path doesn't have RPM packages in path"<br>+ usage<br>+ exit 1<br>+ fi<br>+<br>+ # prepare the workspace<br>+ prepare_ws ${os}_${option_dist}<br>+<br>+ # copy the needed package binaries to the workspace<br>+ cp $repo/*.rpm $ws/.<br>+<br>+ pushd $ws<br>+<br>+ # set the paths<br>+ if [ "$os" == "el" ]; then<br>+ repopath=$option_dist/os/x86_64<br>+ rpmpath=Packages<br>+ elif [ "$os" == "fedora" ]; then<br>+ repopath=releases/$option_dist/Everything/x86_64/os<br>+ rpmpath=Packages/$proddir<br>+ fi<br>+ packpath=$repopath/$rpmpath<br>+<br>+ # prepare local repository with packages<br>+ $mk_dir $packpath<br>+ mv *.rpm $packpath/.<br>+ cd $repopath<br>+<br>+ # copy the current metadata files from S3<br>+ mkdir repodata.base<br>+ for file in $($aws ls $bucket_path/$repopath/repodata/ | awk '{print $NF}') ; do<br>+ $aws ls $bucket_path/$repopath/repodata/$file || continue<br>+ $aws cp $bucket_path/$repopath/repodata/$file repodata.base/$file<br>+ done<br>+<br>+ # create the new repository metadata files<br>+ createrepo --no-database --update --workers=2 \<br>+ --compress-type=gz --simple-md-filenames .<br>+<br>+ updated_rpms=0<br>+ # loop by the new hashes from the new meta file<br>+ for hash in $(zcat repodata/other.xml.gz | grep "<package pkgid=" | \<br>+ awk -F'"' '{print $2}') ; do<br>+ updated_rpm=0<br>+ name=$(zcat repodata/other.xml.gz | grep "<package pkgid=\"$hash\"" | \<br>+ awk -F'"' '{print $4}')<br>+ # search the new hash in the old meta file from S3<br>+ if zcat repodata.base/filelists.xml.gz | grep "pkgid=\"$hash\"" | \<br>+ grep "name=\"$name\"" ; then<br>+ echo "WARNING: $name file already registered in S3!"<br>+ echo "File hash: $hash"<br>+ continue<br>+ fi<br>+ updated_rpms=1<br>+ # check if the hashed file already exists in old meta file from S3<br>+ file=$(zcat repodata/primary.xml.gz | \<br>+ grep -e "<checksum type=" -e "<location href=" | \<br>+ grep "$hash" -A1 | grep "<location href=" | \<br>+ awk -F'"' '{print $2}')<br>+ # check if the file already exists in S3<br>+ if [ "$force" == "" ] && zcat repodata.base/primary.xml.gz | \<br>+ grep "<location href=\"$file\"" ; then<br>+ echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"<br>+ echo "New hash: $hash"<br>+ # unlock the publishing<br>+ $rm_file $ws_lockfile<br>+ exit 1<br>+ fi<br>+ done<br>+<br>+ # check if any RPM files were newly registered<br>+ [ "$updated_rpms" == "0" ] && \<br>+ return || echo "Updating dists"<br>+<br>+ # move the repodata files to the standalone location<br>+ mv repodata repodata.adding<br>+<br>+ # merge metadata files<br>+ mkdir repodata<br>+ head -n 2 repodata.adding/repomd.xml >repodata/repomd.xml<br>+ for file in filelists.xml other.xml primary.xml ; do<br>+ # 1. take the 1st line only - to skip the line with<br>+ # number of packages which is not needed<br>+ zcat repodata.adding/$file.gz | head -n 1 >repodata/$file<br>+ # 2. take 2nd line with metadata tag and update<br>+ # the packages number in it<br>+ packsold=0<br>+ if [ -f repodata.base/$file.gz ] ; then<br>+ packsold=$(zcat repodata.base/$file.gz | head -n 2 | \<br>+ tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g')<br>+ fi<br>+ packsnew=$(zcat repodata.adding/$file.gz | head -n 2 | \<br>+ tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g')<br>+ packs=$(($packsold+$packsnew))<br>+ zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | \<br>+ sed "s#packages=\".*\"#packages=\"$packs\"#g" >>repodata/$file<br>+ # 3. take only 'package' tags from new file<br>+ zcat repodata.adding/$file.gz | tail -n +3 | head -n -1 \<br>+ >>repodata/$file<br>+ # 4. take only 'package' tags from old file if exists<br>+ if [ -f repodata.base/$file.gz ] ; then<br>+ zcat repodata.base/$file.gz | tail -n +3 | head -n -1 \<br>+ >>repodata/$file<br>+ fi<br>+ # 5. take the last closing line with metadata tag<br>+ zcat repodata.adding/$file.gz | tail -n 1 >>repodata/$file<br>+<br>+ # get the new data<br>+ chsnew=$(sha256sum repodata/$file | awk '{print $1}')<br>+ sz=$(stat --printf="%s" repodata/$file)<br>+ gzip repodata/$file<br>+ chsgznew=$(sha256sum repodata/$file.gz | awk '{print $1}')<br>+ szgz=$(stat --printf="%s" repodata/$file.gz)<br>+ timestamp=$(date +%s -r repodata/$file.gz)<br>+<br>+ # add info to repomd.xml file<br>+ name=$(echo $file | sed 's#\.xml$##g')<br>+ cat <<EOF >>repodata/repomd.xml<br>+<data type="$name"><br>+ <checksum type="sha256">$chsgznew</checksum><br>+ <open-checksum type="sha256">$chsnew</open-checksum><br>+ <location href="repodata/$file.gz"/><br>+ <timestamp>$timestamp</timestamp><br>+ <size>$szgz</size><br>+ <open-size>$sz</open-size><br>+</data>"<br>+EOF<br>+ done<br>+ tail -n 1 repodata.adding/repomd.xml >>repodata/repomd.xml<br>+ gpg --detach-sign --armor repodata/repomd.xml<br>+<br>+ # copy the packages to S3<br>+ for file in $rpmpath/*.rpm ; do<br>+ $aws_cp_public $file "$bucket_path/$repopath/$file"<br>+ done<br>+<br>+ # update the metadata at the S3<br>+ $aws_sync_public repodata "$bucket_path/$repopath/repodata"<br>+<br>+ # unlock the publishing<br>+ $rm_file $ws_lockfile<br>+<br>+ popd<br>+}<br>+<br>+if [ "$os" == "ubuntu" -o "$os" == "debian" ]; then<br>+ pack_deb<br>+elif [ "$os" == "el" -o "$os" == "fedora" ]; then<br>+ pack_rpm<br>+else<br>+ echo "USAGE: given OS '$os' is not supported, use any single from the list: $alloss"<br>+ usage<br>+ exit 1<br>+fi<br>--<br>2.17.1<br> </div></div></div></div></blockquote> <div> </div><div data-signature-widget="container"><div data-signature-widget="content"><div>--<br>Oleg Piskunov</div></div></div><div> </div></div></BODY></HTML>