[Tarantool-patches] [PATCH v6 1/2] gitlab-ci: implement packing into MCS S3

Alexander V. Tikhonov avtikhon at tarantool.org
Fri Jan 17 11:38:15 MSK 2020


Gitlab-CI rules changed to be able to create packages on branches
with ".*-full-ci$" pattern and additionaly deploy it to MCS S3 on
master branches in 'live' repository. Tagged in GIT commit will
be additionaly packed to 'release' repository.

Added ability to store packages at MCS S3 additionally to packages
storing at PackageCloud. The main difference that the repository
at MCS S3 has its native files structure as its OS distributions
have. Also the new repository doesn't prune the old packages and
stores all the saved packages for Tarantool and its modules for
each commit at all the release branches. Tarantool users may enable
the repository and fix certain version of Tarantool in dependencies
of an application packages.

For storing the DEB and RPM packages to MCS S3 was created
standalone script which creates/updates the packages repositories
with OS distribution native files structure style.

Common parts of the script are:

 - create new meta files for the new binaries
 - copy new binaries to MCS S3
 - get previous meta files from MCS S3 and merge the new meta data
   for the new binaries
 - update the meta files at MCS S3

Different parts:

 - DEB script part based on 'reprepro' external tool, also it works
   separately only on OS versions level - it means that meta data
   it updates for all distributions together.
 - RPM script part based on 'createrepo' external tool, also it works
   separately for OS/Release level - it means that meta data it
   updates for all releases separately.

Closes #3380

@TarantoolBot
Title: Update download instructions on the website

Need to update download instructions on the website, due to the new
repository based on MCS S3.
---

Github: https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci
Issue: https://github.com/tarantool/tarantool/issues/3380

v5: https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013636.html
v4: https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013568.html
v3: https://lists.tarantool.org/pipermail/tarantool-patches/2019-December/013060.html
v2: https://lists.tarantool.org/pipermail/tarantool-patches/2019-November/012352.html
v1: https://lists.tarantool.org/pipermail/tarantool-patches/2019-October/012021.html

Changes v6:
- implemented 2 MCS S3 repositories 'live' and 'release'
- added AWS and GPG keys into Gitlab-CI
- corrected commit message
- corrected return functionality code in script
- moved all changes for sources tarballs at the standalone patch set

Changes v5:
- code style
- commits squashed
- rebased to master

Changes v4:
- minor corrections

Changes v3:
- common code parts merged to standalone routines
- corrected code style, minor updates
- script is ready for release

Changes v2:
- made changes in script from draft to pre-release stages

 .gitlab-ci.yml       | 152 +++++++++++--
 .gitlab.mk           |  29 ++-
 tools/update_repo.sh | 520 +++++++++++++++++++++++++++++++++++++++++++
 3 files changed, 681 insertions(+), 20 deletions(-)
 create mode 100755 tools/update_repo.sh

diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml
index 3af5a3c8a..c68594c1a 100644
--- a/.gitlab-ci.yml
+++ b/.gitlab-ci.yml
@@ -10,6 +10,10 @@ variables:
   only:
     refs:
       - master
+
+.fullci_only_template: &fullci_only_definition
+  only:
+    refs:
       - /^.*-full-ci$/
 
 .docker_test_template: &docker_test_definition
@@ -24,13 +28,29 @@ variables:
   tags:
     - docker_test
 
+.pack_template: &pack_definition
+  <<: *fullci_only_definition
+  stage: test
+  tags:
+    - deploy
+  script:
+    - ${GITLAB_MAKE} package
+
+.pack_test_template: &pack_test_definition
+  <<: *fullci_only_definition
+  stage: test
+  tags:
+    - deploy_test
+  script:
+    - ${GITLAB_MAKE} package
+
 .deploy_template: &deploy_definition
   <<: *release_only_definition
   stage: test
   tags:
     - deploy
   script:
-    - ${GITLAB_MAKE} package
+    - ${GITLAB_MAKE} deploy
 
 .deploy_test_template: &deploy_test_definition
   <<: *release_only_definition
@@ -38,7 +58,7 @@ variables:
   tags:
     - deploy_test
   script:
-    - ${GITLAB_MAKE} package
+    - ${GITLAB_MAKE} deploy
 
 .vbox_template: &vbox_definition
   stage: test
@@ -141,96 +161,194 @@ freebsd_12_release:
 # Packs
 
 centos_6:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'el'
     DIST: '6'
 
 centos_7:
-  <<: *deploy_test_definition
+  <<: *pack_test_definition
   variables:
     OS: 'el'
     DIST: '7'
 
 centos_8:
-  <<: *deploy_test_definition
+  <<: *pack_test_definition
   variables:
     OS: 'el'
     DIST: '8'
 
 fedora_28:
-  <<: *deploy_test_definition
+  <<: *pack_test_definition
   variables:
     OS: 'fedora'
     DIST: '28'
 
 fedora_29:
-  <<: *deploy_test_definition
+  <<: *pack_test_definition
   variables:
     OS: 'fedora'
     DIST: '29'
 
 fedora_30:
-  <<: *deploy_test_definition
+  <<: *pack_test_definition
   variables:
     OS: 'fedora'
     DIST: '30'
 
 fedora_31:
-  <<: *deploy_test_definition
+  <<: *pack_test_definition
   variables:
     OS: 'fedora'
     DIST: '31'
 
 ubuntu_14_04:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'ubuntu'
     DIST: 'trusty'
 
 ubuntu_16_04:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'ubuntu'
     DIST: 'xenial'
 
 ubuntu_18_04:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'ubuntu'
     DIST: 'bionic'
 
 ubuntu_18_10:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'ubuntu'
     DIST: 'cosmic'
 
 ubuntu_19_04:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'ubuntu'
     DIST: 'disco'
 
 ubuntu_19_10:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'ubuntu'
     DIST: 'eoan'
 
 debian_8:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'debian'
     DIST: 'jessie'
 
 debian_9:
-  <<: *deploy_definition
+  <<: *pack_definition
   variables:
     OS: 'debian'
     DIST: 'stretch'
 
 debian_10:
+  <<: *pack_definition
+  variables:
+    OS: 'debian'
+    DIST: 'buster'
+
+# Deploy
+
+centos_6_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'el'
+    DIST: '6'
+
+centos_7_deploy:
+  <<: *deploy_test_definition
+  variables:
+    OS: 'el'
+    DIST: '7'
+
+centos_8_deploy:
+  <<: *deploy_test_definition
+  variables:
+    OS: 'el'
+    DIST: '8'
+
+fedora_28_deploy:
+  <<: *deploy_test_definition
+  variables:
+    OS: 'fedora'
+    DIST: '28'
+
+fedora_29_deploy:
+  <<: *deploy_test_definition
+  variables:
+    OS: 'fedora'
+    DIST: '29'
+
+fedora_30_deploy:
+  <<: *deploy_test_definition
+  variables:
+    OS: 'fedora'
+    DIST: '30'
+
+fedora_31_deploy:
+  <<: *deploy_test_definition
+  variables:
+    OS: 'fedora'
+    DIST: '31'
+
+ubuntu_14_04_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'ubuntu'
+    DIST: 'trusty'
+
+ubuntu_16_04_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'ubuntu'
+    DIST: 'xenial'
+
+ubuntu_18_04_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'ubuntu'
+    DIST: 'bionic'
+
+ubuntu_18_10_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'ubuntu'
+    DIST: 'cosmic'
+
+ubuntu_19_04_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'ubuntu'
+    DIST: 'disco'
+
+ubuntu_19_10_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'ubuntu'
+    DIST: 'eoan'
+
+debian_8_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'debian'
+    DIST: 'jessie'
+
+debian_9_deploy:
+  <<: *deploy_definition
+  variables:
+    OS: 'debian'
+    DIST: 'stretch'
+
+debian_10_deploy:
   <<: *deploy_definition
   variables:
     OS: 'debian'
diff --git a/.gitlab.mk b/.gitlab.mk
index 48a92e518..79d558708 100644
--- a/.gitlab.mk
+++ b/.gitlab.mk
@@ -98,14 +98,37 @@ vms_test_%:
 vms_shutdown:
 	VBoxManage controlvm ${VMS_NAME} poweroff
 
-# ########################
-# Build RPM / Deb packages
-# ########################
+# ###########################
+# Sources tarballs & packages
+# ###########################
+
+# Push alpha and beta versions to <major>x bucket (say, 2x),
+# stable to <major>.<minor> bucket (say, 2.2).
+GIT_DESCRIBE=$(shell git describe HEAD)
+MAJOR_VERSION=$(word 1,$(subst ., ,$(GIT_DESCRIBE)))
+MINOR_VERSION=$(word 2,$(subst ., ,$(GIT_DESCRIBE)))
+BUCKET=$(MAJOR_VERSION)_$(MINOR_VERSION)
+ifeq ($(MINOR_VERSION),0)
+BUCKET=$(MAJOR_VERSION)x
+endif
+ifeq ($(MINOR_VERSION),1)
+BUCKET=$(MAJOR_VERSION)x
+endif
 
 package: git_submodule_update
 	git clone https://github.com/packpack/packpack.git packpack
 	PACKPACK_EXTRA_DOCKER_RUN_PARAMS='--network=host' ./packpack/packpack
 
+deploy: package
+	for key in ${GPG_SECRET_KEY} ${GPG_PUBLIC_KEY} ; do \
+		echo $${key} | base64 -d | gpg --batch --import || true ; done
+	./tools/update_repo.sh -o=${OS} -d=${DIST} build
+	for tag in $$(git tag) ; do \
+			git describe --long $${tag} ; \
+		done | grep "^$$(git describe --long)$$" >/dev/null && \
+		./tools/update_repo.sh -o=${OS} -d=${DIST} \
+			-b='s3://tarantool_repo/release' build
+
 # ############
 # Static build
 # ############
diff --git a/tools/update_repo.sh b/tools/update_repo.sh
new file mode 100755
index 000000000..28b490fb7
--- /dev/null
+++ b/tools/update_repo.sh
@@ -0,0 +1,520 @@
+#!/bin/bash
+set -e
+
+rm_file='rm -f'
+rm_dir='rm -rf'
+mk_dir='mkdir -p'
+ws_prefix=/tmp/tarantool_repo_s3
+
+alloss='ubuntu debian el fedora'
+product=tarantool
+# the path with binaries either repository
+repo=.
+
+# AWS defines
+aws="aws --endpoint-url ${AWS_S3_ENDPOINT_URL} s3"
+bucket='s3://tarantool_repo/live'
+aws_cp_public="$aws cp --acl public-read"
+aws_sync_public="$aws sync --acl public-read"
+
+function get_os_dists {
+    os=$1
+    alldists=
+
+    if [ "$os" == "ubuntu" ]; then
+        alldists='trusty xenial bionic cosmic disco eoan'
+    elif [ "$os" == "debian" ]; then
+        alldists='jessie stretch buster'
+    elif [ "$os" == "el" ]; then
+        alldists='6 7 8'
+    elif [ "$os" == "fedora" ]; then
+        alldists='27 28 29 30 31'
+    fi
+
+    echo "$alldists"
+}
+
+function prepare_ws {
+    # temporary lock the publication to the repository
+    ws_suffix=$1
+    ws=${ws_prefix}_${ws_suffix}
+    ws_lockfile=${ws}.lock
+    if [ -f $ws_lockfile ]; then
+        old_proc=$(cat $ws_lockfile)
+    fi
+    lockfile -l 60 $ws_lockfile
+    chmod u+w $ws_lockfile && echo $$ >$ws_lockfile && chmod u-w $ws_lockfile
+    if [ "$old_proc" != ""  -a "$old_proc" != "0" ]; then
+        kill -9 $old_proc >/dev/null || true
+    fi
+
+    # create temporary workspace with repository copy
+    $rm_dir $ws
+    $mk_dir $ws
+}
+
+function usage {
+    cat <<EOF
+Usage for store package binaries from the given path:
+    $0 -o=<OS name> -d=<OS distribuition> [-b=<S3 bucket>] [-p=<product>] <path to package binaries>
+
+Usage for mirroring Debian|Ubuntu OS repositories:
+    $0 -o=<OS name> [-b=<S3 bucket>] [-p=<product>] <path to 'pool' subdirectory with packages repository>
+
+Arguments:
+    <path>
+         Path points to the directory with deb/prm packages to be used.
+         Script can be used in one of 2 modes:
+          - path with binaries packages for a single distribution
+          - path with 'pool' subdirectory with APT repository (only: debian|ubuntu), like:
+                /var/spool/apt-mirror/mirror/packagecloud.io/$product/$os
+
+Options:
+    -b|--bucket
+        MCS S3 bucket already existing which will be used for storing the packages, by default bucket is:
+            $bucket
+    -o|--os
+        OS to be checked, one of the list:
+            $alloss
+    -d|--distribution
+        Distribution appropriate to the given OS:
+EOF
+    for os in $alloss ; do
+        echo "            $os: <"$(get_os_dists $os)">"
+    done
+    cat <<EOF
+    -p|--product
+         Product name to be packed with, default name is 'tarantool'
+    -h|--help
+         Usage help message
+EOF
+}
+
+for i in "$@"
+do
+case $i in
+    -b=*|--bucket=*)
+    bucket="${i#*=}"
+    shift # past argument=value
+    ;;
+    -o=*|--os=*)
+    os="${i#*=}"
+    if ! echo $alloss | grep -F -q -w $os ; then
+        echo "ERROR: OS '$os' is not supported"
+        usage
+        exit 1
+    fi
+    shift # past argument=value
+    ;;
+    -d=*|--distribution=*)
+    option_dist="${i#*=}"
+    shift # past argument=value
+    ;;
+    -p=*|--product=*)
+    product="${i#*=}"
+    shift # past argument=value
+    ;;
+    -h|--help)
+    usage
+    exit 0
+    ;;
+    *)
+    repo="${i#*=}"
+    pushd $repo >/dev/null ; repo=$PWD ; popd >/dev/null
+    shift # past argument=value
+    ;;
+esac
+done
+
+# check that all needed options were set and correct
+if ! $aws ls $bucket >/dev/null ; then
+    echo "ERROR: bucket '$bucket' is not found"
+    usage
+    exit 1
+fi
+if [ "$os" == "" ]; then
+    echo "ERROR: need to set -o|--os OS name option, check usage"
+    usage
+    exit 1
+fi
+alldists=$(get_os_dists $os)
+if [ -n "$option_dist" ] && ! echo $alldists | grep -F -q -w $option_dist ; then
+    echo "ERROR: set distribution at options '$option_dist' not found at supported list '$alldists'"
+    usage
+    exit 1
+fi
+
+# set the subpath with binaries based on literal character of the product name
+proddir=$(echo $product | head -c 1)
+
+# set bucket path of the given OS in options
+bucket_path="$bucket/$os"
+
+function update_deb_packfile {
+    packfile=$1
+    packtype=$2
+    update_dist=$3
+
+    locpackfile=$(echo $packfile | sed "s#^$ws/##g")
+    # register DEB/DSC pack file to Packages/Sources file
+    reprepro -Vb . include$packtype $update_dist $packfile
+    # reprepro copied DEB/DSC file to component which is not needed
+    $rm_dir $debdir/$component
+    # to have all sources avoid reprepro set DEB/DSC file to its own registry
+    $rm_dir db
+}
+
+function update_deb_metadata {
+    packpath=$1
+    packtype=$2
+
+    if [ ! -f $packpath.saved ] ; then
+        # get the latest Sources file from S3
+        $aws ls "$bucket_path/$packpath" >/dev/null 2>&1 && \
+            $aws cp "$bucket_path/$packpath" $packpath.saved || \
+            touch $packpath.saved
+    fi
+
+    if [ "$packtype" == "dsc" ]; then
+        # WORKAROUND: unknown why, but reprepro doesn`t save the Sources file
+        gunzip -c $packpath.gz >$packpath
+        # check if the DSC file already exists in Sources from S3
+        hash=$(grep '^Checksums-Sha256:' -A3 $packpath | \
+            tail -n 1 | awk '{print $1}')
+        if grep " $hash .*$" $packpath.saved ; then
+            echo "WARNING: DSC file already registered in S3!"
+            return
+        fi
+        updated_dsc=1
+    elif [ "$packtype" == "deb" ]; then
+        # check if the DEB file already exists in Packages from S3
+        if grep "^$(grep '^SHA256: ' $packages)$" $packages.saved ; then
+            echo "WARNING: DEB file already registered in S3!"
+            return
+        fi
+        updated_deb=1
+    fi
+    # store the new DEB entry
+    cat $packpath >>$packpath.saved
+}
+
+# The 'pack_deb' function especialy created for DEB packages. It works
+# with DEB packing OS like Ubuntu, Debian. It is based on globaly known
+# tool 'reprepro' from:
+#     https://wiki.debian.org/DebianRepository/SetupWithReprepro
+# This tool works with complete number of distributions of the given OS.
+# Result of the routine is the debian package for APT repository with
+# file structure equal to the Debian/Ubuntu:
+#     http://ftp.am.debian.org/debian/pool/main/t/tarantool/
+#     http://ftp.am.debian.org/ubuntu/pool/main/t/
+function pack_deb {
+    # we need to push packages into 'main' repository only
+    component=main
+
+    # debian has special directory 'pool' for packages
+    debdir=pool
+
+    # get packages from pointed location
+    if [ ! -d $repo/$debdir ] && \
+        ( [ "$option_dist" == "" ] || \
+            ! ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null ) ; then
+        echo "ERROR: Current '$repo' path doesn't have any of the following:"
+        echo " - $0 run option '-d' and DEB packages in path"
+        echo " - 'pool' subdirectory with APT repositories"
+        usage
+        exit 1
+    fi
+
+    # prepare the workspace
+    prepare_ws ${os}
+
+    # script works in one of 2 modes:
+    # - path with binaries packages for a single distribution
+    # - path with 'pool' directory with APT repository
+    if [ "$option_dist" != "" ] && \
+            ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null 2>&1 ; then
+        # copy single distribution with binaries packages
+        repopath=$ws/pool/${option_dist}/$component/$proddir/$product
+        $mk_dir ${repopath}
+        cp $repo/*.deb $repo/*.dsc $repo/*.tar.*z $repopath/.
+    elif [ -d $repo/$debdir ]; then
+        # copy 'pool' directory with APT repository
+        cp -rf $repo/$debdir $ws/.
+    else
+        echo "ERROR: neither distribution option '-d' with files $repo/*.deb $repo/*.dsc $repo/*.tar.*z set nor '$repo/$debdir' path found"
+        usage
+        $rm_file $wslock
+        exit 1
+    fi
+
+    pushd $ws
+
+    # create the configuration file for 'reprepro' tool
+    confpath=$ws/conf
+    $rm_dir $confpath
+    $mk_dir $confpath
+
+    for loop_dist in $alldists ; do
+        cat <<EOF >>$confpath/distributions
+Origin: Tarantool
+Label: tarantool.org
+Suite: stable
+Codename: $loop_dist
+Architectures: amd64 source
+Components: $component
+Description: Tarantool DBMS and Tarantool modules
+SignWith: 91B625E5
+DebIndices: Packages Release . .gz .bz2
+UDebIndices: Packages . .gz .bz2
+DscIndices: Sources Release .gz .bz2
+
+EOF
+    done
+
+    # create standalone repository with separate components
+    for loop_dist in $alldists ; do
+        echo ================ DISTRIBUTION: $loop_dist ====================
+        updated_files=0
+
+        # 1(binaries). use reprepro tool to generate Packages file
+        for deb in $ws/$debdir/$loop_dist/$component/*/*/*.deb ; do
+            [ -f $deb ] || continue
+            updated_deb=0
+            # regenerate DEB pack
+            update_deb_packfile $deb deb $loop_dist
+            echo "Regenerated DEB file: $locpackfile"
+            for packages in dists/$loop_dist/$component/binary-*/Packages ; do
+                # copy Packages file to avoid of removing by the new DEB version
+                # update metadata 'Packages' files
+                update_deb_metadata $packages deb
+                [ "$updated_deb" == "1" ] || continue
+                updated_files=1
+            done
+            # save the registered DEB file to S3
+            if [ "$updated_deb" == 1 ]; then
+                $aws_cp_public $deb $bucket_path/$locpackfile
+            fi
+        done
+
+        # 1(sources). use reprepro tool to generate Sources file
+        for dsc in $ws/$debdir/$loop_dist/$component/*/*/*.dsc ; do
+            [ -f $dsc ] || continue
+            updated_dsc=0
+            # regenerate DSC pack
+            update_deb_packfile $dsc dsc $loop_dist
+            echo "Regenerated DSC file: $locpackfile"
+            # copy Sources file to avoid of removing by the new DSC version
+            # update metadata 'Sources' file
+            update_deb_metadata dists/$loop_dist/$component/source/Sources dsc
+            [ "$updated_dsc" == "1" ] || continue
+            updated_files=1
+            # save the registered DSC file to S3
+            $aws_cp_public $dsc $bucket_path/$locpackfile
+            tarxz=$(echo $locpackfile | sed 's#\.dsc$#.debian.tar.xz#g')
+            $aws_cp_public $ws/$tarxz "$bucket_path/$tarxz"
+            orig=$(echo $locpackfile | sed 's#-1\.dsc$#.orig.tar.xz#g')
+            $aws_cp_public $ws/$orig "$bucket_path/$orig"
+        done
+
+        # check if any DEB/DSC files were newly registered
+        [ "$updated_files" == "0" ] && \
+            continue || echo "Updating dists"
+
+        # finalize the Packages file
+        for packages in dists/$loop_dist/$component/binary-*/Packages ; do
+            mv $packages.saved $packages
+        done
+
+        # 2(binaries). update Packages file archives
+        for packpath in dists/$loop_dist/$component/binary-* ; do
+            pushd $packpath
+            sed "s#Filename: $debdir/$component/#Filename: $debdir/$loop_dist/$component/#g" -i Packages
+            bzip2 -c Packages >Packages.bz2
+            gzip -c Packages >Packages.gz
+            popd
+        done
+
+        # 2(sources). update Sources file archives
+        pushd dists/$loop_dist/$component/source
+        sed "s#Directory: $debdir/$component/#Directory: $debdir/$loop_dist/$component/#g" -i Sources
+        bzip2 -c Sources >Sources.bz2
+        gzip -c Sources >Sources.gz
+        popd
+
+        # 3. update checksums entries of the Packages* files in *Release files
+        # NOTE: it is stable structure of the *Release files when the checksum
+        #       entries in it in the following way:
+        # MD5Sum:
+        #  <checksum> <size> <file orig>
+        #  <checksum> <size> <file debian>
+        # SHA1:
+        #  <checksum> <size> <file orig>
+        #  <checksum> <size> <file debian>
+        # SHA256:
+        #  <checksum> <size> <file orig>
+        #  <checksum> <size> <file debian>
+        #       The script bellow puts 'md5' value at the 1st found file entry,
+        #       'sha1' - at the 2nd and 'sha256' at the 3rd
+        pushd dists/$loop_dist
+        for file in $(grep " $component/" Release | awk '{print $3}' | sort -u) ; do
+            sz=$(stat -c "%s" $file)
+            md5=$(md5sum $file | awk '{print $1}')
+            sha1=$(sha1sum $file | awk '{print $1}')
+            sha256=$(sha256sum $file | awk '{print $1}')
+            awk 'BEGIN{c = 0} ; {
+                if ($3 == p) {
+                    c = c + 1
+                    if (c == 1) {print " " md  " " s " " p}
+                    if (c == 2) {print " " sh1 " " s " " p}
+                    if (c == 3) {print " " sh2 " " s " " p}
+                } else {print $0}
+            }' p="$file" s="$sz" md="$md5" sh1="$sha1" sh2="$sha256" \
+                    Release >Release.new
+            mv Release.new Release
+        done
+        # resign the selfsigned InRelease file
+        $rm_file InRelease
+        gpg --clearsign -o InRelease Release
+        # resign the Release file
+        $rm_file Release.gpg
+        gpg -abs -o Release.gpg Release
+        popd
+
+        # 4. sync the latest distribution path changes to S3
+        $aws_sync_public dists/$loop_dist "$bucket_path/dists/$loop_dist"
+    done
+
+    # unlock the publishing
+    $rm_file $ws_lockfile
+
+    popd
+}
+
+# The 'pack_rpm' function especialy created for RPM packages. It works
+# with RPM packing OS like Centos, Fedora. It is based on globaly known
+# tool 'createrepo' from:
+#     https://linux.die.net/man/8/createrepo
+# This tool works with single distribution of the given OS.
+# Result of the routine is the rpm package for YUM repository with
+# file structure equal to the Centos/Fedora:
+#     http://mirror.centos.org/centos/7/os/x86_64/Packages/
+#     http://mirrors.kernel.org/fedora/releases/30/Everything/x86_64/os/Packages/t/
+function pack_rpm {
+    if ! ls $repo/*.rpm >/dev/null ; then
+        echo "ERROR: Current '$repo' path doesn't have RPM packages in path"
+        usage
+        exit 1
+    fi
+
+    # prepare the workspace
+    prepare_ws ${os}_${option_dist}
+
+    # copy the needed package binaries to the workspace
+    cp $repo/*.rpm $ws/.
+
+    pushd $ws
+
+    # set the paths
+    if [ "$os" == "el" ]; then
+        repopath=$option_dist/os/x86_64
+        rpmpath=Packages
+    elif [ "$os" == "fedora" ]; then
+        repopath=releases/$option_dist/Everything/x86_64/os
+        rpmpath=Packages/$proddir
+    fi
+    packpath=$repopath/$rpmpath
+
+    # prepare local repository with packages
+    $mk_dir $packpath
+    mv *.rpm $packpath/.
+    cd $repopath
+
+    # copy the current metadata files from S3
+    mkdir repodata.base
+    for file in $($aws ls $bucket_path/$repopath/repodata/ | awk '{print $NF}') ; do
+        $aws ls $bucket_path/$repopath/repodata/$file || continue
+        $aws cp $bucket_path/$repopath/repodata/$file repodata.base/$file
+    done
+
+    # create the new repository metadata files
+    createrepo --no-database --update --workers=2 \
+        --compress-type=gz --simple-md-filenames .
+    mv repodata repodata.adding
+
+    # merge metadata files
+    mkdir repodata
+    head -n 2 repodata.adding/repomd.xml >repodata/repomd.xml
+    for file in filelists.xml other.xml primary.xml ; do
+        # 1. take the 1st line only - to skip the line with
+        #    number of packages which is not needed
+        zcat repodata.adding/$file.gz | head -n 1 >repodata/$file
+        # 2. take 2nd line with metadata tag and update
+        #    the packages number in it
+        packsold=0
+        if [ -f repodata.base/$file.gz ] ; then
+            packsold=$(zcat repodata.base/$file.gz | head -n 2 | \
+                tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g')
+        fi
+        packsnew=$(zcat repodata.adding/$file.gz | head -n 2 | \
+            tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g')
+        packs=$(($packsold+$packsnew))
+        zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | \
+            sed "s#packages=\".*\"#packages=\"$packs\"#g" >>repodata/$file
+        # 3. take only 'package' tags from new file
+        zcat repodata.adding/$file.gz | tail -n +3 | head -n -1 \
+            >>repodata/$file
+        # 4. take only 'package' tags from old file if exists
+        if [ -f repodata.base/$file.gz ] ; then
+            zcat repodata.base/$file.gz | tail -n +3 | head -n -1 \
+                >>repodata/$file
+        fi
+        # 5. take the last closing line with metadata tag
+        zcat repodata.adding/$file.gz | tail -n 1 >>repodata/$file
+
+        # get the new data
+        chsnew=$(sha256sum repodata/$file | awk '{print $1}')
+        sz=$(stat --printf="%s" repodata/$file)
+        gzip repodata/$file
+        chsgznew=$(sha256sum repodata/$file.gz | awk '{print $1}')
+        szgz=$(stat --printf="%s" repodata/$file.gz)
+        timestamp=$(date +%s -r repodata/$file.gz)
+
+        # add info to repomd.xml file
+        name=$(echo $file | sed 's#\.xml$##g')
+        cat <<EOF >>repodata/repomd.xml
+<data type="$name">
+  <checksum type="sha256">$chsgznew</checksum>
+  <open-checksum type="sha256">$chsnew</open-checksum>
+  <location href="repodata/$file.gz"/>
+  <timestamp>$timestamp</timestamp>
+  <size>$szgz</size>
+  <open-size>$sz</open-size>
+</data>"
+EOF
+    done
+    tail -n 1 repodata.adding/repomd.xml >>repodata/repomd.xml
+    gpg --detach-sign --armor repodata/repomd.xml
+
+    # copy the packages to S3
+    for file in $rpmpath/*.rpm ; do
+        $aws_cp_public $file "$bucket_path/$repopath/$file"
+    done
+
+    # update the metadata at the S3
+    $aws_sync_public repodata "$bucket_path/$repopath/repodata"
+
+    # unlock the publishing
+    $rm_file $ws_lockfile
+
+    popd
+}
+
+if [ "$os" == "ubuntu" -o "$os" == "debian" ]; then
+    pack_deb
+elif [ "$os" == "el" -o "$os" == "fedora" ]; then
+    pack_rpm
+else
+    echo "USAGE: given OS '$os' is not supported, use any single from the list: $alloss"
+    usage
+    exit 1
+fi
-- 
2.17.1



More information about the Tarantool-patches mailing list