Tarantool development patches archive
 help / color / mirror / Atom feed
* [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4
@ 2020-01-13 12:04 Alexander V. Tikhonov
  2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 1/3] gitlab-ci: implement packing into MCS S3 Alexander V. Tikhonov
                   ` (3 more replies)
  0 siblings, 4 replies; 7+ messages in thread
From: Alexander V. Tikhonov @ 2020-01-13 12:04 UTC (permalink / raw)
  To: Igor Munkin; +Cc: tarantool-patches

Github: https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci
Issue: https://github.com/tarantool/tarantool/issues/3380

v4: https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013568.html
v3: https://lists.tarantool.org/pipermail/tarantool-patches/2019-December/013060.html
v2: https://lists.tarantool.org/pipermail/tarantool-patches/2019-November/012352.html
v1: https://lists.tarantool.org/pipermail/tarantool-patches/2019-October/012021.html

Changes v4:
- minor corrections

Changes v3:
- common code parts merged to standalone routines
- corrected code style, minor updates
- script is ready for release

Changes v2:
- made changes in script from draft to pre-release stages

Alexander V. Tikhonov (3):
  gitlab-ci: implement packing into MCS S3
  Testing changes after 2nd review on S3
  Testing changes after 3rd review on S3

 .gitlab-ci.yml            |   5 +-
 .gitlab.mk                |  20 +-
 .travis.mk                |  41 ++-
 tools/add_pack_s3_repo.sh | 527 ++++++++++++++++++++++++++++++++++++++
 4 files changed, 567 insertions(+), 26 deletions(-)
 create mode 100755 tools/add_pack_s3_repo.sh

-- 
2.17.1

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [Tarantool-patches] [PATCH v4 1/3] gitlab-ci: implement packing into MCS S3
  2020-01-13 12:04 [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Alexander V. Tikhonov
@ 2020-01-13 12:04 ` Alexander V. Tikhonov
  2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 2/3] Testing changes after 2nd review on S3 Alexander V. Tikhonov
                   ` (2 subsequent siblings)
  3 siblings, 0 replies; 7+ messages in thread
From: Alexander V. Tikhonov @ 2020-01-13 12:04 UTC (permalink / raw)
  To: Igor Munkin; +Cc: tarantool-patches

Added ability to store packages additionaly at MCS S3.
The target idea was to add the new way of packages creation at MCS S3,
which temporary duplicates the packaging at PackageCloud by Packpack
tool. Also it was needed to pack the binaries in the native way of the
each packing OS style. It was created standalone script for adding the
packages binaries/sources to MCS S3 at DEB either RPM repositories:
'tools/add_pack_s3_repo.sh'
Common parts of the script are:
 - create new meta files for the new binaries
 - copy new binaries to MCS S3
 - get previous metafiles from MCS S3 and merge the new meta data for
the new binaries
 - update the meta files at MCS S3
Different parts:
 - DEB script part based on reprepro external tool, also it works separately
only on OS versions level - it means that meta data it updates for all
Distritbutions together.
 - RPM script part based on createrepo external tool, also it works
separately for OS/Release level - it means that meta data it updates
for all Releases separately.

Closes #3380
---
 .gitlab-ci.yml            |   5 +-
 .gitlab.mk                |  20 +-
 .travis.mk                |  41 ++--
 tools/add_pack_s3_repo.sh | 493 ++++++++++++++++++++++++++++++++++++++
 4 files changed, 533 insertions(+), 26 deletions(-)
 create mode 100755 tools/add_pack_s3_repo.sh

diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml
index 3af5a3c8a..9dc7cfe1c 100644
--- a/.gitlab-ci.yml
+++ b/.gitlab-ci.yml
@@ -237,7 +237,10 @@ debian_10:
     DIST: 'buster'
 
 static_build:
-  <<: *deploy_test_definition
+  <<: *release_only_definition
+  stage: test
+  tags:
+    - deploy_test
   variables:
     RUN_TESTS: 'ON'
   script:
diff --git a/.gitlab.mk b/.gitlab.mk
index 48a92e518..64664c64f 100644
--- a/.gitlab.mk
+++ b/.gitlab.mk
@@ -98,13 +98,27 @@ vms_test_%:
 vms_shutdown:
 	VBoxManage controlvm ${VMS_NAME} poweroff
 
-# ########################
-# Build RPM / Deb packages
-# ########################
+# ###########################
+# Sources tarballs & packages
+# ###########################
+
+# Push alpha and beta versions to <major>x bucket (say, 2x),
+# stable to <major>.<minor> bucket (say, 2.2).
+GIT_DESCRIBE=$(shell git describe HEAD)
+MAJOR_VERSION=$(word 1,$(subst ., ,$(GIT_DESCRIBE)))
+MINOR_VERSION=$(word 2,$(subst ., ,$(GIT_DESCRIBE)))
+BUCKET=$(MAJOR_VERSION)_$(MINOR_VERSION)
+ifeq ($(MINOR_VERSION),0)
+BUCKET=$(MAJOR_VERSION)x
+endif
+ifeq ($(MINOR_VERSION),1)
+BUCKET=$(MAJOR_VERSION)x
+endif
 
 package: git_submodule_update
 	git clone https://github.com/packpack/packpack.git packpack
 	PACKPACK_EXTRA_DOCKER_RUN_PARAMS='--network=host' ./packpack/packpack
+	./tools/add_pack_s3_repo.sh -b=${BUCKET} -o=${OS} -d=${DIST} build
 
 # ############
 # Static build
diff --git a/.travis.mk b/.travis.mk
index 42969ff56..a85f71ced 100644
--- a/.travis.mk
+++ b/.travis.mk
@@ -8,10 +8,6 @@ MAX_FILES?=65534
 
 all: package
 
-package:
-	git clone https://github.com/packpack/packpack.git packpack
-	./packpack/packpack
-
 test: test_$(TRAVIS_OS_NAME)
 
 # Redirect some targets via docker
@@ -176,34 +172,35 @@ test_freebsd_no_deps: build_freebsd
 
 test_freebsd: deps_freebsd test_freebsd_no_deps
 
-####################
-# Sources tarballs #
-####################
-
-source:
-	git clone https://github.com/packpack/packpack.git packpack
-	TARBALL_COMPRESSOR=gz packpack/packpack tarball
+###############################
+# Sources tarballs & packages #
+###############################
 
 # Push alpha and beta versions to <major>x bucket (say, 2x),
 # stable to <major>.<minor> bucket (say, 2.2).
-ifeq ($(TRAVIS_BRANCH),master)
 GIT_DESCRIBE=$(shell git describe HEAD)
 MAJOR_VERSION=$(word 1,$(subst ., ,$(GIT_DESCRIBE)))
 MINOR_VERSION=$(word 2,$(subst ., ,$(GIT_DESCRIBE)))
-else
-MAJOR_VERSION=$(word 1,$(subst ., ,$(TRAVIS_BRANCH)))
-MINOR_VERSION=$(word 2,$(subst ., ,$(TRAVIS_BRANCH)))
-endif
-BUCKET=tarantool.$(MAJOR_VERSION).$(MINOR_VERSION).src
+BUCKET=$(MAJOR_VERSION)_$(MINOR_VERSION)
 ifeq ($(MINOR_VERSION),0)
-BUCKET=tarantool.$(MAJOR_VERSION)x.src
+BUCKET=$(MAJOR_VERSION)x
 endif
 ifeq ($(MINOR_VERSION),1)
-BUCKET=tarantool.$(MAJOR_VERSION)x.src
+BUCKET=$(MAJOR_VERSION)x
 endif
 
+packpack_prepare:
+	git clone https://github.com/packpack/packpack.git packpack
+
+package: packpack_prepare
+	./packpack/packpack
+
+source: packpack_prepare
+	TARBALL_COMPRESSOR=gz packpack/packpack tarball
+
 source_deploy:
 	pip install awscli --user
-	aws --endpoint-url "${AWS_S3_ENDPOINT_URL}" s3 \
-		cp build/*.tar.gz "s3://${BUCKET}/" \
-		--acl public-read
+	for tarball in `ls build/*.tar.gz 2>/dev/null` ; \
+		aws --endpoint-url "${AWS_S3_ENDPOINT_URL}" s3 \
+			cp ${tarball} "s3://tarantool_repo/${BUCKET}/sources/" \
+			--acl public-read
diff --git a/tools/add_pack_s3_repo.sh b/tools/add_pack_s3_repo.sh
new file mode 100755
index 000000000..2316b9015
--- /dev/null
+++ b/tools/add_pack_s3_repo.sh
@@ -0,0 +1,493 @@
+#!/bin/bash
+set -e
+
+rm_file='rm -f'
+rm_dir='rm -rf'
+mk_dir='mkdir -p'
+
+alloss='ubuntu debian centos el fedora'
+
+get_os_dists()
+{
+    os=$1
+    alldists=
+
+    if [ "$os" == "ubuntu" ]; then
+        alldists='trusty xenial cosmic disco bionic eoan'
+    elif [ "$os" == "debian" ]; then
+        alldists='jessie stretch buster'
+    elif [ "$os" == "centos" -o "$os" == "el" ]; then
+        alldists='6 7 8'
+    elif [ "$os" == "fedora" ]; then
+        alldists='27 28 29 30 31'
+    fi
+
+    echo "$alldists"
+}
+
+ws_prefix=/tmp/tarantool_repo_s3
+create_lockfile()
+{
+    lockfile -l 1000 $1
+}
+
+usage()
+{
+    cat <<EOF
+Usage: $0 -b <S3 bucket> -o <OS name> -d <OS distribuition> [-p <product>] <path>
+Options:
+    -b|--bucket
+        MCS S3 bucket which will be used for storing the packages.
+        Name of the package one of the appropriate Tarantool branch:
+            master: 2x
+            2.3: 2_3
+            2.2: 2_2
+            1.10: 1_10
+    -o|--os
+        OS to be checked, one of the list (NOTE: centos == el):
+            $alloss
+    -d|--distribution
+        Distribution appropriate to the given OS:
+EOF
+    for os in $alloss ; do
+        echo "            $os: <"`get_os_dists $os`">"
+    done
+    cat <<EOF
+    -p|--product
+         Product name to be packed with, default name is 'tarantool'
+    -h|--help
+         Usage help message
+    <path>
+         Path points to the directory with deb/prm packages to be used.
+         Script can be used in one of 2 modes:
+          - path with binaries packages for a single distribution
+	  - path with 'pool' directory with APT repository (only: debian|ubuntu)
+EOF
+}
+
+for i in "$@"
+do
+case $i in
+    -b=*|--bucket=*)
+    branch="${i#*=}"
+    if [ "$branch" != "2x" -a "$branch" != "2_3" -a "$branch" != "2_2" -a "$branch" != "1_10" ]; then
+        echo "ERROR: bucket '$branch' is not supported"
+        usage
+        exit 1
+    fi
+    shift # past argument=value
+    ;;
+    -o=*|--os=*)
+    os="${i#*=}"
+    if [ "$os" == "el" ]; then
+        os=centos
+    fi
+    if ! echo $alloss | grep -F -q -w $os ; then
+        echo "ERROR: OS '$os' is not supported"
+        usage
+        exit 1
+    fi
+    shift # past argument=value
+    ;;
+    -d=*|--distribution=*)
+    DIST="${i#*=}"
+    shift # past argument=value
+    ;;
+    -p=*|--product=*)
+    product="${i#*=}"
+    shift # past argument=value
+    ;;
+    -h|--help)
+    usage
+    exit 0
+    ;;
+    *)
+    repo="${i#*=}"
+    pushd $repo >/dev/null ; repo=$PWD ; popd >/dev/null
+    shift # past argument=value
+    ;;
+esac
+done
+
+# check that all needed options were set
+if [ "$branch" == "" ]; then
+    echo "ERROR: need to set -b|--bucket bucket option, check usage"
+    usage
+    exit 1
+fi
+if [ "$os" == "" ]; then
+    echo "ERROR: need to set -o|--os OS name option, check usage"
+    usage
+    exit 1
+fi
+alldists=`get_os_dists $os`
+if [ -n "$DIST" ] && ! echo $alldists | grep -F -q -w $DIST ; then
+    echo "ERROR: set distribution at options '$DIST' not found at supported list '$alldists'"
+    usage
+    exit 1
+fi
+
+# set the path with binaries
+product=$product
+if [ "$product" == "" ]; then
+    product=tarantool
+fi
+proddir=`echo $product | head -c 1`
+
+# set the path with binaries
+if [ "$repo" == "" ]; then
+    repo=.
+fi
+
+aws='aws --endpoint-url https://hb.bizmrg.com'
+s3="s3://tarantool_repo/$branch/$os"
+
+# The 'pack_deb' function especialy created for DEB packages. It works
+# with DEB packing OS like Ubuntu, Debian. It is based on globaly known
+# tool 'reprepro' from:
+#     https://wiki.debian.org/DebianRepository/SetupWithReprepro
+# This tool works with complete number of distributions of the given OS.
+# Result of the routine is the debian package for APT repository with
+# file structure equal to the Debian/Ubuntu:
+#     http://ftp.am.debian.org/debian/pool/main/t/tarantool/
+#     http://ftp.am.debian.org/ubuntu/pool/main/t/
+function pack_deb {
+    # we need to push packages into 'main' repository only
+    component=main
+
+    # debian has special directory 'pool' for packages
+    debdir=pool
+
+    # get packages from pointed location either mirror path
+    if [ "$repo" == "" ] ; then
+        repo=/var/spool/apt-mirror/mirror/packagecloud.io/tarantool/$branch/$os
+    fi
+    if [ ! -d $repo/$debdir ] && ( [ "$DIST" == "" ] || ! ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null 2>&1 ) ; then
+        echo "ERROR: Current '$repo' path doesn't have any of the following:"
+        echo "Usage with set distribuition with option '-d' and packages: $0 [path with *.deb *.dsc *.tar.*z files]"
+        echo "Usage with repositories: $0 [path to repository with '$debdir' subdirectory]"
+        exit 1
+    fi
+
+    # temporary lock the publication to the repository
+    ws=${ws_prefix}_${branch}_${os}
+    ws_lockfile=${ws}.lock
+    create_lockfile $ws_lockfile
+
+    # create temporary workspace with repository copy
+    $rm_dir $ws
+    $mk_dir $ws
+
+    # script works in one of 2 modes:
+    # - path with binaries packages for a single distribution
+    # - path with 'pool' directory with APT repository
+    if [ "$DIST" != "" ] && ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null 2>&1 ; then
+        # copy single distribution with binaries packages
+        repopath=$ws/pool/${DIST}/main/$proddir/$product
+        $mk_dir ${repopath}
+        cp $repo/*.deb $repo/*.dsc $repo/*.tar.*z $repopath/.
+    elif [ -d $repo/$debdir ]; then
+        # copy 'pool' directory with APT repository
+        cp -rf $repo/$debdir $ws/.
+    else
+        echo "ERROR: neither distribution option '-d' with files $repo/*.deb $repo/*.dsc $repo/*.tar.*z set nor '$repo/$debdir' path found"
+	usage
+	$rm_file $wslock
+	exit 1
+    fi
+    cd $ws
+
+    # create the configuration file for 'reprepro' tool
+    confpath=$ws/conf
+    $rm_dir $confpath
+    $mk_dir $confpath
+
+    for dist in $alldists ; do
+        cat <<EOF >>$confpath/distributions
+Origin: Tarantool
+Label: tarantool.org
+Suite: stable
+Codename: $dist
+Architectures: amd64 source
+Components: main
+Description: Unofficial Ubuntu Packages maintained by Tarantool
+SignWith: 91B625E5
+DebIndices: Packages Release . .gz .bz2
+UDebIndices: Packages . .gz .bz2
+DscIndices: Sources Release .gz .bz2
+
+EOF
+done
+
+    # create standalone repository with separate components
+    for dist in $alldists ; do
+        echo =================== DISTRIBUTION: $dist =========================
+        updated_deb=0
+        updated_dsc=0
+
+        # 1(binaries). use reprepro tool to generate Packages file
+        for deb in $ws/$debdir/$dist/$component/*/*/*.deb ; do
+            [ -f $deb ] || continue
+            locdeb=`echo $deb | sed "s#^$ws\/##g"`
+            echo "DEB: $deb"
+            # register DEB file to Packages file
+            reprepro -Vb . includedeb $dist $deb
+            # reprepro copied DEB file to local component which is not needed
+            $rm_dir $debdir/$component
+            # to have all packages avoid reprepro set DEB file to its own registry
+            $rm_dir db
+            # copy Packages file to avoid of removing by the new DEB version
+            for packages in dists/$dist/$component/binary-*/Packages ; do
+                if [ ! -f $packages.saved ] ; then
+                    # get the latest Packages file from S3
+                    $aws s3 ls "$s3/$packages" 2>/dev/null && \
+                        $aws s3 cp --acl public-read \
+                        "$s3/$packages" $packages.saved || \
+                        touch $packages.saved
+                fi
+                # check if the DEB file already exists in Packages from S3
+                if grep "^`grep "^SHA256: " $packages`$" $packages.saved ; then
+                    echo "WARNING: DEB file already registered in S3!"
+                    continue
+                fi
+                # store the new DEB entry
+                cat $packages >>$packages.saved
+                # save the registered DEB file to S3
+                $aws s3 cp --acl public-read $deb $s3/$locdeb
+                updated_deb=1
+            done
+        done
+
+        # 1(sources). use reprepro tool to generate Sources file
+        for dsc in $ws/$debdir/$dist/$component/*/*/*.dsc ; do
+            [ -f $dsc ] || continue
+            locdsc=`echo $dsc | sed "s#^$ws\/##g"`
+            echo "DSC: $dsc"
+            # register DSC file to Sources file
+            reprepro -Vb . includedsc $dist $dsc
+            # reprepro copied DSC file to component which is not needed
+            $rm_dir $debdir/$component
+            # to have all sources avoid reprepro set DSC file to its own registry
+            $rm_dir db
+            # copy Sources file to avoid of removing by the new DSC version
+            sources=dists/$dist/$component/source/Sources
+            if [ ! -f $sources.saved ] ; then
+                # get the latest Sources file from S3
+                $aws s3 ls "$s3/$sources" && \
+                    $aws s3 cp --acl public-read "$s3/$sources" $sources.saved || \
+                    touch $sources.saved
+            fi
+            # WORKAROUND: unknown why, but reprepro doesn`t save the Sources file
+            gunzip -c $sources.gz >$sources
+            # check if the DSC file already exists in Sources from S3
+            hash=`grep '^Checksums-Sha256:' -A3 $sources | \
+                tail -n 1 | awk '{print $1}'`
+            if grep " $hash .*\.dsc$" $sources.saved ; then
+                echo "WARNING: DSC file already registered in S3!"
+                continue
+            fi
+            # store the new DSC entry
+            cat $sources >>$sources.saved
+            # save the registered DSC file to S3
+            $aws s3 cp --acl public-read $dsc $s3/$locdsc
+            tarxz=`echo $locdsc | sed 's#\.dsc$#.debian.tar.xz#g'`
+            $aws s3 cp --acl public-read $ws/$tarxz "$s3/$tarxz"
+            orig=`echo $locdsc | sed 's#-1\.dsc$#.orig.tar.xz#g'`
+            $aws s3 cp --acl public-read $ws/$orig "$s3/$orig"
+            updated_dsc=1
+        done
+
+        # check if any DEB/DSC files were newly registered
+        [ "$updated_deb" == "0" -a "$updated_dsc" == "0" ] && \
+            continue || echo "Updating dists"
+
+        # finalize the Packages file
+        for packages in dists/$dist/$component/binary-*/Packages ; do
+            mv $packages.saved $packages
+        done
+
+        # 2(binaries). update Packages file archives
+        for packpath in dists/$dist/$component/binary-* ; do
+            pushd $packpath
+            sed "s#Filename: $debdir/$component/#Filename: $debdir/$dist/$component/#g" -i Packages
+            bzip2 -c Packages >Packages.bz2
+            gzip -c Packages >Packages.gz
+            popd
+        done
+
+        # 2(sources). update Sources file archives
+        pushd dists/$dist/$component/source
+        sed "s#Directory: $debdir/$component/#Directory: $debdir/$dist/$component/#g" -i Sources
+        bzip2 -c Sources >Sources.bz2
+        gzip -c Sources >Sources.gz
+        popd
+
+        # 3. update checksums entries of the Packages* files in *Release files
+	# NOTE: it is stable structure of the *Release files when the checksum
+	#       entries in it in the following way:
+	# MD5Sum:
+        #  <checksum> <size> <file orig>
+        #  <checksum> <size> <file debian>
+        # SHA1:
+        #  <checksum> <size> <file orig>
+        #  <checksum> <size> <file debian>
+        # SHA256:
+        #  <checksum> <size> <file orig>
+        #  <checksum> <size> <file debian>
+	#       The script bellow puts 'md5' value at the 1st found file entry,
+	#       'sha1' - at the 2nd and 'sha256' at the 3rd
+        pushd dists/$dist
+        for file in `grep " $component/" Release | awk '{print $3}' | sort -u` ; do
+            sz=`stat -c "%s" $file`
+            md5=`md5sum $file | awk '{print $1}'`
+            sha1=`sha1sum $file | awk '{print $1}'`
+            sha256=`sha256sum $file | awk '{print $1}'`
+            awk 'BEGIN{c = 0} ; {
+                if ($3 == p) {
+                    c = c + 1
+                    if (c == 1) {print " " md  " " s " " p}
+                    if (c == 2) {print " " sh1 " " s " " p}
+                    if (c == 3) {print " " sh2 " " s " " p}
+                } else {print $0}
+            }' p="$file" s="$sz" md="$md5" sh1="$sha1" sh2="$sha256" \
+            Release >Release.new
+            mv Release.new Release
+        done
+        # resign the selfsigned InRelease file
+        $rm_file InRelease
+        gpg --clearsign -o InRelease Release
+        # resign the Release file
+        $rm_file Release.gpg
+        gpg -abs -o Release.gpg Release
+        popd
+
+        # 4. sync the latest distribution path changes to S3
+        $aws s3 sync --acl public-read dists/$dist "$s3/dists/$dist"
+    done
+
+    # unlock the publishing
+    $rm_file $ws_lockfile
+}
+
+# The 'pack_rpm' function especialy created for RPM packages. It works
+# with RPM packing OS like Centos, Fedora. It is based on globaly known
+# tool 'createrepo' from:
+#     https://linux.die.net/man/8/createrepo
+# This tool works with single distribution of the given OS.
+# Result of the routine is the rpm package for YUM repository with
+# file structure equal to the Centos/Fedora:
+#     http://mirror.centos.org/centos/7/os/x86_64/Packages/
+#     http://mirrors.kernel.org/fedora/releases/30/Everything/x86_64/os/Packages/t/
+function pack_rpm {
+    if ! ls $repo/*.rpm >/dev/null 2>&1 ; then
+        echo "ERROR: Current '$repo' has:"
+        ls -al $repo
+        echo "Usage: $0 [path with *.rpm files]"
+        exit 1
+    fi
+
+    # temporary lock the publication to the repository
+    ws=${prefix_lockfile}_${branch}_${os}_${DIST}
+    ws_lockfile=${ws}.lock
+    create_lockfile $ws_lockfile
+
+    # create temporary workspace with packages copies
+    $rm_dir $ws
+    $mk_dir $ws
+    cp $repo/*.rpm $ws/.
+    cd $ws
+
+    # set the paths
+    if [ "$os" == "centos" ]; then
+        repopath=$DIST/os/x86_64
+        rpmpath=Packages
+    elif [ "$os" == "fedora" ]; then
+        repopath=releases/$DIST/Everything/x86_64/os
+        rpmpath=Packages/$proddir
+    fi
+    packpath=$repopath/$rpmpath
+
+    # prepare local repository with packages
+    $mk_dir $packpath
+    mv *.rpm $packpath/.
+    cd $repopath
+
+    # copy the current metadata files from S3
+    mkdir repodata.base
+    for file in `$aws s3 ls $s3/$repopath/repodata/ | awk '{print $NF}'` ; do
+        $aws s3 ls $s3/$repopath/repodata/$file || continue
+        $aws s3 cp $s3/$repopath/repodata/$file repodata.base/$file
+    done
+
+    # create the new repository metadata files
+    createrepo --no-database --update --workers=2 --compress-type=gz --simple-md-filenames .
+    mv repodata repodata.adding
+
+    # merge metadata files
+    mkdir repodata
+    head -n 2 repodata.adding/repomd.xml >repodata/repomd.xml
+    for file in filelists.xml other.xml primary.xml ; do
+        # 1. take the 1st line only - to skip the line with number of packages which is not needed
+        zcat repodata.adding/$file.gz | head -n 1 >repodata/$file
+        # 2. take 2nd line with metadata tag and update the packages number in it
+        packsold=0
+        if [ -f repodata.base/$file.gz ] ; then
+        packsold=`zcat repodata.base/$file.gz | head -n 2 | tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g'`
+        fi
+        packsnew=`zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g'`
+        packs=$(($packsold+$packsnew))
+        zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | sed "s#packages=\".*\"#packages=\"$packs\"#g" >>repodata/$file
+        # 3. take only 'package' tags from new file
+        zcat repodata.adding/$file.gz | tail -n +3 | head -n -1 >>repodata/$file
+        # 4. take only 'package' tags from old file if exists
+        if [ -f repodata.base/$file.gz ] ; then
+            zcat repodata.base/$file.gz | tail -n +3 | head -n -1 >>repodata/$file
+        fi
+        # 5. take the last closing line with metadata tag
+        zcat repodata.adding/$file.gz | tail -n 1 >>repodata/$file
+
+        # get the new data
+        chsnew=`sha256sum repodata/$file | awk '{print $1}'`
+        sz=`stat --printf="%s" repodata/$file`
+        gzip repodata/$file
+        chsgznew=`sha256sum repodata/$file.gz | awk '{print $1}'`
+        szgz=`stat --printf="%s" repodata/$file.gz`
+        timestamp=`date +%s -r repodata/$file.gz`
+
+        # add info to repomd.xml file
+        name=`echo $file | sed 's#\.xml$##g'`
+	cat <<EOF >>repodata/repomd.xml
+<data type="$name">
+  <checksum type="sha256">$chsgznew</checksum>
+  <open-checksum type="sha256">$chsnew</open-checksum>
+  <location href="repodata/$file.gz"/>
+  <timestamp>$timestamp</timestamp>
+  <size>$szgz</size>
+  <open-size>$sz</open-size>
+</data>"
+EOF
+    done
+    tail -n 1 repodata.adding/repomd.xml >>repodata/repomd.xml
+    gpg --detach-sign --armor repodata/repomd.xml
+
+    # copy the packages to S3
+    for file in $rpmpath/*.rpm ; do
+        $aws s3 cp --acl public-read $file "$s3/$repopath/$file"
+    done
+
+    # update the metadata at the S3
+    $aws s3 sync --acl public-read repodata "$s3/$repopath/repodata"
+
+    # unlock the publishing
+    $rm_file $ws_lockfile
+}
+
+if [ "$os" == "ubuntu" -o "$os" == "debian" ]; then
+    pack_deb
+elif [ "$os" == "centos" -o "$os" == "fedora" ]; then
+    pack_rpm
+else
+    echo "USAGE: given OS '$os' is not supported, use any single from the list: $alloss"
+    usage
+    exit 1
+fi
-- 
2.17.1

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [Tarantool-patches] [PATCH v4 2/3] Testing changes after 2nd review on S3
  2020-01-13 12:04 [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Alexander V. Tikhonov
  2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 1/3] gitlab-ci: implement packing into MCS S3 Alexander V. Tikhonov
@ 2020-01-13 12:04 ` Alexander V. Tikhonov
  2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 3/3] Testing changes after 3rd " Alexander V. Tikhonov
  2020-01-13 17:02 ` [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Igor Munkin
  3 siblings, 0 replies; 7+ messages in thread
From: Alexander V. Tikhonov @ 2020-01-13 12:04 UTC (permalink / raw)
  To: Igor Munkin; +Cc: tarantool-patches

---
 tools/add_pack_s3_repo.sh | 381 +++++++++++++++++++++-----------------
 1 file changed, 209 insertions(+), 172 deletions(-)

diff --git a/tools/add_pack_s3_repo.sh b/tools/add_pack_s3_repo.sh
index 2316b9015..280a07ab0 100755
--- a/tools/add_pack_s3_repo.sh
+++ b/tools/add_pack_s3_repo.sh
@@ -4,8 +4,12 @@ set -e
 rm_file='rm -f'
 rm_dir='rm -rf'
 mk_dir='mkdir -p'
+ws_prefix=/tmp/tarantool_repo_s3
 
-alloss='ubuntu debian centos el fedora'
+alloss='ubuntu debian el fedora'
+product=tarantool
+# the path with binaries either repository
+repo=.
 
 get_os_dists()
 {
@@ -16,7 +20,7 @@ get_os_dists()
         alldists='trusty xenial cosmic disco bionic eoan'
     elif [ "$os" == "debian" ]; then
         alldists='jessie stretch buster'
-    elif [ "$os" == "centos" -o "$os" == "el" ]; then
+    elif [ "$os" == "el" ]; then
         alldists='6 7 8'
     elif [ "$os" == "fedora" ]; then
         alldists='27 28 29 30 31'
@@ -25,16 +29,45 @@ get_os_dists()
     echo "$alldists"
 }
 
-ws_prefix=/tmp/tarantool_repo_s3
-create_lockfile()
+prepare_ws()
 {
-    lockfile -l 1000 $1
+    # temporary lock the publication to the repository
+    ws=${ws_prefix}_${branch}_${os}
+    if [ "$os" != "ubuntu" -a "$os" != "debian" ]; then
+        ws=${ws}_${option_dist}
+    fi
+    ws_lockfile=${ws}.lock
+    if [ -f $ws_lockfile ]; then
+        old_proc=$(cat $ws_lockfile)
+    fi
+    lockfile -l 60 $ws_lockfile
+    chmod u+w $ws_lockfile && echo $$ >$ws_lockfile && chmod u-w $ws_lockfile
+    if [ "$old_proc" != ""  -a "$old_proc" != "0" ]; then
+        kill -9 $old_proc >/dev/null 2>&1 || true
+    fi
+
+    # create temporary workspace with repository copy
+    $rm_dir $ws
+    $mk_dir $ws
 }
 
 usage()
 {
     cat <<EOF
-Usage: $0 -b <S3 bucket> -o <OS name> -d <OS distribuition> [-p <product>] <path>
+Usage for store package binaries from the given path:
+    $0 -b=<S3 bucket> -o=<OS name> -d=<OS distribuition> [-p=<product>] <path to package binaries>
+
+Usage for mirroring Debian|Ubuntu OS repositories:
+    $0 -b=<S3 bucket> -o=<OS name> [-p=<product>] <path to 'pool' subdirectory with packages repository>
+
+Arguments:
+    <path>
+         Path points to the directory with deb/prm packages to be used.
+         Script can be used in one of 2 modes:
+          - path with binaries packages for a single distribution
+          - path with 'pool' subdirectory with APT repository (only: debian|ubuntu), like:
+                /var/spool/apt-mirror/mirror/packagecloud.io/tarantool/$branch/$os
+
 Options:
     -b|--bucket
         MCS S3 bucket which will be used for storing the packages.
@@ -44,24 +77,19 @@ Options:
             2.2: 2_2
             1.10: 1_10
     -o|--os
-        OS to be checked, one of the list (NOTE: centos == el):
+        OS to be checked, one of the list:
             $alloss
     -d|--distribution
         Distribution appropriate to the given OS:
 EOF
     for os in $alloss ; do
-        echo "            $os: <"`get_os_dists $os`">"
+        echo "            $os: <"$(get_os_dists $os)">"
     done
     cat <<EOF
     -p|--product
          Product name to be packed with, default name is 'tarantool'
     -h|--help
          Usage help message
-    <path>
-         Path points to the directory with deb/prm packages to be used.
-         Script can be used in one of 2 modes:
-          - path with binaries packages for a single distribution
-	  - path with 'pool' directory with APT repository (only: debian|ubuntu)
 EOF
 }
 
@@ -70,7 +98,7 @@ do
 case $i in
     -b=*|--bucket=*)
     branch="${i#*=}"
-    if [ "$branch" != "2x" -a "$branch" != "2_3" -a "$branch" != "2_2" -a "$branch" != "1_10" ]; then
+    if echo "$branch" | grep -qvP '^(1_10|2(x|_[2-4]))$' ; then
         echo "ERROR: bucket '$branch' is not supported"
         usage
         exit 1
@@ -79,9 +107,6 @@ case $i in
     ;;
     -o=*|--os=*)
     os="${i#*=}"
-    if [ "$os" == "el" ]; then
-        os=centos
-    fi
     if ! echo $alloss | grep -F -q -w $os ; then
         echo "ERROR: OS '$os' is not supported"
         usage
@@ -90,7 +115,7 @@ case $i in
     shift # past argument=value
     ;;
     -d=*|--distribution=*)
-    DIST="${i#*=}"
+    option_dist="${i#*=}"
     shift # past argument=value
     ;;
     -p=*|--product=*)
@@ -120,27 +145,68 @@ if [ "$os" == "" ]; then
     usage
     exit 1
 fi
-alldists=`get_os_dists $os`
-if [ -n "$DIST" ] && ! echo $alldists | grep -F -q -w $DIST ; then
-    echo "ERROR: set distribution at options '$DIST' not found at supported list '$alldists'"
+alldists=$(get_os_dists $os)
+if [ -n "$option_dist" ] && ! echo $alldists | grep -F -q -w $option_dist ; then
+    echo "ERROR: set distribution at options '$option_dist' not found at supported list '$alldists'"
     usage
     exit 1
 fi
 
-# set the path with binaries
-product=$product
-if [ "$product" == "" ]; then
-    product=tarantool
-fi
-proddir=`echo $product | head -c 1`
+# set the subpath with binaries based on literal character of the product name
+proddir=$(echo $product | head -c 1)
 
-# set the path with binaries
-if [ "$repo" == "" ]; then
-    repo=.
-fi
-
-aws='aws --endpoint-url https://hb.bizmrg.com'
+# AWS defines
+aws='aws --endpoint-url https://hb.bizmrg.com s3'
 s3="s3://tarantool_repo/$branch/$os"
+aws_cp_to_s3="$aws cp --acl public-read"
+aws_sync_to_s3="$aws sync --acl public-read"
+
+update_packfile()
+{
+    packfile=$1
+    packtype=$2
+
+    locpackfile=$(echo $packfile | sed "s#^$ws/##g")
+    # register DEB/DSC pack file to Packages/Sources file
+    reprepro -Vb . include$packtype $loop_dist $packfile
+    # reprepro copied DEB/DSC file to component which is not needed
+    $rm_dir $debdir/$component
+    # to have all sources avoid reprepro set DEB/DSC file to its own registry
+    $rm_dir db
+}
+
+update_metadata()
+{
+    packpath=$1
+    packtype=$2
+
+    if [ ! -f $packpath.saved ] ; then
+        # get the latest Sources file from S3
+        $aws ls "$s3/$packpath" 2>/dev/null && \
+            $aws cp "$s3/$packpath" $packpath.saved || \
+            touch $packpath.saved
+    fi
+
+    if [ "$packtype" == "dsc" ]; then
+        # WORKAROUND: unknown why, but reprepro doesn`t save the Sources file
+        gunzip -c $packpath.gz >$packpath
+        # check if the DSC file already exists in Sources from S3
+        hash=$(grep '^Checksums-Sha256:' -A3 $packpath | \
+            tail -n 1 | awk '{print $1}')
+        if grep " $hash .*\.dsc$" $packpath.saved ; then
+            echo "WARNING: DSC file already registered in S3!"
+            return 1
+        fi
+    elif [ "$packtype" == "deb" ]; then
+        # check if the DEB file already exists in Packages from S3
+        if grep "^$(grep '^SHA256: ' $packages)$" $packages.saved ; then
+            echo "WARNING: DEB file already registered in S3!"
+            return 1
+        fi
+    fi
+    # store the new DEB entry
+    cat $packpath >>$packpath.saved
+}
 
 # The 'pack_deb' function especialy created for DEB packages. It works
 # with DEB packing OS like Ubuntu, Debian. It is based on globaly known
@@ -158,32 +224,27 @@ function pack_deb {
     # debian has special directory 'pool' for packages
     debdir=pool
 
-    # get packages from pointed location either mirror path
-    if [ "$repo" == "" ] ; then
-        repo=/var/spool/apt-mirror/mirror/packagecloud.io/tarantool/$branch/$os
-    fi
-    if [ ! -d $repo/$debdir ] && ( [ "$DIST" == "" ] || ! ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null 2>&1 ) ; then
+    # get packages from pointed location
+    if [ ! -d $repo/$debdir ] && \
+        ( [ "$option_dist" == "" ] || \
+            ! ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null 2>&1 ) ; then
         echo "ERROR: Current '$repo' path doesn't have any of the following:"
-        echo "Usage with set distribuition with option '-d' and packages: $0 [path with *.deb *.dsc *.tar.*z files]"
-        echo "Usage with repositories: $0 [path to repository with '$debdir' subdirectory]"
+        echo " - $0 run option '-d' and DEB packages in path"
+        echo " - 'pool' subdirectory with APT repositories"
+        usage
         exit 1
     fi
 
-    # temporary lock the publication to the repository
-    ws=${ws_prefix}_${branch}_${os}
-    ws_lockfile=${ws}.lock
-    create_lockfile $ws_lockfile
-
-    # create temporary workspace with repository copy
-    $rm_dir $ws
-    $mk_dir $ws
+    # prepare the workspace
+    prepare_ws
 
     # script works in one of 2 modes:
     # - path with binaries packages for a single distribution
     # - path with 'pool' directory with APT repository
-    if [ "$DIST" != "" ] && ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null 2>&1 ; then
+    if [ "$option_dist" != "" ] && \
+            ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null 2>&1 ; then
         # copy single distribution with binaries packages
-        repopath=$ws/pool/${DIST}/main/$proddir/$product
+        repopath=$ws/pool/${option_dist}/$component/$proddir/$product
         $mk_dir ${repopath}
         cp $repo/*.deb $repo/*.dsc $repo/*.tar.*z $repopath/.
     elif [ -d $repo/$debdir ]; then
@@ -191,10 +252,12 @@ function pack_deb {
         cp -rf $repo/$debdir $ws/.
     else
         echo "ERROR: neither distribution option '-d' with files $repo/*.deb $repo/*.dsc $repo/*.tar.*z set nor '$repo/$debdir' path found"
-	usage
-	$rm_file $wslock
-	exit 1
+        usage
+        $rm_file $wslock
+        exit 1
     fi
+
+    swd=$PWD
     cd $ws
 
     # create the configuration file for 'reprepro' tool
@@ -202,14 +265,14 @@ function pack_deb {
     $rm_dir $confpath
     $mk_dir $confpath
 
-    for dist in $alldists ; do
+    for loop_dist in $alldists ; do
         cat <<EOF >>$confpath/distributions
 Origin: Tarantool
 Label: tarantool.org
 Suite: stable
-Codename: $dist
+Codename: $loop_dist
 Architectures: amd64 source
-Components: main
+Components: $component
 Description: Unofficial Ubuntu Packages maintained by Tarantool
 SignWith: 91B625E5
 DebIndices: Packages Release . .gz .bz2
@@ -217,84 +280,49 @@ UDebIndices: Packages . .gz .bz2
 DscIndices: Sources Release .gz .bz2
 
 EOF
-done
+    done
 
     # create standalone repository with separate components
-    for dist in $alldists ; do
-        echo =================== DISTRIBUTION: $dist =========================
+    for loop_dist in $alldists ; do
+        echo ================ DISTRIBUTION: $loop_dist ====================
         updated_deb=0
         updated_dsc=0
 
         # 1(binaries). use reprepro tool to generate Packages file
-        for deb in $ws/$debdir/$dist/$component/*/*/*.deb ; do
+        for deb in $ws/$debdir/$loop_dist/$component/*/*/*.deb ; do
             [ -f $deb ] || continue
-            locdeb=`echo $deb | sed "s#^$ws\/##g"`
-            echo "DEB: $deb"
-            # register DEB file to Packages file
-            reprepro -Vb . includedeb $dist $deb
-            # reprepro copied DEB file to local component which is not needed
-            $rm_dir $debdir/$component
-            # to have all packages avoid reprepro set DEB file to its own registry
-            $rm_dir db
-            # copy Packages file to avoid of removing by the new DEB version
-            for packages in dists/$dist/$component/binary-*/Packages ; do
-                if [ ! -f $packages.saved ] ; then
-                    # get the latest Packages file from S3
-                    $aws s3 ls "$s3/$packages" 2>/dev/null && \
-                        $aws s3 cp --acl public-read \
-                        "$s3/$packages" $packages.saved || \
-                        touch $packages.saved
-                fi
-                # check if the DEB file already exists in Packages from S3
-                if grep "^`grep "^SHA256: " $packages`$" $packages.saved ; then
-                    echo "WARNING: DEB file already registered in S3!"
-                    continue
-                fi
-                # store the new DEB entry
-                cat $packages >>$packages.saved
-                # save the registered DEB file to S3
-                $aws s3 cp --acl public-read $deb $s3/$locdeb
+            # regenerate DEB pack
+            update_packfile $deb deb
+            echo "Regenerated DEB file: $locpackfile"
+            for packages in dists/$loop_dist/$component/binary-*/Packages ; do
+                # copy Packages file to avoid of removing by the new DEB version
+                # update metadata 'Packages' files
+                if ! update_metadata $packages deb ; then continue ; fi
                 updated_deb=1
             done
+            # save the registered DEB file to S3
+            if [ "$updated_deb" == 1 ]; then
+                $aws_cp_to_s3 $deb $s3/$locpackfile
+            fi
         done
 
         # 1(sources). use reprepro tool to generate Sources file
-        for dsc in $ws/$debdir/$dist/$component/*/*/*.dsc ; do
+        for dsc in $ws/$debdir/$loop_dist/$component/*/*/*.dsc ; do
             [ -f $dsc ] || continue
-            locdsc=`echo $dsc | sed "s#^$ws\/##g"`
-            echo "DSC: $dsc"
-            # register DSC file to Sources file
-            reprepro -Vb . includedsc $dist $dsc
-            # reprepro copied DSC file to component which is not needed
-            $rm_dir $debdir/$component
-            # to have all sources avoid reprepro set DSC file to its own registry
-            $rm_dir db
+            # regenerate DSC pack
+            update_packfile $dsc dsc
+            echo "Regenerated DSC file: $locpackfile"
             # copy Sources file to avoid of removing by the new DSC version
-            sources=dists/$dist/$component/source/Sources
-            if [ ! -f $sources.saved ] ; then
-                # get the latest Sources file from S3
-                $aws s3 ls "$s3/$sources" && \
-                    $aws s3 cp --acl public-read "$s3/$sources" $sources.saved || \
-                    touch $sources.saved
-            fi
-            # WORKAROUND: unknown why, but reprepro doesn`t save the Sources file
-            gunzip -c $sources.gz >$sources
-            # check if the DSC file already exists in Sources from S3
-            hash=`grep '^Checksums-Sha256:' -A3 $sources | \
-                tail -n 1 | awk '{print $1}'`
-            if grep " $hash .*\.dsc$" $sources.saved ; then
-                echo "WARNING: DSC file already registered in S3!"
-                continue
-            fi
-            # store the new DSC entry
-            cat $sources >>$sources.saved
-            # save the registered DSC file to S3
-            $aws s3 cp --acl public-read $dsc $s3/$locdsc
-            tarxz=`echo $locdsc | sed 's#\.dsc$#.debian.tar.xz#g'`
-            $aws s3 cp --acl public-read $ws/$tarxz "$s3/$tarxz"
-            orig=`echo $locdsc | sed 's#-1\.dsc$#.orig.tar.xz#g'`
-            $aws s3 cp --acl public-read $ws/$orig "$s3/$orig"
+            # update metadata 'Sources' file
+            if ! update_metadata dists/$loop_dist/$component/source/Sources dsc \
+                ; then continue ; fi
             updated_dsc=1
+            # save the registered DSC file to S3
+            $aws_cp_to_s3 $dsc $s3/$locpackfile
+            tarxz=$(echo $locpackfile | sed 's#\.dsc$#.debian.tar.xz#g')
+            $aws_cp_to_s3 $ws/$tarxz "$s3/$tarxz"
+            orig=$(echo $locpackfile | sed 's#-1\.dsc$#.orig.tar.xz#g')
+            $aws_cp_to_s3 $ws/$orig "$s3/$orig"
         done
 
         # check if any DEB/DSC files were newly registered
@@ -302,30 +330,30 @@ done
             continue || echo "Updating dists"
 
         # finalize the Packages file
-        for packages in dists/$dist/$component/binary-*/Packages ; do
+        for packages in dists/$loop_dist/$component/binary-*/Packages ; do
             mv $packages.saved $packages
         done
 
         # 2(binaries). update Packages file archives
-        for packpath in dists/$dist/$component/binary-* ; do
+        for packpath in dists/$loop_dist/$component/binary-* ; do
             pushd $packpath
-            sed "s#Filename: $debdir/$component/#Filename: $debdir/$dist/$component/#g" -i Packages
+            sed "s#Filename: $debdir/$component/#Filename: $debdir/$loop_dist/$component/#g" -i Packages
             bzip2 -c Packages >Packages.bz2
             gzip -c Packages >Packages.gz
             popd
         done
 
         # 2(sources). update Sources file archives
-        pushd dists/$dist/$component/source
-        sed "s#Directory: $debdir/$component/#Directory: $debdir/$dist/$component/#g" -i Sources
+        pushd dists/$loop_dist/$component/source
+        sed "s#Directory: $debdir/$component/#Directory: $debdir/$loop_dist/$component/#g" -i Sources
         bzip2 -c Sources >Sources.bz2
         gzip -c Sources >Sources.gz
         popd
 
         # 3. update checksums entries of the Packages* files in *Release files
-	# NOTE: it is stable structure of the *Release files when the checksum
-	#       entries in it in the following way:
-	# MD5Sum:
+        # NOTE: it is stable structure of the *Release files when the checksum
+        #       entries in it in the following way:
+        # MD5Sum:
         #  <checksum> <size> <file orig>
         #  <checksum> <size> <file debian>
         # SHA1:
@@ -334,14 +362,14 @@ done
         # SHA256:
         #  <checksum> <size> <file orig>
         #  <checksum> <size> <file debian>
-	#       The script bellow puts 'md5' value at the 1st found file entry,
-	#       'sha1' - at the 2nd and 'sha256' at the 3rd
-        pushd dists/$dist
-        for file in `grep " $component/" Release | awk '{print $3}' | sort -u` ; do
-            sz=`stat -c "%s" $file`
-            md5=`md5sum $file | awk '{print $1}'`
-            sha1=`sha1sum $file | awk '{print $1}'`
-            sha256=`sha256sum $file | awk '{print $1}'`
+        #       The script bellow puts 'md5' value at the 1st found file entry,
+        #       'sha1' - at the 2nd and 'sha256' at the 3rd
+        pushd dists/$loop_dist
+        for file in $(grep " $component/" Release | awk '{print $3}' | sort -u) ; do
+            sz=$(stat -c "%s" $file)
+            md5=$(md5sum $file | awk '{print $1}')
+            sha1=$(sha1sum $file | awk '{print $1}')
+            sha256=$(sha256sum $file | awk '{print $1}')
             awk 'BEGIN{c = 0} ; {
                 if ($3 == p) {
                     c = c + 1
@@ -350,7 +378,7 @@ done
                     if (c == 3) {print " " sh2 " " s " " p}
                 } else {print $0}
             }' p="$file" s="$sz" md="$md5" sh1="$sha1" sh2="$sha256" \
-            Release >Release.new
+                    Release >Release.new
             mv Release.new Release
         done
         # resign the selfsigned InRelease file
@@ -362,11 +390,13 @@ done
         popd
 
         # 4. sync the latest distribution path changes to S3
-        $aws s3 sync --acl public-read dists/$dist "$s3/dists/$dist"
+        $aws_sync_to_s3 dists/$loop_dist "$s3/dists/$loop_dist"
     done
 
     # unlock the publishing
     $rm_file $ws_lockfile
+
+    cd $swd
 }
 
 # The 'pack_rpm' function especialy created for RPM packages. It works
@@ -380,29 +410,26 @@ done
 #     http://mirrors.kernel.org/fedora/releases/30/Everything/x86_64/os/Packages/t/
 function pack_rpm {
     if ! ls $repo/*.rpm >/dev/null 2>&1 ; then
-        echo "ERROR: Current '$repo' has:"
-        ls -al $repo
-        echo "Usage: $0 [path with *.rpm files]"
+        echo "ERROR: Current '$repo' path doesn't have RPM packages in path"
+        usage
         exit 1
     fi
 
-    # temporary lock the publication to the repository
-    ws=${prefix_lockfile}_${branch}_${os}_${DIST}
-    ws_lockfile=${ws}.lock
-    create_lockfile $ws_lockfile
+    # prepare the workspace
+    prepare_ws
 
-    # create temporary workspace with packages copies
-    $rm_dir $ws
-    $mk_dir $ws
+    # copy the needed package binaries to the workspace
     cp $repo/*.rpm $ws/.
+
+    swd=$PWD
     cd $ws
 
     # set the paths
-    if [ "$os" == "centos" ]; then
-        repopath=$DIST/os/x86_64
+    if [ "$os" == "el" ]; then
+        repopath=$option_dist/os/x86_64
         rpmpath=Packages
     elif [ "$os" == "fedora" ]; then
-        repopath=releases/$DIST/Everything/x86_64/os
+        repopath=releases/$option_dist/Everything/x86_64/os
         rpmpath=Packages/$proddir
     fi
     packpath=$repopath/$rpmpath
@@ -410,53 +437,61 @@ function pack_rpm {
     # prepare local repository with packages
     $mk_dir $packpath
     mv *.rpm $packpath/.
-    cd $repopath
+    cd $ws/$repopath
 
     # copy the current metadata files from S3
     mkdir repodata.base
-    for file in `$aws s3 ls $s3/$repopath/repodata/ | awk '{print $NF}'` ; do
-        $aws s3 ls $s3/$repopath/repodata/$file || continue
-        $aws s3 cp $s3/$repopath/repodata/$file repodata.base/$file
+    for file in $($aws ls $s3/$repopath/repodata/ | awk '{print $NF}') ; do
+        $aws ls $s3/$repopath/repodata/$file || continue
+        $aws cp $s3/$repopath/repodata/$file repodata.base/$file
     done
 
     # create the new repository metadata files
-    createrepo --no-database --update --workers=2 --compress-type=gz --simple-md-filenames .
+    createrepo --no-database --update --workers=2 \
+        --compress-type=gz --simple-md-filenames .
     mv repodata repodata.adding
 
     # merge metadata files
     mkdir repodata
     head -n 2 repodata.adding/repomd.xml >repodata/repomd.xml
     for file in filelists.xml other.xml primary.xml ; do
-        # 1. take the 1st line only - to skip the line with number of packages which is not needed
+        # 1. take the 1st line only - to skip the line with
+        #    number of packages which is not needed
         zcat repodata.adding/$file.gz | head -n 1 >repodata/$file
-        # 2. take 2nd line with metadata tag and update the packages number in it
+        # 2. take 2nd line with metadata tag and update
+        #    the packages number in it
         packsold=0
         if [ -f repodata.base/$file.gz ] ; then
-        packsold=`zcat repodata.base/$file.gz | head -n 2 | tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g'`
+            packsold=$(zcat repodata.base/$file.gz | head -n 2 | \
+                tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g')
         fi
-        packsnew=`zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g'`
+        packsnew=$(zcat repodata.adding/$file.gz | head -n 2 | \
+            tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g')
         packs=$(($packsold+$packsnew))
-        zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | sed "s#packages=\".*\"#packages=\"$packs\"#g" >>repodata/$file
+        zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | \
+            sed "s#packages=\".*\"#packages=\"$packs\"#g" >>repodata/$file
         # 3. take only 'package' tags from new file
-        zcat repodata.adding/$file.gz | tail -n +3 | head -n -1 >>repodata/$file
+        zcat repodata.adding/$file.gz | tail -n +3 | head -n -1 \
+            >>repodata/$file
         # 4. take only 'package' tags from old file if exists
         if [ -f repodata.base/$file.gz ] ; then
-            zcat repodata.base/$file.gz | tail -n +3 | head -n -1 >>repodata/$file
+            zcat repodata.base/$file.gz | tail -n +3 | head -n -1 \
+                >>repodata/$file
         fi
         # 5. take the last closing line with metadata tag
         zcat repodata.adding/$file.gz | tail -n 1 >>repodata/$file
 
         # get the new data
-        chsnew=`sha256sum repodata/$file | awk '{print $1}'`
-        sz=`stat --printf="%s" repodata/$file`
+        chsnew=$(sha256sum repodata/$file | awk '{print $1}')
+        sz=$(stat --printf="%s" repodata/$file)
         gzip repodata/$file
-        chsgznew=`sha256sum repodata/$file.gz | awk '{print $1}'`
-        szgz=`stat --printf="%s" repodata/$file.gz`
-        timestamp=`date +%s -r repodata/$file.gz`
+        chsgznew=$(sha256sum repodata/$file.gz | awk '{print $1}')
+        szgz=$(stat --printf="%s" repodata/$file.gz)
+        timestamp=$(date +%s -r repodata/$file.gz)
 
         # add info to repomd.xml file
-        name=`echo $file | sed 's#\.xml$##g'`
-	cat <<EOF >>repodata/repomd.xml
+        name=$(echo $file | sed 's#\.xml$##g')
+        cat <<EOF >>repodata/repomd.xml
 <data type="$name">
   <checksum type="sha256">$chsgznew</checksum>
   <open-checksum type="sha256">$chsnew</open-checksum>
@@ -472,19 +507,21 @@ EOF
 
     # copy the packages to S3
     for file in $rpmpath/*.rpm ; do
-        $aws s3 cp --acl public-read $file "$s3/$repopath/$file"
+        $aws_cp_to_s3 $file "$s3/$repopath/$file"
     done
 
     # update the metadata at the S3
-    $aws s3 sync --acl public-read repodata "$s3/$repopath/repodata"
+    $aws_sync_to_s3 repodata "$s3/$repopath/repodata"
 
     # unlock the publishing
     $rm_file $ws_lockfile
+
+    cd $swd
 }
 
 if [ "$os" == "ubuntu" -o "$os" == "debian" ]; then
     pack_deb
-elif [ "$os" == "centos" -o "$os" == "fedora" ]; then
+elif [ "$os" == "el" -o "$os" == "fedora" ]; then
     pack_rpm
 else
     echo "USAGE: given OS '$os' is not supported, use any single from the list: $alloss"
-- 
2.17.1

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [Tarantool-patches] [PATCH v4 3/3] Testing changes after 3rd review on S3
  2020-01-13 12:04 [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Alexander V. Tikhonov
  2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 1/3] gitlab-ci: implement packing into MCS S3 Alexander V. Tikhonov
  2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 2/3] Testing changes after 2nd review on S3 Alexander V. Tikhonov
@ 2020-01-13 12:04 ` Alexander V. Tikhonov
  2020-01-13 16:54   ` Igor Munkin
  2020-01-13 17:02 ` [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Igor Munkin
  3 siblings, 1 reply; 7+ messages in thread
From: Alexander V. Tikhonov @ 2020-01-13 12:04 UTC (permalink / raw)
  To: Igor Munkin; +Cc: tarantool-patches

---
 tools/add_pack_s3_repo.sh | 77 +++++++++++++++++++--------------------
 1 file changed, 37 insertions(+), 40 deletions(-)

diff --git a/tools/add_pack_s3_repo.sh b/tools/add_pack_s3_repo.sh
index 280a07ab0..148ffd0d9 100755
--- a/tools/add_pack_s3_repo.sh
+++ b/tools/add_pack_s3_repo.sh
@@ -11,13 +11,13 @@ product=tarantool
 # the path with binaries either repository
 repo=.
 
-get_os_dists()
+function get_os_dists()
 {
     os=$1
     alldists=
 
     if [ "$os" == "ubuntu" ]; then
-        alldists='trusty xenial cosmic disco bionic eoan'
+        alldists='trusty xenial bionic cosmic disco eoan'
     elif [ "$os" == "debian" ]; then
         alldists='jessie stretch buster'
     elif [ "$os" == "el" ]; then
@@ -29,13 +29,11 @@ get_os_dists()
     echo "$alldists"
 }
 
-prepare_ws()
+function prepare_ws()
 {
     # temporary lock the publication to the repository
-    ws=${ws_prefix}_${branch}_${os}
-    if [ "$os" != "ubuntu" -a "$os" != "debian" ]; then
-        ws=${ws}_${option_dist}
-    fi
+    ws_suffix=$1
+    ws=${ws_prefix}_${branch}_${ws_suffix}
     ws_lockfile=${ws}.lock
     if [ -f $ws_lockfile ]; then
         old_proc=$(cat $ws_lockfile)
@@ -51,7 +49,7 @@ prepare_ws()
     $mk_dir $ws
 }
 
-usage()
+function usage()
 {
     cat <<EOF
 Usage for store package binaries from the given path:
@@ -157,33 +155,34 @@ proddir=$(echo $product | head -c 1)
 
 # AWS defines
 aws='aws --endpoint-url https://hb.bizmrg.com s3'
-s3="s3://tarantool_repo/$branch/$os"
-aws_cp_to_s3="$aws cp --acl public-read"
-aws_sync_to_s3="$aws sync --acl public-read"
+bucket_path="s3://tarantool_repo/$branch/$os"
+aws_cp_to_s3_public="$aws cp --acl public-read"
+aws_sync_to_s3_public="$aws sync --acl public-read"
 
-update_packfile()
+function update_deb_packfile()
 {
     packfile=$1
     packtype=$2
+    update_dist=$3
 
     locpackfile=$(echo $packfile | sed "s#^$ws/##g")
     # register DEB/DSC pack file to Packages/Sources file
-    reprepro -Vb . include$packtype $loop_dist $packfile
+    reprepro -Vb . include$packtype $update_dist $packfile
     # reprepro copied DEB/DSC file to component which is not needed
     $rm_dir $debdir/$component
     # to have all sources avoid reprepro set DEB/DSC file to its own registry
     $rm_dir db
 }
 
-update_metadata()
+function update_deb_metadata()
 {
     packpath=$1
     packtype=$2
 
     if [ ! -f $packpath.saved ] ; then
         # get the latest Sources file from S3
-        $aws ls "$s3/$packpath" 2>/dev/null && \
-            $aws cp "$s3/$packpath" $packpath.saved || \
+        $aws ls "$bucket_path/$packpath" 2>/dev/null && \
+            $aws cp "$bucket_path/$packpath" $packpath.saved || \
             touch $packpath.saved
     fi
 
@@ -236,7 +235,7 @@ function pack_deb {
     fi
 
     # prepare the workspace
-    prepare_ws
+    prepare_ws ${os}
 
     # script works in one of 2 modes:
     # - path with binaries packages for a single distribution
@@ -257,8 +256,7 @@ function pack_deb {
         exit 1
     fi
 
-    swd=$PWD
-    cd $ws
+    pushd $ws
 
     # create the configuration file for 'reprepro' tool
     confpath=$ws/conf
@@ -292,17 +290,17 @@ EOF
         for deb in $ws/$debdir/$loop_dist/$component/*/*/*.deb ; do
             [ -f $deb ] || continue
             # regenerate DEB pack
-            update_packfile $deb deb
+            update_deb_packfile $deb deb $loop_dist
             echo "Regenerated DEB file: $locpackfile"
             for packages in dists/$loop_dist/$component/binary-*/Packages ; do
                 # copy Packages file to avoid of removing by the new DEB version
                 # update metadata 'Packages' files
-                if ! update_metadata $packages deb ; then continue ; fi
+                ! update_deb_metadata $packages deb || continue
                 updated_deb=1
             done
             # save the registered DEB file to S3
             if [ "$updated_deb" == 1 ]; then
-                $aws_cp_to_s3 $deb $s3/$locpackfile
+                $aws_cp_to_s3_public $deb $bucket_path/$locpackfile
             fi
         done
 
@@ -310,19 +308,19 @@ EOF
         for dsc in $ws/$debdir/$loop_dist/$component/*/*/*.dsc ; do
             [ -f $dsc ] || continue
             # regenerate DSC pack
-            update_packfile $dsc dsc
+            update_deb_packfile $dsc dsc $loop_dist
             echo "Regenerated DSC file: $locpackfile"
             # copy Sources file to avoid of removing by the new DSC version
             # update metadata 'Sources' file
-            if ! update_metadata dists/$loop_dist/$component/source/Sources dsc \
-                ; then continue ; fi
+            ! update_deb_metadata dists/$loop_dist/$component/source/Sources dsc \
+                || continue
             updated_dsc=1
             # save the registered DSC file to S3
-            $aws_cp_to_s3 $dsc $s3/$locpackfile
+            $aws_cp_to_s3_public $dsc $bucket_path/$locpackfile
             tarxz=$(echo $locpackfile | sed 's#\.dsc$#.debian.tar.xz#g')
-            $aws_cp_to_s3 $ws/$tarxz "$s3/$tarxz"
+            $aws_cp_to_s3_public $ws/$tarxz "$bucket_path/$tarxz"
             orig=$(echo $locpackfile | sed 's#-1\.dsc$#.orig.tar.xz#g')
-            $aws_cp_to_s3 $ws/$orig "$s3/$orig"
+            $aws_cp_to_s3_public $ws/$orig "$bucket_path/$orig"
         done
 
         # check if any DEB/DSC files were newly registered
@@ -390,13 +388,13 @@ EOF
         popd
 
         # 4. sync the latest distribution path changes to S3
-        $aws_sync_to_s3 dists/$loop_dist "$s3/dists/$loop_dist"
+        $aws_sync_to_s3_public dists/$loop_dist "$bucket_path/dists/$loop_dist"
     done
 
     # unlock the publishing
     $rm_file $ws_lockfile
 
-    cd $swd
+    popd
 }
 
 # The 'pack_rpm' function especialy created for RPM packages. It works
@@ -416,13 +414,12 @@ function pack_rpm {
     fi
 
     # prepare the workspace
-    prepare_ws
+    prepare_ws ${os}_${option_dist}
 
     # copy the needed package binaries to the workspace
     cp $repo/*.rpm $ws/.
 
-    swd=$PWD
-    cd $ws
+    pushd $ws
 
     # set the paths
     if [ "$os" == "el" ]; then
@@ -437,13 +434,13 @@ function pack_rpm {
     # prepare local repository with packages
     $mk_dir $packpath
     mv *.rpm $packpath/.
-    cd $ws/$repopath
+    cd $repopath
 
     # copy the current metadata files from S3
     mkdir repodata.base
-    for file in $($aws ls $s3/$repopath/repodata/ | awk '{print $NF}') ; do
-        $aws ls $s3/$repopath/repodata/$file || continue
-        $aws cp $s3/$repopath/repodata/$file repodata.base/$file
+    for file in $($aws ls $bucket_path/$repopath/repodata/ | awk '{print $NF}') ; do
+        $aws ls $bucket_path/$repopath/repodata/$file || continue
+        $aws cp $bucket_path/$repopath/repodata/$file repodata.base/$file
     done
 
     # create the new repository metadata files
@@ -507,16 +504,16 @@ EOF
 
     # copy the packages to S3
     for file in $rpmpath/*.rpm ; do
-        $aws_cp_to_s3 $file "$s3/$repopath/$file"
+        $aws_cp_to_s3_public $file "$bucket_path/$repopath/$file"
     done
 
     # update the metadata at the S3
-    $aws_sync_to_s3 repodata "$s3/$repopath/repodata"
+    $aws_sync_to_s3_public repodata "$bucket_path/$repopath/repodata"
 
     # unlock the publishing
     $rm_file $ws_lockfile
 
-    cd $swd
+    popd
 }
 
 if [ "$os" == "ubuntu" -o "$os" == "debian" ]; then
-- 
2.17.1

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [Tarantool-patches] [PATCH v4 3/3] Testing changes after 3rd review on S3
  2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 3/3] Testing changes after 3rd " Alexander V. Tikhonov
@ 2020-01-13 16:54   ` Igor Munkin
  0 siblings, 0 replies; 7+ messages in thread
From: Igor Munkin @ 2020-01-13 16:54 UTC (permalink / raw)
  To: Alexander V. Tikhonov; +Cc: tarantool-patches

Sasha,

Sorry, but I have to mention a couple remarks from my last review one
more time. Please consider them below.

On 13.01.20, Alexander V. Tikhonov wrote:
> ---
>  tools/add_pack_s3_repo.sh | 77 +++++++++++++++++++--------------------
>  1 file changed, 37 insertions(+), 40 deletions(-)
> 

I wrote to the following in my previous review:
> General comments:
> * Please choose a single code style for function definition. Currently
>   you mix two following approaches:
> |  usage()
> |  {
>   and
> |  function pack_rpm {
As mentioned right above this note also related to pack_{deb,rpm}
functions, but both of them still violates the braces codestyle
considering the one you've chosen.

> diff --git a/tools/add_pack_s3_repo.sh b/tools/add_pack_s3_repo.sh
> index 280a07ab0..148ffd0d9 100755
> --- a/tools/add_pack_s3_repo.sh
> +++ b/tools/add_pack_s3_repo.sh
> @@ -11,13 +11,13 @@ product=tarantool
>  # the path with binaries either repository
>  repo=.
>  
> -get_os_dists()
> +function get_os_dists()
>  {
>      os=$1
>      alldists=
>  
>      if [ "$os" == "ubuntu" ]; then
> -        alldists='trusty xenial cosmic disco bionic eoan'
> +        alldists='trusty xenial bionic cosmic disco eoan'
>      elif [ "$os" == "debian" ]; then
>          alldists='jessie stretch buster'
>      elif [ "$os" == "el" ]; then
> @@ -29,13 +29,11 @@ get_os_dists()
>      echo "$alldists"
>  }
>  
> -prepare_ws()
> +function prepare_ws()
>  {
>      # temporary lock the publication to the repository
> -    ws=${ws_prefix}_${branch}_${os}
> -    if [ "$os" != "ubuntu" -a "$os" != "debian" ]; then
> -        ws=${ws}_${option_dist}
> -    fi
> +    ws_suffix=$1
> +    ws=${ws_prefix}_${branch}_${ws_suffix}
>      ws_lockfile=${ws}.lock
>      if [ -f $ws_lockfile ]; then
>          old_proc=$(cat $ws_lockfile)
> @@ -51,7 +49,7 @@ prepare_ws()
>      $mk_dir $ws
>  }
>  
> -usage()
> +function usage()
>  {
>      cat <<EOF
>  Usage for store package binaries from the given path:
> @@ -157,33 +155,34 @@ proddir=$(echo $product | head -c 1)
>  
>  # AWS defines
>  aws='aws --endpoint-url https://hb.bizmrg.com s3'
> -s3="s3://tarantool_repo/$branch/$os"
> -aws_cp_to_s3="$aws cp --acl public-read"
> -aws_sync_to_s3="$aws sync --acl public-read"
> +bucket_path="s3://tarantool_repo/$branch/$os"
> +aws_cp_to_s3_public="$aws cp --acl public-read"
> +aws_sync_to_s3_public="$aws sync --acl public-read"

I wrote to the following in my previous review:
>> Nice change, but other commands also interact with S3. I propose to
>> change the naming to the following one:
>> | <aws_<action>_public
To make it clearer a bit now I intended to drop _to_s3 suffix and left
something similar to the following:
| aws_cp_public="$aws cp --acl public-read"
| aws_sync_public="$aws sync --acl public-read"

>  
> -update_packfile()
> +function update_deb_packfile()
>  {
>      packfile=$1
>      packtype=$2
> +    update_dist=$3
>  
>      locpackfile=$(echo $packfile | sed "s#^$ws/##g")
>      # register DEB/DSC pack file to Packages/Sources file
> -    reprepro -Vb . include$packtype $loop_dist $packfile
> +    reprepro -Vb . include$packtype $update_dist $packfile
>      # reprepro copied DEB/DSC file to component which is not needed
>      $rm_dir $debdir/$component
>      # to have all sources avoid reprepro sMakinget DEB/DSC file to its own registry
>      $rm_dir db
>  }
>  
> -update_metadata()
> +function update_deb_metadata()
>  {
>      packpath=$1
>      packtype=$2
>  
>      if [ ! -f $packpath.saved ] ; then
>          # get the latest Sources file from S3
> -        $aws ls "$s3/$packpath" 2>/dev/null && \
> -            $aws cp "$s3/$packpath" $packpath.saved || \
> +        $aws ls "$bucket_path/$packpath" 2>/dev/null && \
> +            $aws cp "$bucket_path/$packpath" $packpath.saved || \
>              touch $packpath.saved
>      fi
>  
> @@ -236,7 +235,7 @@ function pack_deb {
>      fi
>  
>      # prepare the workspace
> -    prepare_ws
> +    prepare_ws ${os}
>  
>      # script works in one of 2 modes:
>      # - path with binaries packages for a single distribution
> @@ -257,8 +256,7 @@ function pack_deb {
>          exit 1
>      fi
>  
> -    swd=$PWD
> -    cd $ws
> +    pusha $ws
>  
>      # create the configuration file for 'reprepro' tool
>      confpath=$ws/conf
> @@ -292,17 +290,17 @@ EOF
>          for deb in $ws/$debdir/$loop_dist/$component/*/*/*.deb ; do
>              [ -f $deb ] || continue
>              # regenerate DEB pack
> -            update_packfile $deb deb
> +            update_deb_packfile $deb deb $loop_dist
>              echo "Regenerated DEB file: $locpackfile"
>              for packages in dists/$loop_dist/$component/binary-*/Packages ; do
>                  # copy Packages file to avoid of removing by the new DEB version
>                  # update metadata 'Packages' files
> -                if ! update_metadata $packages deb ; then continue ; fi
> +                ! update_deb_metadata $packages deb || continue
>                  updated_deb=1
>              done
>              # save the registered DEB file to S3
>              if [ "$updated_deb" == 1 ]; then
> -                $aws_cp_to_s3 $deb $s3/$locpackfile
> +                $aws_cp_to_s3_public $deb $bucket_path/$locpackfile
>              fi
>          done
>  
> @@ -310,19 +308,19 @@ EOF
>          for dsc in $ws/$debdir/$loop_dist/$component/*/*/*.dsc ; do
>              [ -f $dsc ] || continue
>              # regenerate DSC pack
> -            update_packfile $dsc dsc
> +            update_deb_packfile $dsc dsc $loop_dist
>              echo "Regenerated DSC file: $locpackfile"
>              # copy Sources file to avoid of removing by the new DSC version
>              # update metadata 'Sources' file
> -            if ! update_metadata dists/$loop_dist/$component/source/Sources dsc \
> -                ; then continue ; fi
> +            ! update_deb_metadata dists/$loop_dist/$component/source/Sources dsc \
> +                || continue
>              updated_dsc=1
>              # save the registered DSC file to S3
> -            $aws_cp_to_s3 $dsc $s3/$locpackfile
> +            $aws_cp_to_s3_public $dsc $bucket_path/$locpackfile
>              tarxz=$(echo $locpackfile | sed 's#\.dsc$#.debian.tar.xz#g')
> -            $aws_cp_to_s3 $ws/$tarxz "$s3/$tarxz"
> +            $aws_cp_to_s3_public $ws/$tarxz "$bucket_path/$tarxz"
>              orig=$(echo $locpackfile | sed 's#-1\.dsc$#.orig.tar.xz#g')
> -            $aws_cp_to_s3 $ws/$orig "$s3/$orig"
> +            $aws_cp_to_s3_public $ws/$orig "$bucket_path/$orig"
>          done
>  
>          # check if any DEB/DSC files were newly registered
> @@ -390,13 +388,13 @@ EOF
>          popd
>  
>          # 4. sync the latest distribution path changes to S3
> -        $aws_sync_to_s3 dists/$loop_dist "$s3/dists/$loop_dist"
> +        $aws_sync_to_s3_public dists/$loop_dist "$bucket_path/dists/$loop_dist"
>      done
>  
>      # unlock the publishing
>      $rm_file $ws_lockfile
>  
> -    cd $swd
> +    popd
>  }
>  
>  # The 'pack_rpm' function especialy created for RPM packages. It works
> @@ -416,13 +414,12 @@ function pack_rpm {
>      fi
>  
>      # prepare the workspace
> -    prepare_ws
> +    prepare_ws ${os}_${option_dist}
>  
>      # copy the needed package binaries to the workspace
>      cp $repo/*.rpm $ws/.
>  
> -    swd=$PWD
> -    cd $ws
> +    pushd $ws
>  
>      # set the paths
>      if [ "$os" == "el" ]; then
> @@ -437,13 +434,13 @@ function pack_rpm {
>      # prepare local repository with packages
>      $mk_dir $packpath
>      mv *.rpm $packpath/.
> -    cd $ws/$repopath
> +    cd $repopath
>  
>      # copy the current metadata files from S3
>      mkdir repodata.base
> -    for file in $($aws ls $s3/$repopath/repodata/ | awk '{print $NF}') ; do
> -        $aws ls $s3/$repopath/repodata/$file || continue
> -        $aws cp $s3/$repopath/repodata/$file repodata.base/$file
> +    for file in $($aws ls $bucket_path/$repopath/repodata/ | awk '{print $NF}') ; do
> +        $aws ls $bucket_path/$repopath/repodata/$file || continue
> +        $aws cp $bucket_path/$repopath/repodata/$file repodata.base/$file
>      done
>  
>      # create the new repository metadata files
> @@ -507,16 +504,16 @@ EOF
>  
>      # copy the packages to S3
>      for file in $rpmpath/*.rpm ; do
> -        $aws_cp_to_s3 $file "$s3/$repopath/$file"
> +        $aws_cp_to_s3_public $file "$bucket_path/$repopath/$file"
>      done
>  
>      # update the metadata at the S3
> -    $aws_sync_to_s3 repodata "$s3/$repopath/repodata"
> +    $aws_sync_to_s3_public repodata "$bucket_path/$repopath/repodata"
>  
>      # unlock the publishing
>      $rm_file $ws_lockfile
>  
> -    cd $swd
> +    popd
>  }
>  
>  if [ "$os" == "ubuntu" -o "$os" == "debian" ]; then
> -- 
> 2.17.1
> 

-- 
Best regards,
IM

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4
  2020-01-13 12:04 [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Alexander V. Tikhonov
                   ` (2 preceding siblings ...)
  2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 3/3] Testing changes after 3rd " Alexander V. Tikhonov
@ 2020-01-13 17:02 ` Igor Munkin
  2020-01-14  5:00   ` Alexander Tikhonov
  3 siblings, 1 reply; 7+ messages in thread
From: Igor Munkin @ 2020-01-13 17:02 UTC (permalink / raw)
  To: Alexander V. Tikhonov; +Cc: tarantool-patches

Sasha,

Thanks, you've fixed almost everything. Sorry, two my remarks were
slightly misleading, thus I extended them in the corresponding patch
in this patchset. Please consider them, since everything else LGTM.

I guess after the final fixes you can squash everithing into a single
commit and ask Sasha Tu to proceed with the patch.

On 13.01.20, Alexander V. Tikhonov wrote:
> Github: https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci
> Issue: https://github.com/tarantool/tarantool/issues/3380
> 
> v4: https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013568.html
> v3: https://lists.tarantool.org/pipermail/tarantool-patches/2019-December/013060.html
> v2: https://lists.tarantool.org/pipermail/tarantool-patches/2019-November/012352.html
> v1: https://lists.tarantool.org/pipermail/tarantool-patches/2019-October/012021.html
> 
> Changes v4:
> - minor corrections
> 
> Changes v3:
> - common code parts merged to standalone routines
> - corrected code style, minor updates
> - script is ready for release
> 
> Changes v2:
> - made changes in script from draft to pre-release stages
> 
> Alexander V. Tikhonov (3):
>   gitlab-ci: implement packing into MCS S3
>   Testing changes after 2nd review on S3
>   Testing changes after 3rd review on S3
> 
>  .gitlab-ci.yml            |   5 +-
>  .gitlab.mk                |  20 +-
>  .travis.mk                |  41 ++-
>  tools/add_pack_s3_repo.sh | 527 ++++++++++++++++++++++++++++++++++++++
>  4 files changed, 567 insertions(+), 26 deletions(-)
>  create mode 100755 tools/add_pack_s3_repo.sh
> 
> -- 
> 2.17.1
> 

-- 
Best regards,
IM

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4
  2020-01-13 17:02 ` [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Igor Munkin
@ 2020-01-14  5:00   ` Alexander Tikhonov
  0 siblings, 0 replies; 7+ messages in thread
From: Alexander Tikhonov @ 2020-01-14  5:00 UTC (permalink / raw)
  To: Igor Munkin; +Cc: tarantool-patches

[-- Attachment #1: Type: text/plain, Size: 1993 bytes --]

Igor,

Thank you for all the reviews, I've squashed all changes into the single commit and sent
it on the review of A. Turenko.

Alexander, please proceed the review.


>Понедельник, 13 января 2020, 20:04 +03:00 от Igor Munkin <imun@tarantool.org>:
>
>Sasha,
>
>Thanks, you've fixed almost everything. Sorry, two my remarks were
>slightly misleading, thus I extended them in the corresponding patch
>in this patchset. Please consider them, since everything else LGTM.
>
>I guess after the final fixes you can squash everithing into a single
>commit and ask Sasha Tu to proceed with the patch.
>
>On 13.01.20, Alexander V. Tikhonov wrote:
>> Github:  https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci
>> Issue:  https://github.com/tarantool/tarantool/issues/3380
>> 
>> v4:  https://lists.tarantool.org/pipermail/tarantool-patches/2020-January/013568.html
>> v3:  https://lists.tarantool.org/pipermail/tarantool-patches/2019-December/013060.html
>> v2:  https://lists.tarantool.org/pipermail/tarantool-patches/2019-November/012352.html
>> v1:  https://lists.tarantool.org/pipermail/tarantool-patches/2019-October/012021.html
>> 
>> Changes v4:
>> - minor corrections
>> 
>> Changes v3:
>> - common code parts merged to standalone routines
>> - corrected code style, minor updates
>> - script is ready for release
>> 
>> Changes v2:
>> - made changes in script from draft to pre-release stages
>> 
>> Alexander V. Tikhonov (3):
>>   gitlab-ci: implement packing into MCS S3
>>   Testing changes after 2nd review on S3
>>   Testing changes after 3rd review on S3
>> 
>>  .gitlab-ci.yml            |   5 +-
>>  .gitlab.mk                |  20 +-
>>  .travis.mk                |  41 ++-
>>  tools/add_pack_s3_repo.sh | 527 ++++++++++++++++++++++++++++++++++++++
>>  4 files changed, 567 insertions(+), 26 deletions(-)
>>  create mode 100755 tools/add_pack_s3_repo.sh
>> 
>> -- 
>> 2.17.1
>> 
>
>-- 
>Best regards,
>IM


-- 
Alexander Tikhonov

[-- Attachment #2: Type: text/html, Size: 3311 bytes --]

^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2020-01-14  5:00 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2020-01-13 12:04 [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Alexander V. Tikhonov
2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 1/3] gitlab-ci: implement packing into MCS S3 Alexander V. Tikhonov
2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 2/3] Testing changes after 2nd review on S3 Alexander V. Tikhonov
2020-01-13 12:04 ` [Tarantool-patches] [PATCH v4 3/3] Testing changes after 3rd " Alexander V. Tikhonov
2020-01-13 16:54   ` Igor Munkin
2020-01-13 17:02 ` [Tarantool-patches] [PATCH v4 0/3] gitlab-ci: implement packing into MCS S3 v.4 Igor Munkin
2020-01-14  5:00   ` Alexander Tikhonov

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox