Tarantool development patches archive
 help / color / mirror / Atom feed
From: Igor Munkin <imun@tarantool.org>
To: "Alexander V. Tikhonov" <avtikhon@tarantool.org>
Cc: tarantool-patches@freelists.org, tarantool-patches@dev.tarantool.org
Subject: Re: [Tarantool-patches] [PATCH v1] gitlab-ci: implement packing into MCS S3
Date: Mon, 21 Oct 2019 14:54:36 +0300	[thread overview]
Message-ID: <20191021115432.GA14638@tarantool.org> (raw)
In-Reply-To: <d83a336d536eca6c5e1c584b596f338a78254f54.1571563733.git.avtikhon@tarantool.org>

Sasha,

Thanks for the patch. Omitting all minor notes we discussed offline I
have a general comment for the current implementation to be fixed within
v2.

I see no reason for avoiding using a single "build" script with common
routines you listed earlier and "invoking"/"sourcing" files with distro
packaging and publishing specifics considering the given OS variable.
This approach seems to be easier for the initial understanding and the
following maintaining and I see the following benefits:
* Introduction of a such backbone structure allows us to reduce next
  changesets related to building and publishing Tarantool packages
* Proposed layout divides the whole process into separate parts
  responsible for the corresponding build stage and makes the further
  maintanence much easier

CCed to tarantool-patches@dev.tarantool.org.

On 20.10.19, Alexander V. Tikhonov wrote:
> Added ability to store packages additionaly at MCS S3.
> The target idea was to add the new way of packages creation at MCS S3,
> which temporary duplicates the packaging at PackageCloud by Packpack
> tool. Also it was needed to pack the binaries in the native way of the
> each packing OS style. It were created two separate scripts, each of it
> for different packing style:
>  - DEB for Ubuntu and Debian, script tools/pub_packs_s3_deb.sh
>  - RPM for CentOS and Fedora, script tools/pub_packs_s3_rpm.sh
> Common parts of the scripts are:
>  - create new meta files for the new binaries
>  - copy new binaries to MCS S3
>  - get previous metafiles from MCS S3 and merge the new meta data for
> the new binaries
>  - update the meta files at MCS S3
> Different parts:
>  - DEB script based on reprepro external tool which needs the prepared
> path with needed file structure - .gitlab.mk file updated for it, also
> it works separately only on OS versions level - it means that meta data
> it updates for all Distritbutions together.
>  - RPM script based on createrepo external tool which doesn't need
> prepared path with file structure and works with the given path with
> binaries, also it works separately for OS/Distribution level - it means
> that meta data it updates for all Distributions separately.
> 
> Closes #3380
> ---
>  .gitlab-ci.yml            |  47 +++++----
>  .gitlab.mk                |  32 +++++-
>  .travis.mk                |  40 ++++----
>  tools/pub_packs_s3_deb.sh | 207 ++++++++++++++++++++++++++++++++++++++
>  tools/pub_packs_s3_rpm.sh | 129 ++++++++++++++++++++++++
>  5 files changed, 411 insertions(+), 44 deletions(-)
>  create mode 100755 tools/pub_packs_s3_deb.sh
>  create mode 100755 tools/pub_packs_s3_rpm.sh
> 
> diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml
> index 431730b67..b68136d96 100644
> --- a/.gitlab-ci.yml
> +++ b/.gitlab-ci.yml
> @@ -24,21 +24,29 @@ variables:
>    tags:
>      - docker_test
>  
> -.deploy_template: &deploy_definition
> +.deploy_deb_template: &deploy_deb_definition
>    <<: *release_only_definition
>    stage: test
>    tags:
>      - deploy
>    script:
> -    - ${GITLAB_MAKE} package
> +    - ${GITLAB_MAKE} package_deploy_deb
>  
> -.deploy_test_template: &deploy_test_definition
> +.deploy_rpm_template: &deploy_rpm_definition
> +  <<: *release_only_definition
> +  stage: test
> +  tags:
> +    - deploy
> +  script:
> +    - ${GITLAB_MAKE} package_deploy_rpm
> +
> +.deploy_test_rpm_template: &deploy_test_rpm_definition
>    <<: *release_only_definition
>    stage: test
>    tags:
>      - deploy_test
>    script:
> -    - ${GITLAB_MAKE} package
> +    - ${GITLAB_MAKE} package_deploy_rpm
>  
>  .vbox_template: &vbox_definition
>    stage: test
> @@ -141,85 +149,88 @@ freebsd_12_release:
>  # Packs
>  
>  centos_6:
> -  <<: *deploy_definition
> +  <<: *deploy_rpm_definition
>    variables:
>      OS: 'el'
>      DIST: '6'
>  
>  centos_7:
> -  <<: *deploy_test_definition
> +  <<: *deploy_test_rpm_definition
>    variables:
>      OS: 'el'
>      DIST: '7'
>  
>  fedora_28:
> -  <<: *deploy_test_definition
> +  <<: *deploy_test_rpm_definition
>    variables:
>      OS: 'fedora'
>      DIST: '28'
>  
>  fedora_29:
> -  <<: *deploy_test_definition
> +  <<: *deploy_test_rpm_definition
>    variables:
>      OS: 'fedora'
>      DIST: '29'
>  
>  fedora_30:
> -  <<: *deploy_test_definition
> +  <<: *deploy_test_rpm_definition
>    variables:
>      OS: 'fedora'
>      DIST: '30'
>  
>  ubuntu_14_04:
> -  <<: *deploy_definition
> +  <<: *deploy_deb_definition
>    variables:
>      OS: 'ubuntu'
>      DIST: 'trusty'
>  
>  ubuntu_16_04:
> -  <<: *deploy_definition
> +  <<: *deploy_deb_definition
>    variables:
>      OS: 'ubuntu'
>      DIST: 'xenial'
>  
>  ubuntu_18_04:
> -  <<: *deploy_definition
> +  <<: *deploy_deb_definition
>    variables:
>      OS: 'ubuntu'
>      DIST: 'bionic'
>  
>  ubuntu_18_10:
> -  <<: *deploy_definition
> +  <<: *deploy_deb_definition
>    variables:
>      OS: 'ubuntu'
>      DIST: 'cosmic'
>  
>  ubuntu_19_04:
> -  <<: *deploy_definition
> +  <<: *deploy_deb_definition
>    variables:
>      OS: 'ubuntu'
>      DIST: 'disco'
>  
>  debian_8:
> -  <<: *deploy_definition
> +  <<: *deploy_deb_definition
>    variables:
>      OS: 'debian'
>      DIST: 'jessie'
>  
>  debian_9:
> -  <<: *deploy_definition
> +  <<: *deploy_deb_definition
>    variables:
>      OS: 'debian'
>      DIST: 'stretch'
>  
>  debian_10:
> -  <<: *deploy_definition
> +  <<: *deploy_deb_definition
>    variables:
>      OS: 'debian'
>      DIST: 'buster'
>  
>  static_build:
> -  <<: *deploy_test_definition
> +  <<: *release_only_definition
> +  stage: test
> +  tags:
> +    - deploy_test
>    variables:
>      RUN_TESTS: 'ON'
>    script:
> diff --git a/.gitlab.mk b/.gitlab.mk
> index bf64df88a..43caca43e 100644
> --- a/.gitlab.mk
> +++ b/.gitlab.mk
> @@ -98,14 +98,38 @@ vms_test_%:
>  vms_shutdown:
>  	VBoxManage controlvm ${VMS_NAME} poweroff
>  
> -# ########################
> -# Build RPM / Deb packages
> -# ########################
> -

> +# ###########################
> +# Sources tarballs & packages
> +# ###########################
> +
> +# Push alpha and beta versions to <major>x bucket (say, 2x),
> +# stable to <major>.<minor> bucket (say, 2.2).
> +GIT_DESCRIBE=$(shell git describe HEAD)
> +MAJOR_VERSION=$(word 1,$(subst ., ,$(GIT_DESCRIBE)))
> +MINOR_VERSION=$(word 2,$(subst ., ,$(GIT_DESCRIBE)))
> +BUCKET=$(MAJOR_VERSION)_$(MINOR_VERSION)
> +ifeq ($(MINOR_VERSION),0)
> +BUCKET=$(MAJOR_VERSION)x
> +endif
> +ifeq ($(MINOR_VERSION),1)
> +BUCKET=$(MAJOR_VERSION)x
> +endif
> +REPODEB=build/pool/${DIST}/main/t/tarantool
> +
> +.PHONY: package
>  package: git_submodule_update
>  	git clone https://github.com/packpack/packpack.git packpack
>  	PACKPACK_EXTRA_DOCKER_RUN_PARAMS='--network=host' ./packpack/packpack
>  
> +package_deploy_deb: package
> +	mkdir -p ${REPODEB}
> +	for packfile in `ls build/*.deb build/*.dsc build/*.tar.*z 2>/dev/null` ; \
> +		do cp $$packfile ${REPODEB}/. ; done
> +	BUCKET=${BUCKET} OS=${OS} DISTS=${DIST} ./tools/pub_packs_s3_deb.sh build
> +
> +package_deploy_rpm: package
> +	BUCKET=${BUCKET} OS=${OS} RELEASE=${DIST} ./tools/pub_packs_s3_rpm.sh build
> +
>  # ############
>  # Static build
>  # ############
> diff --git a/.travis.mk b/.travis.mk
> index 42969ff56..1fcafc60b 100644
> --- a/.travis.mk
> +++ b/.travis.mk
> @@ -8,10 +8,6 @@ MAX_FILES?=65534
>  
>  all: package
>  
> -package:
> -	git clone https://github.com/packpack/packpack.git packpack
> -	./packpack/packpack
> -
>  test: test_$(TRAVIS_OS_NAME)
>  
>  # Redirect some targets via docker
> @@ -176,34 +172,34 @@ test_freebsd_no_deps: build_freebsd
>  
>  test_freebsd: deps_freebsd test_freebsd_no_deps
>  
> -####################
> -# Sources tarballs #
> -####################
> -
> -source:
> -	git clone https://github.com/packpack/packpack.git packpack
> -	TARBALL_COMPRESSOR=gz packpack/packpack tarball
> +###############################
> +# Sources tarballs & packages #
> +###############################
>  
>  # Push alpha and beta versions to <major>x bucket (say, 2x),
>  # stable to <major>.<minor> bucket (say, 2.2).
> -ifeq ($(TRAVIS_BRANCH),master)
>  GIT_DESCRIBE=$(shell git describe HEAD)
>  MAJOR_VERSION=$(word 1,$(subst ., ,$(GIT_DESCRIBE)))
>  MINOR_VERSION=$(word 2,$(subst ., ,$(GIT_DESCRIBE)))
> -else
> -MAJOR_VERSION=$(word 1,$(subst ., ,$(TRAVIS_BRANCH)))
> -MINOR_VERSION=$(word 2,$(subst ., ,$(TRAVIS_BRANCH)))
> -endif
> -BUCKET=tarantool.$(MAJOR_VERSION).$(MINOR_VERSION).src
> +BUCKET=$(MAJOR_VERSION)_$(MINOR_VERSION)
>  ifeq ($(MINOR_VERSION),0)
> -BUCKET=tarantool.$(MAJOR_VERSION)x.src
> +BUCKET=$(MAJOR_VERSION)x
>  endif
>  ifeq ($(MINOR_VERSION),1)
> -BUCKET=tarantool.$(MAJOR_VERSION)x.src
> +BUCKET=$(MAJOR_VERSION)x
>  endif
>  
> +package:
> +	git clone https://github.com/packpack/packpack.git packpack
> +	./packpack/packpack
> +
> +source:
> +	git clone https://github.com/packpack/packpack.git packpack
> +	TARBALL_COMPRESSOR=gz packpack/packpack tarball
> +
>  source_deploy:
>  	pip install awscli --user
> -	aws --endpoint-url "${AWS_S3_ENDPOINT_URL}" s3 \
> -		cp build/*.tar.gz "s3://${BUCKET}/" \
> -		--acl public-read
> +	for tarball in `ls build/*.tar.gz 2>/dev/null` ; \
> +		aws --endpoint-url "${AWS_S3_ENDPOINT_URL}" s3 \
> +			cp ${tarball} "s3://tarantool_repo/${BUCKET}/sources/" \
> +			--acl public-read
> diff --git a/tools/pub_packs_s3_deb.sh b/tools/pub_packs_s3_deb.sh
> new file mode 100755
> index 000000000..caa4644e4
> --- /dev/null
> +++ b/tools/pub_packs_s3_deb.sh
> @@ -0,0 +1,207 @@
> +#!/bin/bash
> +set -xe
> +
> +# manual configuration
> +if [ "$BUCKET" == "" ]; then
> +    echo "ERROR: need to set BUCKET env variable to any like <1_10,2_2,2_3,2x>"
> +    exit 1
> +fi
> +branch=$BUCKET
> +update_dists=0
> +
> +# configuration
> +if [ "$OS" == "" ]; then
> +    echo "ERROR: need to set OS env variable to any of <ubuntu,debian>"
> +    exit 1
> +fi
> +os=$OS
> +if [ "$DISTS" == "" ]; then
> +    echo "ERROR: need to set DISTS env variable to any set of <bionic,cosmic,disco,trusty,xenial>"
> +    exit 1
> +fi
> +dists=$DISTS
> +component=main
> +debdir=pool
> +aws='aws --endpoint-url https://hb.bizmrg.com'
> +s3="s3://tarantool_repo/$branch/$os"
> +
> +# get packages from pointed location either default Docker's mirror
> +repo=$1
> +if [ "$repo" == "" ] ; then
> +    repo=/var/spool/apt-mirror/mirror/packagecloud.io/tarantool/$branch/$os
> +fi
> +if [ ! -d $repo/$debdir ] ; then
> +    echo "Error: Current '$repo' has:"
> +    ls -al $repo
> +    echo "Usage: $0 [path to repository with '$debdir' directory in root path]"
> +    exit 1
> +fi
> +
> +# temporary lock the publication to the repository
> +ws=/tmp/tarantool_repo_s3_${branch}_${os}
> +wslock=$ws.lock
> +lockfile -l 1000 $wslock
> +
> +# create temporary workspace with repository copy
> +rm -rf $ws
> +mkdir -p $ws
> +cp -rf $repo/$debdir $ws/.
> +cd $ws
> +
> +# create the configuration file
> +confpath=$ws/conf
> +rm -rf $confpath
> +mkdir -p $confpath
> +
> +for dist in $dists ; do
> +    cat <<EOF >>$confpath/distributions
> +Origin: Tarantool
> +Label: tarantool.org
> +Suite: stable
> +Codename: $dist
> +Architectures: amd64 source
> +Components: main
> +Description: Unofficial Ubuntu Packages maintained by Tarantool
> +SignWith: 91B625E5
> +DebIndices: Packages Release . .gz .bz2
> +UDebIndices: Packages . .gz .bz2
> +DscIndices: Sources Release .gz .bz2
> +
> +EOF
> +done
> +
> +# create standalone repository with separate components
> +for dist in $dists ; do
> +    echo =================== DISTRIBUTION: $dist =========================
> +    updated_deb=0
> +    updated_dsc=0
> +
> +    # 1(binaries). use reprepro tool to generate Packages file
> +    for deb in $ws/$debdir/$dist/$component/*/*/*.deb ; do
> +        [ -f $deb ] || continue
> +        locdeb=`echo $deb | sed "s#^$ws\/##g"`
> +        echo "DEB: $deb"
> +        # register DEB file to Packages file
> +        reprepro -Vb . includedeb $dist $deb
> +        # reprepro copied DEB file to local component which is not needed
> +        rm -rf $debdir/$component
> +        # to have all packages avoid reprepro set DEB file to its own registry
> +        rm -rf db
> +        # copy Packages file to avoid of removing by the new DEB version
> +        for packages in dists/$dist/$component/binary-*/Packages ; do
> +            if [ ! -f $packages.saved ] ; then
> +                # get the latest Packages file from S3
> +                $aws s3 ls "$s3/$packages" 2>/dev/null && \
> +                    $aws s3 cp --acl public-read \
> +                        "$s3/$packages" $packages.saved || \
> +                    touch $packages.saved
> +            fi
> +            # check if the DEB file already exists in Packages from S3
> +            if grep "^`grep "^SHA256: " $packages`$" $packages.saved ; then
> +                echo "WARNING: DEB file already registered in S3!"
> +                continue
> +            fi
> +            # store the new DEB entry
> +            cat $packages >>$packages.saved
> +            # save the registered DEB file to S3
> +            $aws s3 cp --acl public-read $deb $s3/$locdeb
> +            updated_deb=1
> +        done
> +    done
> +
> +    # 1(sources). use reprepro tool to generate Sources file
> +    for dsc in $ws/$debdir/$dist/$component/*/*/*.dsc ; do
> +        [ -f $dsc ] || continue
> +        locdsc=`echo $dsc | sed "s#^$ws\/##g"`
> +        echo "DSC: $dsc"
> +        # register DSC file to Sources file
> +        reprepro -Vb . includedsc $dist $dsc
> +        # reprepro copied DSC file to component which is not needed
> +        rm -rf $debdir/$component
> +        # to have all sources avoid reprepro set DSC file to its own registry
> +        rm -rf db
> +        # copy Sources file to avoid of removing by the new DSC version
> +        sources=dists/$dist/$component/source/Sources
> +        if [ ! -f $sources.saved ] ; then
> +            # get the latest Sources file from S3
> +            $aws s3 ls "$s3/$sources" && \
> +                $aws s3 cp --acl public-read "$s3/$sources" $sources.saved || \
> +                touch $sources.saved
> +        fi
> +        # WORKAROUND: unknown why, but reprepro doesn`t save the Sources file
> +        gunzip -c $sources.gz >$sources
> +        # check if the DSC file already exists in Sources from S3
> +        hash=`grep '^Checksums-Sha256:' -A3 $sources | \
> +                tail -n 1 | awk '{print $1}'`
> +        if grep " $hash .*\.dsc$" $sources.saved ; then
> +            echo "WARNING: DSC file already registered in S3!"
> +            continue
> +        fi
> +        # store the new DSC entry
> +        cat $sources >>$sources.saved
> +        # save the registered DSC file to S3
> +        $aws s3 cp --acl public-read $dsc $s3/$locdsc
> +        tarxz=`echo $locdsc | sed 's#\.dsc$#.debian.tar.xz#g'`
> +        $aws s3 cp --acl public-read $ws/$tarxz "$s3/$tarxz"
> +        orig=`echo $locdsc | sed 's#-1\.dsc$#.orig.tar.xz#g'`
> +        $aws s3 cp --acl public-read $ws/$orig "$s3/$orig"
> +        updated_dsc=1
> +    done
> +
> +    # check if any DEB/DSC files were newly registered
> +    [ "$update_dists" == "0" -a "$updated_deb" == "0" -a "$updated_dsc" == "0" ] && \
> +        continue || echo "Updating dists"
> +
> +    # finalize the Packages file
> +    for packages in dists/$dist/$component/binary-*/Packages ; do
> +        mv $packages.saved $packages
> +    done
> +
> +    # 2(binaries). update Packages file archives
> +    for packpath in dists/$dist/$component/binary-* ; do
> +        cd $packpath
> +        sed "s#Filename: $debdir/$component/#Filename: $debdir/$dist/$component/#g" -i Packages
> +        bzip2 -c Packages >Packages.bz2
> +        gzip -c Packages >Packages.gz
> +        cd -
> +    done
> +
> +    # 2(sources). update Sources file archives
> +    cd dists/$dist/$component/source
> +    sed "s#Directory: $debdir/$component/#Directory: $debdir/$dist/$component/#g" -i Sources
> +    bzip2 -c Sources >Sources.bz2
> +    gzip -c Sources >Sources.gz
> +    cd -
> +
> +    # 3. update checksums of the Packages* files in *Release files
> +    cd dists/$dist
> +    for file in `grep " $component/" Release | awk '{print $3}' | sort -u` ; do
> +        sz=`stat -c "%s" $file`
> +        md5=`md5sum $file | awk '{print $1}'`
> +        sha1=`sha1sum $file | awk '{print $1}'`
> +        sha256=`sha256sum $file | awk '{print $1}'`
> +        awk 'BEGIN{c = 0} ; {
> +                if ($3 == p) {
> +                    c = c + 1
> +                    if (c == 1) {print " " md  " " s " " p}
> +                    if (c == 2) {print " " sh1 " " s " " p}
> +                    if (c == 3) {print " " sh2 " " s " " p}
> +                } else {print $0}
> +            }' p="$file" s="$sz" md="$md5" sh1="$sha1" sh2="$sha256" \
> +            Release >Release.new
> +        mv Release.new Release
> +    done
> +    # resign the selfsigned InRelease file
> +    rm -rf InRelease
> +    gpg --clearsign -o InRelease Release
> +    # resign the Release file
> +    rm -rf Release.gpg
> +    gpg -abs -o Release.gpg Release
> +    cd -
> +
> +    # 4. sync the latest distribution path changes to S3
> +    $aws s3 sync --acl public-read dists/$dist "$s3/dists/$dist"
> +done
> +
> +# unlock the publishing
> +rm -rf $wslock
> diff --git a/tools/pub_packs_s3_rpm.sh b/tools/pub_packs_s3_rpm.sh
> new file mode 100755
> index 000000000..85422a8ea
> --- /dev/null
> +++ b/tools/pub_packs_s3_rpm.sh
> @@ -0,0 +1,129 @@
> +#!/bin/bash
> +set -xe
> +
> +# manual configuration
> +if [ "$BUCKET" == "" ]; then
> +    echo "ERROR: need to set BUCKET env variable to any like <1_10,2_2,2_3,2x>"
> +    exit 1
> +fi
> +branch=$BUCKET
> +
> +# configuration
> +if [ "$OS" == "" ]; then
> +    echo "ERROR: need to set OS env variable to any of <centos,fedora>"
> +    exit 1
> +fi
> +os=$OS
> +if [ "$RELEASE" == "" ]; then
> +    echo "ERROR: need to set RELEASE env variable to any set of <[6,7,27-30]>"
> +    exit 1
> +fi
> +release=$RELEASE
> +
> +aws='aws --endpoint-url https://hb.bizmrg.com'
> +s3="s3://tarantool_repo/$branch"
> +
> +# get packages from pointed location either default Docker's mirror
> +repo=$1
> +if [ "$repo" == "" ] ; then
> +    repo=.
> +fi
> +if ! ls $repo/*.rpm >/dev/null 2>&1 ; then
> +    echo "Error: Current '$repo' has:"
> +    ls -al $repo
> +    echo "Usage: $0 [path with *.rpm files]"
> +    exit 1
> +fi
> +
> +# temporary lock the publication to the repository
> +ws=/tmp/tarantool_repo_s3_${branch}_${os}_${release}
> +wslock=$ws.lock
> +lockfile -l 1000 $wslock
> +
> +# create temporary workspace with packages copies
> +rm -rf $ws
> +mkdir -p $ws
> +cp $repo/*.rpm $ws/.
> +cd $ws
> +
> +# set the paths
> +if [ "$os" == "centos" -o "$os" == "el" ]; then
> +    repopath=centos/$release/os/x86_64
> +    rpmpath=Packages
> +elif [ "$os" == "fedora" ]; then
> +    repopath=fedora/releases/$release/Everything/x86_64/os
> +    rpmpath=Packages/t
> +fi
> +packpath=$repopath/$rpmpath
> +
> +# prepare local repository with packages
> +mkdir -p $packpath
> +mv *.rpm $packpath/.
> +cd $repopath
> +
> +# copy the current metadata files from S3
> +mkdir repodata.base
> +for file in `$aws s3 ls $s3/$repopath/repodata/ | awk '{print $NF}'` ; do
> +    $aws s3 ls $s3/$repopath/repodata/$file || continue
> +    $aws s3 cp $s3/$repopath/repodata/$file repodata.base/$file
> +done
> +
> +# create the new repository metadata files
> +createrepo --no-database --update --workers=2 --compress-type=gz --simple-md-filenames .
> +mv repodata repodata.adding
> +
> +# merge metadata files
> +mkdir repodata
> +head -n 2 repodata.adding/repomd.xml >repodata/repomd.xml
> +for file in filelists.xml other.xml primary.xml ; do
> +    # 1. take the 1st line only - to skip the line with number of packages which is not needed
> +    zcat repodata.adding/$file.gz | head -n 1 >repodata/$file
> +    # 2. take 2nd line with metadata tag and update the packages number in it
> +    packsold=0
> +    if [ -f repodata.base/$file.gz ] ; then
> +        packsold=`zcat repodata.base/$file.gz | head -n 2 | tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g'`
> +    fi
> +    packsnew=`zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | sed 's#.*packages="\(.*\)".*#\1#g'`
> +    packs=$(($packsold+$packsnew))
> +    zcat repodata.adding/$file.gz | head -n 2 | tail -n 1 | sed "s#packages=\".*\"#packages=\"$packs\"#g" >>repodata/$file
> +    # 3. take only 'package' tags from new file
> +    zcat repodata.adding/$file.gz | tail -n +3 | head -n -1 >>repodata/$file
> +    # 4. take only 'package' tags from old file if exists
> +    if [ -f repodata.base/$file.gz ] ; then
> +        zcat repodata.base/$file.gz | tail -n +3 | head -n -1 >>repodata/$file
> +    fi
> +    # 5. take the last closing line with metadata tag
> +    zcat repodata.adding/$file.gz | tail -n 1 >>repodata/$file
> +
> +    # get the new data
> +    chsnew=`sha256sum repodata/$file | awk '{print $1}'`
> +    sz=`stat --printf="%s" repodata/$file`
> +    gzip repodata/$file
> +    chsgznew=`sha256sum repodata/$file.gz | awk '{print $1}'`
> +    szgz=`stat --printf="%s" repodata/$file.gz`
> +    timestamp=`date +%s -r repodata/$file.gz`
> +
> +    # add info to repomd.xml file
> +    name=`echo $file | sed 's#\.xml$##g'`
> +    echo "<data type=\"$name\">" >>repodata/repomd.xml
> +    echo "  <checksum type=\"sha256\">$chsgznew</checksum>" >>repodata/repomd.xml
> +    echo "  <open-checksum type=\"sha256\">$chsnew</open-checksum>" >>repodata/repomd.xml
> +    echo "  <location href=\"repodata/$file.gz\"/>" >>repodata/repomd.xml
> +    echo "  <timestamp>$timestamp</timestamp>" >>repodata/repomd.xml
> +        echo "  <size>$szgz</size>" >>repodata/repomd.xml
> +        echo "  <open-size>$sz</open-size>" >>repodata/repomd.xml
> +    echo "</data>" >>repodata/repomd.xml
> +done
> +tail -n 1 repodata.adding/repomd.xml >>repodata/repomd.xml
> +gpg --detach-sign --armor repodata/repomd.xml
> +
> +# copy the packages to S3
> +for file in $rpmpath/*.rpm ; do
> +    $aws s3 cp --acl public-read $file "$s3/$repopath/$file"
> +done
> +
> +# update the metadata at the S3
> +$aws s3 sync --acl public-read repodata "$s3/$repopath/repodata"
> +
> +# unlock the publishing
> +rm -rf $wslock
> -- 
> 2.17.1
> 

-- 
Best regards,
IM

           reply	other threads:[~2019-10-21 11:55 UTC|newest]

Thread overview: expand[flat|nested]  mbox.gz  Atom feed
 [parent not found: <d83a336d536eca6c5e1c584b596f338a78254f54.1571563733.git.avtikhon@tarantool.org>]

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=20191021115432.GA14638@tarantool.org \
    --to=imun@tarantool.org \
    --cc=avtikhon@tarantool.org \
    --cc=tarantool-patches@dev.tarantool.org \
    --cc=tarantool-patches@freelists.org \
    --subject='Re: [Tarantool-patches] [PATCH v1] gitlab-ci: implement packing into MCS S3' \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox