Tarantool development patches archive
 help / color / mirror / Atom feed
* [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script
@ 2020-03-30  5:38 Alexander V. Tikhonov
  2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 1/3] Add metafiles cleanup routines at S3 pack script Alexander V. Tikhonov
                   ` (4 more replies)
  0 siblings, 5 replies; 7+ messages in thread
From: Alexander V. Tikhonov @ 2020-03-30  5:38 UTC (permalink / raw)
  To: Oleg Piskunov, Sergey Bronnikov, Alexander Turenko; +Cc: tarantool-patches

Script for pushing packages to S3 extended with the new features/fixes:

1) added ability to fix metafiles on missed/bad packages in S3
2) added ability to work with packages consisting only binaries
   either sources parts
3) added usage instructions for '-p|--product' option

Github: https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci
Issue: https://github.com/tarantool/tarantool/issues/3380

Alexander V. Tikhonov (1):
  Add metafiles cleanup routines at S3 pack script

Alexander.V Tikhonov (2):
  Enable script for saving packages in S3 for modules
  Add help instruction on 'product' option

 tools/update_repo.sh | 277 +++++++++++++++++++++++++++++++++----------
 1 file changed, 212 insertions(+), 65 deletions(-)

-- 
2.17.1

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [Tarantool-patches] [PATCH v1 1/3] Add metafiles cleanup routines at S3 pack script
  2020-03-30  5:38 [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Alexander V. Tikhonov
@ 2020-03-30  5:38 ` Alexander V. Tikhonov
  2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 2/3] Enable script for saving packages in S3 for modules Alexander V. Tikhonov
                   ` (3 subsequent siblings)
  4 siblings, 0 replies; 7+ messages in thread
From: Alexander V. Tikhonov @ 2020-03-30  5:38 UTC (permalink / raw)
  To: Oleg Piskunov, Sergey Bronnikov, Alexander Turenko; +Cc: tarantool-patches

Added cleanup functionality for the meta files.
Script may have the following situations:

 - package files removed at S3, but it still registered:
   Script stores and registers the new packages at S3 and
   removes all the other registered blocks for the sames
   files in meta files.

 - package files already exists at S3 with the same hashes:
   Script passes it with warning message.

 - package files already exists at S3 with the old hashes:
   Script fails w/o force flag, otherwise it stores and
   registers the new packages at S3 and removes all the other
   registered blocks for the sames files in meta files.

Added '-s|skip_errors' option flag to skip errors on changed
packages to avoid of exits on script run.

Follow-up #3380
---
 tools/update_repo.sh | 204 +++++++++++++++++++++++++++++++++----------
 1 file changed, 159 insertions(+), 45 deletions(-)

diff --git a/tools/update_repo.sh b/tools/update_repo.sh
index f49569b73..ddc44d118 100755
--- a/tools/update_repo.sh
+++ b/tools/update_repo.sh
@@ -9,6 +9,7 @@ ws_prefix=/tmp/tarantool_repo_s3
 alloss='ubuntu debian el fedora'
 product=tarantool
 force=
+skip_errors=
 # the path with binaries either repository
 repo=.
 
@@ -82,6 +83,8 @@ EOF
          Product name to be packed with, default name is 'tarantool'
     -f|--force
          Force updating the remote package with the local one despite the checksum difference
+    -s|--skip_errors
+         Skip failing on changed packages
     -h|--help
          Usage help message
 EOF
@@ -114,6 +117,9 @@ case $i in
     -f|--force)
     force=1
     ;;
+    -s|--skip_errors)
+    skip_errors=1
+    ;;
     -h|--help)
     usage
     exit 0
@@ -169,6 +175,9 @@ function update_deb_packfile {
 function update_deb_metadata {
     packpath=$1
     packtype=$2
+    packfile=$3
+
+    file_exists=''
 
     if [ ! -f $packpath.saved ] ; then
         # get the latest Sources file from S3 either create empty file
@@ -185,38 +194,94 @@ function update_deb_metadata {
         # find the hash from the new Sources file
         hash=$(grep '^Checksums-Sha256:' -A3 $packpath | \
             tail -n 1 | awk '{print $1}')
+        # check if the file already exists in S3
+        if $aws ls "$bucket_path/$packfile" ; then
+            echo "WARNING: DSC file already exists in S3!"
+            file_exists=$bucket_path/$packfile
+        fi
         # search the new hash in the old Sources file from S3
         if grep " $hash .* .*$" $packpath.saved ; then
             echo "WARNING: DSC file already registered in S3!"
-            return
+            echo "File hash: $hash"
+            if [ "$file_exists" != "" ] ; then
+                return
+            fi
         fi
         # check if the DSC file already exists in old Sources file from S3
         file=$(grep '^Files:' -A3 $packpath | tail -n 1 | awk '{print $3}')
-        if [ "$force" == "" ] && grep " .* .* $file$" $packpath.saved ; then
-            echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"
-            echo "New hash: $hash"
-            # unlock the publishing
-            $rm_file $ws_lockfile
-            exit 1
+        if grep " .* .* $file$" $packpath.saved ; then
+            if [ "$force" == "" -a "$file_exists" != "" ] ; then
+                if [ "$skip_errors" == "" ] ; then
+                    echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"
+                    echo "New hash: $hash"
+                    # unlock the publishing
+                    $rm_file $ws_lockfile
+                    exit 1
+                else
+                    echo "WARNING: the file already exists, but changed, set '-f' to overwrite it: $file"
+                    echo "New hash: $hash"
+                    return
+                fi
+            fi
+            hashes_old=$(grep '^Checksums-Sha256:' -A3 $packpath.saved | \
+                grep " .* .* $file" | awk '{print $1}')
+            # NOTE: for the single file name may exists more than one
+            #       entry in damaged file, to fix it all found entries
+            #       of this file need to be removed
+            # find and remove all package blocks for the bad hashes
+            for hash_rm in $hashes_old ; do
+                echo "Removing from $packpath.saved file old hash: $hash_rm"
+                pcregrep -Mi -v "(?s)Package: (\N+\n)+(?=^ ${hash_rm}).*?^$" \
+                    $packpath.saved >$packpath.saved_new
+                mv $packpath.saved_new $packpath.saved
+            done
         fi
         updated_dsc=1
     elif [ "$packtype" == "deb" ]; then
         # check if the DEB file already exists in old Packages file from S3
         # find the hash from the new Packages file
-        hash=$(grep '^SHA256: ' $packpath)
+        hash=$(grep '^SHA256: ' $packpath | awk '{print $2}')
+        # check if the file already exists in S3
+        if $aws ls "$bucket_path/$packfile" ; then
+            echo "WARNING: DEB file already exists in S3!"
+            file_exists=$bucket_path/$packfile
+        fi
         # search the new hash in the old Packages file from S3
         if grep "^SHA256: $hash" $packpath.saved ; then
             echo "WARNING: DEB file already registered in S3!"
-            return
+            echo "File hash: $hash"
+            if [ "$file_exists" != "" ] ; then
+                return
+            fi
         fi
         # check if the DEB file already exists in old Packages file from S3
         file=$(grep '^Filename:' $packpath | awk '{print $2}')
-        if [ "$force" == "" ] && grep "Filename: $file$" $packpath.saved ; then
-            echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"
-            echo "New hash: $hash"
-            # unlock the publishing
-            $rm_file $ws_lockfile
-            exit 1
+        if grep "Filename: $file$" $packpath.saved ; then
+            if [ "$force" == "" -a "$file_exists" != "" ] ; then
+                if [ "$skip_errors" == "" ] ; then
+                    echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"
+                    echo "New hash: $hash"
+                    # unlock the publishing
+                    $rm_file $ws_lockfile
+                    exit 1
+                else
+                    echo "WARNING: the file already exists, but changed, set '-f' to overwrite it: $file"
+                    echo "New hash: $hash"
+                    return
+                fi
+            fi
+            hashes_old=$(grep -e "^Filename: " -e "^SHA256: " $packpath.saved | \
+                grep -A1 "$file" | grep "^SHA256: " | awk '{print $2}')
+            # NOTE: for the single file name may exists more than one
+            #       entry in damaged file, to fix it all found entries
+            #       of this file need to be removed
+            # find and remove all package blocks for the bad hashes
+            for hash_rm in $hashes_old ; do
+                echo "Removing from $packpath.saved file old hash: $hash_rm"
+                pcregrep -Mi -v "(?s)Package: (\N+\n)+(?=SHA256: ${hash_rm}).*?^$" \
+                    $packpath.saved >$packpath.saved_new
+                mv $packpath.saved_new $packpath.saved
+            done
         fi
         updated_deb=1
     fi
@@ -248,9 +313,6 @@ function pack_deb {
         exit 1
     fi
 
-    # prepare the workspace
-    prepare_ws ${os}
-
     # set the subpath with binaries based on literal character of the product name
     proddir=$(echo $product | head -c 1)
 
@@ -297,7 +359,7 @@ EOF
             for packages in dists/$loop_dist/$component/binary-*/Packages ; do
                 # copy Packages file to avoid of removing by the new DEB version
                 # update metadata 'Packages' files
-                update_deb_metadata $packages deb
+                update_deb_metadata $packages deb $locpackfile
                 [ "$updated_deb" == "1" ] || continue
                 updated_files=1
             done
@@ -316,7 +378,8 @@ EOF
             echo "Regenerated DSC file: $locpackfile"
             # copy Sources file to avoid of removing by the new DSC version
             # update metadata 'Sources' file
-            update_deb_metadata dists/$loop_dist/$component/source/Sources dsc
+            update_deb_metadata dists/$loop_dist/$component/source/Sources dsc \
+                $locpackfile
             [ "$updated_dsc" == "1" ] || continue
             updated_files=1
             # save the registered DSC file to S3
@@ -398,11 +461,6 @@ EOF
         # 4. sync the latest distribution path changes to S3
         $aws_sync_public dists/$loop_dist "$bucket_path/dists/$loop_dist"
     done
-
-    # unlock the publishing
-    $rm_file $ws_lockfile
-
-    popd
 }
 
 # The 'pack_rpm' function especialy created for RPM packages. It works
@@ -426,9 +484,6 @@ function pack_rpm {
         exit 1
     fi
 
-    # prepare the workspace
-    prepare_ws ${os}_${option_dist}
-
     # copy the needed package binaries to the workspace
     ( cd $repo && cp $pack_rpms $ws/. )
 
@@ -460,29 +515,76 @@ function pack_rpm {
     for hash in $(zcat repodata/other.xml.gz | grep "<package pkgid=" | \
         awk -F'"' '{print $2}') ; do
         updated_rpm=0
+        file_exists=''
         name=$(zcat repodata/other.xml.gz | grep "<package pkgid=\"$hash\"" | \
             awk -F'"' '{print $4}')
+        file=$(zcat repodata/primary.xml.gz | \
+            grep -e "<checksum type=" -e "<location href=" | \
+            grep "$hash" -A1 | grep "<location href=" | \
+            awk -F'"' '{print $2}')
+        # check if the file already exists in S3
+        if $aws ls "$bucket_path/$repopath/$file" ; then
+            echo "WARNING: DSC file already exists in S3!"
+            file_exists=$bucket_path/$repopath/$file
+        fi
         # search the new hash in the old meta file from S3
         if zcat repodata.base/filelists.xml.gz | grep "pkgid=\"$hash\"" | \
             grep "name=\"$name\"" ; then
             echo "WARNING: $name file already registered in S3!"
             echo "File hash: $hash"
-            continue
+            if [ "$file_exists" != "" ] ; then
+                continue
+            fi
         fi
         updated_rpms=1
         # check if the hashed file already exists in old meta file from S3
-        file=$(zcat repodata/primary.xml.gz | \
-            grep -e "<checksum type=" -e "<location href=" | \
-            grep "$hash" -A1 | grep "<location href=" | \
-            awk -F'"' '{print $2}')
-        # check if the file already exists in S3
-        if [ "$force" == "" ] && zcat repodata.base/primary.xml.gz | \
+        if zcat repodata.base/primary.xml.gz | \
                 grep "<location href=\"$file\"" ; then
-            echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"
-            echo "New hash: $hash"
-            # unlock the publishing
-            $rm_file $ws_lockfile
-            exit 1
+            if [ "$force" == "" -a "$file_exists" != "" ] ; then
+                if [ "$skip_errors" == "" ] ; then
+                    echo "ERROR: the file already exists, but changed, set '-f' to overwrite it: $file"
+                    echo "New hash: $hash"
+                    # unlock the publishing
+                    $rm_file $ws_lockfile
+                    exit 1
+                else
+                    echo "WARNING: the file already exists, but changed, set '-f' to overwrite it: $file"
+                    echo "New hash: $hash"
+                    continue
+                fi
+            fi
+            hashes_old=$(zcat repodata.base/primary.xml.gz | \
+                grep -e "<checksum type=" -e "<location href=" | \
+                grep -B1 "$file" | grep "<checksum type=" | \
+                awk -F'>' '{print $2}' | sed 's#<.*##g')
+            # NOTE: for the single file name may exists more than one
+            #       entry in damaged file, to fix it all found entries
+            #       of this file need to be removed
+            for metafile in repodata.base/other \
+                            repodata.base/filelists \
+                            repodata.base/primary ; do
+                up_lines=''
+                if [ "$metafile" == "repodata.base/primary" ]; then
+                    up_full_lines='(\N+\n)*'
+                fi
+                packs_rm=0
+                # find and remove all <package> tags for the bad hashes
+                for hash_rm in $hashes_old ; do
+                    echo "Removing from ${metafile}.xml.gz file old hash: $hash_rm"
+                    zcat ${metafile}.xml.gz | \
+                        pcregrep -Mi -v "(?s)<package ${up_full_lines}\N+(?=${hash_rm}).*?package>" | \
+                        gzip - >${metafile}_new.xml.gz
+                    mv ${metafile}_new.xml.gz ${metafile}.xml.gz
+                    packs_rm=$(($packs_rm+1))
+                done
+                # reduce number of packages in metafile counter
+                gunzip ${metafile}.xml.gz
+                packs=$(($(grep " packages=" ${metafile}.xml | \
+                    sed 's#.* packages="\([0-9]*\)".*#\1#g')-${packs_rm}))
+                sed "s# packages=\"[0-9]*\"# packages=\"${packs}\"#g" \
+                    -i ${metafile}.xml
+                gzip ${metafile}.xml
+            done
         fi
     done
 
@@ -554,22 +656,34 @@ EOF
 
     # update the metadata at the S3
     $aws_sync_public repodata "$bucket_path/$repopath/repodata"
-
-    # unlock the publishing
-    $rm_file $ws_lockfile
-
-    popd
 }
 
 if [ "$os" == "ubuntu" -o "$os" == "debian" ]; then
+    # prepare the workspace
+    prepare_ws ${os}
     pack_deb
+    # unlock the publishing
+    $rm_file $ws_lockfile
+    popd
 elif [ "$os" == "el" -o "$os" == "fedora" ]; then
     # RPM packages structure needs different paths for binaries and sources
     # packages, in this way it is needed to call the packages registering
     # script twice with the given format:
     # pack_rpm <packages store subpath> <patterns of the packages to register>
+
+    # prepare the workspace
+    prepare_ws ${os}_${option_dist}
     pack_rpm x86_64 "*.x86_64.rpm *.noarch.rpm"
+    # unlock the publishing
+    $rm_file $ws_lockfile
+    popd
+
+    # prepare the workspace
+    prepare_ws ${os}_${option_dist}
     pack_rpm SRPMS "*.src.rpm"
+    # unlock the publishing
+    $rm_file $ws_lockfile
+    popd
 else
     echo "USAGE: given OS '$os' is not supported, use any single from the list: $alloss"
     usage
-- 
2.17.1

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [Tarantool-patches] [PATCH v1 2/3] Enable script for saving packages in S3 for modules
  2020-03-30  5:38 [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Alexander V. Tikhonov
  2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 1/3] Add metafiles cleanup routines at S3 pack script Alexander V. Tikhonov
@ 2020-03-30  5:38 ` Alexander V. Tikhonov
  2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 3/3] Add help instruction on 'product' option Alexander V. Tikhonov
                   ` (2 subsequent siblings)
  4 siblings, 0 replies; 7+ messages in thread
From: Alexander V. Tikhonov @ 2020-03-30  5:38 UTC (permalink / raw)
  To: Oleg Piskunov, Sergey Bronnikov, Alexander Turenko; +Cc: tarantool-patches

From: "Alexander.V Tikhonov" <avtikhon@tarantool.org>

Found that modules may have only binaries packages w/o sources
packages. Script changed to be able to work with only binaries
either sources packages.

Follow-up #3380
---
 tools/update_repo.sh | 51 +++++++++++++++++++++++++++-----------------
 1 file changed, 32 insertions(+), 19 deletions(-)

diff --git a/tools/update_repo.sh b/tools/update_repo.sh
index ddc44d118..975a76c00 100755
--- a/tools/update_repo.sh
+++ b/tools/update_repo.sh
@@ -306,20 +306,24 @@ function pack_deb {
     # debian has special directory 'pool' for packages
     debdir=pool
 
-    # get packages from pointed location
-    if ! ls $repo/*.deb $repo/*.dsc $repo/*.tar.*z >/dev/null ; then
-        echo "ERROR: files $repo/*.deb $repo/*.dsc $repo/*.tar.*z not found"
-        usage
-        exit 1
-    fi
-
     # set the subpath with binaries based on literal character of the product name
     proddir=$(echo $product | head -c 1)
 
     # copy single distribution with binaries packages
     repopath=$ws/pool/${option_dist}/$component/$proddir/$product
     $mk_dir ${repopath}
-    cp $repo/*.deb $repo/*.dsc $repo/*.tar.*z $repopath/.
+    file_found=0
+    for file in $repo/*.deb $repo/*.dsc $repo/*.tar.*z ; do
+        [ -f $file ] || continue
+        file_found=1
+        cp $file $repopath/.
+    done
+    # check that any files found
+    if [ "$file_found" == "0" ]; then
+        echo "ERROR: files $repo/*.deb $repo/*.dsc $repo/*.tar.*z not found"
+        usage
+        exit 1
+    fi
     pushd $ws
 
     # create the configuration file for 'reprepro' tool
@@ -395,28 +399,36 @@ EOF
             continue || echo "Updating dists"
 
         # finalize the Packages file
-        for packages in dists/$loop_dist/$component/binary-*/Packages ; do
-            mv $packages.saved $packages
-        done
+        if [ -f $packages.saved ]; then
+            for packages in dists/$loop_dist/$component/binary-*/Packages ; do
+                mv $packages.saved $packages
+            done
+        fi
 
         # finalize the Sources file
-        sources=dists/$loop_dist/$component/source/Sources
-        mv $sources.saved $sources
+        if [ -f $sources.saved ]; then
+            sources=dists/$loop_dist/$component/source/Sources
+            mv $sources.saved $sources
+        fi
 
         # 2(binaries). update Packages file archives
         for packpath in dists/$loop_dist/$component/binary-* ; do
             pushd $packpath
-            sed "s#Filename: $debdir/$component/#Filename: $debdir/$loop_dist/$component/#g" -i Packages
-            bzip2 -c Packages >Packages.bz2
-            gzip -c Packages >Packages.gz
+            if [ -f Packages ]; then
+                sed "s#Filename: $debdir/$component/#Filename: $debdir/$loop_dist/$component/#g" -i Packages
+                bzip2 -c Packages >Packages.bz2
+                gzip -c Packages >Packages.gz
+            fi
             popd
         done
 
         # 2(sources). update Sources file archives
         pushd dists/$loop_dist/$component/source
-        sed "s#Directory: $debdir/$component/#Directory: $debdir/$loop_dist/$component/#g" -i Sources
-        bzip2 -c Sources >Sources.bz2
-        gzip -c Sources >Sources.gz
+        if [ -f Sources ]; then
+            sed "s#Directory: $debdir/$component/#Directory: $debdir/$loop_dist/$component/#g" -i Sources
+            bzip2 -c Sources >Sources.bz2
+            gzip -c Sources >Sources.gz
+        fi
         popd
 
         # 3. update checksums entries of the Packages* files in *Release files
@@ -435,6 +447,7 @@ EOF
         #       'sha1' - at the 2nd and 'sha256' at the 3rd
         pushd dists/$loop_dist
         for file in $(grep " $component/" Release | awk '{print $3}' | sort -u) ; do
+            [ -f $file ] || continue
             sz=$(stat -c "%s" $file)
             md5=$(md5sum $file | awk '{print $1}')
             sha1=$(sha1sum $file | awk '{print $1}')
-- 
2.17.1

^ permalink raw reply	[flat|nested] 7+ messages in thread

* [Tarantool-patches] [PATCH v1 3/3] Add help instruction on 'product' option
  2020-03-30  5:38 [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Alexander V. Tikhonov
  2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 1/3] Add metafiles cleanup routines at S3 pack script Alexander V. Tikhonov
  2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 2/3] Enable script for saving packages in S3 for modules Alexander V. Tikhonov
@ 2020-03-30  5:38 ` Alexander V. Tikhonov
  2020-03-30 14:40 ` [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Sergey Bronnikov
  2020-04-15 13:51 ` Kirill Yukhin
  4 siblings, 0 replies; 7+ messages in thread
From: Alexander V. Tikhonov @ 2020-03-30  5:38 UTC (permalink / raw)
  To: Oleg Piskunov, Sergey Bronnikov, Alexander Turenko; +Cc: tarantool-patches

From: "Alexander.V Tikhonov" <avtikhon@tarantool.org>

Added instructions on 'product' option with examples.

Follow-up #3380
---
 tools/update_repo.sh | 22 +++++++++++++++++++++-
 1 file changed, 21 insertions(+), 1 deletion(-)

diff --git a/tools/update_repo.sh b/tools/update_repo.sh
index 975a76c00..85df7f795 100755
--- a/tools/update_repo.sh
+++ b/tools/update_repo.sh
@@ -80,7 +80,27 @@ EOF
     done
     cat <<EOF
     -p|--product
-         Product name to be packed with, default name is 'tarantool'
+         Product name to be packed with, default name is 'tarantool'.
+	 Product name value may affect location of *.deb, *.rpm and
+         related files relative to a base repository URL. It can be
+         provided or not: the script will generate correct repository
+         metainfo anyway.
+         However providing meaningful value for this option enables
+         grouping of related set of packages into a subdirectory
+         named as '${product}' (only for Deb repositories at moment
+         of writing this).
+         It is enabled here for consistency with locations of other
+         Deb packages in our repositories, but in fact it is the
+         internal detail, which does not lead to any change in the
+         user experience.
+         Example of usage on 'tarantool-queue' product:
+           - for DEB packages:
+             ./$0 -b=s3://<target repository> -o=debian -d=stretch \\
+               <DEB repository>/debian/pool/stretch/main/t/tarantool-queue \\
+               -p=tarantool-queue
+           - for RPM packages:
+             # prepare local path with needed "tarantool-queue-*" packages only
+             ./$0 -b=s3://<target repository> -o=fedora -d=30 <local path>
     -f|--force
          Force updating the remote package with the local one despite the checksum difference
     -s|--skip_errors
-- 
2.17.1

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script
  2020-03-30  5:38 [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Alexander V. Tikhonov
                   ` (2 preceding siblings ...)
  2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 3/3] Add help instruction on 'product' option Alexander V. Tikhonov
@ 2020-03-30 14:40 ` Sergey Bronnikov
  2020-04-11  5:04   ` Oleg Piskunov
  2020-04-15 13:51 ` Kirill Yukhin
  4 siblings, 1 reply; 7+ messages in thread
From: Sergey Bronnikov @ 2020-03-30 14:40 UTC (permalink / raw)
  To: Alexander V. Tikhonov; +Cc: Oleg Piskunov, tarantool-patches

LGTM for a series. Thanks.

On 08:38 Mon 30 Mar , Alexander V. Tikhonov wrote:
> Script for pushing packages to S3 extended with the new features/fixes:
> 
> 1) added ability to fix metafiles on missed/bad packages in S3
> 2) added ability to work with packages consisting only binaries
>    either sources parts
> 3) added usage instructions for '-p|--product' option
> 
> Github: https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci
> Issue: https://github.com/tarantool/tarantool/issues/3380
> 
> Alexander V. Tikhonov (1):
>   Add metafiles cleanup routines at S3 pack script
> 
> Alexander.V Tikhonov (2):
>   Enable script for saving packages in S3 for modules
>   Add help instruction on 'product' option
> 
>  tools/update_repo.sh | 277 +++++++++++++++++++++++++++++++++----------
>  1 file changed, 212 insertions(+), 65 deletions(-)
> 
> -- 
> 2.17.1
> 

-- 
sergeyb@

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script
  2020-03-30 14:40 ` [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Sergey Bronnikov
@ 2020-04-11  5:04   ` Oleg Piskunov
  0 siblings, 0 replies; 7+ messages in thread
From: Oleg Piskunov @ 2020-04-11  5:04 UTC (permalink / raw)
  To: Sergey Bronnikov; +Cc: tarantool-patches

[-- Attachment #1: Type: text/plain, Size: 1066 bytes --]


LGTM

  
>Понедельник, 30 марта 2020, 17:40 +03:00 от Sergey Bronnikov <sergeyb@tarantool.org>:
> 
>LGTM for a series. Thanks.
>
>On 08:38 Mon 30 Mar , Alexander V. Tikhonov wrote:
>> Script for pushing packages to S3 extended with the new features/fixes:
>>
>> 1) added ability to fix metafiles on missed/bad packages in S3
>> 2) added ability to work with packages consisting only binaries
>> either sources parts
>> 3) added usage instructions for '-p|--product' option
>>
>> Github:  https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci
>> Issue:  https://github.com/tarantool/tarantool/issues/3380
>>
>> Alexander V. Tikhonov (1):
>> Add metafiles cleanup routines at S3 pack script
>>
>> Alexander.V Tikhonov (2):
>> Enable script for saving packages in S3 for modules
>> Add help instruction on 'product' option
>>
>> tools/update_repo.sh | 277 +++++++++++++++++++++++++++++++++----------
>> 1 file changed, 212 insertions(+), 65 deletions(-)
>>
>> --
>> 2.17.1
>>
>--
>sergeyb@ 
 
 
--
Oleg Piskunov
 

[-- Attachment #2: Type: text/html, Size: 1853 bytes --]

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script
  2020-03-30  5:38 [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Alexander V. Tikhonov
                   ` (3 preceding siblings ...)
  2020-03-30 14:40 ` [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Sergey Bronnikov
@ 2020-04-15 13:51 ` Kirill Yukhin
  4 siblings, 0 replies; 7+ messages in thread
From: Kirill Yukhin @ 2020-04-15 13:51 UTC (permalink / raw)
  To: Alexander V. Tikhonov; +Cc: Oleg Piskunov, tarantool-patches

Hello,

On 30 мар 08:38, Alexander V. Tikhonov wrote:
> Script for pushing packages to S3 extended with the new features/fixes:
> 
> 1) added ability to fix metafiles on missed/bad packages in S3
> 2) added ability to work with packages consisting only binaries
>    either sources parts
> 3) added usage instructions for '-p|--product' option
> 
> Github: https://github.com/tarantool/tarantool/tree/avtikhon/gh-3380-push-packages-s3-full-ci
> Issue: https://github.com/tarantool/tarantool/issues/3380

I've checked your patchset into 1.10, 2.2, 2.3 and master.

--
Regards, Kirill Yukhin

^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2020-04-15 13:51 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2020-03-30  5:38 [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Alexander V. Tikhonov
2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 1/3] Add metafiles cleanup routines at S3 pack script Alexander V. Tikhonov
2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 2/3] Enable script for saving packages in S3 for modules Alexander V. Tikhonov
2020-03-30  5:38 ` [Tarantool-patches] [PATCH v1 3/3] Add help instruction on 'product' option Alexander V. Tikhonov
2020-03-30 14:40 ` [Tarantool-patches] [PATCH v1 0/3] extend packages pushing to S3 script Sergey Bronnikov
2020-04-11  5:04   ` Oleg Piskunov
2020-04-15 13:51 ` Kirill Yukhin

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox