[Tarantool-patches] [PATCH v1] gitlab-ci: save sources to new S3 location

Sergey Bronnikov sergeyb at tarantool.org
Fri Sep 11 16:19:13 MSK 2020


LGTM

On 11.09.2020 15:54, Alexander V. Tikhonov wrote:
> Hi Sergey, thanks for the review, pplease check my comments below.
>
> On Fri, Sep 11, 2020 at 02:10:14PM +0300, Sergey Bronnikov wrote:
>> Hello, Alexander!
>>
>> Thanks for the patch. See my 5 comments.
>>
>> On 02.09.2020 14:46, Alexander V. Tikhonov wrote:
>>> Changed S3 location for sources tarballs. Also added ability to
>>> create S3 directory for the tarballs if it was not existed.
>>> ---
>>>
>>> Github: https://github.com/tarantool/tarantool/tree/avtikhon/src-s3-new
>>>
>>>    .gitlab.mk | 10 ++++++++--
>>>    1 file changed, 8 insertions(+), 2 deletions(-)
>>>
>>> diff --git a/.gitlab.mk b/.gitlab.mk
>>> index 0ce978063..708c69082 100644
>>> --- a/.gitlab.mk
>>> +++ b/.gitlab.mk
>>> @@ -108,6 +108,7 @@ GIT_DESCRIBE=$(shell git describe HEAD)
>>>    MAJOR_VERSION=$(word 1,$(subst ., ,$(GIT_DESCRIBE)))
>>>    MINOR_VERSION=$(word 2,$(subst ., ,$(GIT_DESCRIBE)))
>>>    BUCKET="$(MAJOR_VERSION).$(MINOR_VERSION)"
>>> +S3_BUCKET="s3://tarantool_repo/sources/$(BUCKET)"
>> 1. Wrong parentheses used around BUCKET.
>>
> That's ok for the current string, due to it depends on Makefile rules.
>
>> 2. You already have variable BUCKET and add a new variable S3_BUCKET.
>>
> BUCKET is a simple bucket name;
> S3_BUCKET renamed to S3_BUCKET_URL to make it more informative.
>
>> For my taste it is difficult to understand what for they need when looking
>> on the names.
>>>    deploy_prepare:
>>>    	[ -d packpack ] || \
>>> @@ -130,8 +131,13 @@ source: deploy_prepare
>>>    	TARBALL_COMPRESSOR=gz packpack/packpack tarball
>>>    source_deploy: source
>>> -	aws --endpoint-url "${AWS_S3_ENDPOINT_URL}" s3 cp build/*.tar.gz \
>>> -		"s3://tarantool.${BUCKET}.src/" --acl public-read
>>> +	( aws --endpoint-url "${AWS_S3_ENDPOINT_URL}" s3 ls "${S3_BUCKET}/" || \
>>> +		( rm -rf "${BUCKET}" ; mkdir "${BUCKET}" &&
>> 3. Didn't get a point to use 'ls' operation here.
>>
> AWS S3 tools too weak and 'ls' used there to check the existence of the
> directory.
>
>> 4. Moreover probably it is better to suppress output from listing command as
>> it is not used here.
>>
> Here it is really needed to save in output the result.
>
>> 5. In a command chain above you check existence of remote bucket and then
>> remove local directory.
>>
> Unfortunately AWS S3 doesn't have any command to create the directory.
> The only way to create it is to copy recursively the empty one. So the
> local empty directory is needed for it.
>
>> Probably you should check existence of local directory too. Otherwise 'rm'
>> will fail when directory is not exist.
>>
>>> +			aws --endpoint-url "${AWS_S3_ENDPOINT_URL}"            \
>>> +				s3 mv "${BUCKET}" "${S3_BUCKET}" --recursive   \
>>> +				--acl public-read ) ) &&                       \
>>> +		aws --endpoint-url "${AWS_S3_ENDPOINT_URL}"                    \
>>> +			s3 cp build/*.tar.gz "${S3_BUCKET}/" --acl public-read
>>>    # ###################
>>>    # Performance testing


More information about the Tarantool-patches mailing list