[Tarantool-patches] [PATCH v2 04/10] crc32: align memory access
Timur Safin
tsafin at tarantool.org
Thu May 28 23:11:56 MSK 2020
: -----Original Message-----
: From: Vladislav Shpilevoy <v.shpilevoy at tarantool.org>
: Subject: [PATCH v2 04/10] crc32: align memory access
:
:
: diff --git a/src/cpu_feature.c b/src/cpu_feature.c
: index 98567ccb3..9bf6223de 100644
: --- a/src/cpu_feature.c
: +++ b/src/cpu_feature.c
: @@ -50,7 +51,7 @@
:
:
: static uint32_t
: -crc32c_hw_byte(uint32_t crc, unsigned char const *data, unsigned int
: length)
: +crc32c_hw_byte(uint32_t crc, char const *data, unsigned int length)
: {
: while (length--) {
: __asm__ __volatile__(
: @@ -68,6 +69,26 @@ crc32c_hw_byte(uint32_t crc, unsigned char const *data,
: unsigned int length)
: uint32_t
: crc32c_hw(uint32_t crc, const char *buf, unsigned int len)
: {
: + const int align = alignof(unsigned long);
: + unsigned long addr = (unsigned long)buf;
: + unsigned int not_aligned_prefix =
: + ((addr - 1 + align) & ~(align - 1)) - addr;
Hmm, hmm...
Isn't it simple `addr % align`? Or even `addr & (align - 1)` ?
: + /*
: + * Calculate CRC32 for the prefix byte-by-byte so as to
: + * then use aligned words to calculate the rest. This is
: + * twice less loads, because every load takes exactly one
: + * word from memory. Not 2 words, which would need to be
: + * partially merged then.
: + * But the main reason is that unaligned loads are just
: + * unsafe, because this is an undefined behaviour.
: + */
: + if (not_aligned_prefix < len) {
: + crc = crc32c_hw_byte(crc, buf, not_aligned_prefix);
: + buf += not_aligned_prefix;
: + len -= not_aligned_prefix;
: + } else {
: + return crc32c_hw_byte(crc, buf, len);
: + }
: unsigned int iquotient = len / SCALE_F;
: unsigned int iremainder = len % SCALE_F;
: unsigned long *ptmp = (unsigned long *)buf;
More information about the Tarantool-patches
mailing list