Skip to content

Commit

Permalink
kasan, arm64: use ARCH_SLAB_MINALIGN instead of manual aligning
Browse files Browse the repository at this point in the history
Instead of changing cache->align to be aligned to KASAN_SHADOW_SCALE_SIZE
in kasan_cache_create() we can reuse the ARCH_SLAB_MINALIGN macro.

Link: http://lkml.kernel.org/r/52ddd881916bcc153a9924c154daacde78522227.1546540962.git.andreyknvl@google.com
Signed-off-by: Andrey Konovalov <andreyknvl@google.com>
Suggested-by: Vincenzo Frascino <vincenzo.frascino@arm.com>
Cc: Andrey Ryabinin <aryabinin@virtuozzo.com>
Cc: Christoph Lameter <cl@linux.com>
Cc: Dmitry Vyukov <dvyukov@google.com>
Cc: Mark Rutland <mark.rutland@arm.com>
Cc: Vincenzo Frascino <vincenzo.frascino@arm.com>
Cc: Will Deacon <will.deacon@arm.com>
Signed-off-by: Andrew Morton <akpm@linux-foundation.org>
Signed-off-by: Linus Torvalds <torvalds@linux-foundation.org>
  • Loading branch information
Andrey Konovalov authored and Linus Torvalds committed Jan 9, 2019
1 parent 63f3655 commit eb214f2
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 2 deletions.
6 changes: 6 additions & 0 deletions arch/arm64/include/asm/cache.h
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,12 @@
*/
#define ARCH_DMA_MINALIGN (128)

#ifdef CONFIG_KASAN_SW_TAGS
#define ARCH_SLAB_MINALIGN (1ULL << KASAN_SHADOW_SCALE_SHIFT)
#else
#define ARCH_SLAB_MINALIGN __alignof__(unsigned long long)
#endif

#ifndef __ASSEMBLY__

#include <linux/bitops.h>
Expand Down
2 changes: 0 additions & 2 deletions mm/kasan/common.c
Original file line number Diff line number Diff line change
Expand Up @@ -298,8 +298,6 @@ void kasan_cache_create(struct kmem_cache *cache, unsigned int *size,
return;
}

cache->align = round_up(cache->align, KASAN_SHADOW_SCALE_SIZE);

*flags |= SLAB_KASAN;
}

Expand Down

0 comments on commit eb214f2

Please sign in to comment.