Skip to content

Commit

Permalink
powerpc: Optimise smp_wmb
Browse files Browse the repository at this point in the history
Change 2d1b202 ("powerpc: Fixup
lwsync at runtime") removed __SUBARCH_HAS_LWSYNC, causing smp_wmb to
revert back to eieio for all CPUs.  This restores the behaviour
intorduced in 74f0609 ("powerpc:
Optimise smp_wmb on 64-bit processors").

Signed-off-by: Nick Piggin <npiggin@suse.de>
Signed-off-by: Paul Mackerras <paulus@samba.org>
  • Loading branch information
Nick Piggin authored and Paul Mackerras committed Nov 19, 2008
1 parent a4e22f0 commit 46d075b
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 2 deletions.
4 changes: 4 additions & 0 deletions arch/powerpc/include/asm/synch.h
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@
#include <linux/stringify.h>
#include <asm/feature-fixups.h>

#if defined(__powerpc64__) || defined(CONFIG_PPC_E500MC)
#define __SUBARCH_HAS_LWSYNC
#endif

#ifndef __ASSEMBLY__
extern unsigned int __start___lwsync_fixup, __stop___lwsync_fixup;
extern void do_lwsync_fixups(unsigned long value, void *fixup_start,
Expand Down
4 changes: 2 additions & 2 deletions arch/powerpc/include/asm/system.h
Original file line number Diff line number Diff line change
Expand Up @@ -45,14 +45,14 @@
#ifdef CONFIG_SMP

#ifdef __SUBARCH_HAS_LWSYNC
# define SMPWMB lwsync
# define SMPWMB LWSYNC
#else
# define SMPWMB eieio
#endif

#define smp_mb() mb()
#define smp_rmb() rmb()
#define smp_wmb() __asm__ __volatile__ (__stringify(SMPWMB) : : :"memory")
#define smp_wmb() __asm__ __volatile__ (stringify_in_c(SMPWMB) : : :"memory")
#define smp_read_barrier_depends() read_barrier_depends()
#else
#define smp_mb() barrier()
Expand Down

0 comments on commit 46d075b

Please sign in to comment.