Skip to content

Commit

Permalink
[PATCH] x86_64: Fix VSMP build
Browse files Browse the repository at this point in the history
Patch fixes a build problem with CONFIG_X86_VSMP.  The vSMP bits probably

gathered some fuzz on its way to mainline, and safe_halt() which was outside
the #endif (CONFIG_X86_VSMP) somehow got inside the !CONFIG_X86_VSMP condition,
hence being undefined and breaking CONFIG_X86_VSMP builds.  Patch takes
safe_halt() and halt() macros out of the #endif

Signed-off-by: Ravikiran Thirumalai <kiran@scalex86.org>
Signed-off-by: Andi Kleen <ak@suse.de>
Signed-off-by: Linus Torvalds <torvalds@osdl.org>
  • Loading branch information
Ravikiran G Thirumalai authored and Linus Torvalds committed Jan 17, 2006
1 parent c09b424 commit 2ddb55f
Showing 1 changed file with 5 additions and 5 deletions.
10 changes: 5 additions & 5 deletions include/asm-x86_64/system.h
Original file line number Diff line number Diff line change
Expand Up @@ -354,11 +354,6 @@ static inline unsigned long __cmpxchg(volatile void *ptr, unsigned long old,
#define local_irq_disable() __asm__ __volatile__("cli": : :"memory")
#define local_irq_enable() __asm__ __volatile__("sti": : :"memory")

/* used in the idle loop; sti takes one instruction cycle to complete */
#define safe_halt() __asm__ __volatile__("sti; hlt": : :"memory")
/* used when interrupts are already enabled or to shutdown the processor */
#define halt() __asm__ __volatile__("hlt": : :"memory")

#define irqs_disabled() \
({ \
unsigned long flags; \
Expand All @@ -370,6 +365,11 @@ static inline unsigned long __cmpxchg(volatile void *ptr, unsigned long old,
#define local_irq_save(x) do { warn_if_not_ulong(x); __asm__ __volatile__("# local_irq_save \n\t pushfq ; popq %0 ; cli":"=g" (x): /* no input */ :"memory"); } while (0)
#endif

/* used in the idle loop; sti takes one instruction cycle to complete */
#define safe_halt() __asm__ __volatile__("sti; hlt": : :"memory")
/* used when interrupts are already enabled or to shutdown the processor */
#define halt() __asm__ __volatile__("hlt": : :"memory")

void cpu_idle_wait(void);

extern unsigned long arch_align_stack(unsigned long sp);
Expand Down

0 comments on commit 2ddb55f

Please sign in to comment.