Merge tag 'locking-core-2024-05-13' of git://git.kernel.org/pub/scm/linux/kernel/git/tip/tip
Pull locking updates from Ingo Molnar: - Over a dozen code generation micro-optimizations for the atomic and spinlock code - Add more __ro_after_init attributes - Robustify the lockdevent_*() macros * tag 'locking-core-2024-05-13' of git://git.kernel.org/pub/scm/linux/kernel/git/tip/tip: locking/pvqspinlock/x86: Use _Q_LOCKED_VAL in PV_UNLOCK_ASM macro locking/qspinlock/x86: Micro-optimize virt_spin_lock() locking/atomic/x86: Merge __arch{,_try}_cmpxchg64_emu_local() with __arch{,_try}_cmpxchg64_emu() locking/atomic/x86: Introduce arch_try_cmpxchg64_local() locking/pvqspinlock/x86: Remove redundant CMP after CMPXCHG in __raw_callee_save___pv_queued_spin_unlock() locking/pvqspinlock: Use try_cmpxchg() in qspinlock_paravirt.h locking/pvqspinlock: Use try_cmpxchg_acquire() in trylock_clear_pending() locking/qspinlock: Use atomic_try_cmpxchg_relaxed() in xchg_tail() locking/atomic/x86: Define arch_atomic_sub() family using arch_atomic_add() functions locking/atomic/x86: Rewrite x86_32 arch_atomic64_{,fetch}_{and,or,xor}() functions locking/atomic/x86: Introduce arch_atomic64_read_nonatomic() to x86_32 locking/atomic/x86: Introduce arch_atomic64_try_cmpxchg() to x86_32 locking/atomic/x86: Introduce arch_try_cmpxchg64() for !CONFIG_X86_CMPXCHG64 locking/atomic/x86: Modernize x86_32 arch_{,try_}_cmpxchg64{,_local}() locking/atomic/x86: Correct the definition of __arch_try_cmpxchg128() x86/tsc: Make __use_tsc __ro_after_init x86/kvm: Make kvm_async_pf_enabled __ro_after_init context_tracking: Make context_tracking_key __ro_after_init jump_label,module: Don't alloc static_key_mod for __ro_after_init keys locking/qspinlock: Always evaluate lockevent* non-event parameter once
Showing
- arch/x86/include/asm/atomic.h 2 additions, 10 deletionsarch/x86/include/asm/atomic.h
- arch/x86/include/asm/atomic64_32.h 52 additions, 27 deletionsarch/x86/include/asm/atomic64_32.h
- arch/x86/include/asm/atomic64_64.h 2 additions, 10 deletionsarch/x86/include/asm/atomic64_64.h
- arch/x86/include/asm/cmpxchg_32.h 126 additions, 79 deletionsarch/x86/include/asm/cmpxchg_32.h
- arch/x86/include/asm/cmpxchg_64.h 7 additions, 1 deletionarch/x86/include/asm/cmpxchg_64.h
- arch/x86/include/asm/qspinlock.h 9 additions, 4 deletionsarch/x86/include/asm/qspinlock.h
- arch/x86/include/asm/qspinlock_paravirt.h 3 additions, 4 deletionsarch/x86/include/asm/qspinlock_paravirt.h
- arch/x86/kernel/kvm.c 1 addition, 1 deletionarch/x86/kernel/kvm.c
- arch/x86/kernel/tsc.c 1 addition, 1 deletionarch/x86/kernel/tsc.c
- include/asm-generic/sections.h 5 additions, 0 deletionsinclude/asm-generic/sections.h
- include/linux/jump_label.h 3 additions, 0 deletionsinclude/linux/jump_label.h
- init/main.c 1 addition, 0 deletionsinit/main.c
- kernel/context_tracking.c 1 addition, 1 deletionkernel/context_tracking.c
- kernel/jump_label.c 53 additions, 0 deletionskernel/jump_label.c
- kernel/locking/lock_events.h 2 additions, 2 deletionskernel/locking/lock_events.h
- kernel/locking/qspinlock.c 5 additions, 8 deletionskernel/locking/qspinlock.c
- kernel/locking/qspinlock_paravirt.h 22 additions, 27 deletionskernel/locking/qspinlock_paravirt.h
Please register or sign in to comment