[PATCH 2/3][AArch64][PR target/65697] Strengthen barriers for sync-compare-swap builtins.
Matthew Wahab
matthew.wahab@arm.com
Fri May 22 08:50:00 GMT 2015
[Added PR number and updated patches]
This patch changes the code generated for __sync_type_compare_and_swap to
ldxr reg; cmp; bne label; stlxr; cbnz; label: dmb ish; mov .., reg
This removes the acquire-barrier from the load and ends the operation with a
fence to prevent memory references appearing after the __sync operation from
being moved ahead of the store-release.
This also strengthens the acquire barrier generated for __sync_lock_test_and_set
(which, like compare-and-swap, is implemented as a form of atomic exchange):
ldaxr; stxr; cbnz
becomes
ldxr; stxr; cbnz; dmb ish
Tested with check-gcc for aarch64-none-linux-gnu.
Ok for trunk?
Matthew
2015-05-22 Matthew Wahab <matthew.wahab@arm.com>
PR target/65697
* config/aarch64/aarch64.c (aarch64_split_compare_and_swap): Check
for __sync memory models, emit appropriate initial and final
barriers.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 0002-AArch64-Strengthen-barriers-for-sync-compare-swap-bu.patch
Type: text/x-patch
Size: 2261 bytes
Desc: not available
URL: <http://gcc.gnu.org/pipermail/gcc-patches/attachments/20150522/e321409a/attachment.bin>
More information about the Gcc-patches
mailing list