This is the mail archive of the fortran@gcc.gnu.org mailing list for the GNU Fortran project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

OpenMP threadprivate on non-ELF systems


Attached is a patch-in-progress that implements the OpenMP directive
in gfortran without relying on ELF Thread Local Storage support.  The
current implementation only supports systems with TLS, and doesn't
work on other systems (AIX, Darwin, etc.)  This patch doesn't use
Thread Local Storage at all, but there's no reason why the TLS
implementation couldn't be conditionally compiled in on TLS systems.

This patch implements the threadprivate directive with a runtime
library routine.  In a compiled procedure, each threadprivate variable
or common block will get a local integer variable that will hold the
address of the private copy of that variable.  This value is set by a
runtime library call that looks up the remapping for the current
thread (malloc'ing space, if necessary); this initialization is done
at the beginning of the routine and at the beginning of each parallel
region.  The copyin clause is handled similarly.

This patch is mostly complete on the Fortran side, but I haven't had
time yet to implement the C frontend portion.  This patch passes the
libgomp testsuite on x86/x86_64 linux, and I was also  able to compile
and correctly run some non-trivial applications that rely heavily on
threadprivate common blocks.  I've tried to indicate bugs and todo
items in the comments.  (In particular, the linear list
threadprivate.c should be replaced by a balanced tree of some sort.)

I'm going to be offline for a couple of days, and I'll probably be too
busy to do much with this until the end of August, but I'd appreciate
any feedback or contributions.

-Asher

Attachment: 2006-08-14_16 11 56.diff
Description: Binary data


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]