This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [gomp] Fix omp_is_private (PR middle-end/27388)


On Wed, May 03, 2006 at 01:21:37PM -0700, Richard Henderson wrote:
> On Wed, May 03, 2006 at 07:16:04AM -0400, Jakub Jelinek wrote:
> > BTW: I really wonder when the "iteration variable %qs should be private"
> > error could trigger - omp for can't have explicit shared clause and the
> > iteration var should be scanned before the loop body.  Maybe
> > replacing
> > 	  if (ctx == gimplify_omp_ctxp)
> > 	    {
> > 	      error ("iteration variable %qs should be private",
> > 		     IDENTIFIER_POINTER (DECL_NAME (decl)));
> > 	      n->value = GOVD_PRIVATE;
> > 	      return true;
> > 	    }
> > 	  else
> > with
> > 	  gcc_assert (ctx != gimplify_omp_ctxp);
> > would be enough.
> 
> Because IIRC we're supposed to error for
> 
>   #pragma omp parallel for shared (i)
>    for (i = 0; i < 10; ++i) foo();
> 
> but something broke, and we no longer do...

That's the consequence of breaking combined parallel omp constructs early.
But, from my reading I'd say we are supposed to error on:
#pragma omp parallel for shared (i)
  for (i = 0; i < 10; ++i)
    ;
but not for
#pragma omp parallel shared (i)
#pragma omp for
  for (i = 0; i < 10; ++i)
    ;
(but both are represented the same at the GIMPLE level).
Certainly the error is not to be reported on say
#pragma omp parallel shared (i)
  {
#pragma omp for
    for (i = 0; i < 10; ++i)
      ;
    i = 1;
  }
but the middle example is really about two different nested directives
and i is shared in the outer context, while private in the inner context.
Though, of course, it doesn't make sense to do that if they are tightly
nested and nothing is around.

	Jakub


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]