This is the mail archive of the mailing list for the GCC project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Re: setrlimit()/C++ new() seg fault


> I've been experimenting with limiting the amount of memory available to
> a process by setting its data segment resource limit from within the
> process by using setrlimit().

If you really want to prevent excessive memory allocation, the limit
to set is RLIMIT_AS, not RLIMIT_DATA (as the latter can be easily
circumvented) but this has nothing to do with gcc.

> I've been testing if this technique
> would work by attempting to allocate more memory on the heap than the
> limit set in setrlimit() call.  Unfortunately, I get a seg fault in the
> standard C++ library (gcc 2.95.2 on a Debian GNU/Linux "potato" i686
> box).

I can reproduce this.  It seems that the mechanism to throw exceptions
is depending on malloc() still working.  If you reduce the data limit
so low as you've done in your sample, then malloc() will always return

There are several places in gcc/frame.c where malloc() is used without
checking for a NULL result, in this case it's

static inline void
start_fde_sort (fde_accumulator *accu, size_t count)
  accu->linear.array = (fde **) malloc (sizeof (fde *) * count);
  accu->erratic.array = (fde **) malloc (sizeof (fde *) * count);
  accu->linear.count = 0;
  accu->erratic.count = 0;

which sets the arrays to NULL and therefore you get the crash
shortly thereafter.

I don't know whether it's possible or worthwhile to fix this.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]