This is the mail archive of the mailing list for the GCC project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Multiplatform binary generation

On Friday, May 14, 2004, at 06:25 AM, deploy.kde wrote:
I am working on research project to implement executable binary for all major linux desktop platforms.

Great, you've just reinvented FAT executables. I'd suggest reading up on the NeXT legacy.

Check out the apple-ppc-branch and the `driver driver' and the -arch flag. lipo is the program the glues together multiple output files (.o and executable files) for multiple arches.

On darwin, the kernel (exec code) will select and load the right part of the file for the CPU, as well as the dynamic loaders based upon the current CPU setting.

We can do things like rebuild software multiple times for different chips of the same general family and make them FAT, so if you run on chip X, you can get code for chip X, as well as totally different CPU architectures.

An example:

	gcc -arch ppc -arch ppc970 -arch 386 -arch 686 foo.c -c -o foo.o
	gcc -arch ppc -arch ppc970 -arch 386 -arch 686 foo.c -o foo

would create one foo.o (or foo) that has 4 parts, internally, the driver driver runs the real gcc 4 times, with 4 different output files, then at the end, it glues them all together.

Most normal software you can build this way:

CFLAGS='-arch ...' configure && make && make install

gcc, being special, we build specially, see build_gcc in the branch for details.

However, having said all that, I think you should just use traditional techniques and not intermingle contents.

I would like to hear any your sugestions or comments before I spent months developing it.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]