8/16bit oddities on avr-gcc

Michael Kukat michael@unixiron.org
Fri Mar 2 13:03:00 GMT 2007


Hi,

2007/3/2, Andrew Haley <aph@redhat.com>:
> This is strange, and might be a missed optimization bug.  I can't see
> any explanation for the behaviour you see here.
>
> What does the output of -fdump-tree-optimized look like?

I can't read out something useful from this...

$ avr-gcc -mmcu=atmega16 -Wall -Werror -O3 -fdump-tree-optimized -S -o
- perftest.c
        .file   "perftest.c"
        .arch atmega16
__SREG__ = 0x3f
__SP_H__ = 0x3e
__SP_L__ = 0x3d
__tmp_reg__ = 0
__zero_reg__ = 1
        .global __do_copy_data
        .global __do_clear_bss
        .text
.global test
        .type   test, @function
test:
/* prologue: frame size=0 */
/* prologue end (size=0) */
        ldi r24,lo8(0)
        ldi r25,hi8(0)
.L2:
        sts xx,r24
        adiw r24,1
        cpi r24,64
        cpc r25,__zero_reg__
        brne .L2
/* epilogue: frame size=0 */
        ret
/* epilogue end (size=1) */
/* function test size 9 (8) */
        .size   test, .-test
        .comm xx,1,1
/* File "perftest.c": code    9 = 0x0009 (   8), prologues   0, epilogues   1 */

But optimization was also my idea, because with -O0, it makes ctr 8bit:

        std Y+1,__zero_reg__
.L2:
        ldd r24,Y+1
        sts xx,r24
        ldd r24,Y+1
        subi r24,lo8(-(1))
        std Y+1,r24
        ldd r24,Y+1
        cpi r24,lo8(64)
        brlo .L2

okay, this code is due to heavy SRAM use not really good, but i tested
all optimization levels, -Os, -O1 up to -O3, the "16bit problem"
appears in all levels, just O0 seems to work in 8bit.

I didn't find an option to disable from -O1 to get back to 8bit to
track it down further yet.

...Michael



More information about the Gcc-help mailing list