This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Switching from 16bit to 32 bit mode


Hi ,
I have a piece of assembly code which starts up in 16bit mode and then is 
suppossed to go into
32bit mode. The GNU help says to use .code16 directive to indicate to the 
compiler that we are
currently in 16bit mode and to then use .code32 when we are ready to switch 
to 32 bit mode.
However the .code32 dose not appear to be working in the above usage.  A 
whole lot of junk instructions
are generated upon using the .code32  and the code crashes.
Could someone please point me to the correct way of using it??
Any help will be highly appreciated.

Thanks,
Deepa
**************************************************
Deepa Menon
Software Engineer, Cisco Systems
(408) 853-7723
**************************************************

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]