Tokenization and preprocessing
Probably the least specified part of previous versions of C
concerned the operations that transformed each source file
from a bunch of characters into a sequence of tokens,
ready to parse.
These operations included
recognition of white space (including comments),
bundling consecutive characters into tokens,
handling preprocessing directive lines,
and macro replacement.
However, their respective ordering was never guaranteed.
Next topic:
ANSI C translation phases
Previous topic:
Third example: integral constants
© 2005 The SCO Group, Inc. All rights reserved.
SCO OpenServer Release 6.0.0 -- 02 June 2005