1 The Internals of the Mono C# Compiler
9 The Mono C# compiler is a C# compiler written in C# itself.
10 Its goals are to provide a free and alternate implementation
11 of the C# language. The Mono C# compiler generates ECMA CIL
12 images through the use of the System.Reflection.Emit API which
13 enable the compiler to be platform independent.
15 * Overview: How the compiler fits together
17 The compilation process is managed by the compiler driver (it
20 The compiler reads a set of C# source code files, and parses
21 them. Any assemblies or modules that the user might want to
22 use with his project are loaded after parsing is done.
24 Once all the files have been parsed, the type hierarchy is
25 resolved. First interfaces are resolved, then types and
28 Once the type hierarchy is resolved, every type is populated:
29 fields, methods, indexers, properties, events and delegates
30 are entered into the type system.
32 At this point the program skeleton has been completed. The
33 next process is to actually emit the code for each of the
34 executable methods. The compiler drives this from
37 Each type then has to populate its methods: populating a
38 method requires creating a structure that is used as the state
39 of the block being emitted (this is the EmitContext class) and
40 then generating code for the topmost statement (the Block).
42 Code generation has two steps: the first step is the semantic
43 analysis (Resolve method) that resolves any pending tasks, and
44 guarantees that the code is correct. The second phase is the
45 actual code emission. All errors are flagged during in the
48 After all code has been emitted, then the compiler closes all
49 the types (this basically tells the Reflection.Emit library to
50 finish up the types), resources, and definition of the entry
51 point are done at this point, and the output is saved to
54 The following list will give you an idea of where the
55 different pieces of the compiler live:
60 This drives the compilation process: loading of
61 command line options; parsing the inputs files;
62 loading the referenced assemblies; resolving the type
63 hierarchy and emitting the code.
67 The state tracking for code generation.
71 Code to do semantic analysis and emit the attributes
76 Keeps track of the types defined in the source code,
77 as well as the assemblies loaded.
81 This contains the MCS type system.
85 Error and warning reporting methods.
89 Assorted utility functions used by the compiler.
95 The tokenizer for the C# language, it includes also
98 cs-parser.jay, cs-parser.cs:
100 The parser is implemented using a C# port of the Yacc
101 parser. The parser lives in the cs-parser.jay file,
102 and cs-parser.cs is the generated parser.
106 The `location' structure is a compact representation
107 of a file, line, column where a token, or a high-level
108 construct appears. This is used to report errors.
114 Basic expression classes, and interfaces most shared
115 code and static methods are here.
119 Most of the different kinds of expressions classes
124 The assignment expression got its own file.
128 The classes that represent the constant expressions.
132 Literals are constants that have been entered manually
133 in the source code, like `1' or `true'. The compiler
134 needs to tell constants from literals apart during the
135 compilation process, as literals sometimes have some
136 implicit extra conversions defined for them.
140 The constant folder for binary expressions.
146 All of the abstract syntax tree elements for
147 statements live in this file. This also drives the
148 semantic analysis process.
152 Contains the support for implementing iterators from
153 the C# 2.0 specification.
155 Declarations, Classes, Structs, Enumerations
159 This contains the base class for Members and
160 Declaration Spaces. A declaration space introduces
161 new names in types, so classes, structs, delegates and
162 enumerations derive from it.
166 Methods for holding and defining class and struct
167 information, and every member that can be in these
168 (methods, fields, delegates, events, etc).
170 The most interesting type here is the `TypeContainer'
171 which is a derivative of the `DeclSpace'
175 Handles delegate definition and use.
179 Handles enumerations.
183 Holds and defines interfaces. All the code related to
184 interface declaration lives here.
188 During the parsing process, the compiler encapsulates
189 parameters in the Parameter and Parameters classes.
190 These classes provide definition and resolution tools
195 Routines to track pending implementations of abstract
196 methods and interfaces. These are used by the
197 TypeContainer-derived classes to track whether every
198 method required is implemented.
201 * The parsing process
203 All the input files that make up a program need to be read in
204 advance, because C# allows declarations to happen after an
205 entity is used, for example, the following is a valid program:
210 a = "hello"; b = "world";
219 At the time the assignment expression `a = "hello"' is parsed,
220 it is not know whether a is a class field from this class, or
221 its parents, or whether it is a property access or a variable
222 reference. The actual meaning of `a' will not be discovered
223 until the semantic analysis phase.
225 ** The Tokenizer and the pre-processor
227 The tokenizer is contained in the file `cs-tokenizer.cs', and
228 the main entry point is the `token ()' method. The tokenizer
229 implements the `yyParser.yyInput' interface, which is what the
230 Yacc/Jay parser will use when fetching tokens.
232 Token definitions are generated by jay during the compilation
233 process, and those can be references from the tokenizer class
234 with the `Token.' prefix.
236 Each time a token is returned, the location for the token is
237 recorded into the `Location' property, that can be accessed by
238 the parser. The parser retrieves the Location properties as
239 it builds its internal representation to allow the semantic
240 analysis phase to produce error messages that can pin point
241 the location of the problem.
243 Some tokens have values associated with it, for example when
244 the tokenizer encounters a string, it will return a
245 LITERAL_STRING token, and the actual string parsed will be
246 available in the `Value' property of the tokenizer. The same
247 mechanism is used to return integers and floating point
250 C# has a limited pre-processor that allows conditional
251 compilation, but it is not as fully featured as the C
252 pre-processor, and most notably, macros are missing. This
253 makes it simple to implement in very few lines and mesh it
256 The `handle_preprocessing_directive' method in the tokenizer
257 handles all the pre-processing, and it is invoked when the '#'
258 symbol is found as the first token in a line.
260 The state of the pre-processor is contained in a Stack called
261 `ifstack', this state is used to track the if/elif/else/endif
262 nesting and the current state. The state is encoded in the
263 top of the stack as a number of values `TAKING',
264 `TAKEN_BEFORE', `ELSE_SEEN', `PARENT_TAKING'.
266 To debug problems in your grammar, you need to edit the
267 Makefile and make sure that the -ct options are passed to
268 jay. The current incarnation says:
270 ./../jay/jay -c < ./../jay/skeleton.cs cs-parser.jay
272 During debugging, you want to change this to:
274 ./../jay/jay -cvt < ./../jay/skeleton.cs cs-parser.jay
276 This generates a parser with debugging information and allows
277 you to activate verbose parser output in both the csharp
278 command and the mcs command by passing the "-v -v" flag (-v
281 When you do this, standard output will have a dump of the
282 tokens parsed and how the parser reacted to those. You can
283 look up the states with the y.output file that contains the
284 entire parser state diagram in human readable form.
288 Locations are encoded as a 32-bit number (the Location
289 struct) that map each input source line to a linear number.
290 As new files are parsed, the Location manager is informed of
291 the new file, to allow it to map back from an int constant to
292 a file + line number.
294 Prior to parsing/tokenizing any source files, the compiler
295 generates a list of all the source files and then reserves the
296 low N bits of the location to hold the source file, where N is
297 large enough to hold at least twice as many source files as were
298 specified on the command line (to allow for a #line in each file).
299 The upper 32-N bits are the line number in that file.
301 The token 0 is reserved for ``anonymous'' locations, ie. if we
302 don't know the location (Location.Null).
304 The tokenizer also tracks the column number for a token, but
305 this is currently not being used or encoded. It could
306 probably be encoded in the low 9 bits, allowing for columns
307 from 1 to 512 to be encoded.
311 The parser is written using Jay, which is a port of Berkeley
312 Yacc to Java, that I later ported to C#.
314 Many people ask why the grammar of the parser does not match
315 exactly the definition in the C# specification. The reason is
316 simple: the grammar in the C# specification is designed to be
317 consumed by humans, and not by a computer program. Before
318 you can feed this grammar to a tool, it needs to be simplified
319 to allow the tool to generate a correct parser for it.
321 In the Mono C# compiler, we use a class for each of the
322 statements and expressions in the C# language. For example,
323 there is a `While' class for the the `while' statement, a
324 `Cast' class to represent a cast expression and so on.
326 There is a Statement class, and an Expression class which are
327 the base classes for statements and expressions.
333 * Internal Representation
337 Expressions in the Mono C# compiler are represented by the
338 `Expression' class. This is an abstract class that particular
339 kinds of expressions have to inherit from and override a few
342 The base Expression class contains two fields: `eclass' which
343 represents the "expression classification" (from the C#
344 specs) and the type of the expression.
346 During parsing, the compiler will create the various trees of
347 expressions. These expressions have to be resolved before they
348 are can be used. The semantic analysis is implemented by
349 resolving each of the expressions created during parsing and
350 creating fully resolved expressions.
352 A common pattern that you will notice in the compiler is this:
357 expr = expr.Resolve (ec);
359 // There was an error, stop processing by returning
361 The resolution process is implemented by overriding the
362 `DoResolve' method. The DoResolve method has to set the `eclass'
363 field and the `type', perform all error checking and computations
364 that will be required for code generation at this stage.
366 The return value from DoResolve is an expression. Most of the
367 time an Expression derived class will return itself (return
368 this) when it will handle the emission of the code itself, or
369 it can return a new Expression.
371 For example, the parser will create an "ElementAccess" class
376 During the resolution process, the compiler will know whether
377 this is an array access, or an indexer access. And will
378 return either an ArrayAccess expression or an IndexerAccess
379 expression from DoResolve.
381 All errors must be reported during the resolution phase
382 (DoResolve) and if an error is detected the DoResolve method
383 will return null which is used to flag that an error condition
384 has occurred, this will be used to stop compilation later on.
385 This means that anyone that calls Expression.Resolve must
386 check the return value for null which would indicate an error
389 The second stage that Expressions participate in is code
390 generation, this is done by overwriting the "Emit" method of
391 the Expression class. No error checking must be performed
394 We take advantage of the distinction between the expressions that
395 are generated by the parser and the expressions that are the
396 result of the semantic analysis phase for lambda expressions (more
397 information in the "Lambda Expressions" section).
399 But what is important is that expressions and statements that are
400 generated by the parser should implement the cloning
401 functionality. This is used lambda expressions require the
402 compiler to attempt to resolve a given block of code with
403 different possible types for parameters that have their types
406 ** Simple Names, MemberAccess
408 One of the most important classes in the compiler is
409 "SimpleName" which represents a simple name (from the C#
410 specification). The names during the resolution time are
411 bound to field names, parameter names or local variable names.
413 More complicated expressions like:
417 Are composed using the MemberAccess class which contains a
418 name (Math) and a SimpleName (Sin), this helps driving the
423 The parser creates expressions to represent types during
424 compilation. For example:
433 That will produce a "SimpleName" expression for the "Version"
434 word. And in this particular case, the parser will introduce
435 "Version vers" as a field declaration.
437 During the resolution process for the fields, the compiler
438 will have to resolve the word "Version" to a type. This is
439 done by using the "ResolveAsType" method in Expression instead
442 ResolveAsType just turns on a different set of code paths for
443 things like SimpleNames and does a different kind of error
444 checking than the one used by regular expressions.
448 Constants in the Mono C# compiler are represented by the
449 abstract class `Constant'. Constant is in turn derived from
450 Expression. The base constructor for `Constant' just sets the
451 expression class to be an `ExprClass.Value', Constants are
452 born in a fully resolved state, so the `DoResolve' method
453 only returns a reference to itself.
455 Each Constant should implement the `GetValue' method which
456 returns an object with the actual contents of this constant, a
457 utility virtual method called `AsString' is used to render a
458 diagnostic message. The output of AsString is shown to the
459 developer when an error or a warning is triggered.
461 Constant classes also participate in the constant folding
462 process. Constant folding is invoked by those expressions
463 that can be constant folded invoking the functionality
464 provided by the ConstantFold class (cfold.cs).
466 Each Constant has to implement a number of methods to convert
467 itself into a Constant of a different type. These methods are
468 called `ConvertToXXXX' and they are invoked by the wrapper
469 functions `ToXXXX'. These methods only perform implicit
470 numeric conversions. Explicit conversions are handled by the
471 `Cast' expression class.
473 The `ToXXXX' methods are the entry point, and provide error
474 reporting in case a conversion can not be performed.
478 The C# language requires constant folding to be implemented.
479 Constant folding is hooked up in the Binary.Resolve method.
480 If both sides of a binary expression are constants, then the
481 ConstantFold.BinaryFold routine is invoked.
483 This routine implements all the binary operator rules, it
484 is a mirror of the code that generates code for binary
485 operators, but that has to be evaluated at runtime.
487 If the constants can be folded, then a new constant expression
488 is returned, if not, then the null value is returned (for
489 example, the concatenation of a string constant and a numeric
490 constant is deferred to the runtime).
499 *** Invariant meaning in a block
501 The seemingly small section in the standard entitled
502 "invariant meaning in a block" has several subtleties
503 involved, especially when we try to implement the semantics
506 Most of the semantics are trivial, and basically prevent local
507 variables from shadowing parameters and other local variables.
508 However, this notion is not limited to that, but affects all
509 simple name accesses within a block. And therein lies the rub
510 -- instead of just worrying about the issue when we arrive at
511 variable declarations, we need to verify this property at
512 every use of a simple name within a block.
514 The key notion that helps us is to note the bi-directional
515 action of a variable declaration. The declaration together
516 with anti-shadowing rules can maintain the IMiaB property for
517 the block containing the declaration and all nested sub
518 blocks. But, the IMiaB property also forces all surrounding
519 blocks to avoid using the name. We thus need to maintain a
520 blacklist of taboo names in all surrounding blocks -- and we
521 take the expedient of doing so simply: actually maintaining a
522 (superset of the) blacklist in each block data structure, which
523 we call the 'known_variable' list.
525 Because we create the 'known_variable' list during the parse
526 process, by the time we do simple name resolution, all the
527 blacklists are fully populated. So, we can just enforce the
528 rest of the IMiaB property by looking up a couple of lists.
530 This turns out to be quite efficient: when we used a block
531 tree walk, a test case took 5-10mins, while with this simple
532 mildly-redundant data structure, the time taken for the same
533 test case came down to a couple of seconds.
535 The IKnownVariable interface is a small wrinkle. Firstly, the
536 IMiaB also applies to parameter names, especially those of
537 anonymous methods. Secondly, we need more information than
538 just the name in the blacklist -- we need the location of the
539 name and where it's declared. We use the IKnownVariable
540 interface to abstract out the parser information stored for
541 local variables and parameters.
543 * The semantic analysis
545 Hence, the compiler driver has to parse all the input files.
546 Once all the input files have been parsed, and an internal
547 representation of the input program exists, the following
550 * The interface hierarchy is resolved first.
551 As the interface hierarchy is constructed,
552 TypeBuilder objects are created for each one of
555 * Classes and structure hierarchy is resolved next,
556 TypeBuilder objects are created for them.
558 * Constants and enumerations are resolved.
560 * Method, indexer, properties, delegates and event
561 definitions are now entered into the TypeBuilders.
563 * Elements that contain code are now invoked to
564 perform semantic analysis and code generation.
570 The EmitContext class is created any time that IL code is to
571 be generated (methods, properties, indexers and attributes all
572 create EmitContexts).
574 The EmitContext keeps track of the current namespace and type
575 container. This is used during name resolution.
577 An EmitContext is used by the underlying code generation
578 facilities to track the state of code generation:
580 * The ILGenerator used to generate code for this
583 * The TypeContainer where the code lives, this is used
584 to access the TypeBuilder.
586 * The DeclSpace, this is used to resolve names through
587 RootContext.LookupType in the various statements and
590 Code generation state is also tracked here:
594 This variable tracks the `checked' state of the
595 compilation, it controls whether we should generate
596 code that does overflow checking, or if we generate
597 code that ignores overflows.
599 The default setting comes from the command line
600 option to generate checked or unchecked code plus
601 any source code changes using the checked/unchecked
602 statements or expressions. Contrast this with the
603 ConstantCheckState flag.
607 The constant check state is always set to `true' and
608 cant be changed from the command line. The source
609 code can change this setting with the `checked' and
610 `unchecked' statements and expressions.
614 Whether we are emitting code inside a static or
619 The value that is allowed to be returned or NULL if
620 there is no return type.
624 A `Label' used by the code if it must jump to it.
625 This is used by a few routines that deals with exception
630 Whether we have a return label defined by the toplevel
635 Points to the Type (extracted from the
636 TypeContainer) that declares this body of code
642 Whether this is generating code for a constructor
646 Tracks the current block being generated.
650 The location where return has to jump to return the
653 A few variables are used to track the state for checking in
654 for loops, or in try/catch statements:
658 Whether we are in a Finally block
662 Whether we are in a Try block
666 Whether we are in a Catch block
669 Whether we are inside an unsafe block
671 Methods exposed by the EmitContext:
675 This emits a toplevel block.
677 This routine is very simple, to allow the anonymous
678 method support to roll its two-stage version of this
681 * NeedReturnLabel ():
683 This is used to flag during the resolution phase that
684 the driver needs to initialize the `ReturnLabel'
688 The introduction of anonymous methods in the compiler changed
689 various ways of doing things in the compiler. The most
690 significant one is the hard split between the resolution phase
691 and the emission phases of the compiler.
693 For instance, routines that referenced local variables no
694 longer can safely create temporary variables during the
695 resolution phase: they must do so from the emission phase,
696 since the variable might have been "captured", hence access to
697 it can not be done with the local-variable operations from the
700 The code emission is in:
704 Which drives the process, it first resolves the topblock, then
705 emits the required metadata (local variable definitions) and
706 finally emits the code.
708 A detailed description of anonymous methods and iterators is
709 on the new-anonymous-design.txt file in this directory.
713 Lambda expressions can come in two forms: those that have implicit
714 parameter types and those that have explicit parameter types, for
719 Foo ((int x) => x + 1);
726 One of the problems that we faced with lambda expressions is
727 that lambda expressions need to be "probed" with different
728 types until a working combination is found.
734 The above expression could mean vastly different things depending
735 on the type of "x". The compiler determines the type of "x" (left
736 hand side "x") at the moment the above expression is "bound",
737 which means that during the compilation process it will try to
738 match the above lambda with all the possible types available, for
741 delegate int di (int x);
742 delegate string ds (string s);
749 In the above example, overload resolution will try "x" as an "int"
750 and will try "x" as a string. And if one of them "compiles" thats
751 the one it picks (and it also copes with ambiguities if there was
752 more than one matching method).
754 To compile this, we need to hook into the resolution process,
755 but since the resolution process has side effects (calling
756 Resolve can either return instances of the resolved expression
757 type, or can alter field internals) it was necessary to
758 incorporate a framework to "clone" expressions before we
761 The support for cloning was added into Statements and
762 Expressions and is only necessary for objects of those types
763 that are created during parsing. It is not necessary to
764 support these in the classes that are the result of calling
765 Resolve. This means that SimpleName needs support for
766 Cloning, but FieldExpr does not need it (SimpleName is created
767 by the parser, FieldExpr is created during semantic analysis
770 The work happens through the public method called "Clone" that
771 clones the given Statement or Expression. The base method in
772 Statement and Expression merely does a MemberwiseCopy of the
773 elements and then calls the virtual CloneTo method to complete
774 the copy. By default this method throws an exception, this
775 is useful to catch cases where we forgot to override CloneTo
776 for a given Statement/Expression.
778 With the cloning capability it became possible to call resolve
779 multiple times (once for each Cloned copy) and based on this
780 picking the one implementation that would compile and that
781 would not be ambiguous.
783 The cloning process is basically a deep copy that happens in the
784 LambdaExpression class and it clones the top-level block for the
785 lambda expression. The cloning has the side effect of cloning
786 the entire containing block as well.
788 This happens inside this method:
790 public override bool ImplicitStandardConversionExists (Type delegate_type)
792 This is used to determine if the current Lambda expression can be
793 implicitly converted to the given delegate type.
795 And also happens as a result of the generic method parameter
798 ** Lambda Expressions and Cloning
800 All statements that are created during the parsing method should
801 implement the CloneTo method:
803 protected virtual void CloneTo (CloneContext clonectx, Statement target)
805 This method is called by the Statement.Clone method after it has
806 done a shallow-copy of all the fields in the statement, and they
807 should typically Clone any child statements.
809 Expressions should implement the CloneTo method as well:
811 protected virtual void CloneTo (CloneContext clonectx, Expression target)
813 ** Lambda Expressions and Contextual Return
815 When an expression is parsed as a lambda expression, the parser
816 inserts a call to a special statement, the contextual return.
822 Is actually compiled as:
824 a => contextual_return (a+1)
826 The contextual_return statement will behave differently depending
827 on the return type of the delegate that the expression will be
830 If the delegate return type is void, the above will basically turn
831 into an empty operation. Otherwise the above will become
832 a return statement that can infer return types.
836 The compiler can now be used as a library, the API exposed
837 lives in the Mono.CSharp.Evaluator class and it can currently
838 compile statements and expressions passed as strings and
839 compile or compile and execute immediately.
841 As of April 2009 this creates a new in-memory assembly for
842 each statement evaluated.
844 To support this evaluator mode, the evaluator API primes the
845 tokenizer with an initial character that would not appear in
846 valid C# code and is one of:
848 int EvalStatementParserCharacter = 0x2190; // Unicode Left Arrow
849 int EvalCompilationUnitParserCharacter = 0x2191; // Unicode Arrow
850 int EvalUsingDeclarationsParserCharacter = 0x2192; // Unicode Arrow
852 These character are turned into the following tokens:
854 %token EVAL_STATEMENT_PARSER
855 %token EVAL_COMPILATION_UNIT_PARSER
856 %token EVAL_USING_DECLARATIONS_UNIT_PARSER
858 This means that the first token returned by the tokenizer when
859 used by the Evalutor API is a special token that helps the
860 yacc parser go from the traditional parsing of a full
861 compilation-unit to the interactive parsing:
863 The entry production for the compiler basically becomes:
867 // The standard rules
869 : outer_declarations opt_EOF
870 | outer_declarations global_attributes opt_EOF
871 | global_attributes opt_EOF
872 | opt_EOF /* allow empty files */
875 // The rule that allows interactive parsing
877 | interactive_parsing { Lexer.CompleteOnEOF = false; } opt_EOF
881 // This is where Evaluator API drives the compilation
884 : EVAL_STATEMENT_PARSER EOF
885 | EVAL_USING_DECLARATIONS_UNIT_PARSER using_directives
886 | EVAL_STATEMENT_PARSER
887 interactive_statement_list opt_COMPLETE_COMPLETION
888 | EVAL_COMPILATION_UNIT_PARSER
889 interactive_compilation_unit
892 Since there is a little bit of ambiguity for example in the
893 presence of the using directive and the using statement a
894 micro-predicting parser with multiple token look aheads is
895 used in eval.cs to resolve the ambiguity and produce the
896 actual token that will drive the compilation.
898 This helps this scenario:
901 using (var x = File.OpenRead) {}
903 This is the meaning of these new initial tokens:
905 EVAL_STATEMENT_PARSER
906 Used to parse statements or expressions as statements.
908 EVAL_USING_DECLARATIONS_UNIT_PARSER
909 This instructs the parser to merely do using-directive
910 parsing instead of statement parsing.
912 EVAL_COMPILATION_UNIT_PARSER
913 Used to evaluate toplevel declarations like namespaces
916 The feature is currently disabled because later stages
917 of the compiler are not yet able to lookup previous
918 definitions of classes.
920 What happens is that between each call to Evaluate()
921 we reset the compiler state and at this stage we drop
922 also any existing definitions, so evaluating "class X
923 {}" followed by "class Y : X {}" does not currently
926 We need to make sure that new type definitions used
927 interactively are preseved from one evaluation to the
930 The evaluator the expression or statement `BODY' is hosted
931 inside a wrapper class. If the statement is a variable
932 declaration then the declaration is split from the assignment
933 into a DECLARATION and BODY.
935 This is what the code generated looks like:
937 public class Foo : $InteractiveBaseClass {
940 static void Host (ref object $retval)
946 Since both statements and expressions are mixed together and
947 it is useful to use the Evaluator to compute expressions we
948 return expressions for example for "1+2" in the `retval'
951 To support this, the reference retval parameter is set to a
952 special internal value that means "Value was not set" before
953 the method Host is invoked. During parsing the parser turns
954 expressions like "1+2" into:
958 This is done using a special OptionalAssign
959 ExpressionStatement class.
961 When the Host method return, if the value of retval is still
962 the special flag no value was set. Otherwise the result of
963 the expression is in retval.
965 The `InteractiveBaseClass' is the base class for the method,
966 this allows for embedders to provide different base classes
967 that could expose new static methods that could be useful
968 during expression evaluation.
970 Our default implementation is InteractiveBaseClass and new
971 implementations should derive from this and set the property
972 in the Evaluator to it.
974 In the future we will move to creating dynamic methods as the
975 wrapper for this code.
979 Support for code completion is available to allow the compiler
980 to provide a list of possible completions at any given point
981 int he parsing process. This is used for Tab-completion in
982 an interactive shell or visual aids in GUI shells for possible
985 This method is available as part of the Evaluator API where a
986 special method GetCompletions returns a list of possible
987 completions given a partial input.
989 The parser and tokenizer work together so that the tokenizer
990 upon reaching the end of the input generates the following
991 tokens: GENERATE_COMPLETION followed by as many
992 COMPLETE_COMPLETION token and finally the EOF token.
994 GENERATE_COMPLETION needs to be handled in every production
995 where the user is likely to press the TAB key in the shell (or
996 in the future the GUI, or an explicit request in an IDE).
997 COMPLETE_COMPLETION must be handled throughout the grammar to
998 provide a way of completing the parsed expression. See below
1001 For the member access case, I have added productions that
1002 mirror the non-completing productions, for example:
1004 primary_expression DOT IDENTIFIER GENERATE_COMPLETION
1006 LocatedToken lt = (LocatedToken) $3;
1007 $$ = new CompletionMemberAccess ((Expression) $1, lt.Value, lt.Location);
1012 primary_expression DOT IDENTIFIER opt_type_argument_list
1014 LocatedToken lt = (LocatedToken) $3;
1015 $$ = new MemberAccess ((Expression) $1, lt.Value, (TypeArguments) $4, lt.Location);
1018 The CompletionMemberAccess is a new kind of
1019 Mono.CSharp.Expression that does the actual lookup. It
1020 internally mimics some of the MemberAccess code but has been
1021 tuned for this particular use.
1023 After this initial token is processed GENERATE_COMPLETION the
1024 tokenizer will emit COMPLETE_COMPLETION tokens. This is done
1025 to help the parser basically produce a valid result from the
1026 partial input it received. For example it is able to produce
1027 a valid AST from "(x" even if no parenthesis has been closed.
1028 This is achieved by sprinkling the grammar with productions
1029 that can cope with this "winding down" token, for example this
1030 is what parenthesized_expression looks like now:
1032 parenthesized_expression
1033 : OPEN_PARENS expression CLOSE_PARENS
1035 $$ = new ParenthesizedExpression ((Expression) $2);
1040 | OPEN_PARENS expression COMPLETE_COMPLETION
1042 $$ = new ParenthesizedExpression ((Expression) $2);
1046 Once we have wrapped up everything we generate the last EOF token.
1048 When the AST is complete we actually trigger the regular
1049 semantic analysis process. The DoResolve method of each node
1050 in our abstract syntax tree will compute the result and
1051 communicate the possible completions by throwing an exception
1052 of type CompletionResult.
1054 So for example if the user type "T" and the completion is
1055 "ToString" we return "oString".
1057 ** Enhancing Completion
1059 Code completion is a process that will be curated over time.
1060 Just like producing good error reports and warnings is an
1061 iterative process, to find a good balance, the code completion
1062 engine in the compiler will require tuning to find the right
1063 balance for the end user.
1065 This section explains the basic process by which you can
1066 improve the code completion by using a real life sample.
1068 Once you add the GENERATE_COMPLETION token to your grammar
1069 rule, chances are, you will need to alter the grammar to
1070 support COMPLETE_COMPLETION all the way up to the toplevel
1073 To debug this, you will want to try the completion with either
1074 a sample program or with the `csharp' tool.
1080 This will turn on the parser debugging output and will
1081 generate a lot of data when parsing its input.
1083 To start with a new completion scheme, type your C# code and
1084 then hit the tab key to trigger the completion engine. In the
1085 generated output you will want to look for the first time that
1086 the parser got the GENERATE_COMPLETION token, it will look
1089 lex state 414 reading GENERATE_COMPLETION value {interactive}(1,35):
1091 The first word `lex' indicates that the parser called the
1092 lexer at state 414 (more on this in a second) and it got back
1093 from the lexer the token GENERATE_COMPLETION. If this is a
1094 kind of completion chances are, you will get an error
1095 immediately as the rules at that point do not know how to cope
1096 with the stream of COMPLETE_COMPLETION tokens that will
1097 follow, they will look like this:
1100 pop state 414 on error
1101 pop state 805 on error
1102 pop state 628 on error
1103 pop state 417 on error
1105 The first line means that the parser has entered the error
1106 state and will pop states until it can find a production that
1107 can deal with the error. At that point an error message will
1110 Open the file `y.output' which describes the parser states
1111 generated by jay and search for the state that was reported
1112 previously in `lex' that got the GENERATE_COMPLETION:
1115 object_or_collection_initializer : OPEN_BRACE . opt_member_initializer_list CLOSE_BRACE (444)
1116 object_or_collection_initializer : OPEN_BRACE . member_initializer_list COMMA CLOSE_BRACE (445)
1117 opt_member_initializer_list : . (446)
1119 We now know that the parser was in the middle of parsing an
1120 `object_or_collection_initializer' and had alread seen the
1123 The `.' after OPEN_BRACE indicates the current state of the
1124 parser, and this is where our parser got the
1125 GENERATE_COMPLETION token. As you can see from the three
1126 rules in this sample, support for GENERATE_COMPLETION did not
1129 So we must edit the grammar to add a production for this case,
1130 I made the code look like this:
1134 | GENERATE_COMPLETION
1136 LocatedToken lt = $1 as LocatedToken;
1137 $$ = new CompletionElementInitializer (GetLocation ($1));
1141 This new production creates the class
1142 CompletionElementInitializer and returns this as the value for
1143 this. The following is a trivial implementation that always
1144 returns "foo" and "bar" as the two completions and it
1145 illustrates how things work:
1147 public class CompletionElementInitializer : CompletingExpression {
1148 public CompletionElementInitializer (Location l)
1153 public override Expression DoResolve (EmitContext ec)
1155 string [] = new string [] { "foo", "bar" };
1156 throw new CompletionResult ("", result);
1160 // You should implement CloneTo if your CompletingExpression
1161 // keeps copies to Statements or Expressions. CloneTo
1162 // is used by the lambda engine, so you should always
1165 protected override void CloneTo (CloneContext clonectx, Expression t)
1167 // We do not keep references to anything interesting
1168 // so cloning is an empty operation.
1173 We then rebuild our compiler:
1175 (cd mcs/; make cs-parser.jay)
1176 (cd tools/csharplib; make install)
1180 (cd tools/csharp; csharp -v -v)
1182 Chances are, you will get another error, but this time it will
1183 not be for the GENERATE_COMPLETION, we already handled that
1184 one. This time it will be for COMPLETE_COMPLETION.
1186 The remaining of the process is iterative: you need to locate
1187 the state where this error happens. It will look like this:
1189 lex state 623 reading COMPLETE_COMPLETION value {interactive}(1,35):
1192 And make sure that the state can handle at this point a
1193 COMPLETE_COMPLETION. When receiving COMPLETE_COMPLETION the
1194 parser needs to complete constructing the parse tree, so
1195 productions that handle COMPLETE_COMPLETION need to wrap
1196 things up with whatever data they have available and just make
1197 it so that the parser can complete.
1199 To avoid rule duplication you can use the
1200 opt_COMPLETE_COMPLETION production and append it to an
1201 existing production:
1203 foo : bar opt_COMPLETE_COMPLETION {
1209 ** Error Processing.
1211 Errors are reported during the various stages of the
1212 compilation process. The compiler stops its processing if
1213 there are errors between the various phases. This simplifies
1214 the code, because it is safe to assume always that the data
1215 structures that the compiler is operating on are always
1218 The error codes in the Mono C# compiler are the same as those
1219 found in the Microsoft C# compiler, with a few exceptions
1220 (where we report a few more errors, those are documented in
1221 mcs/errors/errors.txt). The goal is to reduce confusion to
1222 the users, and also to help us track the progress of the
1223 compiler in terms of the errors we report.
1225 The Report class provides error and warning display functions,
1226 and also keeps an error count which is used to stop the
1227 compiler between the phases.
1229 A couple of debugging tools are available here, and are useful
1230 when extending or fixing bugs in the compiler. If the
1231 `--fatal' flag is passed to the compiler, the Report.Error
1232 routine will throw an exception. This can be used to pinpoint
1233 the location of the bug and examine the variables around the
1234 error location. If you pass a number to --fatal the exception
1235 will only be thrown when the error count reaches the specified
1238 Warnings can be turned into errors by using the `--werror'
1239 flag to the compiler.
1241 The report class also ignores warnings that have been
1242 specified on the command line with the `--nowarn' flag.
1244 Finally, code in the compiler uses the global variable
1245 RootContext.WarningLevel in a few places to decide whether a
1246 warning is worth reporting to the user or not.
1248 ** Debugging the compiler
1250 Sometimes it is convenient to find *how* a particular error
1251 message is being reported from, to do that, you might want to use
1252 the --fatal flag to mcs. The flag will instruct the compiler to
1253 abort with a stack trace execution when the error is reported.
1255 You can use this with -warnaserror to obtain the same effect
1258 ** Debugging the Parser.
1260 A useful trick while debugging the parser is to pass the -v
1261 command line option to the compiler.
1263 The -v command line option will dump the various Yacc states
1264 as well as the tokens that are being returned from the
1265 tokenizer to the compiler.
1267 This is useful when tracking down problems when the compiler
1268 is not able to parse an expression correctly.
1270 You can match the states reported with the contents of the
1271 y.output file, a file that contains the parsing tables and
1272 human-readable information about the generated parser.
1274 * Editing the compiler sources
1276 The compiler sources are intended to be edited with 134
1281 Once you have a full build of mcs, you can improve your
1282 development time by just issuing make in the `mcs' directory or
1283 using `make qh' in the gmcs directory.