Looking for lexical analyzers in .NET…

Maybe someone out there can help me out:

I need to parse a really complex data format which tends to change every now and then.  Manually coding something to read the data and take it apart would probably result in an app which needs reworking for every new element in the data structure.  Instead, I am looking for a lexical parser which can take a grammar (rules, terminals and charsets), calculate the DFA/LALR tables, and return a parse tree of the data file using the given grammar.

There are several .NET ports of Lex/Yacc/Bison (such as Coco and CsLex ), however all of them require embedding code templates into the lexical grammar file, which is problematic for me since I am interested purely in lexical analysis and tokenization – without the code generation. Hacking up a code template just to get at the parse tree is too much work for a lazy guy like me 🙂 especially since it would probably take almost as long as coding the parser by hand.

Does anyone know of any reasonably accessible utility which can take a BNF grammar and do what I described?  Is there a better way which I am not aware of yet?

Thanks guys!





Comments (2)

  1. I’m not sure what you mean by "really complex data format", but it’s possible that you could use MarkItUp (http://www.MarkItUp.com) to do the tokenizing for you.

    Feel free to contact me for more information about how this might work:

    showusyourcode (AT) hotmail (DOT) com

Skip to main content