Introduction - If you have any usage issues, please Google them yourself
lucene2.0
An Analyzer builds TokenStreams, which analyze text. It thus represents a
* policy for extracting index terms from text.
Typical implementations first build a Tokenizer, which breaks the stream of