[code] Oddity in trying to make a 7.0 lexer

From: Michael Richter <ttmrichter.att.gmail.com>
Date: Mon, 4 Nov 2013 12:05:24 +0800

I have this rule for keywords:

local keyword = token(l.KEYWORD, word_match{
  'module', 'end_module', 'interface', 'implementation', 'pred', 'func',
  'mode', 'det', 'semidet', 'nondet', 'multi', 'pragma', 'foreign_proc',
  'impure', 'semipure', 'promise_pure', 'promise_semipure', 'foreign_type',
  'foreign_decl', 'type', 'import_module', 'include_module', 'cc_multi',
  'initialise', 'finalise', 'initialize', 'finalize', 'foreign_enum',
})

It works, but perhaps a bit too eagerly. For while it will highlight, as
expected, "*module*" and "*import_module*" and "*include_module*", it will
also highlight "general_*module*" (just the "module" part in case the bold
didn't show through).

How would I get around this? I would think that word_match would match
whole words only, not word fragments.

-- 
"Perhaps people don't believe this, but throughout all of the discussions
of entering China our focus has really been what's best for the Chinese
people. It's not been about our revenue or profit or whatnot."
--Sergey Brin, demonstrating the emptiness of the "don't be evil" mantra.
-- 
You are subscribed to code.att.foicica.com.
To change subscription settings, send an e-mail to code+help.att.foicica.com.
To unsubscribe, send an e-mail to code+unsubscribe.att.foicica.com.
Received on Sun 03 Nov 2013 - 23:05:24 EST

This archive was generated by hypermail 2.2.0 : Mon 04 Nov 2013 - 06:54:45 EST