[code] Re: Debugging language modules

From: Arnel <jalespring.att.gmail.com>
Date: Mon, 28 Mar 2016 13:41:05 +0800

On Sat, 26 Mar 2016 14:28:05 -0400 (EDT), Mitchell <m.att.foicica.com> wrote:
> Hi Arnel,
>
> On Sat, 26 Mar 2016, Arnel wrote:
>
> > Hello,
> >
> > I am currently working on a language module for Racket (formerly PLT Scheme).
> > Here are the questions I have:
> >
> > - I have the following lines in the 'init.lua' file for the Racket module:
> >
> > textadept.file_types.extensions.rkt = 'racket'
> > [snip]
> >
> > However, if I open a Racket script and try testing the syntax checking, nothing
> > happens. The 'init.lua' file is located inside the 'racket' directory inside
> > '~/.textadept/modules'. If I run
> >
> > ui.print(textadept.file_types.extensions.rkt)
> >
> > from the command entry, I come up with plain 'nil'. Why is that?
>
> This is a bit of a Catch-22. I've snipped the relevant line from your
> Racket module's init.lua file. In order for Textadept to know the 'rkt'
> extension should load the 'racket' lexer (and eventually the 'racket'
> module), that line needs to be run earlier. Right now, you are expecting
> Textadept to load the 'racket' module, which then tells Textadept about
> the extension needed in order to load the 'racket' module... Put that
> single line in your `~/.textadept/init.lua` file and everything will work
> as expected.
>
> > If I place these lines in my regular '~/.textadept/init.lua', the syntax
> > checking works fine.
>
> This illustrates my point above :)

Bookmarked for any future language modules I write for TA :) Thanks.
 
> > - Is there a better way to debug lexer modules? The Racket lexer I'm working on
> > was based off the Scheme lexer file provided with TA. I've read somewhere in
> > the API manual that troubleshooting lexers can be tricky and it's recommended
> > to run TA in the terminal to get the error messages. I tried this but I didn't
> > get any. Those who have written lexers for other languages before - any
> > pointers? Anything on seeing what's actually captured by the LPEG expressions
> > would be great.
>
> If you don't see any error messages in the terminal by default, then that
> means your lexer is well formed and is processing text just fine. However,
> that doesn't mean your lexer is processing text as you'd expect! Robert
> already mentioned using Scintillua as a library (which is an idea I hadn't
> thought of!). Normally I just use:
>
> P(function(input, index)
> _G.print(...)
> return index
> end
>
> and put that in a pattern I'm debugging. The "return index" line ensures
> that debug function "matches" so that text matching can continue.

Could you elaborate further how I can add this to any pattern? Say I have
something like:

local keywords = token(l.KEYWORD, word_match({
   '#%app', '#%datum', '#%declare', '#%expression', '#%module-begin',
;; ...
}, '!#%*+-./:=>?_'))

How do I use that to print the captured text (or index as it were)?

(I tried the example given in Scintillua for using it as a library, but for
some reason I'm getting an error message about not seeing 'lpeg' even though
I've installed it via luarocks, so I thought I'd try this instead.)

----
Thank you,
Arnel
-- 
You are subscribed to code.att.foicica.com.
To change subscription settings, send an e-mail to code+help.att.foicica.com.
To unsubscribe, send an e-mail to code+unsubscribe.att.foicica.com.
Received on Mon 28 Mar 2016 - 01:41:05 EDT

This archive was generated by hypermail 2.2.0 : Mon 28 Mar 2016 - 06:54:11 EDT