Settings

Theme

LuaTeX 1.0.0

tug.org

203 points by jasongrout 9 years ago · 50 comments

Reader

patrickg 9 years ago

LuaTeX has a great abilitiy compared to regular TeX: you are able to access the internal data structures of TeX. See my article about TeX without \TeX at http://wiki.luatex.org/index.php/TeX_without_TeX.

With this access you can read and write (and create,...) these "nodes". This makes it possible to completely circumvent TeX's input language, which can be a pain to program with (sometimes it is very nice though). Now that easy access to data structures and the rather nice (simple) programming language of Lua makes it easy to build rather complex programs on top of TeX.

A shameless plug is my database Publishing software (see https://speedata.github.io/publisher/index.html) which I believe takes this approach to an extreme: I can use TeX's algorithms like paragraph building, hyphenation, PDF writing and such (they are of excellent quality) and build my own software around it. I don't say that this is done on a weekend or so, but the posibilities are much, much, much greater than they were before LuaTeX was available.

  • exDM69 9 years ago

    I've got a project where it would be really nice to plot some math formulae with *TeX typesetting and then outputting raster or vector data in memory (to be rendered in a graphics engine).

    Do you think such a thing would be possible with LuaTeX? I did glance over the article you linked and it's showing that it works pretty nicely for basic text, but can anyone shed some light if this works nicely for outputting equations, etc?

    • patrickg 9 years ago

      (Lua)TeX only renders PDF and only on disc. The great typesetting (especially the math formulae) would make it feasible for such a job.

      Depending on your need, LuaTeX can be quite fast. I have a project where I use a non-optimized version of LuaTeX to have a 'live' preview of a web form. It takes less than 1/2 second to run LuaTeX, generate a bitmap from the PDF and push it to the user. In most cases this will be too slow, but for me this was a good solution.

      • iheartmemcache 9 years ago

        speeddata looks absolutely amazing as an alternative to (what's also an amazing piece of OSS) Scribus and (in my mind the paragon) Framemaker.

        For technical documents, I've been using: TeknikCenter (for maintaining build scripts and asset management) + Bibtex + Tikz (for graphs/charts, etc) & I can get a final PDF that's fairly close to something I consider nicely laid out (https://www.utdallas.edu/~herve/abdi-awPCA2010.pdf - John Wiley & Sons, Inc's template done in some variant of TeX (apparently Wiley, Elsevier, Springer all use TeX for their masters which go to plate for press)). Scrbook, memoir and book all are 'almost there' but not quite.

        Images have been pretty much the most difficult problem to render 'properly'. Using DOT has alignment/font aesthetic issues, especially if I use RST or DocBook as my 'base' document and want to output both to a blog in long-form, available in the classic TeX output (e.g. https://www.mitpress.mit.edu/sicp/full-text/book/book-Z-H-1....) as well as in the "one-page" long-form.

        I guess this is a long-winded way of asking, if you have any suggestions (I've read your site and it lacks a "I'm a TeX power user, here are the idioms/conventions/best practices for speedata).

        FYI, I'm perfectly okay with half a second (in fact, that's way faster than my TexNicCenter renders!) for what it's worth. Heck, I'm sure some of the old timers remember the troff wait times. Believe youme, it's no trivial feet to get typesetting correct and output in under half a second! Good on you + your team.

    • lmcinnes 9 years ago

      You might want [PyX](http://pyx.sourceforge.net/) which takes TeX formula input and can return you SVGs. You could also combine that with Sympy for formula manipulation etc.

pikachu_is_cool 9 years ago

It's interesting to see the trajectory of Lua adoption. The determining factor of the popularity of most languages was at the fate of large companies, (AT&T/C, Netscape/JavaScript, Microsoft/C++, Google/Python). On the flip side, Lua seems to slowly gain more and more popularity over the years. It's more "organic", for lack of a better term.

I really hope a big company doesn't pick Lua up, because I think the fact that it hasn't reached "Eternal September" yet is what makes it such a good language. It has room to breathe.

(The fact that it's beginner friendly, has an ANSI C89 implementation, and has the best/fastest JIT/FFI doesn't hurt either though :P)

  • etiene 9 years ago

    My favorite utopia is Lua becoming mainstream and killing JS. But I agree with you that one of the reasons Lua is so nice is exactly because that didn't happen, so it's a tough choice. This is precisely the reason why many members of the community refuse to treat it as a general purpose language and keep enforcing it as a niche because they're trying to protect their special snowflake. I do think Lua is growing faster outside the embedded systems & games niches now, with Torch for machine learning and OpenResty for the web. I'm very excited and I hope this growth gets more support! We need more people pushing for this :D

  • andrepd 9 years ago

    >AT&T/C, Netscape/JavaScript, Microsoft/C++, Google/Python

    I wonder if you can provide more background on that, particularly those last 2. And by AT&T, you mean Unix, right? :p

    • diego_moita 9 years ago

      I am not the o.p. but: the success of Unix helped the success of C, JavaScript succeeded because Netscape made it the only one on the browser, C++ won the war against Objective-C as the "object-oriented successor of C" because Microsoft embraced it and they had a lot more power than Next (that embraced Obj-C). Python did sustain its fight with Ruby (and PHP) for the role of "sane successor of Perl" because Google provided so many libraries and frameworks.

      • pekk 9 years ago

        Google's contribution to the success of Python was really minimal. Most of the top stuff was not from Google at all.

    • pjmlp 9 years ago

      Microsoft was the last C vendor on MS-DOS to add C++ support to their compilers....

    • pikachu_is_cool 9 years ago

      I don't have a source, but from what I remember, Microsoft was the main factor that pushed C++ into the mainstream. Same with Google and Python. I guess that's up for debate though. Feel free to correct me.

      • pjmlp 9 years ago

        I wonder why people keep pushing this, Microsoft was the last C vendor to support C++ on MS-DOS.

        Even on Windows, all the C++ alternatives were way better until they released the 32 bit version of Visual C++.

        • groovy2shoes 9 years ago

          Yep. The way I remember it is that until MSVC 6, everyone was still using Borland or Watcom tools. Even up until the early 2000s, Borland C++ Builder was giving MSVC a run for the money. There was also a plethora of quality but smaller-marketshare tools like the Digital Mars stuff.

          The real reasons C++ "beat" Obj-C have more to do with a) AT&T aggressively marketing C++ for at least 6 years before NeXT showed up on the scene, b) C++ being a little older then Obj-C in the first place, and c) Obj-C offering all the expressiveness of C with all the efficiency of Smalltalk (thank you, James Iry).

          • pjmlp 9 years ago

            Also other things that came to mind, CORBA on UNIX, Apple's move from Object Pascal to C++, and the whole Taligent project that ended up nowhere.

  • the_duke 9 years ago

    The project released it's first public beta in 2007, so you probably can't count that as a recent Lua win. ^^

    • etiene 9 years ago

      that's true, but I do think it's interesting that someone thought about it, decided to post it here and there are people discussing. I've been seeing Lua regularly on hackernews recently and that's awesome!

  • marvy 9 years ago

    So, you think it's a good thing (for Lua) that Netscape made JS instead of just using Lua?

    • etiene 9 years ago

      It's a good thing for Lua but not a good thing for the web and these things are partially mutually exclusive.

      One of the reasons Lua's evolution is better than JS is because managing versions when you have to keep browser compatibility is way more difficult.

    • pikachu_is_cool 9 years ago

      Yeah, because it would have solidified Lua == 2.1. Instead of Lua == 5.1, 5.2, 5.3, or LuaJIT.

  • jahewson 9 years ago

    Isn't LuaJIT now unmaintained?

    • etiene 9 years ago

      No, it's not. It was "adopted" by cloudflare, and although Mike Pall is a special person, there are other people who are contributors.

    • occamrazor 9 years ago

      No, but it's forked: LuaJIT is compatible with Lua 5.1, and mostly compatible with 5.2), but not with 5.3.

      Both Lua and LuaJIT are maintained and in active development.

    • Spiritus 9 years ago

      LuaJIT != Lua

the_duke 9 years ago

Can anyone shine a light on what LuaTex actually is?

"an extended version of pdfTeX using Lua as an embedded scripting language" doesn't help me that much.

What's the advantage over (La)Tex?

  • jabl 9 years ago

    From http://www.luatex.org/roadmap.html#tbp:

    There are currently three engines in use in the TeX community: pdfTeX, XeTeX and LuaTeX and they all serve an audience.

    The pdfTeX engine is a stable extension to TeX that has eTeX on board as well as a few extensions. It provides hz and protruding and supports advanced PDF output. It is fast and reliable and sort of frozen with respect to functionality. It will stay around for ever but has only limited provisions for dealing with OpenType fonts.

    Then there is XeTeX which supports Unicode as well as OpenType by means of third party libraries. It integrates nicely into the current infrastructure and support from macro packages is easy as there are no fundamental changes in interfaces and functionality. It uses the eTeX r2l typesetting while LuaTeX uses the Omega (Aleph) method. Support from macro packages does not demand changes in the core.

    The LuaTeX project uses a different approach. Instead of using libraries it provides an extension mechanism. This keeps the program independent and permits the flexibility that we like so much. However, it comes at a price. First of all LuaTeX does not have the performance of its cousins, although for instance the average ConTeXt MkIV run can be faster in LuaTeX. LuaTeX also demands extensive support at the macro level if you want to profit from its benefits. Just adding some support for scripting is nice but the power of LuaTeX only shows when it's tightly integrated. This integration is also needed in order to provide macro users some stability as most of them are no TeX programmers. If such integration is not what you want, you might consider sticking to the other engines as there are no real advantages to using LuaTeX then.

  • data_hope 9 years ago

    having a more-modern language for scripting? I remember that tikz (a library for vector graphics) had to emulate some floating point math because it just wasn't supported by TeX (which meant when I wrote my thesis, my macbooks fans were going through the roof).

leephillips 9 years ago

A brief introduction to what you can do with LuaTeX:

http://lwn.net/Articles/657157/

haberman 9 years ago

If you think about it, TeX is just a compiler, right? It operates on trees, and translates some input to some output. I'd love to see the equivalent of LLVM for TeX: a modern, modular implementation that implements known best practices and is easy to integrate into other environments.

Maybe this is something like what LuaTeX is trying to do? It's hard to tell from their web page.

  • patrickg 9 years ago

    No, TeX is not a compiler by a common sense of a compiler (unless you say it compiles to PDF).

    TeX's input grammer can change during the TeX run, so I believe it is impossible to make an equivalent of LLVM (unless I misunderstand the concept of LLVM).

    • haberman 9 years ago

      > No, TeX is not a compiler by a common sense of a compiler (unless you say it compiles to PDF).

      That is exactly what I am saying.

      > TeX's input grammer can change during the TeX run, so I believe it is impossible to make an equivalent of LLVM

      The important part is not the grammar, but the internal representation. I don't know enough about TeX to know much about its internal representation post-parsing.

      • Kristine1975 9 years ago

        > I don't know enough about TeX to know much about its internal representation post-parsing.

        IIRC it's a stream of tokens. A token can be a character, a built-in command, a macro etc. During processing of this token stream macros are "expanded", i.e. replaced with their definition (recursively). It is possible to control this expansion process using built-in commands.

  • ternaryoperator 9 years ago

    The problem with this is that TeX cannot be described using a context-free grammar (CFG). Knuth has discussed this in the past. So, building a compiler for TeX is almost impossible.

    • mmarx 9 years ago

      > The problem with this is that TeX cannot be described using a context-free grammar (CFG). Knuth has discussed this in the past. So, building a compiler for TeX is almost impossible.

      Neither can C++[0], yet there still are C++ compilers out there. It does raise the bar significantly, though.

      [0] http://stackoverflow.com/a/14589567

  • graphene 9 years ago

    Not entirely what you're describing, but pandoc goes a long way towards being a sort of LLVM for text documents. In order to do all the format conversions, it transforms inputs into a tree-based internal representation, and then translates that into the output format.

    Unfortunately it doesn't have a (pure) TeX reader yet, but that could be implemented relatively easily.

    • richard_todd 9 years ago

      If it could be implemented easily, chances are it would have been by now. One big issue is, TeX doesn't run in traditional compiler-like layers (lex,parse,etc.) In TeX, the meaning of the next token (lexer level) can be changed by something happening in the guts of the engine in response to the previous token. So, just as compiling LISP requires an ability to interpret LISP, compiling TeX into some sort of tree structure would require implementing a big chunk of the TeX engine itself in the process.

      • graphene 9 years ago

        Well, yes and no. You are absolutely right that a complete implementation of TeX would be difficult, but you could read a subset of the language that is big enough to be useful, including simple macro definitions and commonly used commands, which is exactly what pandoc's LaTeX reader already does.

nikolay 9 years ago

Project's website: http://www.luatex.org/

  • sambe 9 years ago

    The "more details here" link is broken, if the maintainer is reading.

hyh1048576 9 years ago

Great news.

BTW is there anyway to make LuaTeX compile faster?

  • patrickg 9 years ago

    LuaTeX _is_ very fast. What makes it slow is luaotfload and fontspec (with LaTeX). LuaJITTeX is improving much, but a lot needs to be done on the two mentioned packages.

    • hyh1048576 9 years ago

      So is there anyway to speed it up? The font part is essential to me though.

      • patrickg 9 years ago

        Short answer: no. In theory, it would be possible to increase the speed, but that would include a rewrite of these two font packages. Fontloading is not necessarily slow in Lua(TeX), but the mentioned luaotfload for example deals with all kinds of complex scripts which is not necessary in most cases.

  • jonathanstrange 9 years ago

    You mean compiling LuaTeX itself or using LuaTeX to compile? I don't know about LuaTeX, but I've just noticed that pdftex gets magnitudes faster when you install it (including all fonts, etc.) on an SSD. It's very I/O intensive.

    I was amazed about the speed increase. (although it's kind of obvious in retrospect..)

    • JorgeGT 9 years ago

      I have to try this! Also it's useful to separate very long documents in different .tex files using `\include` and then use `\includeonly{file_im_writing}` to only compile the part you're working on.

    • hyh1048576 9 years ago

      Well I mean LuaTeX (in fact LuaLaTeX) is too slow to compile tex files. I'm using a Mac and it has SSD. a simple slides takes 5 seconds to compile, which is very slow compare to things like pdfLaTeX.

      • iheartmemcache 9 years ago

        Are you using Beamer? What CTAN packages are you using? Those long render times could have quite a few reasons but fonts are one reason for sure.

        If you want absolutely instantaneous (on the order of ~100ms for full generation of entire thesis with multiple passes for TOC/index/list of figures proper enumeration) output, RAMdisks with all of your content - that means install, fonts, graphic assets, packages, etc is only the way to go (without re-writing large chunks of LuaTeX). When I needed a quick render loop for final alterations, I had ~6gb of RAM on a dedicated partition in /opt/tex/ on FreeBSD setup just for that and it was nothing short of a pleasure.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection