• Wincent
    Menu
  • Blog
  • Wiki
  • Snippets
  • Tags
  • Search

Instrument wikitext.c for benchmarking purposes (wikitext, 7c884ae)

Created 2/2/2008, updated 3/5/2024
  • snippets

We want to measure the speed of raw tokenization (how fast the scanner can spit out tokens) so prepare a special function just for this purpose.

Signed-off-by: Greg Hurrell <greg@hurrell.net>

← More tokenization specs (wikitext, 53522a1)
Add URI matching rule (wikitext, e956ebb) →

All snippets

Site
  • About
  • Blog
  • Wiki
  • Snippets
  • Tags
  • Search
External
  • GitHub
  • Twitter
  • YouTube
  • Facebook
  • LinkedIn
Colophon

Made by Greg Hurrell with Rust (with help from Git and Neovim).