Skip to content

Commit

Permalink
docs(topic): Split out a dedicated lexing section
Browse files Browse the repository at this point in the history
  • Loading branch information
epage committed Feb 28, 2025
1 parent 1efa453 commit ab9830d
Show file tree
Hide file tree
Showing 4 changed files with 28 additions and 14 deletions.
15 changes: 1 addition & 14 deletions src/_topic/arithmetic.rs
Original file line number Diff line number Diff line change
@@ -1,19 +1,6 @@
//! # Arithmetic
//!
//! ## Direct evaluation
//!
//! This parses arithmetic expressions and directly evaluates them.
//! ```rust
#![doc = include_str!("../../examples/arithmetic/parser.rs")]
//! ```
//!
//! ## Parse to AST
//!
//! ```rust
#![doc = include_str!("../../examples/arithmetic/parser_ast.rs")]
//! ```
//!
//! ## Parse to Tokens then AST
//!
//! ```rust
#![doc = include_str!("../../examples/arithmetic/parser_lexer.rs")]
//! ```
23 changes: 23 additions & 0 deletions src/_topic/lexing.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
//! # Lexing and Parsing
//!
//! ## Parse to AST
//!
//! The simplest way to write a parser is to parse directly to the AST.
//!
//! Example:
//! ```rust
#![doc = include_str!("../../examples/arithmetic/parser_ast.rs")]
//! ```
//!
//! ## Lexing
//!
//! However, there are times when you may want to separate lexing from parsing.
//! Winnow provides [`TokenSlice`] to simplify this.
//!
//! Example:
//! ```rust
#![doc = include_str!("../../examples/arithmetic/parser_lexer.rs")]
//! ```
#![allow(unused_imports)]
use crate::stream::TokenSlice;
2 changes: 2 additions & 0 deletions src/_topic/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
//! - [Implementing `FromStr`][fromstr]
//! - [Performance][performance]
//! - [Parsing Partial Input][partial]
//! - [Lexing and Parsing][lexing]
//! - [Custom stream or token][stream]
//! - [Custom errors][error]
//! - [Debugging][crate::_tutorial::chapter_8]
Expand All @@ -32,6 +33,7 @@ pub mod http;
pub mod ini;
pub mod json;
pub mod language;
pub mod lexing;
pub mod nom;
pub mod partial;
pub mod performance;
Expand Down
2 changes: 2 additions & 0 deletions src/stream/token.rs
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ use crate::stream::UpdateSlice;
/// - Any `PartialEq` type (e.g. a `TokenKind` or `&str`) can be used with
/// [`literal`][crate::token::literal]
/// - A `PartialEq` for `&str` allows for using `&str` as a parser for tokens
///
/// See also [Lexing and Parsing][crate::_topic::lexing].
#[derive(Debug, Copy, Clone, PartialEq, Eq)]
pub struct TokenSlice<'t, T> {
initial: &'t [T],
Expand Down

0 comments on commit ab9830d

Please sign in to comment.