notedeck

One damus client to rule them all
git clone git://jb55.com/notedeck
Log | Files | Refs | README | LICENSE

README.md (1985B)


      1 # Tokenator
      2 
      3 Tokenator is a simple, efficient library for parsing and serializing string tokens in Rust. It provides a lightweight solution for working with colon-delimited (or custom-delimited) string formats.
      4 
      5 ## Features
      6 
      7 - Parse colon-delimited (or custom-delimited) string tokens
      8 - Serialize data structures into token strings
      9 - Robust error handling with descriptive error types
     10 - Support for backtracking and alternative parsing routes
     11 - Zero-copy parsing for improved performance
     12 - Hex decoding utilities
     13 
     14 ## Installation
     15 
     16 Add this to your `Cargo.toml`:
     17 
     18 ```toml
     19 [dependencies]
     20 tokenator = "0.1.0"
     21 ```
     22 
     23 ## Quick Start
     24 
     25 ```rust
     26 use tokenator::{TokenParser, TokenWriter, TokenSerializable};
     27 
     28 // Define a type that can be serialized to/from tokens
     29 struct User {
     30     name: String,
     31     age: u32,
     32 }
     33 
     34 impl TokenSerializable for User {
     35     fn parse_from_tokens<'a>(parser: &mut TokenParser<'a>) -> Result<Self, tokenator::ParseError<'a>> {
     36         // Expect the token "user" first
     37         parser.parse_token("user")?;
     38 
     39         // Parse name and age
     40         let name = parser.pull_token()?.to_string();
     41         let age_str = parser.pull_token()?;
     42         let age = age_str.parse::<u32>().map_err(|_| tokenator::ParseError::DecodeFailed)?;
     43 
     44         Ok(Self { name, age })
     45     }
     46 
     47     fn serialize_tokens(&self, writer: &mut TokenWriter) {
     48         writer.write_token("user");
     49         writer.write_token(&self.name);
     50         writer.write_token(&self.age.to_string());
     51     }
     52 }
     53 
     54 fn main() {
     55     // Parsing example
     56     let tokens = ["user", "alice", "30"];
     57     let mut parser = TokenParser::new(&tokens);
     58     let user = User::parse_from_tokens(&mut parser).unwrap();
     59     assert_eq!(user.name, "alice");
     60     assert_eq!(user.age, 30);
     61 
     62     // Serializing example
     63     let user = User {
     64         name: "bob".to_string(),
     65         age: 25,
     66     };
     67     let mut writer = TokenWriter::default();
     68     user.serialize_tokens(&mut writer);
     69     assert_eq!(writer.str(), "user:bob:25");
     70 }
     71 ```
     72 
     73 ## License
     74 
     75 MIT