Sounds cool. The types like Haskell’s data types or Rust’s enums compared with proper pattern matching are pretty much a requirement for a good language imo. And the process/message passing is interesting.
Sounds cool. The types like Haskell’s data types or Rust’s enums compared with proper pattern matching are pretty much a requirement for a good language imo. And the process/message passing is interesting.
Sticky lines looks nice. Ideally I’d never encounter code where it’s really needed, but unfortunately sometimes it do be like that. The extra context would make it a lot easier to follow what I’m reading
I’m not sure what the best approach would be, but for reading docx you might be better off using something like Apache POI. Docx may be XML, but it’s imo absolute abuse of XML. POI shields you a little bit from all the nonsense happening in docx. I could see ANTLR working for Typst since there’s probably not another interface for it.
I don’t think it’ll support it, but you could also check if this can be done with pandoc.
Yep, I feel a bit more prepared now. I haven’t had the time yet today, but from what I saw the first one shouldn’t be too hard
Last year I decided to do it in Rust, in order to learn Rust. I found out pretty quickly that you can’t just jump from Java/Python/Haskell into Rust and expect to understand what’s going on. This year I feel more prepared, so if time permits I’ll make it right this time.
As a developer, I feel absolute pain for the people who had to convert these. There’s quite some edge cases and sensitive topics to dodge here, and doing something wrong might piss people off. They must’ve had some lengthy meetings about a few emoji.
I might misunderstand what you mean with “implementing” an LLM, but unless you have a good understanding of deep learning and math I wouldn’t recommend to implement one from scratch. There’s a lot of complex math involved in these kind of topics. If you mean implementing an application around an existing LLM, for example writing a chat website that interfaces with ChatGPT or a local LLM, then it’s doable (depending on you current skills).