There are too many things off about the origin and author to not be suspicious of it. I’m not sure what the motivation was, but it seems likely. I do think they used the Zig source code heavily, and put together a pipeline of some sort feeding relevant context into the LLM, or maybe just codex or w/e instructed to read in the source.
It seems like it had to take quite a bit of effort to make, and is interesting on its own. And I would trust it more if I knew how it was made (LLMs or not).
There are too many things off about the origin and author to not be suspicious of it. I’m not sure what the motivation was, but it seems likely. I do think they used the Zig source code heavily, and put together a pipeline of some sort feeding relevant context into the LLM, or maybe just codex or w/e instructed to read in the source.
It seems like it had to take quite a bit of effort to make, and is interesting on its own. And I would trust it more if I knew how it was made (LLMs or not).
As another suspicious data point see this issue by the author: https://github.com/microsoft/vscode/issues/272725
Edit: https://news.ycombinator.com/item?id=45952581 found some concrete issues