That was a joke in the release video. The Pythia model is already released at [1] and the deltas for the LLaMa model should be up here [2] in the next few days.
It's also relatively cheap to make your own llama-30 weights, the real value of OpenAssistant is in the training data, and all of that data has been made available.
The OpenAssistant effort gets an A+ for open source contributions.
Oh, ha, yeah this is exactly the gag I fell for. I just noped out of the video and wrote off the project as this was the first I ever heard of them, and their website just has a signup and no downloads I could see.
Unless something changed, I thought it was that they literally cannot legally release the weights that are based on LLaMA (except maybe with an xor thing) so they’re going to train it based on something else