QAOA is a funny algorithm that co-opts a classical computer to perform high-dimensional, multi-parameter optimization of a quantum cost function. The "quantum cost function" is just a normal function that takes as input some number of float values and produces as output one (or several) float values—it just uses a quantum computer to calculate those output float values.
A huge problem with using the algorithm practically, especially with more qubits, is that noise really affects how well this multi-parameter optimization can succeed. How do you optimize
f(x1, x2, x3, ..., x49, x50)
when calculating f just once requires repeating a calculation umpteen times on a quantum computer, and this quantum computer is actually providing you
f(x1, ..., x50) + error
where error scales with the number of variables, the size of the program, and (roughly) the reciprocal of the number of calculation repetitions? 50- or 500-dimensional space is big, and there are tons of local optima that disappear and reappear in the presence of this error.
This is a personal opinion and not a statement of fact, but scaling up QAOA to work with ~50x the problem size the paper demonstrated (which is needed for a practical RSA demo) on any near-term quantum processor will not work.
This assumes that all the doubts about Schnorr's work are relieved.
[1] https://www.schneier.com/blog/archives/2023/01/breaking-rsa-...
[2] https://news.ycombinator.com/item?id=34235350