A powerful neural network for coders has been released: Qwen2.5-Coder-32B-Instruct outperforms GPT-4o and nearly surpasses Claude-3.5 Sonnet
The new model significantly excels at generating and editing code. The reason for its success is training on 5.2 trillion tokens, whereas GPT-4o is trained on only hundreds of billions. This allows the neural network to outperform competitors by dozens of times.
Link: huggingface.co/Qwen/Qwen2.5-Coder-32B-Instruct