Just tried the new open-source 20b parameter OpenAPI gpt-oss model on my laptop. Here’s what I got.
Have no idea why it needed to generate code for a multi-threaded Fibonacci calculator! Funny part is it did all that, then just loaded the original multiplication request into Python and printed out the result.
On the plus side, the performance was pretty decent.
Thanks for the tip! Will check it out when back at desk. A bunch of us have been scratching our heads over this one.
Yeah I can’t check at the moment, but it has been an issue for me for a while.
I was very confused why it would fetch the weather for Paris randomly with some models, and it was even more confusing because I was learning MCP by writing a weather tool (so the LLMs were getting conflicting instructions).