I Built a Python script that uses a local Ollama LLM to automatically find and add movies to Radarr.
It picks random films from your library, asks Ollama for similar suggestions based on theme and atmosphere, validates against OMDb, scores with plot embeddings, then adds the top results to Radarr automatically.
Examples:
- Whiplash → La La Land, Birdman, All That Jazz
- The Thing → In the Mouth of Madness, It Follows, The Descent
- In Bruges → Seven Psychopaths, Dead Man’s Shoes
Features:
- 100% local, no external AI API
- –auto mode for daily cron/Task Scheduler
- –genre “Horror” for themed movie nights
- Persistent blacklist, configurable quality profile
- Works on Windows, Linux, Mac
GitHub: https://github.com/nikodindon/radarr-movie-recommender


No LLM use is benign. The effects on the environment, the internet, and society are real, and that cannot be ignored.
You can make the argument that in some cases it is justified, e.g.: for scientific research.
Saw it was already commented about CO2, so I thought I’d counter-point your environment claim regarding water usage (since that is something I’ve seen a lot of too).
The ISSA had a call to action due to the AI water use “crisis”: https://www.issa.com/industry-news/ai-data-center-water-consumption-is-creating-an-unprecedented-crisis-in-the-united-states/
68 billion gallons of water by 2028! That’s a lot…right? Well, what I found is that this is somewhat of a bad faith argument. 68 billion gallons annually is a lot for one town, but those are numbers from a national level and it isn’t compared to usage from anything else. So, lets look at US agriculture (that’s something that’s tracked very well by the USDA): https://www.nass.usda.gov/Publications/Highlights/2024/Census22_HL_Irrigation_4.pdf
That’s 26.4 trillion gallons of water annually. So, AI datacenter represents 0.26% of agriculture consumption. If AI datacenter consumption is a crisis, why is agriculture consumption not a crisis? You could argue that agriculture produces “something useful”, but usefulness doesn’t factor into the scarcity of a resource. So, either its not a crisis, or you are cherry picking something that has no meaningful outcome to solving the problem.
yeah, I think the whole “water” argument really dilutes the case against data centers.
On a serious note, the argument works for areas that already struggle to supply enough water for consumers. Otherwise, we should be focusing more on the power stress to the grid, and the domino effect on supply chain of hardware cost increases that it’s happening across many industries. It started with GPUs, now it’s CPU, storage, networking equipment, and other components.
If these prices are too high for a couple of years, we’ll start seeing generalized price increases as companies need to pass along the costs to consumers.
I think the supply chain issue is probably the most pressing out of all of them. The other points people have are either non-issues or a result of dropping usage hogs into existing electrical infrastructure. Infrastructure can be updated, though.
Supply chain is different. There isn’t a supply shortage of chips, its that profitability dictates you should sell them to datacenters or adjacent industry. Unlike infrastructure where you can just build out more, adding more supply for chips just means you have more to sell to datacenters. Since the demand is there, end of day profits will always win.
Didn’t down vote you. I hear this line of complaint in conjunction with AI, especially if the person saying it is anti-AI. Without even calculating in AI, some 25 million metric tons of CO2 emissions annually from streaming and content consumption. Computers, smartphones, and tablets can emit around 200 million metric tons CO2 per year in electrical consumption. Take data centers for instance. If they are powered by fossil fuels, this can add about 100 million metric tons of CO2 emissions. Infrastructure contributes around 50 million metric tons of CO2 per year.
Now…who wants to turn off their servers and computers? Volunteers? While it is true that AI does contribute, we’re already pumping out some significant CO2 without it. Until we start switching to renewable energy globally, this will continue to climb with or without AI. It seems tho, that we will have to deplete the fossil fuel supply globally before renewables become the de facto standard.
chill, this is extracting text embeddings from a local model, not generating feature-length films
that’s like saying “no jet use is benign” meant for comparing a private jet to a jet-ski
the generative aspect is not even used here
So running a local model is unforgivable, but “scientific research” running on hyperscalers, can be justified?