I miss Everything from Windows.
It’s a file search tool from voidtools, which has instant as-you-type results. It is never out of date because it tracks filesystem events. FSearch simply does not compare.
I miss Everything from Windows.
It’s a file search tool from voidtools, which has instant as-you-type results. It is never out of date because it tracks filesystem events. FSearch simply does not compare.


The bubble continuing ensures the current paradigm soldiers on, meaning hideously expensive projects shove local models into people’s hands for free, because everyone else is doing that.
And once it bursts, there’s gonna be an insulating layer of dipshits repeating “guess it was nothing!” over the next decade of incremental wizardry. For now, tolerating the techbro cult’s grand promises of obvious bullshit means the unwashed masses are interpersonally receptive to cool things happening.
Already the big boys are pivoted toward efficiency instead of raw speed at all costs. The closer they get toward a toaster matching current tech with a model trained for five bucks, the better. I’d love for VCs to burn money on experimentation instead of scale.


This is the real future of neural networks. Trained on supercomputers - runs on a Game Boy. Even in comically large models, the majority of weights are negligible, and local video generation will eventually be taken for granted.
Probably after the crash. Let’s not pretend that’s far off. The big players in this industry have frankly silly expectations. Ballooning these projects to the largest sizes money can buy has been illustrative, but DeepSeek already proved LLMs can be dirt cheap. Video’s more demanding… but what you get out of ten billion weights nowadays is drastically different from a six months ago. A year to date ago, video models barely existed. A year to date from now, the push toward training on less and running on less will presumably be a lot more pressing.


Excellent news. It’s ridiculous that matrix algebra was turned into proprietary software.


There is oxygen on Mars.
Not much. But it’s there.


Search for lightning / 8steps / 4steps / turbo LORAs.
Try Chroma.


“Where do you think we are right now?”
Ooh, fair point. We don’t know that any of these options boot.
I cannot fathom having my shit together to such a degree that my bootloader has a theme.


Maybe they should visit a hospital Israel bombed.
Mods included, now.


Any reliance on remote compute is fragile, because you lack control of the model. Exact versions matter. Every thread that goes ‘how do I use the old [service name]?’ is someone learning this lesson, often too late.


Ehhh. Mixed bag. Integrating AI into every-damn-thing is half of why people act like a robot kicked their dog - but diffusion is the half that unambiguously does what it’s supposed to, and a popular FOSS tool is a decent place to offer an it-just-works installation for local models.
The images are generated on volunteer GPUs through AI Horde.
Nevermind, fuck this.


Cosmetics are the same abuse.
Fifteen people are mad they have the reputation they’ve proudly demonstrated.


Some people are never worth the time. Block and breathe easy.
Some replies have an untouched positive score because everyone’s already blocked them.


Analogies tend to have something do with a topic at hand.
Russia invaded Ukraine. Is bickering about motives not good enough? You gotta pretend their army didn’t amass on the border, attempt to seize the seat of government, and claim permanent ownership of a shitload of territory?
You might wanna skim this thread for a second read on what you think I think.
Sleep is also when your brain finally goes “Hey why didn’t you do this obvious solution?”
See also talking in the shower as a debugging method.