Not a bother at all! I have used uMatrix for several years now. It is no longer actively maintained, but has an absolutely unrivaled grid interface (hence Matrix) that comprehensively lays everything out into columns and rows.
Rows represent the different domains and subdomains that a webpage loads assets from.
Columns represent the different types of assets individually.
Sane, strict rules that can be set within the My rules page:
https-strict: * true
https-strict: behind-the-scene false
noscript-spoof: * true
referrer-spoof: * true
referrer-spoof: behind-the-scene false
no-workers: * true
* * * block
* 1st-party image allow
Or these can be set with the graphical matrix grid with global scope selected, then click on the lock icon to make it persistent.
What uMatrix does that uBlock Origin does not (or the authors refuse to integrate into uBlock Origin):
Unfortunately, uMatrix has been left to bitrot, so I’ve been closely watching the development of xiMatrix which replicates the idea and extends it to also handle remote fonts and inline scripts. (But still needs further development before I can consider it a drop-in replacement IMO).
It loads fine for me without CSS or javascript.
Why would you ever want to allow the execution of
adobeDatalayer_bridge.js
adobe_analytics_bridge.js
globalstore_bridge.js
?
Good example of third party trash hiding behind first party domain.
TIL jump hosts are an existing concept
What, you don’t want to interact with a CIA asset?
I do use ClamAV. Most users just run some sort of daily scan, but this is remedial and not preventative.
In order to truly harness clamav’s potential, you need to configure clamonacc on-access scanning. It passes items off to clamd with lowered privileges and prevents file access through inotify until its realtime scan has cleared.
I wonder sometimes if the advice against pointing DNS records to your own residential IP amounts to a big scare. Like you say, if it’s just a static page served on an up to date and minimal web server, there’s less leverage for an attacker to abuse.
I’ve found that ISPs too often block port 80 and 443. Did you luck out with a decent one?
I have to constantly remind myself that the average user today cannot tell the difference between a search engine and a web browser.
And we have the URL bar pulling double duty as a search engine entry form to thank for that.
Schneider Electric APC Back-UPS 1500VA, 900W.
They power on self-test okay, but go on to just fail to switchover during outages. I’m still trying to figure out if it is a factor of cumulative time, running hours, or they’re only good for a fixed number of power failures. And whether its the battery or the UPS device itself.
It feels like crashing your car, and then the airbags go off after you’re already mangled and bleeding out.
Retail UPS batteries don’t even last a single year, in my experience. The weekly brownouts and momentary blackouts probably aren’t helping.
At this point, I’m just thinking of building my own with a charge controller, inverter and a bank of car batteries.
Consider running some kind of file integrity monitoring. samhain, tiger, tripwire, to name a few.
considering containerization, but so far, I find it not worth foregoing the benefits I get of a single package manager for the entire server
Just do MAC with either AppArmor or SELinux.
I’m so glad I grew up when I did and got to experience video games before they began injecting overt political messaging into games.
The New Yorker launched a redesigned home page in late 2023, having reached a similar conclusion.
Oh boy let’s check it out.
newyorker.com attempts to load js and frames from eight third party domains. Among them;
“conde.digital” – I am assuming that means conde as in conde nast AKA reddit DNA… and we all know what happens to anything reddit touches.
“condenast.digital” above confirmed. I can almost feel the bile welling in my throat.
“cookielaw.org” - probably to serve cookie consent notices to the plebs who fail to block cookies and other trash.
“doubleclick.net” - known malware
“googletagmanager.com” - so that google can keep track of all their cattle.
And yet all of their articles are perfectly readable in plain HTML formatting, as expected. Not that I would ever spend any time reading articles from whatever this place is.
Block both javascript and CSS. Most of those nags are implemented via some combination of the two.
Google workspace
Dystopia is real
Only a few more steps until “Google Government”
That’s good. It should serve as a reminder that you need to abandon a dinosaur push media. It’s a subtle way to punish normies who haven’t caught on yet.