Let this be a lesson to you then. Checking the logs should be your first troubleshooting step, not installing a variety of distros until one “just works”. Good luck.
Let this be a lesson to you then. Checking the logs should be your first troubleshooting step, not installing a variety of distros until one “just works”. Good luck.
I’m not biased and I’m not picking a side, but there is a lot of whataboutism is this thread and I stand by my stance that it is a weak argument and a logical fallacy.
Whataboutism isn’t a very convincing argument.
Just organize your library properly and pretty much every software will manage it better. There are options for organizing and renaming them mostly automatically, like EastTAG or filebot. Some people use Sonarr and Radarr to organize shows and movies, but those are probably overkill for you. The various *arrs will be more useful if you’re consuming new media through a server hosting Plex or Jellyfin. Kodi is also a waste if the library isn’t already meticulously organized and you don’t need a 10 foot interface.
If you’re only consuming on desktop and you insist on being disorganized, then why even bother with anything other than VLC? It runs on Linux, Windows, iOS, and Android.
Those are the kinds of inconsistencirs the GOP use to accuse you of voter fraud, even if you correct it. Don’t be surprised if some GOP busybody adds your name to a list of voter registrations to purge. Whenever they talk about voting irregularities and a rigged election, this is exactly the kind of voter disenfranchisement that they are trying to accomplish. Hell, question enough votes and you can get the votes of an entire region thrown out. You don’t need any proof to make the accusations, then they massive shit sandwich gets dropped in the lap of a few underpaided and understaffed people to substantiate. I recommend you vote as early as possible and follow up to make sure your vote gets counted.
Yes, I’ve done almost exactly this while traveling. You can even carry around a couple variously configured sd cards for different use cases. I had one with jellyfin for sharing locally and also Kodi for direct HDMI connection to TVs. There is a in app on Android for jellyfin called findroid that allows offline copies from the media server, which allowed me to not need the thing powered the entire time I wanted to watch something on my phone, just long enough to download it. Adding samba shares adds a other layer of accessibility. I had another SD Card with video game ROMs for retro gaming, but this one got left at home because it requires controllers and I didn’t think I’d use it that much. I had another with “little backup box” installed for automatically backing up my photos and videos after a day out exploring with my camera.
I used a Raspberry pi 5 for all of this, running from a battery backup, because I didn’t really need a keyboard once I had remote connections to my phone sorted out. Pick a rugged case and you case just toss it in your bag of chargers. It took up about as much space as a pack of cigarettes. Another option would be the Raspberry Pi 400, built into a keyboard. A little bulkier, but maybe more resilient in the face of technical difficulties.
Oh yeah, I totally support the local copy. That will save you in times up hardware failure or fuck ups. I could just never keep up with the maintenance and kind of gave up making automatic backups and syncing. But reorganizing often translates to integrating deletions into rsync or whatever syncing protocol you use, and that has caused me headaches and heartaches.
Yeah, that was a typo. Thanks, I’ll fix it.
I have a very similar setup to yours, a relatively large music library around 1.7TB of mostly flac files on my server. I’m able to organize these files locally from my laptop, which at various times has run either OSX, various GNU/Linuxes, or Windows. However I do not bother pushing the files themselves back and forth over the network.
Even if I did, I wouldn’t automate the syncing, I’d only run it manually after I’d done my organizing with Picard for that day. After all, it the organization with Picard isn’t automated, why should the syncing be? I’d probably use rsync for this.
In actual practice I do this: Connect to my server from my laptop using ssh, forwarding X. Run Picard on the actual server through this remote connection. Picard runs just fine over ssh. Opening a browser from a Picard tag for occasional Musicbrainz.org stuff is a little slower but works. I would then use a tmux or screen session to run the rsync command when I’m done with Picard for the day for syncing to a backup if necessary.
I don’t really bother keeping a whole copy of my music collection locally on my laptop or phone though, since It’s been bigger than is practical for a long time. Managing multiple libraries and keeping the two in sync turned into such a hassle that I was spending more time organizing than actually listening (or making mixtapes/playlists). To listen to my music locally I’ve used either Plex or Jellyfin, sometimes MPD (like when my server was directly connected to my stereo receiver), or just shared the folder via samba and NFS.
Then maybe you can tell me what “attempting to do more” means, because the author of the article certainly didn’t. Or why that’s bad. My only take away is that the author thinks the system should facilitate the running of applications and just get out of their way already. But that sounds a lot like building a road network and then failing to install traffic controls because the DOT should just stay out of the way of traffic.
Well there’s your problem. Public wifi is going to have systems in place to stop exactly the kind of thing you’re trying to do.
I have setup and run what are basically HTPC’s for decades now. Kodi running on a Debian based Linux distribution or just Debian is a solid recommendation and has lots of support for infrared remotes, but kodi can be very fiddly to setup properly. It will work, but don’t expect it to work “out of the box”. You’ll probably still need a mouse and keyboard for anything outside Kodi. You’ll have to read a bunch of documentation and do some customizing to get the most out of Kodi. It’s still easier than most other setups, but it will feel very frustrating if it’s your introduction to Linux too.
I’ve moved to using my HTPC primarily as a server. Once you get comfortable with linux and docker, setting up new server services like Jellyfish, Plex, and and *ARR stack is relatively trivial. The advantage here being that you can serve your media to any device that can connect to your server. For me that means one library of media to share with any TV in my house, any mobile device I own, and any friends and family computer savvy enough to download the right apps and setup an account. If your network (and your Internet connection) isn’t reliable this kind of setup may not work very well for you at all. For example, Plex account authentication will fail is you don’t have Internet. Jellyfin and Kodi fair better when Internet is only available occasionally or is unreliable.
My least favorite part of using Kodi was setting up the remote. Even worse was trying to configure controllers for retro gaming. The situation is MUCH better than it was, but is still far from easy. I was kind of able to side step the remote problem because now I can just use the remote for the TV (if it supports the Plex or Jellyfin apps) or another streaming stick like fire stick, Nvidia shield, or Roku. My Nvidia shield can pair with any Bluetooth controller and runs RetroArch so that problem was side stepped too. ROMs can be copied via samba shares or loaded directly by a USB drive.
TLDR: Kodi has built-in support for IR, but streaming sticks are cheap, and in the long run I found setting up a server was more versatile, more reliable, and less stressful. I know, I also hate it when people ask for a specific solution and others recommend asking a different question. But in this case, my experience is that IR remotes suck, are flaky, and not worth it if there is any other option.
Do you mind sharing what brand retail UPS weren’t lasting a year?
I’m dealing with similar brownouts and also an area with lots of lightning. I got about 5 years out of my UPS batteries. Wondering if I’ve just been lucky.
Windows is never going to like an NTFS that has been touched by another OS even if it windows was completely shutdown during that time. Reading the NTFS partition might be okay. But, last I checked none of the Linux drivers could write without windows noticing and fouling things up. If that has changed it would be welcome news to me despite my warning use of windows.
If windows (and to a lesser extent that other OS) came bundled with some ability to mount, read, and write filesystems popular with other operating systems this wouldn’t be such a problem. One shouldn’t have to involve the network stack or 3rd party drivers just to share a partition on the same hardware or a portable drive with a modern file system.
Funny. This is the same video I show to my friends to help them pronounce guanábana correctly.
Using crontab to execute these kinds of quick fixes that don’t really solve the problem so much as reset the countdown to failure are the real Duck tape Linux hacks.
I’ve also found that the documentation online is much better, or at least easier to search, with Ubuntu in particular than any other distro. This is probably mostly due to popularity at this point as you said, but I think they got that popularity because of the straight forward and easy to digest documentation. And I’m not just talking about self-help support forums, I mean published and polished wikis and guides hosted by the distro itself.
Windows wasn’t my first operating system. I don’t even remember what my first was, but it ran on top of DOS and had a 5 and a quarter inch floppy drive. I’ve used pretty much every windows desktop version since 3.1, but really only installed or maintained XP, 2000 server, and Windows 10 on my own hardware. But I’ve also installed and maintained various Linux and BSD distros since about the turn of the millennium, including a brief relationship with a Mac laptop with OSX.
There was never a switch. I always ran whatever I could get working that would get the job done. For some tasks that was Windows, either because it was good enough and came pre-installed or it was required by the software I needed to run for school or work. I’ve handed in many assignments on 3.5 inch floppies. I haven’t maintained a server with windows since Windows2000 server. I’ve tried Slackware and Corel Linux. I bought SUSE Linux in a box from a big box store. I’ve gotten those brown Ubuntu install CDs in the mail. I remember being delighted with the development of BitTorrent because now my downloads would check themselves for consistency as they downloaded the ISO. No more getting to the end of a download only to discover the md5sum failed to check. I’ve used Knoppix and Clonezilla for system recovery.
There was never a change. I’m a tech nerd that likes Linux, not a Linux nerd that likes tech. But, it was the way windows kept destroying my Linux partitions that drove me away from dual booting and installing windows on anything in general. Also the windows situation with viruses, updates, and lack of security that drove me away unless compelled. Now windows lives on its own hardware or in a VM for me.
It’s a damn shame that we haven’t built a microwave that actually listens to the pops and stops when the pops slow, just like every bag of popcorn instructs you to do. We’ve got gun shot detectors; you’d think we could build a chip to analyze popping popcorn.