• 0 Posts
  • 25 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle
  • I use Zimbra with an external email gateway that only accepts authenticated email. Zimbra is pretty heavy (it’s intended to be a Microsoft Exchange replacement) but it at least has a huge amount of protection built-in to deal with spam and comes configured out of the box to not relay (well, outside of you setting up aliases and lists.)

    That said, it’s not hard to find “incoming email only” configurations that deliver to local mailboxes only, for most email servers. The thing to avoid is having a single server configuration that tries to do both - accepting external email and sending locally originated email out. The configurations do exist to do that, but they’re confusing and tricky.

    External email gateways… that bit is hard. I use a mail server I set up myself on a VPS. It does not listen on incoming port 25. It requires credentials. I did this largely because I was trying to send email out via Xfinity’s customer email relay, but the latter kept upping the authentication requirements until one day Zimbra just couldn’t be configured to use it any more. And each time they changed something, I wouldn’t find out until I noticed people had clearly not received the emails I’ve sent out.

    VPSes are problematic as some IPs are blocked due to spam. There’s not much you can do about it if you’re stuck with a bad IP, so if you can find a way to send outgoing email via your ISP’s outgoing email server, do that. For Postfix, you can send out authenticated email using something like: in main.cf:

    relayhost = [smtp.office365.com]:587
    smtp_sasl_auth_enable = yes
    smtp_sasl_security_options = noanonymous
    smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd
    smtp_use_tls = yes
    

    and in /etc/postfix/sasl_passwd:

    [smtp.office365.com]:587 example@outlook.com:hunter2
    

    So in summary:

    • Consider an email-in-a-box solution like Zimbra, I understand the wish to go for something light but it might make sense if your aim is just to control your own email
    • Regardless of whether you do or not, use separate servers for incoming/outgoing email.
    • For incoming email, lock it down to accept local email down if you’re manually doing this rather than using an email-in-a-box solution like Zimbra.
    • For outgoing email, use authentication and avoid it listening on port 25. Consider either directly using your ISPs, or if that’s not practical, configuring your outgoing email server to relay in turn to your ISP (see above for how to do this.)

    Good luck.


  • This feels like more of an operating system issue than a hardware issue. What you’re looking for is a way to reduce the power it sips while still allowing downloads to happen. Leaving aside the edge cases like OS updates others have mentioned, the major issue is that applications aren’t structured like that.

    If I have Firefox open with one tab displaying a website that runs 1,102 javascript routines all the time in an attempt negotiate a really good advertising deal for each of the banner ads it’s showing - you know, the type you visit and your machine starts crawling and the fans start blowing almost immediately - and another open on Ubuntu.com where I’ve just clicked on the “Download Ubuntu desktop ISO” button, only Firefox knows which of those tasks can be backgrounded and right now (as far as I can see) there’s no API in any of the major OSes where it can say “Send me this signal and I’ll only do the thing that can’t be interrupted.” nor “I’ve put the stuff that can’t be interrupted in this thread, so only run this when you’re trying to save power and nobody’s using the computer anyway”)

    Would it be a good idea? Well, that would depend on whether developers actually use that API if it ever comes into existence. I’d like it, I just see it being one of these well meaning things that devs would avoid using because it complicates their code and probably makes it easier to break.







  • That’s an orthogonal thing though. Downvotes and removals are generally triggered by different things, even if they intersect occasionally:

    • “I think The Mandalorian is the best sci-fi western ever” is a reasonable thing to post in c/tv, c/scifi, c/westerns, etc.
    • “I think The Mandalorian is the best sci-fi western ever” might be a reasonable thing in an open topic about comparing Firefly to other space westerns in c/firefly
    • “I think The Mandalorian is the best sci-fi western ever” is clearly an off topic troll if posted as a top level reply to a question “Do you think Mal will ever find his true love” in c/firefly

    In all three cases, it’s entirely possible the comment will get a lot of downvotes. But the fact it’s off-topic and, in context, intended to troll, that means the third case is the only one where you’d want it removed.

    Indeed, it’s possible to envisage a highly upvoted comment that also ends up being removed because it’s off topic or an attempt to derail. Those are actually harder to remove, and I’ve seen (Reddit) moderators make the wrong call on them when they’re successful attempts to derail because they’re not always as obvious.


  • Also the installer and compatibility. For years I recommended Ubuntu over others because while the rest was six of one, half a dozen of the other, the installer was pretty much guaranteed to work on everything from the most standard White Box PC to the most finnicky Thinkpad.

    Whereas virtually everything else I’d tried was hit or miss - worked with some hardware, had major problems on others. As an example I recall five years ago trying to get Fedora to run on an old Dell laptop, and I had to disable the built-in AMD graphics in favor of the Intel integrated in the BIOS otherwise it just wouldn’t display anything.

    (Right now I don’t recommend Ubuntu, but it’s only because they went too far with the snap thing.)

    People forget the importance of the installer and how it can mean whether you spend 15 minutes installing and have everything set up, or whether it takes hours to find the right set of hacks and BIOS settings, and even then you’re left with something where you’re playing with Wifi drivers for the next six months.


  • There are two problems here:

    1. Most of the disposable toothbrushes don’t have the ability to replace the heads. Some of them do, as the GP mentioned, but most don’t in my experience.

    2. The ability to replace the heads is not the same thing as actually being able to find the heads in the store that sold you the toothbrush.

    3. The entire assembly costs typically something in the same ballpark as a head replacement anyway.

    4. The entire assembly often costs less to replace on a regular basis than the heads for, say, the Sonicare - $24 for 3 heads at the time of writing, compare this to the $10 two pack of disposable brushes, $8 per unit (plus the cost of the rest of the system) for the “right” way, $5 per unit for the disposable route (all inclusive.)

    Most of these disposable systems are cheap in every sense of the word (cost and build quality) and not really intended to be used for a long period of time.

    From a consumer standpoint, they make a lot of sense. From an environmental standpoint, not so much. How did we get here? Well, Sonicare would probably argue they make a superior brush and therefore can charge more which may or may not be true. More likely the volumes involved combined with the “Upscale”/“Downscale” marketing associated with each brush makes it genuinely much, much, cheaper to create an all-in-one unit that’s only supposed to last a month compared to the alternatives.



  • This is compounded by the fact that people don’t take care of their teeth so feedback from dentists is almost always poor

    I love the way this conversation is usually “What type of toothbrush are you using again?” “Uh, the spinny one you get from the supermarket, it’s disposable so I have to buy one every month, but it seems OK”, “Ah no, what you need is the $250 Philips SuperScrubacare Plus, which has bristles on the end of the bristles, and on the end of those bristles are more bristles, and on the ends of those are little robots with tiny vacuum cleaners and flame throwers. Those really kill plaque. Also stop eating so much sugar.” “Ummm OK” “Anyway, we’re done. Here’s a cheap ass regular unpowered toothbrush. And a starlight mint.”



  • I’m not directly familiar with either, but syncthing seems to be about backing up, so I’m not entirely surprised it’s file oriented, and jellyfin doesn’t look like it’s about user maintained content so much as being a server of content. So I’m not entirely surprised neither would support S3/Minio.

    Yeah it took me a while to realize what S3 is intended to be too. But you’ll find “Blob storage” now a major part of most cloud providers, whether they support the S3 protocol (which is Amazon’s) or their own, and it’s to be used precisely the way we’re talking about: user data. Things clicked for me when I was reading the DoveCot manuals and found S3 was supported as a first class back-end storage system like maildir.

    I’m old though, I’m used to this kind of thing being done (badly) by NFS et al…


  • It’s not always possible but it’s generally good practice to configure your applications to use external storage rather than file systems - MySQL/PostgreSQL for indexable data, and S3-clones like MinIO for blob storage.

    One major reason for this is that these systems generally have data replication and fall over redundancy built-in. So you can have two or more physical servers, have an instance of each type of server on each, and have these stay synchronized. If one server goes down, the disks crash, or you need to upgrade, you can easily rebuild a set of redundant servers without downtime, and all you need to do is save the configurations (and take notes!)

    Like I said, not always possible, but in general the more an application needs to store “user data”, the more likely it is it has the ability to use one of the above as a backend storage system. That will reduce, significantly, the amount of application servers that need to be backed up, and may reduce your need to consider using NFS etc to separate the data.


  • Having done it before my honest advice to anyone planning this is:

    1. Start with a Mastodon account on a regular server.
    2. Build lists of friends etc.
    3. After a few months, once you’ve curated a feed you like, move to a self hosted one.

    That’s if you intend to use it “socially” as opposed to, say, “commercially” (ie an cartoonist publicizing their work, for example, or even the corporate Mastoverse account for a burger chain), in which case it makes sense to have that account on a private server (where it’s essentially self verifying, and can’t be killed by a single confused overworked instance admin - in the case of the burger chain, also by an instance admin that would rather not host commercial accounts), but also a private account on one of the main servers for just being yourself.


  • I didn’t say it (directly) supported capitalism, I said the fact modern Christians accept it despite significant changes to biblical canon was a demonstration that modern Christians believe that power is given by God.

    Also Capitalism isn’t that new. The term is, but it’s always been used to describe pre-existing market based economies and concentrations of wealth, and pretty much every era has had a significant civilization that had that.

    Your thing about English translations: Nobody’s criticizing translations into English. But the King James edition included, for example, the “sodomite” language which didn’t appear to come from any legitimate translation of the bibles. So it did significantly change the meaning of the Bible in places, in fairly negative ways.


  • I’m aware various groups and individuals appeared at various times during the last two millennia that opposed abortion on Biblical grounds. But I was specifically referring to the Catholic church. The quote you’re responding to was “(…) the Catholic church didn’t adopt this position until the late 19th Century. It literally took nearly two millennia for anyone in the primary Christian religion to notice their book had these (supposedly) anti-abortion messages.”

    Now, true, “anyone in the (Catholic church)” is probably hyperbole, but certainly “anyone in position to make decisions in the (Catholic church)” is accurate. They didn’t adopt their current stance until the late nineteenth century.


  • I suspect you can find ways to read into the Bible whatever you want to read. As a basic example, modern Catholics are convinced the Bible outlaws abortion, and there’s a ton of road side billboards next to Catholic churches that supposedly quote Biblical anti-abortion statements. But the Catholic church didn’t adopt this position until the late 19th Century. It literally took nearly two millennia for anyone in the primary Christian religion to notice their book had these (supposedly) anti-abortion messages. What’s more likely, they missed them, they ignored them because it was inconvenient, or none of these quotes are as clear cut as the billboards would imply?

    Then you have the allegiance to the King James edition of the bible, which most Christian churches do, and that generally feeds into a more direct answer to what you’re asking.

    Why King James? What makes him more of an authority on what the Bible means than Jesus, his disciples, and the other contemporaries and near contemporaries who put the Bible together? Well, he’s a King of course.

    …crickets…

    And God loves powerful people?

    …crickets…

    Uh, OK, well, what about if God didn’t want him to be King, he wouldn’t be a King, therefore, ergo, God thought King James was a pretty cool dude and should be able to do whatever he wanted? Including edit the Bible and put some stuff in there that wasn’t in there originally?

    Ding ding ding!

    NOW is it starting to make sense? Because if God didn’t want Elon Musk or Jeff Bezos or Rupert Murdoch or Peter Theil or Sheldon Adelson or (long list of other rich jerks) to be rich and powerful, they wouldn’t be rich and powerful, right?

    Now, never mind the contradictions here, I mean, I’m pretty sure the Bible does, in fact, have some choice words to say about rich people, and they’re not positive, and it’s pretty anti-Roman Empire in parts, especially the bit about crucifixions, but that all requires reading the Bible, and not trying to find double meanings to justify the status quo.

    Add to that the fact the rich and powerful control the narrative and always will, and you’re left with Prosperity theology and all its ramifications becoming more and more a consensus in countries that allow people to become that rich and powerful.

    What the Bible says… well, “it’s not meant to be taken literally, it refers to any manufacturers of dairy products” The eye of a needle might be too small for a camel, but the loophole of not being meant to be taken literally certainly can be.


  • To own my own data and feed and have some control over what’s pushed at me?

    I mean, I get it. Some people hate X and Meta. I hate them too. But if my aim was to get away from those two, I’d be on Tumblr, not Mastodon. If I was concerned that my postings to “social media” can be abused, I wouldn’t use Mastodon either, it’s completely open and there’s very little concept of privacy.

    To put it bluntly, Meta doesn’t even need to join the Mastoverse with an ActivityPub instance to vacuum up your Mastoverse data. It just needs single accounts to join the big instances and follow the “Federated feed” on them, doing a little algorithmic work to link accounts to Facebook accounts. It’s actually easier for Meta to suck your data from the Mastoverse than it was Twitter or Tumblr. (I deadnamed X, because I assume X’s position is so dire that if Meta offered to pay for everyone’s feeds, Musk would sell it all. But Twitter, for all of its faults, wouldn’t have done that.)

    What I’m hoping is that Meta will follow through and join properly, offering ActivityPub feeds and the ability to subscribe to ActivityPub feeds. Doing so will give Meta’s own users an off ramp, making it easier for Meta’s users to feel able to leave without losing their circle. And it’ll give the morons who insist that “OMG MASTODON IS TOO HARD YOU HAVE TO CHOOSE A SERVER!” (I can’t be polite about these people any more, the number who brag about their own idiocy is astonishing) a “simple” social network they can join with that off-ramp available for the future.

    But no, in my case, I didn’t join Mastodon to get away from Meta. I joined so I have the network I want.