You could look into prose. The interface of slack/discord/mattermost, built on XMPP, with E2EE.
You could look into prose. The interface of slack/discord/mattermost, built on XMPP, with E2EE.
Bitwardens local cache does not include attachments, though. If you rely on them, you have to rely on the server being available.
While I like and appreciate the campaign, the issue IMO is bigger. IoT devices for example even have environmental impact when services behind them get discontinued.
I would therefore like a more general rule: whenever a product is discontinued for whatever reason, all necessary documents, sources, etc need to be released to allow third parties to take over maintenance (that also includes schematics for hardware repairs).
I don’t understand how that hybrid is supposed to work. Monospace is a binary attribute; either all chars have the same width or not. So what is the font now?
Why? What did Zenimax do to you?
Unreal Tournament
Most people in my company use OSX, followed by a few dozen Linux users (various distros; whatever each one prefers), followed by a few Windows users (whyever they want that). So essentially: we can choose what we want to use.
Yes and no; you left out part of my quote. Stuff that can be put in a reminder is up to me (especially if I tell them “I’ll handle it”). But if for whatever reason that’s not possible and I tell them “you might have to remind me again next week” and they are fine with that, then they shouldn’t be pissed if I indeed needed a reminder. That’s what I meant with “I warned them”.
This doesn’t seem reasonable… If you accept some responsibility
But … that was the point. “Telling them your boundaries” implies not accepting something you are not up to. My managers know that I am not a good manager myself. I have a lot of qualities, at being a driving force in a project is not among them. So they don’t utilize me for that. Which is good.
Yes, it would be on me if I constantly tell them “sure, just let me handle it” and then not handle it. But that would be the opposite of what I wrote above.
I mostly agree, but (what else ^^):
No one has the right to make their internal turmoil everyone else’s problem, even if it may be particularly burdensome. The world should be far more sympathetic and empathetic, but at some point you have to take responsibility for you.
IMO you do take responsibility when you tell others about your boundaries and how they can work around them. If they don’t want to because it also costs them a little bit of energy and disrupts their typical workflows they have (again: IMO) no right to blame it all on you. If I tell them “I can’t do X” or something and they again and again expect me to do X, it’s also on them.
Simple example: I tell colleagues, family, whatever to please remind me again if they feel I missed something they expected of me. If they do, all is good. If they later are pissed that I missed something and immediately blame me … sorry my friend, I warned you. (If I had the ability to set a reminder, sure that’s on me for not doing that. But it doesn’t always work that way.)
They also fuck over their own OS. I don’t think they deliberately broke dual boot installs, they simply don’t put enough effort in QA. (See their recent problems with BitLocker after an update. Or that one update that fails because some internal partition is too small. And so on.)
Fli4l is still around?! Crazy. I used that back in 2002 or so to turn an old i386 with 3 ISA HP 100Mbit network cards into a router + fileserver combo. Good times.
glibc’s malloc
increases the stacksize of threads depending on the number of cpu cores you have. The JVM might spawn a shitload of threads. That can increase the memory usage outside of the JVMs heap considerably. You could try to run the jvm with tcmalloc (which will replace malloc
calls for the spawned process). Also different JVMs bundle different memory allocators. I think Zulu could also improve the situation out of the box. tcmalloc might still help additionally.
I wonder if that would be a genuine use case for “AI”. If the voice actor consents to have his voice represented in such a scene but doesn’t want to play it out in a studio, the computer model could take over that part.
It’s an okay game, but far worse than the first two. They forced an open world onto it, and made it pretty repetitive. The DLC is more linear and feels a lot more like a typical Mafia story telling.
Software/Staff Engineer, as Architect and Solver. So I help design our system (from the technical side), I assist and to a degree coordinate teams, I jump in when know how or man power is needed, I rework or rebuild systems that have no clear ownership of a team, and so on. Oh and I always have an opinion no matter which (technical) topic.
Its not said that they need devs to target home machines, it says they need to give the resources so people can host it themselves, period.
Before attacking me with such an arrogant rant, maybe read what I wrote.
I said:
Once they release the source, people can refactor or reengineer it to run on smaller scale, replace proprietary databases with free ones, etc.
So of course it’s about releasing anything (!) at all.
I simply said that you can’t compare a small fan project like a WoW self hosted server with Blizzards infrastructure and the requirements to have a high available setup for millions of players.
ArenaNet is quite open about their infrastructure and you can see that this is far from trivial, but also allows them to have zero downtime updates. That is a huge feat, but also means that self hosting that thing will be a pain in the ass. Yet I would not want them to not do this just so it could be easily (!) self hosted some time in the distant future.
Such an architecture is typically shit. Building a system that is simple AND scales high won’t work. Complexity usually gets added to cope with scale. If we don’t allow companies to build scalable (i.e. complex) systems, we simply won’t get such games anymore.
Again: I am completely in favor of forcing devs to release everything necessary to host it. I am not in favor of forcing devs to target home machines for their servers, when their servers clearly have completely different requirements. That’s unrealistic.
Not a fair comparison. The private servers were written with the small hosting in mind. They would very likely never scale to what Blizzard has in place. For all I know, Blizzard could run their stuff on a Mainframe with specific platform optimizations against an IBM DB2.
But I also don’t think this has to be transferable to a local setup without effort either. Once they release the source, people can refactor or reengineer it to run on smaller scale, replace proprietary databases with free ones, etc.
It’s more comparable to Snikket. Both Snikket and Prose use Prosody as server with their own extensions.