Signal is fully open source! You can run it on-premises, if you know your business!
Why are we not talking about it?
Unless something has drastically changed recently, the official Signal service won’t interoperate with anyone else’s instance. That makes its source code practically useless for general-purpose messaging, which might explain why few are talking about it.
My point is that you have all the open source software components needed to run secure communications, on your own premises, for your own users/community in case you are not trusting Signal’s infrastructure.
If you know any other similar alternative with strong encryption open source protocols please let me know! I love learning new things everyday!
on your own premises, for your own users/community in case you are not trusting Signal’s infrastructure.
Yes, that’s an example of data (and infrastructure) sovereignty. It’s good for self-contained groups, but is not general-purpose messaging, since it doesn’t allow communication with anyone outside your group.
If you know any other similar alternative with strong encryption open source protocols please let me know! I love learning new things everyday!
Matrix can do this. It also has support for communicating across different server instances worldwide (both public and private), and actively supports interoperability with other messaging networks, both in the short term through bridges and in the long term through the IETF’s More Instant Messaging Interoperability (MIMI) working group.
XMPP can do on-premise encrypted messaging, too. Technically, it can also support global encrypted messaging with fairly modern features, with the help of carefully selected extensions and server software and clients, although this quickly becomes impractical for general-purpose messaging, mainly because of availability and usability: Managed free servers with the right components are in short supply and often don’t last for long, and the general public doesn’t have the tech skills to do it themselves. (Availability was not a problem when Google and Facebook supported it, but that support ended years ago.) It’s still useful for relatively small groups, though, if you have a skilled admin to maintain the servers and help the users.
I know that Telegram has a lot of users, so I'm not describing all of them here. But I've noticed that it seems especially popular among people who kind of like to "play pretend" as underground hackers. You know, the kind of person who likes to imagine that the government would be after them.
This mudslinging feels like more of a marketing campaign than anything else. An info op that will work well on the Telegram users who like to imagine that they have outmaneuvered all the info ops.
Consider that the European Data Protection Supervisor (EDPS) recently found that the European Commission’s use of Microsoft 365 breaches data protection law for EU institutions and bodies.”
Lol, it took a while to see the mountain. Also, they should sue Microsoft, if data was transfered and stored in US… AH, no. There’s probably the usual default dialog that you click ‘Agree’ on blindly. Anyway, good news. I hope they invest the money on further development, sponsoring, support, and training of employees.
Instead I feel it’s the opposite because that memory is shared with the GPU. So if you’re gaming even with some old game, it’s like having 4gb for the system and 4gb to the GPU. They might claim that their scheduler is magic and can predict memory usage with perfect accuracy but still, it would be like 6+2 GB. If a game has heavy textures they will steal memory from the system. Maybe you want to have a browser for watching a tutorial on YouTube during gaming, or a chat. That’s another 1-2 gb stolen from the CPU and GPU.
Their pricing for the ram is ridiculous, they’re charging $300 for just 8gb of additional memory! We’re not in the 2010s anymore!
It was pretty obvious that Trump would try to scam Musk. Fortunately, given his own difficulties, Musk was unlikely to agree - we're just fortunate these bills are coming due so far in advance of the election. And also that, given Musk's foreign birth, he'll never be eligible to be president or vice president (something that I'm sure sticks in his craw).
There’s a learning curve, but really, every large org can save so much money that way.
Heck, most proprietary apps are web apps these days anyway, so it really doesn’t hurt running Linux except for a few specialty roles. Just run Mac or Windows for those areas, and everyone else gets Linux.
I would love it if my work computer was Linux, 90% of my work is on a terminal anyway, it would save me from having to SSH to a Linux server every day.
Some people only believe what they’re told by an authority, and some people only view massive corporations, politicians, and the church as legitimate authorities. So if a corporation tells them it’s bad, they’ll believe it’s bad.
The Apple M series is not ARM based. It's Apple's own RISC architecture. They get their performance in part from the proximity of the RAM to the GPU, yes. But not only. Contrary to ARM that has become quite bloated after decades of building upon the same instruction set (and adding new instructions to drive adoption even if that's contrary to RISC's philosophy), the M series has started anew with no technological debt. Also Apple controls both the hardware to the software, as well as the languages and frameworks used by third party developers for their platform. They therefore have 100% compatibility between their chips' instruction set, their system and third party apps. That allows them to make CPUs with excellent efficiency. Not to mention that speculative execution, a big driver of performance nowadays, works better on RISC where all the instructions have the same size.
You are right that they do not cater to power users who need a LOT of power though. But 95% of the users don't care, they want long battery life, light and silent devices. Sales of desktop PCs have been falling for more than a decade now, as have the investments made in CISC architectures. People don't want them anymore. With the growing number of manufacturers announcing their adoption of the new open-source RISC-V architecture I am curious to see what the future of Intel and AMD is. Especially with China pouring billions into building their own silicon supply chain. The next decade is going to be very interesting. :)
The whole “Apple products are great because they control both software and hardware” always made about as much sense to me as someone claiming “this product is secure because we invented our own secret encryption”.
Here’s an example for that: Apple needed to ship an x86_64 emulator for the transition, but that’s slow and thus make the new machines appear much slower than their older Intel-based ones. So, what they did was to come up with their own private instructions that an emulator needs to greatly speed up its task and added them to the chip. Now, most people don’t even know whether they run native or emulated programs, because the difference in performance is so minimal.
The Apple M series is not ARM based. It’s Apple’s own RISC architecture.
M1s through M3s run ARMv8-A instructions. They’re ARM chips.
What you might be thinking of is that Apple has an architectural license, that is, they are allowed to implement their own logic to implement the ARM instruction set, not just permission to etch existing designs into silicon. Qualcomm, NVidia, Samsung, AMD, Intel, all hold such a license. How much use they actually make of that is a different question, e.g. AMD doesn’t currently ship any ARM designs of their own I think and the platform processor that comes in every Ryzen etc. is a single “barely not a microprocessor” (Cortex A5) core straight off ARM’s design shelves, K12 never made it to the market.
You’re right about the future being RISC-V, though, ARM pretty much fucked themselves with that Qualcomm debacle. Android and android apps by and large don’t care what architecture they run on, RISC-V already pretty much ate the microcontroller market (unless you need backward compatibility for some reason, heck, there’s still new Z80s getting etched) and android devices are a real good spot to grow. Still going to take a hot while before RISC-V appears on the desktop proper, though – performance-wise server loads will be first, and sitting in front of it office thin clients will be first. Maybe, maybe, GPUs. That’d certainly be interesting, the GPU being simply vector cores with a slim insn extension for some specialised functionality.
Thanks for the clarification. I wonder if/when Microsoft is going to hop on the RISC train. They did a crap job trying themselves at a ARM version a few years back and gave up. A RISC Surface with a compatible Windows 13 and proper binary translator (like Apple did with Rosetta) would shake the PC market real good!
the mac pro is a terrible deal even compared to their own mac studio. It has the same specs but for almost $1000 extra. Yes, the cheese grater aluminum case is cool, but $1000 cool?
I think this is one of those words which has lost its meaning in the personal computer world. What are people doing with computers these days? Every single technology reviewer is, well, a reviewer - a journalist. The heaviest workload that computer will ever see is Photoshop, and 98% of the time will be spent in word processing at 200 words per minute or on a web browser. A mid-level phone from 2016 can do pretty much all of that work without skipping a beat. That’s “professional” work these days.
The heavy loads Macs are benchmarked to lift are usually video processing. Which, don’t get me wrong, is compute intensive - but modern CPU designers have recognized that they can’t lift that load in general purpose registers, so all modern chips have secondary pipelines which are essentially embedded ASICs optimized for very specific tasks. Video codecs are now, effectively, hardcoded onto the chips. Phone chips running at <3W TDP are encoding 8K60 in realtime and the cheapest i series Intel x64 chips are transcoding a dozen 4K60 streams while the main CPU is idle 80% of the time.
Yes, I get bent out of shape a bit over the “professional” workload claims because I work in an engineering field. I run finite elements models and, while sparce matrix solutions have gotten faster over the years, it’s still a CPU intensive process and general (non video) matrix operations aren’t really gaining all that much speed. Worse, I work in an industry with large, complex 2D files (PDFs with hundreds of 100MP images and overlain vector graphics) and the speed of rendering hasn’t appreciably changed in several years because there’s no pipeline optimization for it. People out there doing CFD and technical 3D modeling as well as other general compute-intensive tasks on what we used to call “workstations” are the professional applications which need real computational speed - and they’re/we’re just getting speed ratio improvements and the square root of the number of cores, when the software can even parallelize at all. All these manufacturers can miss me with the “professional” workloads of people surfing the web and doing word processing.
Indeed! It makes the benchmarks that much more disingenuous since pros will end up CPU crunching. I find video production tedious (it’s a skill issue/PEBKAC, really) so I usually just let the GPU (nvenc) do it to save time. ;-)
Also, one of these days AMD or Intel will bolt 8GB on their CPUs too, and then they’ll squash M.
I can’t remember who it is but somebody is already doing this. But it’s primarily marketed as an AI training chip. So basically only Microsoft and Google are able to buy them, even if you had the money, there isn’t any stock left.
Yeah, I gave Apple a try over the last two years, largely because I was annoyed with Google and wanted to ditch Android. I’ve been fed up since about 6 months in, but gave it some more time, which led to an eventual waiting game to get the replacements I want.
I just picked up a Thinkpad P14s g4 AMD with a 7840u, 64GB of RAM, and a 3 year onsite warranty for $1270 after taxes. I added a 4TB Samsung 990 Pro for another $270. I can’t imagine spending more than that and only getting 8GB RAM (and less warranty), which is what I have assigned to the GPU. Plus I get to run Linux, which I really didn’t realize how much MacOS would leave me wanting.
The thing I’ll miss is the iPhone 13 Mini size. I found iOS to be absolute trash, but there’s just not an Android phone that’s a reasonable size. But at least I can run Calyx/Graphene on a Pixel and get a decent OS without spying.
I do like the M1 MBA form factor, too, but I’ll grab the Thinkpad X13s successor for portability and get a better keyboard. I don’t need top end performance out of that, I really just want battery life and passive cooling.
And don’t even get me started on the overpriced mess that are the Airpods Max. I much prefer the Audeze Maxwell and Sennheiser Momentum 4 I replaced them with.
Brazil ended with a third system: Pix. It boils down to the following:
The money receiver sends the payer either a “key” or a QR code.
The payer opens their bank’s app and use it to either paste the key or scan the QR code.
The payer defines the value, if the code is not dynamic (more on that later).
Confirm the transaction. An electronic voucher is emitted.
The “key” in question can be your cell phone number, physical/juridical person registre number, e-mail, or even a random number. You can have up to five of them.
Regarding dynamic codes, it’s also possible to generate a key or QR code that applies to a single transaction. Then the value to be paid is already included.
Frankly the system surprised me. It’s actually good and practical; and that’s coming from someone who’s highly suspicious of anything coming from the federal government, and who hates cell phones. [insert old man screaming at clouds meme]
Brazil's PIX is revolutionary, really. It's instant 24/7 transfers that don't depend on which bank you're using and does not need a third party app. Pretty much if you have a bank account, you already have PIX.
Yeah, it’s actually good. People use it even for trivial stuff nowadays; and you don’t need a pix key to send stuff, only to receive it. (And as long as your bank allows you to check the account through an actual computer, you don’t need a cell phone either.)
Perhaps the only flaw is shared with the Asian QR codes - scams are a bit of a problem, you could for example tell someone that the transaction will be a value and generate a code demanding a bigger one. But I feel like that’s less of an issue with the system and more with the customer, given that the system shows you who you’re sending money to, and how much, before confirmation.
I’m not informed on Tikkie and Klarna, besides one being Dutch and another Swedish. How do they work?
theregister.com
Top