Yeah, I mean nobody uses SIP as an open protocol with email-like addresses. You could call me with an unregistered softphone. It would have been way cooler if I had any use for it outside of like two other nerd friends of mine who run personal Asterisk servers.
Yeah, my experience with Element and a Matrix.org account is that it’s sluggish. However, it’s been better at Beeper, so I’m uncertain whether it’s intrinsic to Matrix or merely Matrix.org and/or Element’s servers.
I don’t believe that its existence causes more fragmentation than it remediates. news.ycombinator.com/item?id=36939482 explains why I consider Matrix fundamentally superior most (if not all) uses, although in practice it’s because the clients (Element and FluffyChat primarily) are cross-platform and support a generally uniform set of features, in comparison to the aged (but glorious) Pidgin, and its counterparts.
Your hackernews post and the fact you mention Pidgin shows that you haven’t used xmpp in the last 10 years. By the time Matrix was first released, xmpp had history sync.
Which is why I can’t wrap my head around why a second protocol with no features that didn’t already exist in XMPP took over.
I used it yesterday, via Pidgin. I’m rokejulianlockhart@xmpp.jp. Why else would I have referenced it? Don’t tell me what I’ve done. That’s not a way to have productive conversations.
Regardless, I can’t provide any more technical insight than that - I know solely that the clients provide so much more functionality that irrespective of the protocol, it’s better in practice. Fedora, openSUSE, the Bundeswehr, NATO, and Beeper - all chose Matrix over XMPP, not least partially because of Element (which they also all chose).
Others have said already, but XMPP and RSS. Also, nobody mentioned NNTP yet.
I wish everything was accessible by NNTP and we had better NNTP clients. NNTP is like RSS but for forums (so, Lemmy, Reddit, or anything where you could reply to posts). Download for offline reading, read in your client, define your own formatting, sorting, filtering, your client, your rules.
If Lemmy was accessible via NNTP, I could just download all posts and comments I’m interested in and reply to them without any connection, and my replies would get synced with the server later when I connect to WiFi or something.
Probably it would be better to edit my comment, but I’ll go with a reply to myself.
To all fans of RSS: there’s this service called FeedBase that is essentially a RSS to NNTP gate. You add your RSS feed to that and it becomes a newsgroup on their server, and you can subscribe to it using any NNTP client. New articles appear as new posts in that newsgroup and you can post your own replies to them. So, you get RSS but with discussions or comments.
Back in the day I was a big Usenet fan. What’s the modern solution to the spam issue? At the time, folk wisdom was that the demise was being caused by spam, and that due to the nature of the protocol it was somehwhat unsolveable.
I also wonder to what extent activity pub is the barrier to offline use? For reddit, the Slide client had offline reading and iirc posting. I have been disappointed it isn’t available for Lemmy. My guess has been it simply isn’t a priority for the devs. Maybe eventually we will get it.
I think it would be cool if RSS got put into Lemmy clients. Example you could make a unified inbox for all accounts by automatically getting the private RSS for incoming messages for all logged in accounts. I have manually set this up a couple of times but its tedious. Completely lacks smoothness when it comes to clicking a link, replying etc. But a client could add a little finesse to fix that.
Content addressable protocols are better for asynchronous use. I’d like to see a proper bluesky atprotocol fork with “post lexicons” properly adapted for forums, they’re built on top of content addressing and public key based account IDs along with 3rd party moderation tooling support integrated and custom 3rd party feeds/views.
Unbelievable that we have to rely on Google and co for sth as essential as push messages! Even among the open source community, the adoption is surprisingly limited.
It is what is used for browser push messages, is already widely supported. Is compatible with existing push infrastructure and users and is end-to-end encrypted. IDK why Unified Push felt the need to create a new protocol when a perfectly good one already existed.
Although there is no “client side” spec. The Unified Push client side could be useful. But they should throw away their custom backend protocol and just use Web Push.
Also KDX. I was too young to use that, but tried and it’s cool. Sadly even FOSS clients are all dead and don’t build anymore. (I think I had limited success with patching one called Fidelio to build, but that was a few years ago and I can’t find any traces of that attempt.)
In the world of computers, why would remembering numbers be the stop for new technologies?
Do you remember anyone’s public key? Certificate?
I don’t even remember domain (most) names, just Google them or save them as bookmarks or something.
The reason IPv4 still exists is because ISPs benefit from its scarcity. Big ISPs already paid a lot of money to own IPv4 addresses, if they switched to IPv6 that investnywould be worthless.
Try selling static IPv6 addresses as they do now with IPv4. People would laugh at them and just get a free IPv6 address from an ISP that wants to get new users and doesn’t charge for it.
The longer ISPs delay the adoption of IPv6, the longer they can milk IPv4 scarcity.
IPv6 addresses are practically endless, therefore their value is practically 0. ISPs justify charging extra for static IPv4 because IPv4 addresses do have a value.
If ISPs charge for static IPv6, then one of them could just give that service for free (while keeping the rest of the prices the same as their competitors). That would get them more customers while costing them nothing.
EDIT: I can’t give you an example of an ISP that offers free static IPv6 because there are no ISPs in my country that offer IPv6.
damn if only we had a service that like, obfuscated and abstracted these hard to remember IPs that aren’t very user friendly, and turned them into something more usable. That would be cool i think. Someone should make that.
I hear you on this! Took me a whole day to get my router to delegate IPv6 properly. I’m sure that had it been better adopted, I wouldn’t be having such a hard time.
Matrix tries to kill XMPP but the reality is that if you want to self-host, XMPP is much less of a hassle. Also, Matrix is an open standard as in “pay big money to participate in the openness”. matrix.org/…/funding-matrix-via-the-matrix-org-fo…
Membership comes at various levels, each with different rewards:
<span style="color:#323232;">Individual memberships (i.e. today’s Patreon supporters):
</span><span style="color:#323232;"> Ability to vote in the appointment of up to 2 ‘community representatives’ to the Foundation's governing board.
</span><span style="color:#323232;"> Name on the Matrix.org website
</span><span style="color:#323232;">Silver member: between £2,000 and £80,000 per year, depending on organisation size
</span><span style="color:#323232;"> Ability to vote on the appointment of up to 2 ‘Silver representative’ to the Foundation's governing board
</span><span style="color:#323232;"> Supporter logo on the front page of the new Matrix.org website
</span><span style="color:#323232;">Gold member: £200,000 / year, adds:
</span><span style="color:#323232;"> Ability to vote on the appointment of up to 3 ‘Gold representatives’ to the Foundation's governing board.
</span><span style="color:#323232;"> Press release announcing the sponsorship
</span><span style="color:#323232;"> 1 original post on the Matrix.org blog per year
</span><span style="color:#323232;"> Participation in the internal Spec Core Team room
</span><span style="color:#323232;"> Larger logo on the front page of Matrix.org
</span><span style="color:#323232;">Platinum member: £500,000 / year, adds:
</span><span style="color:#323232;"> Ability to vote on the appointment of up to 5 ‘platinum representatives’ to the Foundation's governing board.
</span><span style="color:#323232;"> 1 sponsored Matrix Live episode per year
</span><span style="color:#323232;"> Largest logo on the front page of Matrix.org
</span>
IPv6, needed for modern Internet not to collapse, would make many other important things easier. Easier to become an ISP, to selfhost, to build P2P networks, etc.
GNU Taler, a payment protocol just look at it go: 101010.pl/, or just imagine building a payment terminal of a Raspberry Pi
Matrix, to unify chat, conference and calling apps
some self-arranging darknet protocol becoming a norm like I2P, GNUNet or Yggdrasil, so we could have a backup when mass Internet blockage happen
I really hope matrix gets native VoIP. I saw like 2 years ago it was in beta, haven’t kept up with it though. I’d also really like voice channels like discord so my friends and I can replace discord but it seems like matrix isn’t interested in being a discord replacement
Matrix I have doubts about. The idea of Tox was nicer, but the implementation quality and the scandal at some point didn’t help.
Tox felt more playable, like piping files over it or a remote shell over it (I know, bad associations, but still), or even using it for VPN. I think there were clients allowing to do such stuff, and the protocol allows it.
EDIT: I mean, it’s still alive, just don’t see it claiming the place of FOSS old Skype replacement as it did.
GNUNet - all you people mentioning it have peers? I tried to set it up a few weeks ago, couldn’t get peers.
About Tox, I am not a fan of mixing up universal delivering of packets and applications. Piping files or using as VPS feels like something that would be better done with proper full network and not be mixed with chat.
to be clear, I2P is not really intended for anything, it’s used for everything. It supports all kinds of things, and there are people doing all kinds of things on it. Though i could see potential technological limitations being a problem.
There are no IPv4 addresses left. So you eather go IPv6-only, which would make many services not work. Or wait in a long queue to repurpose address spaces marked as depracated which would soon run out too. And then you put clients behind double or triple NAT doing having shitty service.
A lot of orgs (mine included) are sitting on large chunks of IPs they don’t need (we have a /16 and several /24s) because they adopted early, got an ASN and prefix assigned by ARIN, and their addressing scheme is now so disjointed and scattered that they can’t sell off anything bigger than a /22, and that makes setting up BGP a pain. Juice ain’t worth the squeeze.
Such a simple solution for the cookie banner issue. But it prevented websites from tricking users into allowing them to gather their data, so it had to go.
Most of those cookie banners are not even needed, you only need them for tracking cookie, not login and session cookies. But of course everyone decided it is just easier to nag all the users with a big splash screen.
A lot of them are not even doing it right, you are not allowed to hint the user that accept all is the “correct” choice by having it in a different color than the others. And being able to say no to all shouls be as easy as accepting all, often it isn’t.
Basically, cookie banners are usually not needed and when they are they are most often incorrectlt designed (not by accident).
But of course everyone decided it is just easier to nag all the users with a big splash screen.
Nope, the thing is, you’ll very rarely find a website that only uses technically necessary session/login cookies. The reason every fucking website, yes, even the one from the barber shop around the corner, has a humongous cookie banner is that every fucking website helps google and other corporations to track users across the whole internet for no reason.
Yes, seen by people visiting EU websites or companies with an EU presence. And because whether or not they assign a cookie is easily verifiable by the person on the other end.
LaTeX. As someone in academia, I absolutely love it. It has some issues like package incompatibility, but it’s far far better than anything else I’ve used. It’s basically ubiquitous in academia, and I wish it were the case everywhere else as well.
It’s not a standard but still its an interesting software so I’ll post this here:
Joking aside, I love and hate it. Its paradigm is almost like using the C preprocessor to build a really awkward Turing-machine. TeX/LaTeX does a great job of what it was intended to do; it applies high quality typesetting rules to complex material and produces really good results. I love the output I can get with it and I will be eternally grateful that Donald Knuth decided to tackle this problem. And despite my complaints below, that gratitude is genuine. Being able to redefine something in a context-sensitive way, or to be able to rely on semantics to produce spacing appropriate to an operator vs a variable etc; these are beautiful things.
The problem is, at least once a day I’m left wishing I could just write a callable routine in a normal language with variables, types, arrays, loops and so on. You can implement all those things in TeX, but TeX doesn’t have a normal notion of strings, numbers or arrays, so it is rare that you can do a complicated thing in an efficient way, with readable code. So as a language, TeX frequently leads to cargo-cult programming. I’m not aware that you can invoke reflection after a page is output, to see what decisions on glue and breaks were made; but at the same time you can’t conditionally include something that is dependent on those decisions, since the decision will depend on what is included. This leads to some horrible conditionals combined with compiling twice, and the results are not always deterministic. Sometimes I find it’s quicker to work around things like that by writing an external program that modifies the resulting PDF output, but that seems perverse.
At the same time, there’s really nothing else out there that comes close to doing what LaTeX does, and if you have the patience, the quality of documents it can produce is essentially unbounded. The legacy of encodings, category codes, parameter limits, stack limits etc. just makes it very hard for package writers, and consumes a great deal of time for a lot of people. But maybe I am picky about things that a saner person would just live with.
A lot of very talented people have written a lot of very complex packages to save the user from these esoteric details, and as a result LaTeX is alive and well, and 99% of the time you can get the results you want, using off-the-shelf parts. The remaining 1% of the time, getting the result you want requires a level of expertise that is unreasonable to expect of users. (For comparison, I wrote an optimising C compiler and generally found it far easier to make that work as expected, than some of the things I’ve tried, and failed, to do properly in LaTeX. I now have a rule; if getting some weird alignment to work takes me more than an hour, I just fake it with a postscript file, an image, or write an external program to generate it longhand, in order to save my sanity.)
I think (and certainly hope) that LaTeX is here to stay, in much the same way that C and assembly language are. As time moves forward I think we’ll see more and more abstractions and fewer people dealing with the internals. But I will be forever grateful to the people who are experts in TeX, and who keep providing us with incredible packages.
I honestly just use it for my resume with a template I found, so my knowledge is extremely basic, but I really do love the concept that I can “compile” and actually see the source of my document’s formatting.
Nope and yep. It’s an incredible tool, but it’s got a vim-sized learning curve to really leverage it plus other significant drawbacks. Still my beloved one-and-only when I can get away with it, but its a bit of a masochistic acquired taste for sure.
Template tweaking, as I imagine academia heavily relies on, is really the closest to practical it gets. You do still get beautiful results, it’s just hard to express yourself arbitrarily without really committing to the bit.
Markdown and LaTeX are meant for entirely different purposes. It’s somewhat analogous to HTML vs PDF. While it’s possible to write books with Markdown, it’s a vastly inferior solution compared to latex or typst (for fixed format docs like books).
They host a proprietary service that does all the stuff, the compiler and spec are completely FOSS. So you need to create your own implementations, which is not hard.
I dont think they will close source the compiler. And thats basically everything thats needed?
I have 0 problems with people creating a fancy proprietary implementation to get people hooked. I will never use an online editor, but why care?
Learning LaTeX and working around its quirks seems like a much better time investment than sidegrading to something that lives on premises given by a proprietary commercial project. If someone saw LaTeX and said “I want to make some version of this that is better”, without alterior motives, they would probably just work on improving LaTeX (which a whole lot of people do).
Fancy does not mean better, and often is in many ways worse than plain old boring.
Many projects need to be rewritten from scratch I think. But I also think an easier markup language for LaTeX could be possible, keeping all the nice templates etc.
The experience gained from the production and maintenance of LaTeX2e (the version you have been using for many years) had a major influence on our goals for future development and on new code which is now integrated into LaTeX.
A while ago we made the decision to drop the idea of a separate LaTeX3 format that would exist in parallel to LaTeX2e, but instead decided to gradually modernize LaTeX to keep it competitive in today’s world while maintaining compatibility methods for older documents.
I think this decision was pretty much a good one.
Overleaf does not modernize LaTeX in meaningful ways. It only adds cloud functionality and glossy appearance that you can get on dedicated editors anyways.
No, but Overleaf is just a proprietary fancy editor like the Typst one. Meanwhile typst is just as usable for building editor too.
I dont see any arguments against typst really. I am using Markdown all time and find it best, but lacking. Then LaTeX, honestly I dont want to learn as it must be a pain to write.
Now in typst, you can write academic papers etc just as well. All you need is free software, with good backing, modern tooling (rust, cargo), thus it runs everywhere. Its pretty cool!
Overleaf are not benefactors that develop LaTeX for economic gains, unlike the situation with Typst that rely on it (to my knowledge). LaTeX is also cross platform, supported in tons of editors and can easily be converted to other formats with pandoc. It is also somewhat supported in other formats using implementations such as KaTeX for Markdown and Mathjax in HTML due to being the defacto standard for math typesetting.
Writing papers in LaTeX is a joy, not a pain. The end result is also a beautifully typeset document rivalled by none.
I wrote my masters in LaTeX and while I appreciate the structuredness and the fact I could use vim, it was so quirky. Having to spend half an hour to fix a non obvious compile error, more than once, was a big distractor. I’m sure it gets better when you use it more but I don’t think I have ever used it since. I’m not in academia and I don’t need to solve compile problems when creating an invoice or writing a letter to local government.
I was actually surprised to find out QUIC is fairly close to being default.
Wikipedia
HTTP/3 uses QUIC, a multiplexed transport protocol built on UDP.
HTTP/3 is (at least partially) supported by 97% of tracked web browser installations (thereof of 98% of "tracked mobile" web browsers), and 29% of the top 10 million websites.
Add comment