@stardreamer@lemmy.blahaj.zone avatar

stardreamer

@stardreamer@lemmy.blahaj.zone

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Canonical Announces Availability of Real-Time Kernel for Ubuntu 24.04 LTS - 9to5Linux (9to5linux.com)

To get started with the real-time kernel for Ubuntu 24.04, check out the official documentation. One thing to keep in mind if you’re an NVIDIA GPU user is that the real-time Ubuntu kernel does not support the proprietary NVIDIA graphics drivers.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

An alternative definition: a real-time system is a system where the correctness of the computation depends on a deadline. For example, if I have a drone checking “with my current location + velocity will I crash into the wall in 5 seconds?”, the answer will be worthless if the system responds 10 seconds later.

A real-time kernel is an operating system that makes it easier to build such systems. The main difference is that they offer lower latency than a usual OS for your one critical program. The OS will try to give that program as much priority as it wants (to the detriment of everything else) and immediately handle all signals ASAP (instead of coalescing/combining them to reduce overhead)

Linux has real-time priority scheduling as an optional feature. Lowering latency does not always result in reduced overhead or higher throughout. This allows system builders to design RT systems (such as audio processing systems, robots, drones, etc) to utilize these features without annoying the hell out of everyone else.

Stopping a badly behaved bot the wrong way.

I host a few small low-traffic websites for local interests. I do this for free - and some of them are for a friend who died last year but didn’t want all his work to vanish. They don’t get so many views, so I was surprised when I happened to glance at munin and saw my bandwidth usage had gone up a lot....

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

I’ve recently moved from fail2ban to crowdsec. It’s nice and modular and seems to fit your use case: set up a http 404/rate-limit filter and a cloudflare bouncer to ban the IP address at the cloudflare level (instead of IPtables). Though I’m not sure if the cloudflare tunnel would complicate things.

Another good thing about it is it has a crowd sourced IP reputation list. Too many blocks from other users = preemptive ban.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

Pretty sure expiry is handled by the local crowdsec daemon, so it should automatically revoke rules once a set time is reached.

At least that’s the case with the iptables and nginx bouncers (4 hour ban for probing). I would assume that it’s the same for the cloudflare one.

Alternatively, maybe look into running two bouncers (1 local, 1 CF)? The CF one filters out most bot traffic, and if some still get through then you block them locally?

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

iirc the bad UA filter is bundled with either base-http-scenarios or nginx. That might help assuming they aren’t trying to mask that UA.

stardreamer, (edited )
@stardreamer@lemmy.blahaj.zone avatar

According to this post, the person involved exposed a different name at one point.

boehs.org/…/everything-i-know-about-the-xz-backdo…

Cheong is not a Pingyin name. It uses Romanization instead. Assuming that this isn’t a false trail (unlikely, why would you expose a fake name once instead of using it all the time?) that cuts out China (Mainland) and Singapore which use the Pingyin system. Or somebody has a time machine and grabbed this guy before 1956.

Likely sources of the name would be a country/Chinese administrative zone that uses Chinese and Romanization. Which gives us Taiwan, Macau, or Hong Kong, all of which are in GMT+8. Note that two of these are technically under PRC control.

Realistically I feel this is just a rogue attacker instead of a nation state. The probability of China 1. Hiring someone from these specific regions 2. Exposing a non-pinying full name once on purpose is extremely low. Why bother with this when you have plenty of graduates from Tsinghua in Beijing? Especially after so many people desperate for jobs after COVID.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

Been playing it since release and I have to say I quite like it. The mtx is less intrusive than Dragon Age Origins’ DLC (no mention in game at all versus “There’s a person bleeding out on the road, if you want to help him please go to the store page”).

So far, the game is a buttery smooth 60 fps at 4k max graphics + FSR3 w/o ray tracing except for inside the capital city (running 7800x3d with a 7900xtx). The only graphics complaint I have is the FSR implementation is pretty bad, with small amounts of ghosting under certain lighting conditions. There’s also a noticeable amount of input lag compared to the first game: not game breaking, but if you do a side-by-side comparison it’s pretty obvious.

Sure the game has its issues, but right now this looks like something that I enjoy. Games don’t need to be masterworks to be fun (my favorite games are some old niche JRPGs that have been absolutely demolished by reviewers at the time), and right now I think it’s money well spent.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

My personal complaints (despite enjoying the gameplay):

  1. Input lag. It’s negligible compared to other games, but comparing it to DDDA it feels much higher (meh vs “oh wow this is smooth!”)
  2. FSR. There is definitely something wrong with the FSR implementation here, because there are minor traces of ghosting that are not present in other games. Rotate your character in the character selection screen, or look at a pillar with water as the backdrop with light rays nearby. That being said, it becomes less obvious during actual gameplay. I do hope that this will be fixed though.
stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

Both Bluetooth and BLE are perfectly fine protocols. You won’t be able to design much for short distance with that much power savings otherwise. The main issue is that for any protocols like this you would most likely need to put it in the 2.4ghz unlicensed band. And that’s predominantly used by wifi these days.

stardreamer, (edited )
@stardreamer@lemmy.blahaj.zone avatar

It doesn’t have to be turn-based. FFXI and FFXII are also great. I feel the bigger issue is that making a story heavy game while everyone else is also making story heavy games makes it no longer unique.

I wouldn’t mind going back to ATB, but I don’t think that would win back an audience except for nostalgia points.

Maybe more FF:T though? Kinda miss that.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

You Is Into

Baba IS Money

Take The Breach

:)

Also is anyone reminded of Final Fantasy: Tactics by the small isometric maps?

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

Back in the 90s we had the Flash as well.

Somehow I still have that theme song stuck in my head…

And that scene where a brainwashed Flash destroys an entire row of parking meters…

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

MH series always does one big (console) one small (mobile) in that order. Last gen World was the big and Rise was the small.

This is probably gonna be the big one :)

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

When I said small I was referring to portable (kinda forgot the word), as hunts can be completed in 15min or less. I think I would still prefer World though, probably because I did 300 Narwa hunts in one week before they fixed the “loot drop tables” bug.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

Some people play games to turn their brains off. Other people play them to solve a different type of problem than they do at work. I personally love optimizing, automating, and min-maxing numbers while doing the least amount of work possible. It’s relatively low-complexity (compared to the bs I put up with daily), low-stakes, and much easier to show someone else.

Also shout-out to CDDA and FFT for having some of the worst learning curves out there along with DF. Paradox games get an honorable mention for their wiki.

stardreamer, (edited )
@stardreamer@lemmy.blahaj.zone avatar

8gb RAM and 256 gb storage is perfectly fine for a pro-ish machine in 2023. What’s not fine is the price point they are offering it (but if idiots still buy that, that’s on them and not apple). I’ve been using a 8gb ram 256 gb storage Thinkpad for lecturing, small code demos, and light video editing (e.g. zoom recordings) this past year, it works perfectly fine. But as soon as I have to run my own research code, back to the 2022 Xeon I go.

Is it Apple’s fault people treat browser tabs as a bookmarking mechanism? No. Is it unethical for Apple to say that their 8GB model fits this weirdly common use case? Definitely.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

I don’t think either of us is the target audience here. I can see a “cheaper” (questionable) Pro laptop being useful for students going into college with a limited budget. An undergrad CS/graphic design degree shouldn’t tax an 8gb machine too much, assuming students shut down everything else when doing their once-a-semester major rendering/compiling/model training. If people just want Macbook pro software with more ports, a “cheaper” machine is better than none. Personally, I would still get a used/refurbished machine though.

That being said, my current laptop workload tends to be emacs, qpdfview, Firefox, and tmux on EL9. For the remaining stuff, I usually just spin up a VM then ssh/xrdp into it. As for slack, teams, jabber, etc, I’m happy to report I’ve been out of industry/IT for 1+ years and don’t plan on going back anytime soon. For all I care, Apple can call their models unicorn edition. As long as it sells it’s not stupid.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

Oh great, a human failed the Turing Test…

stardreamer, (edited )
@stardreamer@lemmy.blahaj.zone avatar

Am I the only idiot that read X as X11 then realized it was referring to Twitter?

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

A more recent example:

“Nobody needs more than 4 cores for personal use!”

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

Yep it’s Intel.

They said it up until their competitor started offering more than 4 cores as a standard.

stardreamer, (edited )
@stardreamer@lemmy.blahaj.zone avatar

The problem is that hardware has come a long way and is now much harder to understand.

Back in the old days you had consoles with custom MIPS processors, usually augmented with special vector ops and that was it. No out-of-order memory access, no DMA management, no GPU offloading etc.

These days, you have all of that on x86 plus branch predictors, complex cache architecture with various on-chip interconnects, etc… It’s gotten so bad that most CS undergrad degrees only teach a simplified subset of actual computer architecture. How many people actually write optimized inline assembly these days? You need to be a crazy hacker to pull off what game devs in the 80-90s used to do. And crazy hackers aren’t in the game industry anymore, they get paid way better working on high performance simulation software/networking/embedded programming.

Are there still old fashioned hackers that make games? Yes, but you’ll want to look into the modding scene. People have been modifying the Java bytecode /MS cli for ages for compiled functions. A lot of which is extremely technically impressive (i.e. splicing a function in realtime). It’s just that none of these devs who can do this wants to do this for a living with AAA titles. Instead, they’re doing it as a hobby with modding instead.

stardreamer,
@stardreamer@lemmy.blahaj.zone avatar

If it helps you avoid users it’s a plus.

I’d take deciphering the Rosetta code over that any day.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • fightinggames
  • All magazines