deadbeef

@deadbeef@lemmy.nz

This profile is from a federated server and may be incomplete. Browse more on the original instance.

deadbeef,

It might help for the folks here to know which brand and model of SSDs you have, what sort of sata controllers the sata ones are plugged into and what sort of cpu and motherboard the nvme one is connected to.

What I can say is Ubuntu 22.04 doesn’t have some mystery problem with SSDs. I work in a place where we have in the order of 100 Ubuntu 22.04 installs running with SSDs, all either older intel ones or newer samsung ones. They go great.

deadbeef,

If you haven’t already, try running hdparm on your drive to get an idea of if the drives are at least doing large raw reads straight off the disk at an appropriate performance level.

This is output from the little NUC I’m using right now:


<span style="color:#323232;"># lsblk
</span><span style="color:#323232;">NAME   MAJ:MIN RM   SIZE RO TYPE MOUNTPOINTS
</span><span style="color:#323232;">sda      8:0    0 465.8G  0 disk 
</span><span style="color:#323232;">├─sda1   8:1    0   512M  0 part /boot/efi
</span><span style="color:#323232;">├─sda2   8:2    0 464.3G  0 part /
</span><span style="color:#323232;">└─sda3   8:3    0   976M  0 part [SWAP]
</span><span style="color:#323232;">
</span><span style="color:#323232;"># hdparm -i /dev/sda
</span><span style="color:#323232;">
</span><span style="color:#323232;">/dev/sda:
</span><span style="color:#323232;">
</span><span style="color:#323232;"> Model=Samsung SSD 860 EVO 500GB, FwRev=RVT02B6Q, SerialNo=S3YANB0KB24583B
</span><span style="color:#323232;">...
</span><span style="color:#323232;">
</span><span style="color:#323232;"># hdparm -t /dev/sda
</span><span style="color:#323232;">
</span><span style="color:#323232;">/dev/sda:
</span><span style="color:#323232;"> Timing buffered disk reads: 1526 MB in  3.00 seconds = 508.21 MB/sec
</span><span style="color:#323232;">
</span>

If your results are really poor for this test then it points more at the drive / cable / controller / linux controller driver.

If the results are okay, then the issue is probably something more like a logical partitioning / filesystem driver issue.

I’m not sure what a good benchmark application for Linux that tests the filesystem layer as well is other than bonnie++ which has been around forever. Someone else might have a more current idea of something to use for this.

deadbeef,

I just read the update to the post saying that the issue has been narrowed down to the NTFS driver. I haven’t used NTFS on linux since the NTFS fuse driver was brand new and still wonky as hell something like 15 years ago, so I don’t know much about it.

However, it sounds like the in kernel driver was still pretty fresh in 5.15, so doing as you have suggested and trying out a 6.5 kernel instead is a pretty good call.

deadbeef,

Last week I did an install of Debian 12 on a little NUC7CJYH to use for web browsing and ssh sessions into work and ended up with wayland by default. Seems to work great.

From what I have experienced, it goes great with intel integrated graphics, great with a radeon card and can be made to work with Nvidia if you are lucky or up for a fight.

deadbeef,

Which workflows? Asking because I’d like to experiment with some edge case stuff.

I’m running KDE with wayland on multiple different vintage machines with AMD and intel graphics and it would take alot for me to go back to the depressing old mess that was X.

The biggest improvement in recent times was absolutely pulling out all my Nvidia cards and putting in second hand Radeon cards, but switching to wayland fixed all the dumb interactions between VRR ( and HDR ) capable monitors of mixed refresh rates.

Even the little NUC that drives the three 4k TV’s for the security cameras at work is a little happier with wayland, running for weeks now with hardware decoding, rather than X crashing pretty well every few days.

deadbeef,

Appreciate the reply. Which desktop environment are you using?

My only experience with Wayland is also with KDE. Wheres for the 27-ish years before that I’ve used all sorts of stuff with X.

I’ve scripted the machine that drives the frontend for our video surveilance ssytem to place windows exactly where I want them when it comes up.

I use a couple of dbus triggers that make the TV on the wall in my garage go to sleep from the shell, perhaps not tested via ssh though. They were pretty well the functional equivalent of some xset dpms commands that I used to use. Not sure if that is what you were meaning. I think I also had something working that disabled the output altogether. I think that was pretty clunky as it used some sort of screen ID that would occasionally change. Sorry I’m hazy on the details, I’m old.

I’ll try it all out when I get home, I’ve got to find some old serial crap for a coworker in the garage anyway.

deadbeef,

Agreed, it seems like they should have put just a little bit more in the standard feature set so every little window manager doesn’t have to reinvent the wheel.

deadbeef,

I have an A1502 Macbook that I have been using for work since it was new in 2014. It triple boots Windows, Linux and OSX, but I only really use Linux.

Mine has the same CPU, a i5-4308U but 16GB of memory, I think it was a custom order at the time.

If I recall I did the regular bootcamp process you would do to install Windows, installed Windows on a subset of the free space and Linux on the rest.

I’ve got Linux mint 21 on it currently, but I have had vanilla Ubuntu at different times. I can’t think of anything on it that doesn’t just work off hand.

I tried, I really did

I’ve been an IT professional for 20 years now, but I’ve mainly dealt with Windows. I’ve worked with Linux servers through out the years, but never had Linux as a daily driver. And I decided it was time to change. I only had 2 requirements. One, I need to be able to use my Nvidia 3080 ti for local LLM and I need to be able...

deadbeef,

Sorry to hear about that mess.

I posted here lemmy.nz/comment/1784981 a while back about what I went through with the Nvidia driver on Linux.

From what I can tell, people who think Linux works fine on Nvidia probably only have one monitor or maybe two that happen to be the same model ( with unique EDID serials FWIW ). My experience with a whole bunch of mixed monitors / refresh rates was absolutely awful.

If you happen to give it another go, get yourself an AMD card, perhaps you can carry on using the Nvidia card for the language modelling, just don’t plug your monitors into it.

deadbeef,

I have two AMD Radeon cards for Linux that I’m pretty happy with that replaced a couple of Nvidia cards. They are an RX6800 and an RX6700XT. They were both ex mining cards that I bought when the miners were dumping their ethereum rigs, so they were pretty cheap.

If I had to buy a new card to fill that gap, I’d probably get a 7800XT, but if you don’t game on them you could get a much lower end model like an RX7600.

deadbeef,

It isn’t something that is in the distro vendors control. Nvidia do not disclose programming info for their chipsets. They distribute an unreliable proprietry driver that is obfuscated to hell so that noone can help out fixing their problems.

If you use an AMD card it will probably work fine in Windows and Linux. If you use an Nvidia card you are choosing to run windows or have a bad time in Linux.

deadbeef,

The support for larger numbers of monitors and mixed resolutions and odd layouts in KDE vastly improved in the ubuntu 23.04 release. I wouldn’t install anything other than the latest LTS release for a server ( and generally a desktop ), but KDE was so much better that it was worth running something newer with the short term aupport on my desktops.

We aren’t too far off the next LTS that will include that work anyway I guess. I’m probably going to be making the move to debian rather than trying that one out though.

deadbeef, (edited )

If you go back a bit further, multi monitor support was just fine. Our office in about 2002 was full of folks running dual ( 19 inch tube! ) monitors running off matrox g400’s with xinerama on redhat 6.2 ( might have been 7.0 ). I can’t recall that being much trouble at all.

There were even a bunch of good years of the proprietry nvidia drivers, the poor quality is something that I’ve only really noticed in the last three or so years.

deadbeef,

Oh yeah. That video of Linus Torvalds giving Nvidia the finger linked elsewhere in this thread was the result of a ton of frustration around them hiding programming info. They also popularised a dodgy system of LGPL’ing a shim which acted as the licence go-between the kernel driver API ( drivers are supposed to be GPL’d ) and their proprietary obfuscated code.

Despite that, I’m not really that anti them as a company. For me, the pragmatic reality is that spending a few hundred bucks on a Radeon is so much better than wasting hours performing arcane acts of fault finding and trial and error.

deadbeef,

I’m not the PR department for desktop Linux for everyone man.

People who only have Windows experience see an Nvidia card that is premium priced product with a premium experience and think that this will translate to a Linux environment, it does not. I’ve been using Linux for like 27 years now and that was my opinion until a couple of years ago.

Hopefully the folks that might read this thread ( like the OP 20 year IT veteran ) can take away that Nvidia cards in linux are the troublesome / subpar choice and are only going to get worse going forwards ( because of the Wayland migration that Nvidia are ignoring ).

  • All
  • Subscribed
  • Moderated
  • Favorites
  • fightinggames
  • All magazines