theregister.com

OfCourseNot, to technology in Japan forces Apple and Google to open their mobile platforms • The Register
@OfCourseNot@fedia.io avatar

Apple: 'Mobile platform? Nah this is just a game console' winks at Nintendo and Sony

nxdefiant,

Japan: Checkmate

:: Reveals 10X more laws regulating game consoles ::

OfCourseNot,
@OfCourseNot@fedia.io avatar

How so? Honest question, I can't seem to find anything that is not super pro-corporations like the prohibition on modding consoles with tens of thousands dollar fines or even prison sentences...

nxdefiant,

haha no source, just a dumb joke.

kevincox,
@kevincox@lemmy.ml avatar

I would pay a lot of money to see Nintendo’s conniption over having to allow home brew and non-approved software on their game consoles. I would love to release emulators for older Nintendo consoles for the Switch so that they don’t get to keep charging people again to play old games on newer consoles.

FoxBJK, to linux in Debian spices up APT package manager with a dash of color
@FoxBJK@midwest.social avatar

Kinda miffed they didn’t include a screenshot of the colors, but I’m guessing the readability will be vastly better!

runswithjedi,

deleted_by_author

  • Loading...
  • FooBarrington,

    I hate it, makes me look much less like a hacker while installing pre-built software from other people

    lemmyreader,

    😂

    fossphi,

    Just switch to green on black. Immediate street cred boost

    umbrella,
    @umbrella@lemmy.ml avatar

    computer people have upvotes not street cred

    LeFantome,

    Just do it on ARM or RISC-V. RISC architecture changes everything.

    shotgun_crab,

    But it has more green, isn’t that the hacker color?

    Jordan_U,

    Just keep “hollywood” running in another terminal at all times.

    eager_eagle,
    @eager_eagle@lemmy.world avatar

    that is vastly more readable, not only thanks to the colors, but the indentation, new lines, and straightforward section titles are a huge improvement.

    42yeah,

    And proper package name alignment!

    azertyfun,

    I’m just surprised the purists aren’t all up in arms that this isn’t KISS and that it doesn’t fit in their 80x24 teletype.

    … sorry, guess I’m not over that whole systemd “debate”.

    BCsven,

    Looks like how zypper does it

    Nyanix,
    @Nyanix@lemmy.ca avatar

    I love this change, actually, I’m not a boring-text purist. Proper categorizing of data allows me to spot things at a glance much easier, and I’m all in favor of anything that can improve efficiency and understanding, especially for new folks, so we can improve product adoption.

    9488fcea02a9,

    I’m using sid and i’m loving thia change. It’s an obvious visual cue to check if i’m about to remove something important like my whole desktop environment lol

    Routhinator,
    @Routhinator@startrek.website avatar

    I love it, but as someone with a red-green colour blind coworker, I always try to use blue for positive feedback, and orange for negative, as its better for representation for most colourblind types.

    Nyanix,
    @Nyanix@lemmy.ca avatar

    That’s a great callout, and something we should be considering more often

    rokejulianlockhart,
    @rokejulianlockhart@lemmy.ml avatar

    I wish FreeDesktop would standardize CLIs taking their application colours from the user theme so that colourblindness is catered for.

    AceFuzzLord,

    I didn’t know I needed multi-coloured terminal text until I saw the 2nd image. It looks so much more readable!

    Jordan_U,

    Install and run “btop”.

    You could scroll down to the screenshots on the GitHub page, but I had a friend recommend btop to me and seeing it for the first time running on my own machine was an experience. Highly recommend.

    github.com/aristocratos/btop

    llii,

    It’s not only more readable because of the color, they also rearranged everything.

    737,

    looks splendid :3

    1984,
    @1984@beehaw.org avatar

    There are some screenshots in this article:

    www.phoronix.com/news/Debian-APT-2.9-Released

    Haggunenons, to privacy in Majority of Americans now use ad blockers
    @Haggunenons@lemmy.world avatar

    SponsorBlock - for anyone interested in taking their ad blocking hobby to the next level.

    mynamesnotrick,

    Can confirm. Been using with freetube, invidious and clipious as it’s built-in and it’s fantastic.

    Scolding0513,

    sponserblock is uncannily effective

    possiblylinux127, to privacy in Microsoft CEO of AI: Online content is 'freeware' for models • The Register

    So Windows XP source code leak is now freeware?

    nasi_goreng, to technology in Japan forces Apple and Google to open their mobile platforms • The Register
    @nasi_goreng@lemmy.zip avatar

    Japan has so many unique store that operated in their country with region-locked apps/games.

    As far as I remember, even DMM and DLsite already has their own game store on Android.

    This is truly a win for Japanese customer and company.

    SNFi, to technology in Apple slams Android as a 'massive tracking device' in internal slides revealed in Google antitrust battle

    Ah! I just configured yesterday my router to block all the Apple tracking requests (via DNS)… My Android don’t have Google, so they are technically wrong, there is no Apple OS with no tracking (as it is closed source).

    EDIT: Also, we don’t need to listen them, we have proofs: www.scss.tcd.ie/doug.leith/apple_google.pdf 😼

    4am, to linux in Fedora 40 boasts more spins and flavors than ever

    Compared to the rise of LLMs, containers are positively old hat now

    You know, this statement makes the author sound like they think LLMs should replace containers, or that development of better containers is passé because of New and Shiny Things.

    Please take care not to sound like a project manager when doing tech journalism.

    vrighter, to opensource in Open Source Initiative tries to define Open Source AI

    it is only open source if i can build it myself. Which I can’t if you just give me the weights.

    The weights are the “compiled” version of the dataset. It’s the dataset that’s the source, not the weights

    chebra,
    @chebra@mstdn.io avatar

    @vrighter @ylai
    That is a really bad analogy. If the "compilation" takes 6 months on a farm of 1000 GPUs and the results are random, then the dataset is basically worthless compared to the model. Datasets are easily available, always were, but if someone invests the effort in the training, then they don't want to let others use the model as open-source. Which is why we want open-source models. But not "openwashed" where they call it "open" for non-commercial, no modifications, no redistribution

    vrighter,

    the results are random therefore the dataset is useless.

    tell that to any fpga toolchain

    delirious_owl,
    @delirious_owl@discuss.online avatar

    So the cover art I made for a friend’s album isn’t open source, even though I released it as CC BY-SA… because you can’t make it yourself?

    bitfucker,

    I think technically, the source should be the native format of whatever image manipulation program that you use. For vector graphics, there is svg format but the native editor is still preferable. Otherwise, whoever gets the end copy cannot easily modify or reproduce it, only copy it. But it of course depends on the definition of “easy” and a lot of other factors. Licensing is hard and it is because I am not a lawyer.

    sweng,

    It would depend on the format what is counted as source, and what isn’t.

    You can create a picture by hand, using no input data.

    I challenge you to do the same for model weights. If you truly just sit down and type away numbers in a file, then yes, the model would have no further source. But that is not something that can be done in practice.

    delirious_owl,
    @delirious_owl@discuss.online avatar

    I challenge you to recreate the Mona Lisa.

    My point is that these models are so complex that they’re closer to art than anything reproduce

    sweng, (edited )

    I don’t see your point? What is the “source” for Mona Lisa I would use? For LLMs I could reproduce them given the original inputs.

    Creating those inputs may be an art, but so could any piece of code. No one claims that code being elegant disqualifies it from being open source.

    delirious_owl,
    @delirious_owl@discuss.online avatar

    Are you sure that you can reproduce the model, given the same inputs? Reproducibility is a difficult property to achieve. I wouldn’t think LLMs are reproduce.

    sweng,

    In theory, if you have the inputs, you have reproducible outputs, modulo perhaps some small deviations due to non-deterministic parallelism. But if those effects are large enough to make your model perform differently you already have big issues, no different than if a piece of software performs differently each time it is compiled.

    delirious_owl,
    @delirious_owl@discuss.online avatar

    That’s the theory for some paradigms that were specifically designed to have the property of determinism.

    Most things in the world, even computers, are non-deterministic

    Nondeterminism isn’t necessarily a bad thing for systems like AI.

    leopold,

    I would consider the “source code” for artwork to be the project file, with all of the layers intact and whatnot. The Photoshop PSD, the GIMP XCF or the Krita KRA. The “compiled” version would be the exported PNG/JPG.

    You can license a compiled binary under CC BY if you want. That would allow users to freely decompile/disassemble it or to bundle the binary for their purposes, but it’s different from releasing source code. It’s closed source, but under a free license.

    vrighter,

    you released it under a non open source license. So very clearly: no it is not

    leopold,

    CC BY-SA is considered open source. CC BY-NC is not.

    delirious_owl,
    @delirious_owl@discuss.online avatar

    Wut. That license is literally compatible with the GPL

    ylai,

    The situation is somewhat different and nuanced. With weights there are tools for fine-tuning, LoRA/LoHa, PEFT, etc., which presents a different situation as with binaries for programs. You can see that despite e.g. LLaMA being “compiled”, others can significantly use it to make models that surpass the previous iteration (see e.g. recently WizardLM 2 in relation to LLaMA 2). Weights are also to a much larger degree architecturally independent than binaries (you can usually cross train/inference on GPU, Google TPU, Cerebras WSE, etc. with the same weights).

    sweng,

    How is that different then e.g. patching a closed-sourced binary? There are plenty of community patches to old games to e.g. make them work on newer hardware. Architectural independence seems irrelevant, it’s no different than e.g Java bytecode.

    ylai, (edited )

    This is a very shallow analogy. Fine-tuning is rather the standard technical approach to reduce compute, even if you have access to the code and all training data. Hence there has always been a rich and established ecosystem for fine-tuning, regardless of “source.” Patching closed-source binaries is not the standard approach, since compilation is far less computational intensive than today’s large scale training.

    Java byte codes are a far fetched example. JVM does assume a specific architecture that is particular to the CPU-dominant world when it was developed, and Java byte codes cannot be trivially executed (efficiently) on a GPU or FPGA, for instance.

    And by the way, the issue of weight portability is far more relevant than the forced comparison to (simple) code can accomplish. Usually today’s large scale training code is very unique to a particular cluster (or TPU, WSE), as opposed to the resulting weight. Even if you got hold of somebody’s training code, you often have to reinvent the wheel to scale it to your own particular compute hardware, interconnect, I/O pipeline, etc… This is not commodity open source on your home PC or workstation.

    sweng,

    The analogy works perfectly well. It does not matter how common it is. Pstching binaries is very hard compared to e.g. LoRA. But it is still essentially the same thing, making a derivative work by modifying parts of the original.

    ylai, (edited )

    How does this analogy work at all? LoRA is chosen by the modifier to be low ranked to accommodate some desktop/workstation memory constraint, not because the other weights are “very hard” to modify if you happens to have the necessary compute and I/O. The development in LoRA is also largely directed by storage reduction (hence not too many layers modified) and preservation of the generalizability (since training generalizable models is hard). The Kronecker product versions, in particular, has been first developed in the context of federated learning, and not for desktop/workstation fine-tuning (also LoRA is fully capable of modifying all weights, it is rather a technique to do it in a correlated fashion to reduce the size of the gradient update). And much development of LoRA happened in the context of otherwise fully open datasets (e.g. LAION), that are just not manageable in desktop/workstation settings.

    This narrow perspective of “source” is taking away the actual usefulness of compute/training here. Datasets from e.g. LAION to Common Crawl have been available for some time, along with training code (sometimes independently reproduced) for the Imagen diffusion model or GPT. It is only when e.g. GPT-J came along that somebody invested into the compute (including how to scale it to their specific cluster) that the result became useful.

    heckypecky, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

    Seems fair, you pay 1000 for the logo and 600 for the hardware.

    Overzeetop,

    It’s a very nice logo. And it lights up. Hard to argue with their pricing, really.

    ebc,

    It actually doesn’t light up anymore…

    intensely_human,

    For $375 you can get an iFlashlight to point at the logo

    TheFriendlyArtificer,

    Ordered the iFleshlight. Looking forward to seeing the jealous looks I get at the coffee shop.

    Thorny_Insight,

    It’s actually just the display backlight which is why I had to cover it with aluminium tape instead of just disconnecting the wire. Not only don’t I want an ad on my computer I especially don’t want an illuminated one.

    intensely_human,

    Apple laptops and gaming headphones, keeping it classy

    vahtos, to technology in Multi-day DDoS storm batters Internet Archive. Think this is bad? See what Big Media wants to do to us, warns founder

    Losing the Internet Archive would be a huge loss. Unfortunately, greedy companies don’t want us to have nice things.

    GrindingGears,

    It would be a massive loss for sure. One that will be felt for a long time. It’s the only way I can get around our thoroughly enshittified press up here in Canada. I mean I’d gladly pay, if it was worth paying for, which it’s not.

    gregorum, to privacy in End-to-end encryption may be the bane of cops, but they can't close that Pandora's Box

    Yeah, well, they couldn’t “shut it down” before E2E encryption, either, so, obviously, the problem isn’t necessarily the encryption, but that the cops suck at their jobs.

    “We couldn’t really catch them before, but now we can’t real their text messages! Merde!”

    Stop blaming encryption, and do a better job.

    asexualchangeling, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

    For the last time, PC means personal computer, not windows computer, if a mac isn’t a personal computer then what is it?

    kusivittula,

    poop. it’s poop.

    janguv,

    True by the letter but not really by practice. PC is synonymous with a computer running Windows, or Linux at a push. I don’t know whether that’s because of Microsoft’s early market dominance or because Apple enjoys marketing itself as a totally different entity, or some combination of the two. But yeah, usage determines meaning more than what the individual words mean in a more literal sense.

    gnuplusmatt,

    Originally “PC” was IBMPC or PC Compatible (as in compatible with IBM without using their trademark). An IBMPC could have run DOS, Windows or even OS/2

    asexualchangeling,

    It’s funny to me becouse these days with all the remote software reinstallation and asking why you want to close one drive and things, windows isn’t exactly very personal either

    DrownedAxolotl,

    I agree with you, but you know how Apple operates, slapping a shiny new name on an already existing concept and making it sound premium.

    Syldon,
    @Syldon@lemmy.one avatar

    You can install your own software on a personal computer, there is a freedom of choice. Apple tells you what you can install on a Mac. archive.ph/ks4uO

    source link

    asexualchangeling,

    Well then here you go, something more open to install on a modern mac asahilinux.org

    Syldon,
    @Syldon@lemmy.one avatar

    Can you run that outside of a virtual box?

    Will this make Apple Silicon Macs a fully open platform?

    No, Apple still controls the boot process and, for example, the firmware that runs on the Secure Enclave Processor. However, no modern device is “fully open” - no usable computer exists today with completely open software and hardware (as much as some companies want to market themselves as such). What ends up changing is where you draw the line between closed parts and open parts. The line on Apple Silicon Macs is when the alternate kernel image is booted, while SEP firmware remains closed - which is quite similar to the line on standard PCs, where the UEFI firmware boots the OS loader, while the ME/PSP firmware remains closed. In fact, mainstream x86 platforms are arguably more intrusive because the proprietary UEFI firmware is allowed to steal the main CPU from the OS at any time via SMM interrupts, which is not the case on Apple Silicon Macs. This has real performance/stability implications; it’s not just a philosophical issue.

    And wouldn’t it be a lot cheaper to just build your own PC rather than pay the premium for the apple logo?

    asexualchangeling,

    And wouldn’t it be a lot cheaper to just build your own PC rather than pay the premium for the apple logo?

    100%, but I have family that uses Apple, and If I got an old mac from any of them I wouldn’t complain, I’d just quietly instal a new OS

    Syldon,
    @Syldon@lemmy.one avatar

    But you still cannot do it outside of a virtual box right?

    So you will still be at the behest of the AppleOS.

    BarryZuckerkorn,

    Can you run that outside of a virtual box?

    It’s not virtualization. It’s actually booted and runs on bare metal, same as the way Windows runs on a normal Windows computer: a proprietary closed UEFI firmware handles the boot process but boots an OS from the “hard drive” portion of non-volatile storage (usually an SSD on Windows machines). Whether you run Linux or Windows, that boot process starts the same.

    Asahi Linux is configured so that Apple’s firmware loads a Linux bootloader instead of booting MacOS.

    And wouldn’t it be a lot cheaper to just build your own PC rather than pay the premium for the apple logo?

    Apple’s base configurations are generally cheaper than similarly specced competitors, because their CPU/GPUs are so much cheaper than similar Intel/AMD/Nvidia chips. The expense comes from exorbitant prices for additional memory or storage, and the fact that they simply refuse to use cheaper display tech even in their cheapest laptops. The entry level laptop has a 13 inch 2560x1600 screen, which compares favorably to the highest end displays available on Thinkpads and Dells.

    If you’re already going to buy a laptop with a high quality HiDPI display, and are looking for high performance from your CPU/GPU, it takes a decent amount of storage/memory for a Macbook to overtake a similarly specced competitor in price.

    Syldon,
    @Syldon@lemmy.one avatar

    It’s not virtualization. It’s actually booted and runs on bare metal, same as the way Windows runs on a normal Windows computer: a proprietary closed UEFI firmware handles the boot process but boots an OS from the “hard drive” portion of non-volatile storage (usually an SSD on Windows machines). Whether you run Linux or Windows, that boot process starts the same.

    Except the boot process on a non apple PC is open software. You can create custom a bios revision. The firmware on an apple computer is not open source. AFAIK you cannot create a custom bios on an apple computer.

    Apple’s base configurations are generally cheaper than similarly specced competitors, because their CPU/GPUs are so much cheaper than similar Intel/AMD/Nvidia chips.

    No idea what you mean by this. You cannot buy Apple’s hardware due the restrictions Apple places on any purchases. Any hardware you can buy from Apple has a premium.

    Apple leans heavily on the display being good on an Apple but imo it does not make up for the pricing. There is a good guide on better alternatives here.

    If you’re already going to buy a laptop with a high quality HiDPI display, and are looking for high performance from your CPU/GPU, it takes a decent amount of storage/memory for a Macbook to overtake a similarly specced competitor in price.

    I think you mean that Apple uses its own memory more effectively then a windows PC does. Yes it does, but memory is not that expensive to make. To increase the storage space from 256GB to 512 is £200. I can buy a 2TB drive for that. More importantly, it can be replaced when it wears out. Apple give you a replacement price that means you need a new computer.

    Apple computers are designed to make repairs expensive. They may have pseudo adopted the right to repair, but let us see how that goes before believing the hype.

    BarryZuckerkorn,

    Except the boot process on a non apple PC is open software.

    For the most part, it isn’t. The typical laptop you buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware. They all end up supporting the open UEFI standard, but the implementation is usually closed source. Having the ability to flash new firmware that is mostly open source but with closed source binary blobs (like coreboot) or fully open source (like libreboot) gets closer to the hardware at startup, but still sits on proprietary implementations.

    There’s some movement to open source more and more of this process, but it’s not quite there yet. AMD has the OpenSIL project and has publicly committed to open sourcing a functional firmware for those chips by 2026.

    Asahi uses the open source m1n1 bootloader to load a U-boot to load desktop Linux bootloaders like GRUB (which generally expect UEFI compatibility), as described here:

    • The SecureROM inside the M1 SoC starts up on cold boot, and loads iBoot1 from NOR flash
    • iBoot1 reads the boot configuration in the internal SSD, validates the system boot policy, and chooses an “OS” to boot – for us, Asahi Linux / m1n1 will look like an OS partition to iBoot1.
    • iBoot2, which is the “OS loader” and needs to reside in the OS partition being booted to, loads firmware for internal devices, sets up the Apple Device Tree, and boots a Mach-O kernel (or in our case, m1n1).
    • m1n1 parses the ADT, sets up more devices and makes things Linux-like, sets up an FDT (Flattened Device Tree, the binary devicetree format), then boots U-Boot.
    • U-Boot, which will have drivers for the internal SSD, reads its configuration and the next stage, and provides UEFI services – including forwarding the devicetree from m1n1.
    • GRUB, booting as a standard UEFI application from a disk partition, works like GRUB on any PC. This is what allows distributions to manage kernels the way we are used to, with grub-mkconfig and /etc/default/grub and friends.
    • Finally, the Linux kernel is booted, with the devicetree that was passed all the way from m1n1 providing it with the information it needs to work.

    If you compare the role of iBoot (proprietary Apple code) to the closed source firmware in the typical Dell/HP/Acer/Asus/Lenovo booting Linux, you’ll see that it’s basically just line drawing at a slightly later stage, where closed-source code hands off to open-source code. No matter how you slice it, it’s not virtualization, unless you want to take the position that most laptops can only run virtualized OSes.

    I think you mean that Apple uses its own memory more effectively then a windows PC does.

    No, I mean that when you spec out a base model Macbook Air at $1,199 and compare to similarly specced Windows laptops, whose CPUs/GPUs can deliver comparable performance on benchmarks, and a similar quality display built into the laptop, the Macbook Air is usually cheaper. The Windows laptops tend to become cheaper when you’re comparing Apple to non-Apple at higher memory and storage (roughly 16GB/1TB), but the base model Macbooks do compare favorably on price.

    Syldon,
    @Syldon@lemmy.one avatar

    The typical laptop you buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware.

    FTFY: The typical laptop MOST buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware. Though, I totally agree there are some PC suppliers with shitty practises. Where we disagree is that is if the firmware is fixed by the hardware manufacturer, then you have control over everything on the system. It is only when you have control of the base functionality of the system that you can say you are in charge. This may be too literal for you, but I just see that as a trust level you have in the manufacturer not to abuse that control.

    As for the comparison I disagree.

    This is a £1400 laptop from scan V’s £1500 macbook air currently.

    17 inch screen (2560X1440) over the 15.3 inch (2880X1864)

    16gb memory - 8GB upgrade to 16gb=+£200

    1TB SSD over 256GB (upgrade to 1Tb=+£400)

    8 full core/16t CPU (AMD5900hx) over an 8 core non hyperx cpu, 4 cores are cheaper variants.

    All of the PC components can be upgraded at the cost of the part + labour. Everything on the Apple will cost the same price as a new computer to replace. Mainly because it is all soldered onto the board to make it harder to replace.

    BarryZuckerkorn,

    This is a £1400 laptop from scan V’s £1500 macbook air currently.

    Ah, I see where some of the disconnect is. I’m comparing U.S. prices, where identical Apple hardware is significantly cheaper (that 15" Macbook Air starts at $1300 in the U.S., or £1058).

    And I can’t help but notice you’ve chosen a laptop with a worse screen (larger panel with lower resolution). Like I said, once you actually start looking at High DPI screens on laptops you’ll find that Apple’s prices are actually pretty cheap. 15 inch laptops with at least 2600 pixels of horizontal resolution generally start at higher prices. It’s fair to say you don’t need that kind of screen resolution, but the price for a device with those specs is going to be higher.

    The CPU benchmarks on that laptop’s CPU are also slightly behind the 15" Macbook Air, too, even held back by not having fans for managing thermals.

    There’s a huge market for new computers that have lower prices and lower performance than Apple’s cheapest models. That doesn’t mean that Apple’s cheapest models are a bad price for what they are, as Dell and Lenovo have plenty of models that are roughly around Apple’s price range, unless and until you start adding memory and storage. Thus, the backwards engineered pricing formula is that it’s a pretty low price for the CPU/GPU, and a very high price for the Storage/Memory.

    All of the PC components can be upgraded at the cost of the part + labour.

    Well, that’s becoming less common. Lots of motherboards are now relying on soldered RAM, and a few have started relying on soldered SSDs, too.

    Syldon,
    @Syldon@lemmy.one avatar

    Amazon has this one for $1200. I would still pay the extra for the features over an Applemac.

    I can’t help but notice you’ve chosen a laptop with a worse screen (larger panel with lower resolution).

    I would choose a larger screen over that marginal difference in dpi every day of the week. People game on TV screens all the time with lower resolution because it is better.

    The CPU benchmarks on that laptop’s CPU are also slightly behind the 15" Macbook Air, too, even held back by not having fans for managing thermals.

    You cannot compare an app that runs on two different OS. That is just plain silly. Cinebench only tests one feature of a system. That is the CPU to render a graphic. Apple is built around displaying graphics. A PC is a lot more versatile. There is more to a system than one component. Let’s see you run some raytracing benchmarks on that system.

    Well, that’s becoming less common. Lots of motherboards are now relying on soldered RAM

    I wouldn’t buy one. You will always find some idiotic willing victim. In the future though ram is moving to the CPU as a package, but that will be done for speed gains. Until then only a bloody fool would buy into this.

    An apple system has one major benefit over a PC system - battery life. Other than that I would not recommend one, even then I would give stern warnings over repair costs.

    BarryZuckerkorn,

    I would choose a larger screen over that marginal difference in dpi every day of the week.

    Yes, but you’re not addressing my point that the price for the hardware isn’t actually bad, and that people who complain would often just prefer to buy hardware with lower specs for a lower price.

    The simple fact is that if you were to try to build a MacBook killer and try to compete on Apple’s own turf by matching specs, you’d find that the entry level Apple devices are basically the same price as other laptops you could configure with similar specs, because Apple’s baseline/entry level has a pretty powerful CPU/GPU and high resolution displays. So the appropriate response is not that they overcharge for what they give, but that they make choices that are more expensive for the consumer, which is a subtle difference that I’ve been trying to explain throughout this thread.

    You cannot compare an app that runs on two different OS.

    Why not? Half of the software I use is available on both Linux and MacOS, and frankly a substantial amount of what most people do is in browser anyway. If the software runs better on one device over another, that’s a real world difference that can be measured. If you’d prefer to use Passmark or whatever other benchmark you’d like you use, you’ll still see be able to compare specific CPUs.

    Syldon,
    @Syldon@lemmy.one avatar

    you’re not addressing my point that the price for the hardware isn’t actually bad,

    I disagree. It is not only that the hardware is cheaper and a lower spec with the exception of the CPU, the design is geared around making upgrades and repairs near impossible or unfeasible. Software has much more support on a Windows OS. Video editing has been bread and butter for many years now, but Windows has caught up due to improvements in hardware and software. In my mind this negates the case for buying a Mac currently, but I can easily see it was a good buy in the past.

    The outlier is Macs are good in battery life. Therefore there is a niche market that is an exceptionally good return on your investment.

    Why not? Half of the software I use is available on both Linux and MacOS, and frankly a substantial amount of what most people do is in browser anyway. If the software runs better on one device over another, that’s a real world difference that can be measured. If you’d prefer to use Passmark or whatever other benchmark you’d like you use, you’ll still see be able to compare specific CPUs.

    Because you cannot use Cinebench unless you are comparing the same system setup. Comparing two OSs is just stupid and cherry picking. Apple has a very trimmed down OS compared to the complexity of Windows. Apple OS dumps the need for legacy code with a closed system designed for specific hardware. Windows still caters for code written for DX CPUs under x86 architecture. This as well as the many other reasons why not. I noticed you ignored my offer of comparing back to back raytracing result, and now fail to even mention it.

    You are obviously enamoured by the Apple model, I am not. There really is nothing that you could say that would convince me otherwise. I will wish you good day, and hope you agree to disagree.

    schnurrito,

    I think the history is such that a “PC” is a computer compatible with the “IBM PC” which Macs were historically not and modern ones aren’t either.

    But I still like “Windows computer”, we can abbreviate that to “WC”.

    tal,
    @tal@lemmy.today avatar

    Another complication was that DOS-using machines weren’t always running Windows at one point in time.

    sanzky,

    I doubt it’s the last time. also while “PC” means personal computer, it was a very specific brand name by IBM, not a general purpose term. their computers (and clones later) became synonymous with x86-windows machines.

    Even apple themselves have always distanced themselves from the term (I’m a Mac, and I’m a PC…).

    boredsquirrel, to linux in Fedora 40 boasts more spins and flavors than ever

    Tbh I am fully behind KDE as flagship desktop. Dealing with GNOME users problems all day in the forum, KDE is just better for usability?

    GNOME is reduced over the amount that makes sense. KDE could use a bit of reduction, but not as much as GNOMEs. People need the Terminal or random extensions for basic things, this is not a good experience.

    On the other hand, GNOME and KDE both have really nice features, GNOME with their Microsoft integrations being particularly powerful (their account system works at all, unlike KDEs which I think nobody uses. But when using Thunderbird, which has standalone Exchange support, you dont use that account system anyways so it doesnt matter again).

    Also GNOME has like all their apps on Flathub. GNOME Boxes is particularly crazy, having sandboxed virtualization. This means you can mix match GNOME Flatpaks on a KDE desktop without any problems, KDE even handles the theming for you. On GNOME on the other hand… it actively breaks Qt apps, its insane.

    So I think GNOME has some great apps (snapshot, decoder, simplescan, carburetor, celluloid …) but you can install them anywhere.

    barbara,

    GNOME looks better out of the box and configuring KDE can be very tricky. There are also a lot of outdated “addons” for KDE and you need some in order to get what you want. extensions are better integrsted in KDE but it’s not like KDE has everything out of the box. I’d love to see more KDE support.

    boredsquirrel,

    True. KDEs virtual desktops are also basically unusable for me, idk I just dont see them so they are not used.

    There are pros and cons. Its simply a tie, I stay with KDE because the lack of some things (like close buttons with the hitbox in the very edge) would annoy me.

    jjlinux,

    This is my issue with KDE. Virtual Desktops are too unnecessarily convoluted to use. Even Alt-Tabbing is a pain if you have anything over 1 single workspace. I decided to daily drive KDE for a few months to give it a good chance, because before I would usually just go back to Gnome after a few days. It’s been 2 months now, and I don’t think I can take much more of it.

    boredsquirrel,

    Their Plasma 6 overview is great, just needs the panel displayed or even an app menu and it could be similar to GNOME.

    jjlinux,

    I actually tweaked it to be more “gnome-like”, but the desktops are a hot mess. At the end of the day, it’s a matter of taste, and I’m a huge fan of Gnome’s simplicity.

    octopus_ink,

    I don’t really get this but I’m going to assume it’s that my workflow is just different than yours.

    I have keyboard shortcuts I’m happy with that let me navigate my virtual desktops as desired and place widows on them. If I wasn’t happy with those shortcuts I could change them. I can see having different preferences, or etc, but what makes it a hot mess exactly?

    jjlinux,

    When I Alt-tab it always goes to the apps open on the next desktop, and never shows the apps on the current desktop. So, say I have Vivaldi and KWrite on desktop 1, and Brave and LibreOffice Calc on desktop 2.

    If I’m on desktop 1 on Vivaldi and Alt-tab, it’ll move to Desktop 2 and move between Brave and Calc, and but will never show anything from Desktop 1, until I release the Alt key and Alt-tab again.

    Now, for me it’s even worse since I have 3 Desktops instead of 2.

    octopus_ink,

    Have you dug into the options at all? If I’m visualizing what you are describing correctly, I think spending some time here should solve your issues.

    edit - specifically the options in the lower right

    https://lemmy.ml/pictrs/image/d55921c4-90bd-4451-b052-b302779d3d17.png

    andrewd18,

    I think KDE looks great out of the box, includes all the extensions I want, and is easy to configure.

    barbara,

    That’s good :)

    Sentau,

    includes all the extensions I want

    This is what people dont get. Different DEs best serve different people. We should always push to have a better experience but sniping between DEs makes no sense

    Vincent,

    Dealing with GNOME users problems all day in the forum, KDE is just better for usability?

    It seems not unimaginable that whichever is more popular (/the default) will have more people reporting problems in the forum, regardless of how good it is?

    boredsquirrel,

    Yeah okay. I dont deny that I would also prefer maintaining and QA-ing GNOME over KDE, as its just so much smaller.

    But stuff like “there are no right click options for zip” are pretty crazy. Or the total lack of templates by default, for stuff like text files.

    helenslunch,
    @helenslunch@feddit.nl avatar

    I think Gnome is great. I use KDE on my Steam Deck and it’s fine, but very dated and ugly. Looks too much like Windows. Same reason I wont recommend Mint.

    boredsquirrel,

    Agree on the looks. Even though GNOME is literally a “no blur” macOS clone, which I also dont find really inspired

    Sentau,

    My father uses a mac and it is plenty different. Maybe the design philosophy of MacOS and GNOME are similar but the implementation is very different.

    boredsquirrel,

    What is different? I think GNOME diverged a bit more, by removing window buttons, desktop icons, the dock etc. And they dont use blur and transparency at all.

    But with dash to dock, blur my shell and some decoration manipulation changer it is very similar.

    Not that I dont think this makes sense (I dont, as having a dock but also a top panel wastes space) but it is not really a unique workflow

    Hadriscus,

    Removing window buttons ? the trio of buttons for controlling window size ? or is this something else

    boredsquirrel,

    Yep. And removing the maximize button doesnt even make sense, apart from “looking better”. Not everyone can easily double click I guess

    Hadriscus,

    but what. This is completely dumb. How do you do those actions then ?

    boredsquirrel,

    Double click somewhere on the oversized titlebar

    Hadriscus,

    But there’s 3 actions right ? is there a way to minimize and close too ? triple click ? that sounds so counter functional on paper. I guess I’d have to try it

    boredsquirrel,

    There is a close button, thats it.

    You wont believe me but minimize is not a thing as there is no panel or dock. You open stuff, move it somewhere else and you will never use a dock as a container, just as a quicklauncher.

    I think that is fair, but it for sure forces many people to adapt their workflows.

    Sentau,

    Well the way the workspaces and the overview work is completely different which means that workflow is night and day different. Not to mention how the differences in how floating windows work, what role the top panel plays and things like that.

    They might look similar just like how KDE ‘looks’ similar to windows but that is only true at the surface level. The way the desktops behave and hence the workflow is very different in each case

    boredsquirrel,

    Okay that may be true. GNOME is very usable (with extensions), macos is hell

    TheGrandNagus,

    I never understand the “Gnome is a MacOS clone” thing.

    Other than a black bar at the top which has the time and a few system icons, what to they really have in common?

    The workflow is entirely different, the dock is almost always hidden in Gnome, MacOS has no activities view, Gnome doesn’t even use the icon in the top left as a start-menu.

    boredsquirrel, (edited )

    Yes it is MacOS with the dock hidden. And without window buttons. And they are not on the left and not damn colorblind unfriendly.

    I mean the top bar is the exact same, the app drawer, the workspaces. The quicksettings. They just removed even more stuff.

    Edit: there are many things about them that are different, but the overall design seems similar to me. I think GNOME is way more usable and makes more sense. But still, having a top bar already is kinda odd and I think using that already makes you “macOS like”.

    TheGrandNagus,

    No it isn’t.

    The top bar isn’t the exact same, it’s extremely different. Gnome doesn’t use a global menu, doesn’t have a start menu, doesn’t have the clock on the right. The only similarity is the bar being at the top and containing stuff like WiFi and battery icons.

    The window decorations are different. The UI looks different. Gnome doesn’t have a permanent dock, doesn’t have stuff on the desktop. Window management works in a very different way, MacOS doesn’t have the activities view, etc.

    They are not alike.

    mexicancartel,

    Nah Mac OS looks far more ugly than GNOME imo

    lemmytellyousomething, to privacy in 96% of US Hospital Websites Share Visitor Data with Google, Meta, Data Brokers, and Other Third Parties, Study Finds

    Glad to live in Europe…

    OpenStars, to privacy in Majority of Americans now use ad blockers
    @OpenStars@startrek.website avatar

    What I wonder is… how?! A quick search shows that half of people in the USA use Chrome, another 30% Safari, 8% use Edge, and only 5% Firefox. This study was done by Ghostery so perhaps they chose a biased subset of the population? It just seems weird to me to think that more than half of average users use ad-blocking, these days.

    viking,
    @viking@infosec.pub avatar

    My mom knows nothing about adblock, and is still blocking ads. You better believe all of the kids having to fix their relative’s computers will set up some free antivirus and ad blocking right away.

    Can’t comment on the sample size though, Ghostery might indeed be somehow biased and measure devices where their software is installed vs. total number of internet users or something? But users of ghostery are more likely to be tech savvy, so there’s a higher chance of them having more devices that are equally sanitized.

    I’d have to dig through the study and see if the sampling mechanism is made public.

    OpenStars,
    @OpenStars@startrek.website avatar

    Yes it is available. It in turn points to another site Censuswide, but does say:

    The figures are representative of all US adults aged 16+

    morrowind,
    @morrowind@lemmy.ml avatar

    will set up some free antivirus and ad blocking right away.

    Those mfs have got a way to go if they’re setting up free antiviruses. Free anti-virsus will hurt your system probably more on average than actual viruses

    viking,
    @viking@infosec.pub avatar

    Bitdefender Free is great and doesn’t nag users to upgrade.

    min_fapper,

    Just use Windows defender already. It’s been good enough for ages. All the others downplay this to justify their existence.

    Outtatime,
    @Outtatime@sh.itjust.works avatar

    Exactly. If you run windows, the default defender anti virus is just as capable. Don’t use 3rd party anti virus software or use the"free" ones.

    viking,
    @viking@infosec.pub avatar

    I have an inherent distrust to all things Microsoft. And their firewall is so terrible that I don’t want to find out they were as negligent when it came to developing their antivirus.

    Zerush,
    @Zerush@lemmy.ml avatar

    Some years ago, the Windows Defender certainly was a joke, but currently is very capable with an detection quote of 100%. A cause that Windows, as the most used OS was always also the most atacked by malware, but the devs of MS at least had made a good job. Windows is certainly an privacy nightmare, at least if used in default settings, but in question of security is currently maybe the best protected with safe boot, a good sandbox system and Defender, and, well, the Firewall is good, but sometimes overreacting with the need to whitelist some downloads and apps. But at all, there isn’t anymor need for 3rd party AVs.

    https://file.coffee/u/LD0kX40fx45fga_HCvw_j.png

    0oWow,

    Why are people recommending Microsoft spyware in a privacy thread??

    HumanPerson,

    The better option is not to use windows at all, but if you are, I don’t think disabling windows defender will stop them from getting whatever they want anyway.

    Zerush,
    @Zerush@lemmy.ml avatar

    Here we are spaking about that this Spyware is pretty resistant against all kind of malware, not about that it’s needed to gut it from all kind of telemetries, bloatware and not needed services before the first use, that is another thing.

    TheAnonymouseJoker,

    Kaspersky Free is top grade stuff. Bitdefender Free is good but has false positives. Defender is a joke against ransomware and without internet connection. Rest are bad.

    Fleppensteijn,
    @Fleppensteijn@feddit.nl avatar

    According to statistics on my server, it’s 57% Chrome, 14% Safari and 12% Firefox. Also 10% use Linux. I’m not hosting anything tech related though.

    Anyways, adblocking is kind of essential. Even the boomers ask what’s wrong when ads start showing. The only people I’ve seen browsing without adblock are Apple users.

    stewie3128,

    iPhone or Mac?

    Vendetta9076,
    @Vendetta9076@sh.itjust.works avatar

    Yes

  • All
  • Subscribed
  • Moderated
  • Favorites
  • fightinggames
  • All magazines