theregister.com

Moonrise2473, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

Instead I feel it’s the opposite because that memory is shared with the GPU. So if you’re gaming even with some old game, it’s like having 4gb for the system and 4gb to the GPU. They might claim that their scheduler is magic and can predict memory usage with perfect accuracy but still, it would be like 6+2 GB. If a game has heavy textures they will steal memory from the system. Maybe you want to have a browser for watching a tutorial on YouTube during gaming, or a chat. That’s another 1-2 gb stolen from the CPU and GPU.

Their pricing for the ram is ridiculous, they’re charging $300 for just 8gb of additional memory! We’re not in the 2010s anymore!

echodot,

Maybe you want to have a browser for watching a tutorial on YouTube during gaming, or a chat. That’s another 1-2 gb stolen from the CPU and GPU.

Or five times that amount if you’re running Chrome

pbjamm,
@pbjamm@beehaw.org avatar

The most expensive 8GB DDR5 stick I can find on Amazon is about us$35. There are 64GB sets that are under us$200!

Apple should be ashamed.

Blackmist, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

Just upgrade the RAM yourself.

Oh wait, you can’t because it’s 2023 and it’s become inexplicably acceptable to solder it to the motherboard.

monsieur_jean,

Not even soldered, it's part of the CPU/GPU die now.

Blackmist,

Ah yes, it’s the SSD that’s soldered.

Just 300 of your English pounds to upgrade from 512GB to 1TB.

Meanwhile, a 2TB drive at PS5 speeds is under £100.

For unupgradable kit, the pricing is grotesque.

DaDragon,

I mean, the NAND chips can be replaced fairly effectively if you know what you’re doing

Kidplayer_666,

Actually no. There’s some pairing trickery going on on the SoC level, so if you change the NAND chips by higher capacity ones without apple’s special sauce, you’ll just get an unbootsble system

DaDragon,

I was under the impression that had been solved by third parties? Or is chip cloning not enough?

Skirmish,

And paging in & out of RAM frequently is probably one of the quickest ways to wear out the NAND.

Put it all together and you have a system that breaks itself and can’t be repaired. The less RAM you buy the quicker the NAND will break.

NattyNatty2x4,

Apple has put a lot of effort into (successfully) creating a customer-base that thinks overpriced goods and different colored texts make them in a special club, I’m not surprised that an exec thought this excuse would fly

monsieur_jean,

It's a bit more complex than that (and you probably know it).

When you enter the Apple ecosystem you basically sign a contract with them : they sell you overpriced goods, but in exchange you get a consistent, coherent and well thought-out experience across the board. Their UX is excellent. Their support is good. Things work well, applications are easy to use and pretty stable and well built. And if they violate your privacy like the others, at least they don't make the open-bar sale of your data their fucking business model (wink wink Google).

Of course you there's a price to pay. Overpriced products, limited UI/UX options, no interoperability, little control over your data. And when there's that one thing that doesn't work, no luck. But your day to day life within the Apple ecosystem IS enjoyable. It's a nice golden cage with soft pillows.

I used to be a hardcore PC/Linux/Android user. Over the last few years I gradually switched to a full Apple environment : MacBook, iPhone, iPad... I just don't have time to "manage" my hardware anymore. Nor the urge to do it. I need things to work out of the box in a predictable way. I don't want a digital mental load. Just a simple UX, consistency across my devices and good apps (and no Google, fuck Google). Something I wouldn't have with an Android + PC setup. :)

The whole "special club" argument is bullshit, and I hope we grow out of it. Neither the Apple nor the Google/Microsoft environments are satisfactory. Not even speaking of Linux and FOSS. We must aim higher.

rwhitisissle, (edited )

I’m gonna have to argue against a few of these points:

When you enter the Apple ecosystem you basically sign a contract with them : they sell you overpriced goods, but in exchange you get a consistent, coherent and well thought-out experience across the board.

Consistent: yes. Every Apple device leverages a functionally very similar UI. That said, the experience is, in my opinion, not very coherent or well thought out. Especially if you are attempting to leverage their technology from the standpoint of someone like a Linux power user. The default user experience is frustratingly warped around the idea that the end user is an idiot who has no idea how to use a terminal and who only wants access to the default applications provided with the OS.

Things work well

Things work…okay. But try installing, uninstalling, and then reinstalling a MySQL DB on a macbook and then spend an hour figuring out why your installation is broken. Admittedly, that’s because you’re probably installing it with Homebrew, but that’s the other point: if you want to do anything of value on it, you have to use a third party application like Homebrew to do it. The fact that you have to install and leverage a third party package manager is unhinged for an ecosystem where everything is so “bundled” together by default.

Of course you there’s a price to pay. Overpriced products, limited UI/UX options, no interoperability, little control over your data. And when there’s that one thing that doesn’t work, no luck. But your day to day life within the Apple ecosystem IS enjoyable. It’s a nice golden cage with soft pillows.

I guess the ultimate perspective is one in which you have to be happy surrendering control over so much to Apple. But then again, you could also just install EndeavorOS with KDE Plasma or any given flavor of Debian distribution with any DE of your choice, install KDE Connect on your PC and phone, and get 95 percent of the experience Apple offers right out of the box, with about 100x the control over your system.

I used to be a hardcore PC/Linux/Android user. Over the last few years I gradually switched to a full Apple environment : MacBook, iPhone, iPad… I just don’t have time to “manage” my hardware anymore.

I don’t know of anyone who would describe themselves as a hardcore “PC/Linux user,” or what this means to you. I’m assuming by PC you mean Windows. But people who are really into Linux generally don’t like MacOS or Windows, and typically for all the same reasons. I tolerate a Windows machine for video game purposes, but if I had to use it for work I’d immediately install Virtualbox and work out of a Linux VM. For the people who are really into Linux, the management of the different parts of it is, while sometimes a pain in the ass, also part of the fun. It’s the innate challenge of something that can only be mastered by technical proficiency. If that’s not for you, totally fine.

The whole “special club” argument is bullshit, and I hope we grow out of it.

It’s less argument and more of a general negative sentiment people hold towards Apple product advocates. You can look up the phenomenon of “green bubble discrimination.” It’s a vicious cycle in which the ecosystem works seamlessly for people who are a part of it, but Apple intentionally makes leaving that ecosystem difficult and intentionally draws attention to those who interact with the people inside of it who are not part of it. Apple products also often are associated with a higher price tag: they’re status symbols as much as they are functional tools. People recognize a 2000 dollar Macbook instantly. Only a few people might recognize a comparably priced Thinkpad. In a lot of cases, they’ll just assume the Macbook was expensive and the non-Macbook was cheap. And you might say, “yeah, but that’s because of people, not because of Mac.” But it would be a lie to say that Apple isn’t a company intensely invested in brand recognition and that it doesn’t know it actively profits from these perceptions.

monsieur_jean,

Everything you say is what past me would have answered ten years ago, thinking current me is an idiot. Yet here we are. ;)

You are right and make good points. But you are not 99% of computer users. Just considering installing a linux distro puts you in the top 1% most competent.

(Speaking of which, I still have a laptop running EndeavourOS + i3. Three months in my system is half broken because of infrequent updates. I could fix it, I just don't have the motivation to do so. Or the time. I'll probably just reinstall Mint.)

rwhitisissle,

Everything you say is what past me would have answered ten years ago, thinking current me is an idiot. Yet here we are. ;)

Wow. Talk about coincidences…

you are not 99% of computer users. Just considering installing a linux distro puts you in the top 1% most competent.

I’m a dumbass and if I can do it anyone can. But, yes, technology is a daunting thing to most people. Intuition and experience go far. That said, it’s literally easier today than it ever has been. You put in the installation usb, click next a whole bunch, reboot, and you have a working machine. Is it sometimes more complicated than that and you have to do BIOS/UEFI bullshit? Sure, but past that hurdle it’s smooth sailing.

(Speaking of which, I still have a laptop running EndeavourOS + i3. Three months in my system is half broken because of infrequent updates. I could fix it, I just don’t have the motivation to do so. Or the time. I’ll probably just reinstall Mint.)

Ah, the joys of rolling release distros. Endeavor has been stable for me so far. I’m running it on an X1 Thinkpad. Generally works more reliably than my own vanilla arch installs and more low profile tiling window managers. I’ve found myself sticking to KDE Plasma for a DE because it’s so consistent and has enough features to keep me happy without having to spend all my time fine tuning my own UX, which I just don’t care about. My realization has been that arch distros are best suited for machines running integrated graphics and popular DEs, rather than ones with separate cards and more niche or highly customizable DEs. Prevents you from having to futs about with things like Optimus, with graphics drivers being the primary cause of headaches for that distro, per my experience. That said, I used to run an old Acer laptop with arch and a tiling window manager called qtile. Qtile was great, but every other update completely altered the logic and structure of how it read the config file for it, so the damn thing broke constantly. I’m like…just decide how you want the config to look and keep that. Or at least allow for backwards compatibility. But they didn’t.

mvpts,

Unbelievable …

RickRussell_CA,

It’s not “inexplicable”.

DIMM mounting brackets introduce significant limitations to maximum bandwidth. SOC RAM offers huge benefits in bandwidth improvement and latency reduction. Memory bandwidth on the M2 Max is 400GB/second, compared to a max of 64GB/sec for DDR5 DIMMs.

It may not be optimizing for the compute problem that you have, and that’s fine. But it’s definitely optimizing for compute problems that Apple believes to be high priority for its customers.

Send_me_nude_girls, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC
@Send_me_nude_girls@feddit.de avatar

The best part is people complaining to them for pointing out that 8gb is laughable little. Ah, the sweet fanboys.

miss_brainfart, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC
@miss_brainfart@lemmy.ml avatar

Pairing a chip this capable with just 8GB of shared memory is also just a waste of good silicon. Which makes the price all the more insulting to me.

Like, this is the equivalent of Usain Bolt losing one of his legs

smileyhead,

“His one leg is still more capable than regular person’s two legs”

miss_brainfart,
@miss_brainfart@lemmy.ml avatar

That is exactly what Apple would say, isn’t it

echodot,

The thing is even if that were true, which it isn’t, I’d still prefer him with two legs. Especially if I’m paying the amount of money I would normally pay for 50 legs.

Somewhat stretching the analogy there

miss_brainfart,
@miss_brainfart@lemmy.ml avatar

They could just sell appropriately specced computers and make absolute bank like they do anyway, but nooo, that would be too nice of them.

echodot,

I don’t necessarily even begrudge them a profit margin on RAM. Sure it’s kind of a scam but also I guess it’s just the price you pay for convenience. If you want to better price you upgrade the RAM yourself (assuming that was actually possible).

But the markup they have on RAM isn’t reasonable it’s totally insane.

If you went to McDonald’s and cheeseburger was $0.99 and then a cheeseburger with extra cheese was $2 do you think was something was up but that’s essentially what Apple are doing. Cheese does not cost $1.99, you are literally almost doubling the price for a subcomponent

miss_brainfart,
@miss_brainfart@lemmy.ml avatar

Apple fans have a very different definition of the word convenience than I do, then.

It’s so annoying. They have the whole design industry by their balls with their great displays and perfect colour management in MacOS.

Putting more RAM in those models, or just cutting the lower-end models out entirely would do them no harm at all.

toothpicks,

It’s always nice to have as many legs as possible. I love legs

janguv,

Somewhat stretching the analogy there

Your analogy is looking a bit leggy at this point.

thingsiplay, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC
@thingsiplay@kbin.social avatar

I felt getting ripped off by just reading the article. My recent PC build has 32 GB, is cheaper and the upgrade to 64 GB (meaning additional pair of 16 GB) only costs me around 100 Euros. It's nice that their devices are probably more effective and need less RAM, which the iPhones proved to be correct. But that does not mean the cost of the additional RAM units are more expensive. Apple chose to make them expensive.

echodot, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

Apple exec doesn’t actually understand how computers work and think that that actually might be a reasonable arguement.

It doesn’t matter how good your processor is if you can only bank 8 GB of something into memory it’s going to be slow. The only way an 8 GB device would beat a 16 GB device would be if the 16 GB device had the world’s slowest processor. Like something from 2005. Taking stuff out of RAM is the single slowest operation you can perform other than loading from a hard drive.

LoganNineFingers,

Apple exec doesn’t actually understand how computers work and think that that actually might be a reasonable arguement

I think a lot of Apple users fit this bill too so it doesn’t matte much if this is the messaging, a fair amount of people will believe it.

narc0tic_bird, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

Do they store 32-bit integers as 16-bit internally or how does macOS magically only use half the RAM? Hint: it doesn’t.

Even if macOS was more lightweight than Windows - which might well be true will all the bs processes running in Windows 11 especially - third party multiplatform apps will use similar amounts of memory no matter the platform they run on. Even for simple use cases, 8 GB is on the limit (though it’ll likely still be fine) as Electron apps tend to eat RAM for breakfast. Love it or hate it Apple, people often (need to) use these memory-hogging apps like Teams or even Spotify, they are not native Swift apps.

I love my M1 Max MacBook Pro, but fuck right off with that bullshit, it’s straight up lying.

Kazumara,

Pied Piper middle out compression for your RAM

But seriously it’s so ridiculous especially since he said it in an interview with a machine learning guy. Exactly the type of guy who needs a lot of RAM for his own processes working on his own data using his own programs. Where the OS has no control over precision, access patterns or the data streaming architecture.

echodot,

Apple executives haven’t actually been computed guys for years now. They’re all sales and have no idea how computers work. They constantly saying stupid things that make very little sense, but no one ever calls them on it because Apple.

abhibeckert, (edited )

Do they store 32-bit integers as 16-bit internally or how does macOS magically only use half the RAM? Hint: it doesn’t.

As a Mac programmer I can give you a real answer… there are three major differences… but before I go into those, almost all integers in native Mac apps are 64 bit. 128 bit is probably more common than 32.

First of all Mac software generally doesn’t use garbage collection. It uses “Automatic Reference Counting” which is far more efficient. Back when computers had kilobytes of RAM, reference counting was the standard with programmer painstakingly writing code to clear things from memory the moment it wasn’t needed anymore. The automatic version of that is the same, except the compiler writes the code for you… and it tends to do an even better job than a human, since it doesn’t get sloppy.

Garbage collection, the norm on modern Windows and Linux code, frankly sucks. Code that, for example, reads a bunch of files on disk might store all of those files in RAM for for ten seconds even if it only needs one of them in RAM at a time. That burn be 20GB of memory and push all of your other apps out into swap. Yuck.

Second, swap, while it’s used less (due to reference counting), still isn’t a “last resort” on Macs. Rather it’s a best practice to use swap deliberately for memory that you know doesn’t need to be super fast. A toolbar icon for example… you map the file into swap and then allow the kernel to decide if it should be copied into RAM or not. Chances are the toolbar doesn’t change for minutes at a time or it might not even be visible on the screen at all - so even if you have several gigabytes of RAM available there’s a good chance the kernel will kick that icon out of RAM.

And before you say “toolbar icons are tiny” - they’re not really. The tiny favicon for beehaw is 49kb as a compressed png… but to draw it quickly you might store it uncompressed in RAM. It’s 192px square and 32 bit color so 192 x 192 x 32 = 1.1MB of RAM for just one favicon. Multiply that by enough browser tabs and… Ouch. Which is why Mac software would commonly have the favicon as a png on disk, map the file into swap, and decompress the png every time it needs to be drawn (the window manager will keep a cache of the window in GPU memory anyway, so it won’t be redrawn often).

Third, modern Macs have really fast flash memory for swap. So fast it’s hard to actually measure it, talking single digit microseconds, which means you can read several thousand files off disk in the time it takes the LCD to refresh. If an app needs to read a hundred images off swap in order to draw to the screen… the user is not going to notice. It will be just as fast as if those images were in RAM.

Sure, we all run a few apps that are poorly written - e.g. Microsoft Teams - but that doesn’t matter if all your other software is efficient. Teams uses, what, 2GB? There will be plenty left for everything else.

Of course, some people need more than 8GB. But Apple does sell laptops with up to 128GB of RAM for those users.

rasensprenger, (edited )

Almost all programs use both 32bit and 64bit integers, sometimes even smaller ones, if possible. Being memory efficient is critical for performance, as L1 caches are still very small.

Garbage collection is a feature of programming languages, not an OS. Almost all native linux software is written in systems programming languages like C, Rust or C++, none of which have a garbage collector.

Swap is used the same way on both linux and windows, but kicking toolbar items out of ram is not actually a thing. It needs to be drawn to the screen every frame, so it (or a pixel buffer for the entire toolbar) will kick around in VRAM at the very least. A transfer from disk to VRAM can take hundreds of milliseconds, which would limit you to like 5 fps, no one retransfers images like that every frame.

Also your icon is 1.1Mbit not 1.1MB

I have a gentoo install that uses 50MB of ram for everything including its GUI. A webbrowser will still eat up gigabytes of ram, the OS has literally no say in this.

narc0tic_bird,

My 32/16 bit integer example was just that: an example where one was half the size as the other. Take 128/64 or whatever, doesn’t matter as it doesn’t work like that (which was my point).

Software written in non-GC based languages runs on other operating systems as well.

I used MS Teams as an example, but it’s hardly an exception when it comes to Electron/WebView/CEF apps. You have Spotify running, maybe a password manager (even 1Password uses Electron for its GUI nowadays), and don’t forget about all the web apps you have open in the browser, like maybe GMail and some Google Docs spreadsheet.

And sure, Macs have fast flash memory, but so do PC notebooks in this price range. A 990 Pro also doesn’t set you back $400 per terabyte, but more like … $80, if even that. A fifth. Not sure where you got that they are so fast it’s hard to measure.

There are tests out there that clearly show why 8 GB are a complete joke on a $1600 machine.

So no, I still don’t buy it. I use a desktop Windows/Linux machine and a MacBook Pro (M1 Max) and the same workflows tend to use very similar amounts of memory (what a surprise /s).

chemicalwonka, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC
@chemicalwonka@discuss.tchncs.de avatar

8GB for this price in 2023 is a SCAM. All Apple devices are a SCAM. Many pay small fortunes for luxurious devices full of spyware and which they have absolutely no control over. It’s insane. They like to be chained in their golden shackles.

trevron,

Agree with you on the price, disagree with the sentiment. Unless you’re comparing to a linux machine it is a bad take. You can do plenty to MacOS and it isn’t constantly trying to reinstall fucking one drive or hijack my search bar or reset my privacy settings after an update.

But yeah, they can fuck off with the prices.

chemicalwonka,
@chemicalwonka@discuss.tchncs.de avatar

I don’t trust MacOS, its proprietary code obviously hides evil spying and control functions over the user. Apple has always been an enemy of the free software community because it is not in favor of its loyal customers but only its greedy shareholders. There is no balance, Apple has always adopted anti-competitive measures. That’s just to say the least.

PenguinTD,

It took the EU legislation to force them adapt USB 3 charger port. Their consumer base are their cows.

chemicalwonka,
@chemicalwonka@discuss.tchncs.de avatar

Exactly lol

dan,
@dan@upvote.au avatar

And even though they have USB 3 ports, it’s not even a proper USB 3 port as the lower-end models only support USB 2 speeds (480Mbps max)!

chemicalwonka,
@chemicalwonka@discuss.tchncs.de avatar

USB 2.0 in 2023 LOL LOL LOL LOL

dan,
@dan@upvote.au avatar

Lightning was also 480Mbps so I wonder if they just changed the port but kept most of the internals the same

echodot,

They claim that the die that they use for the M1 chip doesn’t support USB 3 standards but the die that they use for the M1 Pro chip does.

Which is probably true, but they also made the chip so it’s not much of a defence.

abhibeckert, (edited )

they also made the chip so it’s not much of a defence

It’s a pretty old chip though. They shipped it over a year ago and even that was mostly just an upgrade from LPDDR4 to LPDDR5. Which is a substantial upgrade, real world performance wise, but most of the engineering work would’ve been done by whoever makes the memory - not Apple’s own chip design team who presumably were working on something else (I’d guess desktop/laptop chips, and those certainly do have USB-3).

Apple certainly could have included USB-3 support in those chips… but three years ago there wasn’t any pressing reason to that so and this year they’ve added support for the models with the most expensive camera. And anyone who cares about data transfer speeds will buy the one with the best camera.

PenguinTD,

yeah, I forgot about that, it’s a USB-C type port.

echodot,

It’s not even USB 3 it’s USB 2 delivered via USB-C. Because that’s something everybody wants isn’t it, slow charging on a modern standard that should be faster and indeed is faster on every other budget Android phone.

abhibeckert, (edited )

Apple has always been an enemy of the free software community

Apple is one of the largest contributors to open source software in the world and they’ve been a major contributor to open source since the early 1980’s. Yes, they have closed source software too… but it’s all built on an open foundation and they give a lot back to the open source community.

LLVM for example, was a small project nobody had ever heard of in 2005, when Apple hired the university student who created it, gave him an essentially unlimited budget to hire a team of more people, and fast forward almost two decades it’s by far the best compiler in the world used by both modern languages (Rust/Swift/etc) and old languages (C, JavaScript, Fortran…) and it’s still not controlled in any way by Apple. The uni student they hired was Chris Lattner, he is still president of LLVM now even though he’s moved on (currently CEO of an AI startup called Modular AI).

chemicalwonka,
@chemicalwonka@discuss.tchncs.de avatar

Well, look at the annual contribution that Apple makes to the BSD team and see that Apple uses several open source software in its products but with minimal financial contribution. Even more so for a company of this size. Apple only “donates” when it is in its interest that such software is ready for it to use.

chemicalwonka,
@chemicalwonka@discuss.tchncs.de avatar

Just an example: If Apple simply wants to turn your iPhone into a brick, it can do that and there is no one who can reverse it.

abhibeckert,

Um. No they can’t. The class action lawyers would have a field day with that.

Honytawk,

They already do so with apps.

If Apple deems the app too old, then it won’t be compatible and is as useful as a brick.

Stormyfemme,

You know I have software on my PC old enough I can’t run it even in compatibility mode, I’d need to spin up a VM to run it or a pseudoVM like DOSBox, it’s not unheard of it’s not even uncommon.

chemicalwonka,
@chemicalwonka@discuss.tchncs.de avatar

they have the power to do it, is what I’m saying

SkepticalButOpenMinded,

That’s too simplistic. For example, the entry level M1 MacBook Air is hands down one of the best value laptops. It’s very hard to find anything nearly as good for the price.

On the high end, yeah you can save $250-400 buying a similarly specced HP Envy or Acer Swift or something. These are totally respectable with more ports, but they have 2/3rd the battery life, worse displays, and tons of bloatware. Does that make them “not a scam”?

(I’m actually not sure what “spyware” you’re referring to, especially compared to Windows and Chromebooks.)

chemicalwonka,
@chemicalwonka@discuss.tchncs.de avatar

I 'm not refering to Windows or ChromeOS ( that are full of spyware too ) . The first generation of Mac M1 had a reasonably more “accessible” price precisely to encourage users to migrate to ARM technology and consequently also encourage developers to port their software, and not because Apple was generous. Far from it.Everything Apple does in the short or long term is to benefit itself.

And not to mention that it is known that Apple limits both hardware and software on its products to force consumers to pay the “Apple Idiot Tax”. There is no freedom whatsoever in these products, true gilded cages. Thank you, but I don’t need it. Software and hardware freedom are more important.

SkepticalButOpenMinded,

I didn’t claim that Apple is doing anything to be “generous”. That seems like it’s moving the goal posts. Say, are other PC manufacturers doing things out of generosity? Which ones?

Even the M2 and M3 Macs are a good value if you want the things they’re good at. For just a few hundred more, no other machine has the thermal management or battery life. Very few have the same build quality or displays. If you’re using it for real professional work, even just hours of typing and reading, paying a few extra hundred over the course of years for these features is hardly a “scam”.

You didn’t elaborate on your “spyware” claim. Was that a lie? And now you claim it’s “known” that Apple limits hardware and software. Can you elaborate?

chemicalwonka,
@chemicalwonka@discuss.tchncs.de avatar

MacBooks do have excellent screens, software integration and everything else, that’s a fact and I don’t take that away from Apple. But the problem is that it’s not worth paying for this in exchange for a system that is completely linked to your Apple ID, tracking all your behavior for advertising purposes and whatever else Apple decides. Privacy and freedom are worth more. If you can’t check the source code you can’t trust what Apple says, they can lie for their own interests. Have you ever read Apple’s privacy policy regarding Apple ID, for example? If not, I recommend it.

SkepticalButOpenMinded,

I think that decision makes sense.

What you said got me worried, so I looked into the claim that it is “tracking all your behavior for advertising purposes and whatever else Apple decides”. That’s a convincing concern, and you’ve changed my mind on this. I don’t see any evidence that they’re doing anything close to this level of tracking — the main thing they seem to track is your Mac App Store usage — but they may have the potential to do so in the enshittified future. That gives me pause.

Stormyfemme,

Apple has repeatedly stressed that they’re privacy focused in the past, while a major departure from that could happen absolutely it feels a bit like borrowing trouble to assume it will happen soon. Google is an advertising company first, microsoft is just a mess, but Apple is a luxury hardware producer, they have minimal reason to damange their reputation in a way that would make those sorts of consumers upset.

Please note that I’m not saying it’s impossible just unlikely in the near future

SkepticalButOpenMinded,

That assessment sounds right. I think we just need to stay vigilant as consumers. We have defeasible reason to trust Apple right now. But we’ve seen, especially recently, what happens when we let corporations take advantage of that hard earned trust for short term gain.

lemillionsocks,
@lemillionsocks@beehaw.org avatar

When compared to other professional level laptops the macbooks do put up a good fight. They have really high quality displays which accounts for some of the cost and of course compared to a commercial grade laptop like a thinkpad the prices get a lot closer(when they arent on sale like thinkpads frequently do).

That said even then the m1 macbook is over a thousand dollars after tax and that gets you just 256GB of storage and 8GB of ram. Theyre annoyingly not as easy to find as intel offerings but you can find modern ryzen laptops that can still give you into the teens of screen on time for less with way more ram and storage space. The m1 is still the better chip in terms of power per watt and battery life overall, but then getting the ram and storage up to spec can make it $700 more than a consumer grade ryzen.

SkepticalButOpenMinded,

I agree with that. I think there are cheaper laptops, where you can spend less to get less. Not everyone needs a metal body and all day battery life.

java,

They have really high quality displays which accounts for some of the cost and of course compared to a commercial grade laptop like a thinkpad

Is that important for a professional laptop? I mean, if you use it for work every day, you probably want a screen that is at least 27 inches, preferably two. It should be capable of adjusting its height for better ergonomics.

lemillionsocks,
@lemillionsocks@beehaw.org avatar

If you work in something that involves graphic design or imaging absolutely

abhibeckert, (edited )

One of the features they highlighted is is the built in display has very similar specs to their 6K 32" professional display (which, by the way, costs more than this laptop). So when you’re not working at your desk you’ll still have a great display (and why are you buying a laptop unless you occasionally work away from your desk?)

  • Both have a peak brightness is 1600 nits (a Dell XPS will only do ~600 nits and that’s brighter than most laptops).
  • Both have 100% P3 color gamut (Dell XPS only gets to 90% - so it just can’t display some standard colors)
  • even though it’s an LCD, black levels are better than a lot of OLED laptops
  • contrast is also excellent
  • 120hz refresh rate, which is better than their desktop display (that only runs at 60Hz. Same as the Dell XPS)
  • 245 dpi (again, slightly better than 218 dpi on the desktop display, although you sit further away from a desktop… Dell XPS is 169 dpi)

I love Dell displays, I’ve got two on my desk. But even the Dell displays that cost thousands of dollars are not as good as Apple displays.

echodot,

The bloatware really isn’t an arguement because it takes all of 30 seconds to uninstall it all with a script that you get off GitHub. Yeah it’s annoying and it shouldn’t be there but it’s not exactly going to alter my purchase decision.

The M1’s ok value for money, but the problem is invariably you’ll want to do more and more complex things over the lifetime of the device, (if only because basic software has become more demanding), while it might be fine at first it tends to get in the way 4 or 5 years down the line. You can pay ever so slightly more money and future proof your device.

But I suppose if you’re buying Apple you’re probably going to buy a new device every year anyway. Never understood the mentality personally.

My cousin gets the new iPhone every single year, and he was up for it at midnight as well, I don’t understand why because it’s not better in any noticeable sense then it was last year, it’s got a good screen and a nice camera but so did the model 3 years ago. Apple customers are just weird.

janguv,

But I suppose if you’re buying Apple you’re probably going to buy a new device every year anyway. Never understood the mentality personally.

My cousin gets the new iPhone every single year, and he was up for it at midnight as well, I don’t understand why because it’s not better in any noticeable sense then it was last year, it’s got a good screen and a nice camera but so did the model 3 years ago. Apple customers are just weird.

I think you’re basing your general estimation of the Apple customer on the iPhone customer a bit too heavily. E.g., I have never had an iPhone and wouldn’t ever consider buying one, considering how locked down and overpriced it is, and how competitive Android is as an alternative OS.

Meanwhile, I’ve been on MacOS for something like 7 or so years and cannot look back, for everyday computing needs. I have to use Windows occasionally on work machines and I cannot emphasise enough how much of an absolute chore it is. Endless errors, inconsistent UX, slow (even on good hardware), etc. It is by contrast just a painful experience at this point.

And one of the reasons people buy MacBooks, myself included, is to have longevity, not to refresh it after a year (that’s insane). It’s a false economy buying a Windows laptop for most people, because you absolutely do need to upgrade sooner rather than later. My partner has a MacBook bought in 2014 and it still handles everyday tasks very well.

echodot,

It’s a false economy buying a Windows laptop for most people, because you absolutely do need to upgrade sooner rather than later.

I think you missed my point.

You want to keep laptops for ages regardless of what OS it was it runs (really not sure how that would have any bearing on spec fall off), but the MacBook M1 is only competitive now, but it won’t be competitive in 4 to 5 years. The chip is good for its power consumption but it isn’t a particularly high performance chip in terms of raw numbers. But the laptop costs as if it is a high performance chip.

There’s no such thing as a Windows laptop you just buy a laptop and that’s the specs you get so not quite sure what you’re comparing the MacBook too.

java,

All Apple devices are a SCAM.

True. Sometimes I look the specs and prices of Apple devices while visiting large electronic stores. I don’t understand how people who aren’t rich can rationalize buying an Apple device. While it’s true that Windows has become increasingly plagued by invasive ads recently, and macOS seems like the only alternative for many, this issue is relatively recent. On the other hand, MacBooks have been overpriced for years.

echodot,

I bought a PC the other day and it only had 6 gigabytes of RAM which is pathetic for what I paid for it but there you go. The thing is for a fraction of the price Apple are asking to upgrade it to 16, I upgraded it to 32 gig.

I honestly think I could upgrade it to 64 and still come in under the Apple price. They’re charging something like a 300% markup on commercially available RAM, it’s ridiculous.

SuperSpruce,

On storage, the markup is about 2000%.

And on RAM if we compare to DDR5 (not totally fair because of how Apple’s unified memory works), it’s about 800% marked up.

Auzy, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

I switched back to Apple recently, but used to sell them.

1 week before Bootcamp was released, I was selling Apple gear, and I showed a sales manager who was visiting how we got Windows running on the new Intel Mac Mini, and explained how this was great, because it was a great transition technology

In front of customers, as I was explaining, he basically called me an idiot, and said “why would anyone want to run windows on a mac”.

A week or so later, bootcamp was released, and he was back… He was now using the arguments I made a week early as a template for bragging about bootcamp to us and explaining the benefits. No apologies for any of the previous discussion.

They make decent products otherwise, and management doesn’t even need to act like wankers or be deceptive either

I only now using Apple again because Microsoft has finally pushed me over the edge with windows (literally, when they started hijacking my chrome tabs EVERY bootup, and opening Edge automatically), and the fact my Xbox Series X wouldn’t even play remote on Windows (their own OS)

Neil,
@Neil@lemmy.ml avatar

deleted_by_author

  • Loading...
  • Auzy,

    Absolutely agree. Unfortunately, Apple attracts the kind of idiots too who think they know what they’re talking about too. When I was selling them, I had a customer tell another that Macs can’t get viruses as I was talking to them.

    I used a lot of Linux in the past too (everything from playing unreal tournament on Gentoo in 3DFX days to Ubuntu more recently), and unfortunately, in the past Linux tended to also attract the upstuck crowd too.

    But, slowly, the LInux culture does seem to be changing. But, we still regularly see people argue about things like SystemD vs Init Scripts (and anyone who has ever written a Init script knows exactly what a pile of crap they are to write) and Pulseaudio vs AlSA/OSS/ESOUND/ETC (whereas, any old school user also remembers the pain of sound servers conflicting with each other). Linux does finally appear to be on the right path to improving things, improve interoperability and the general common sense crowd finally seems to be drowning others out (and new technologies like Wayland or Pipewire are no longer getting heavy blowback). It may also be because Linux developers these days tend to be a lot better at communicating the benefits (Compiz was another case where the benefits were well communicated).

    There’s a lot of things honestly Apple should be fixing

    Cethin,

    Why did you decide to go back to Apple instead of giving Linux a try? It’s free so it literally would have cost nothing to try, and you could keep your other OS(s).

    JustARegularNerd,

    Good grief, I had a lady behind the counter try to berate me onto the store’s rewards card and she wasn’t as pushy as this comment.

    Cethin,

    It’s pushy to ask why someone made a large purchase when there’s a free alternative they might not have tried that they may or may not like better? Unlike buying an Apple product, it takes little effort and no cost to just boot up Linux and give it a shot. Some people won’t like it and that’s fine. It’d be pushy to say you will like it better, which is not what I said.

    Auzy,

    I used to use Linux exclusively (I was actually the top poster on a few major Linux news sites, and my linux project once got published in LinuxWorld Magazine).

    Whilst it has certainly gotten better, I still feel some parts of linux need refining. Also, one thing both Microsoft and Apple do have is available integration of mobile apps… I thought Apple could do both via Parallels (android in windows, iPhone in MacOS), but turns out Android in Windows on parallels won’t work.

    For the type of development i do, windows and macos are still the best options unfortunately too. If Linux had more seamless mobile app integration, I probably would have highly considered it to be honest

    Cethin,

    By mobile app integration, do you mean a connection between your mobile phone and your computer? KDE Connect is pretty good from my experience. It has more features than the Windows alternative at least (and I think there’s even a Windows version oddly enough).

    If you mean running a mobile app in the system, I have no experience with that.

    Auzy,

    Running mobile apps on computer. It’s really the one use case Apple does extremely well, and it’s a pity because Linux could actually do it well if distros sorted themselves out

    asexualchangeling, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

    For the last time, PC means personal computer, not windows computer, if a mac isn’t a personal computer then what is it?

    kusivittula,

    poop. it’s poop.

    janguv,

    True by the letter but not really by practice. PC is synonymous with a computer running Windows, or Linux at a push. I don’t know whether that’s because of Microsoft’s early market dominance or because Apple enjoys marketing itself as a totally different entity, or some combination of the two. But yeah, usage determines meaning more than what the individual words mean in a more literal sense.

    gnuplusmatt,

    Originally “PC” was IBMPC or PC Compatible (as in compatible with IBM without using their trademark). An IBMPC could have run DOS, Windows or even OS/2

    asexualchangeling,

    It’s funny to me becouse these days with all the remote software reinstallation and asking why you want to close one drive and things, windows isn’t exactly very personal either

    DrownedAxolotl,

    I agree with you, but you know how Apple operates, slapping a shiny new name on an already existing concept and making it sound premium.

    Syldon,
    @Syldon@lemmy.one avatar

    You can install your own software on a personal computer, there is a freedom of choice. Apple tells you what you can install on a Mac. archive.ph/ks4uO

    source link

    asexualchangeling,

    Well then here you go, something more open to install on a modern mac asahilinux.org

    Syldon,
    @Syldon@lemmy.one avatar

    Can you run that outside of a virtual box?

    Will this make Apple Silicon Macs a fully open platform?

    No, Apple still controls the boot process and, for example, the firmware that runs on the Secure Enclave Processor. However, no modern device is “fully open” - no usable computer exists today with completely open software and hardware (as much as some companies want to market themselves as such). What ends up changing is where you draw the line between closed parts and open parts. The line on Apple Silicon Macs is when the alternate kernel image is booted, while SEP firmware remains closed - which is quite similar to the line on standard PCs, where the UEFI firmware boots the OS loader, while the ME/PSP firmware remains closed. In fact, mainstream x86 platforms are arguably more intrusive because the proprietary UEFI firmware is allowed to steal the main CPU from the OS at any time via SMM interrupts, which is not the case on Apple Silicon Macs. This has real performance/stability implications; it’s not just a philosophical issue.

    And wouldn’t it be a lot cheaper to just build your own PC rather than pay the premium for the apple logo?

    asexualchangeling,

    And wouldn’t it be a lot cheaper to just build your own PC rather than pay the premium for the apple logo?

    100%, but I have family that uses Apple, and If I got an old mac from any of them I wouldn’t complain, I’d just quietly instal a new OS

    Syldon,
    @Syldon@lemmy.one avatar

    But you still cannot do it outside of a virtual box right?

    So you will still be at the behest of the AppleOS.

    BarryZuckerkorn,

    Can you run that outside of a virtual box?

    It’s not virtualization. It’s actually booted and runs on bare metal, same as the way Windows runs on a normal Windows computer: a proprietary closed UEFI firmware handles the boot process but boots an OS from the “hard drive” portion of non-volatile storage (usually an SSD on Windows machines). Whether you run Linux or Windows, that boot process starts the same.

    Asahi Linux is configured so that Apple’s firmware loads a Linux bootloader instead of booting MacOS.

    And wouldn’t it be a lot cheaper to just build your own PC rather than pay the premium for the apple logo?

    Apple’s base configurations are generally cheaper than similarly specced competitors, because their CPU/GPUs are so much cheaper than similar Intel/AMD/Nvidia chips. The expense comes from exorbitant prices for additional memory or storage, and the fact that they simply refuse to use cheaper display tech even in their cheapest laptops. The entry level laptop has a 13 inch 2560x1600 screen, which compares favorably to the highest end displays available on Thinkpads and Dells.

    If you’re already going to buy a laptop with a high quality HiDPI display, and are looking for high performance from your CPU/GPU, it takes a decent amount of storage/memory for a Macbook to overtake a similarly specced competitor in price.

    Syldon,
    @Syldon@lemmy.one avatar

    It’s not virtualization. It’s actually booted and runs on bare metal, same as the way Windows runs on a normal Windows computer: a proprietary closed UEFI firmware handles the boot process but boots an OS from the “hard drive” portion of non-volatile storage (usually an SSD on Windows machines). Whether you run Linux or Windows, that boot process starts the same.

    Except the boot process on a non apple PC is open software. You can create custom a bios revision. The firmware on an apple computer is not open source. AFAIK you cannot create a custom bios on an apple computer.

    Apple’s base configurations are generally cheaper than similarly specced competitors, because their CPU/GPUs are so much cheaper than similar Intel/AMD/Nvidia chips.

    No idea what you mean by this. You cannot buy Apple’s hardware due the restrictions Apple places on any purchases. Any hardware you can buy from Apple has a premium.

    Apple leans heavily on the display being good on an Apple but imo it does not make up for the pricing. There is a good guide on better alternatives here.

    If you’re already going to buy a laptop with a high quality HiDPI display, and are looking for high performance from your CPU/GPU, it takes a decent amount of storage/memory for a Macbook to overtake a similarly specced competitor in price.

    I think you mean that Apple uses its own memory more effectively then a windows PC does. Yes it does, but memory is not that expensive to make. To increase the storage space from 256GB to 512 is £200. I can buy a 2TB drive for that. More importantly, it can be replaced when it wears out. Apple give you a replacement price that means you need a new computer.

    Apple computers are designed to make repairs expensive. They may have pseudo adopted the right to repair, but let us see how that goes before believing the hype.

    BarryZuckerkorn,

    Except the boot process on a non apple PC is open software.

    For the most part, it isn’t. The typical laptop you buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware. They all end up supporting the open UEFI standard, but the implementation is usually closed source. Having the ability to flash new firmware that is mostly open source but with closed source binary blobs (like coreboot) or fully open source (like libreboot) gets closer to the hardware at startup, but still sits on proprietary implementations.

    There’s some movement to open source more and more of this process, but it’s not quite there yet. AMD has the OpenSIL project and has publicly committed to open sourcing a functional firmware for those chips by 2026.

    Asahi uses the open source m1n1 bootloader to load a U-boot to load desktop Linux bootloaders like GRUB (which generally expect UEFI compatibility), as described here:

    • The SecureROM inside the M1 SoC starts up on cold boot, and loads iBoot1 from NOR flash
    • iBoot1 reads the boot configuration in the internal SSD, validates the system boot policy, and chooses an “OS” to boot – for us, Asahi Linux / m1n1 will look like an OS partition to iBoot1.
    • iBoot2, which is the “OS loader” and needs to reside in the OS partition being booted to, loads firmware for internal devices, sets up the Apple Device Tree, and boots a Mach-O kernel (or in our case, m1n1).
    • m1n1 parses the ADT, sets up more devices and makes things Linux-like, sets up an FDT (Flattened Device Tree, the binary devicetree format), then boots U-Boot.
    • U-Boot, which will have drivers for the internal SSD, reads its configuration and the next stage, and provides UEFI services – including forwarding the devicetree from m1n1.
    • GRUB, booting as a standard UEFI application from a disk partition, works like GRUB on any PC. This is what allows distributions to manage kernels the way we are used to, with grub-mkconfig and /etc/default/grub and friends.
    • Finally, the Linux kernel is booted, with the devicetree that was passed all the way from m1n1 providing it with the information it needs to work.

    If you compare the role of iBoot (proprietary Apple code) to the closed source firmware in the typical Dell/HP/Acer/Asus/Lenovo booting Linux, you’ll see that it’s basically just line drawing at a slightly later stage, where closed-source code hands off to open-source code. No matter how you slice it, it’s not virtualization, unless you want to take the position that most laptops can only run virtualized OSes.

    I think you mean that Apple uses its own memory more effectively then a windows PC does.

    No, I mean that when you spec out a base model Macbook Air at $1,199 and compare to similarly specced Windows laptops, whose CPUs/GPUs can deliver comparable performance on benchmarks, and a similar quality display built into the laptop, the Macbook Air is usually cheaper. The Windows laptops tend to become cheaper when you’re comparing Apple to non-Apple at higher memory and storage (roughly 16GB/1TB), but the base model Macbooks do compare favorably on price.

    Syldon,
    @Syldon@lemmy.one avatar

    The typical laptop you buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware.

    FTFY: The typical laptop MOST buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware. Though, I totally agree there are some PC suppliers with shitty practises. Where we disagree is that is if the firmware is fixed by the hardware manufacturer, then you have control over everything on the system. It is only when you have control of the base functionality of the system that you can say you are in charge. This may be too literal for you, but I just see that as a trust level you have in the manufacturer not to abuse that control.

    As for the comparison I disagree.

    This is a £1400 laptop from scan V’s £1500 macbook air currently.

    17 inch screen (2560X1440) over the 15.3 inch (2880X1864)

    16gb memory - 8GB upgrade to 16gb=+£200

    1TB SSD over 256GB (upgrade to 1Tb=+£400)

    8 full core/16t CPU (AMD5900hx) over an 8 core non hyperx cpu, 4 cores are cheaper variants.

    All of the PC components can be upgraded at the cost of the part + labour. Everything on the Apple will cost the same price as a new computer to replace. Mainly because it is all soldered onto the board to make it harder to replace.

    BarryZuckerkorn,

    This is a £1400 laptop from scan V’s £1500 macbook air currently.

    Ah, I see where some of the disconnect is. I’m comparing U.S. prices, where identical Apple hardware is significantly cheaper (that 15" Macbook Air starts at $1300 in the U.S., or £1058).

    And I can’t help but notice you’ve chosen a laptop with a worse screen (larger panel with lower resolution). Like I said, once you actually start looking at High DPI screens on laptops you’ll find that Apple’s prices are actually pretty cheap. 15 inch laptops with at least 2600 pixels of horizontal resolution generally start at higher prices. It’s fair to say you don’t need that kind of screen resolution, but the price for a device with those specs is going to be higher.

    The CPU benchmarks on that laptop’s CPU are also slightly behind the 15" Macbook Air, too, even held back by not having fans for managing thermals.

    There’s a huge market for new computers that have lower prices and lower performance than Apple’s cheapest models. That doesn’t mean that Apple’s cheapest models are a bad price for what they are, as Dell and Lenovo have plenty of models that are roughly around Apple’s price range, unless and until you start adding memory and storage. Thus, the backwards engineered pricing formula is that it’s a pretty low price for the CPU/GPU, and a very high price for the Storage/Memory.

    All of the PC components can be upgraded at the cost of the part + labour.

    Well, that’s becoming less common. Lots of motherboards are now relying on soldered RAM, and a few have started relying on soldered SSDs, too.

    Syldon,
    @Syldon@lemmy.one avatar

    Amazon has this one for $1200. I would still pay the extra for the features over an Applemac.

    I can’t help but notice you’ve chosen a laptop with a worse screen (larger panel with lower resolution).

    I would choose a larger screen over that marginal difference in dpi every day of the week. People game on TV screens all the time with lower resolution because it is better.

    The CPU benchmarks on that laptop’s CPU are also slightly behind the 15" Macbook Air, too, even held back by not having fans for managing thermals.

    You cannot compare an app that runs on two different OS. That is just plain silly. Cinebench only tests one feature of a system. That is the CPU to render a graphic. Apple is built around displaying graphics. A PC is a lot more versatile. There is more to a system than one component. Let’s see you run some raytracing benchmarks on that system.

    Well, that’s becoming less common. Lots of motherboards are now relying on soldered RAM

    I wouldn’t buy one. You will always find some idiotic willing victim. In the future though ram is moving to the CPU as a package, but that will be done for speed gains. Until then only a bloody fool would buy into this.

    An apple system has one major benefit over a PC system - battery life. Other than that I would not recommend one, even then I would give stern warnings over repair costs.

    BarryZuckerkorn,

    I would choose a larger screen over that marginal difference in dpi every day of the week.

    Yes, but you’re not addressing my point that the price for the hardware isn’t actually bad, and that people who complain would often just prefer to buy hardware with lower specs for a lower price.

    The simple fact is that if you were to try to build a MacBook killer and try to compete on Apple’s own turf by matching specs, you’d find that the entry level Apple devices are basically the same price as other laptops you could configure with similar specs, because Apple’s baseline/entry level has a pretty powerful CPU/GPU and high resolution displays. So the appropriate response is not that they overcharge for what they give, but that they make choices that are more expensive for the consumer, which is a subtle difference that I’ve been trying to explain throughout this thread.

    You cannot compare an app that runs on two different OS.

    Why not? Half of the software I use is available on both Linux and MacOS, and frankly a substantial amount of what most people do is in browser anyway. If the software runs better on one device over another, that’s a real world difference that can be measured. If you’d prefer to use Passmark or whatever other benchmark you’d like you use, you’ll still see be able to compare specific CPUs.

    Syldon,
    @Syldon@lemmy.one avatar

    you’re not addressing my point that the price for the hardware isn’t actually bad,

    I disagree. It is not only that the hardware is cheaper and a lower spec with the exception of the CPU, the design is geared around making upgrades and repairs near impossible or unfeasible. Software has much more support on a Windows OS. Video editing has been bread and butter for many years now, but Windows has caught up due to improvements in hardware and software. In my mind this negates the case for buying a Mac currently, but I can easily see it was a good buy in the past.

    The outlier is Macs are good in battery life. Therefore there is a niche market that is an exceptionally good return on your investment.

    Why not? Half of the software I use is available on both Linux and MacOS, and frankly a substantial amount of what most people do is in browser anyway. If the software runs better on one device over another, that’s a real world difference that can be measured. If you’d prefer to use Passmark or whatever other benchmark you’d like you use, you’ll still see be able to compare specific CPUs.

    Because you cannot use Cinebench unless you are comparing the same system setup. Comparing two OSs is just stupid and cherry picking. Apple has a very trimmed down OS compared to the complexity of Windows. Apple OS dumps the need for legacy code with a closed system designed for specific hardware. Windows still caters for code written for DX CPUs under x86 architecture. This as well as the many other reasons why not. I noticed you ignored my offer of comparing back to back raytracing result, and now fail to even mention it.

    You are obviously enamoured by the Apple model, I am not. There really is nothing that you could say that would convince me otherwise. I will wish you good day, and hope you agree to disagree.

    schnurrito,

    I think the history is such that a “PC” is a computer compatible with the “IBM PC” which Macs were historically not and modern ones aren’t either.

    But I still like “Windows computer”, we can abbreviate that to “WC”.

    tal,
    @tal@lemmy.today avatar

    Another complication was that DOS-using machines weren’t always running Windows at one point in time.

    sanzky,

    I doubt it’s the last time. also while “PC” means personal computer, it was a very specific brand name by IBM, not a general purpose term. their computers (and clones later) became synonymous with x86-windows machines.

    Even apple themselves have always distanced themselves from the term (I’m a Mac, and I’m a PC…).

    happyhippo, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

    My 16GB XPS running Linux almost fills up entirely when running several docker containers, IDEA, Firefox, Teams, Postman and a few other, smaller apps, but it fits still, and I can work with it (tho I can’t wait to get my 32GB framework laptop)

    Now gimme a 8GB MBP and I’ll show you that I wouldn’t get shit done on that configuration. And at 1600 it’s just crazy.

    echodot,

    11 gigabytes of that is probably being used up by Teams, it’s a memory hog.

    coffeejunky,

    Teams it’s the absolute worst and it’s the only app that can sometimes crash my Linux machine

    SNFi,

    It crashes on Apple too.

    coffeejunky,

    It sucks that better alternatives like slack went from the big player to a small player only because of Microsofts power over businesses. If teams would win because it was actually a better product I’d be fine with that. But teams is just a pile of shit we are forced to use.

    soulfirethewolf, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

    I still hate that they killed the mid-range model. Your option is the lower end MacBook Air with no fan, or the higher-end MacBook Pro. There is no in between.

    I absolutely love the snappiness of the m1 chip in my current 2020 MBP, and how much more efficient ARM is compared to x86, but it seems really hard to justify going an extra 300$ in the future.

    I really just wish they would bring back the original MacBook (with no suffixes at the end)

    soulfirethewolf,

    I kind of want to go for the framework laptop, but I still do like ARM and given I want to do more stuff around machine learning in the future, which is already kind of difficult to run large language models with only 8 gigabytes of RAM, it at least kind of runs with ARM. On my basement PC, It will barely do anything

    tal,
    @tal@lemmy.today avatar

    There are some external GPUs that can be USB-attached. Dunno about for the Mac. Latency hit, but probably not as significant for current LLM use than games, as you don’t have a lot of data being pushed over the bus once the model is up.

    soulfirethewolf,

    Those don’t work on Apple silicon Macs. sadly

    monobot, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC

    My GeoTIFFs do not agree.

    slowbyrne, to technology in Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB on a PC
    @slowbyrne@beehaw.org avatar

    Macrumors just released an article talking about how the 8gb is a bottleneck in the new M3 models lol

    lenguen, to technology in Lawyer guilty of arrogance after ignoring tech support

    Expertise in one area of knowledge does not equate omniscience

    wjrii,
    @wjrii@kbin.social avatar

    Having read some software developers' attempts at writing, or, god help me, contract drafting, I agree completely.

    Doctors are the worst about this though, and everybody WebMD'ing themselves before coming in is simply collective karma.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • fightinggames
  • All magazines