Overzeetop

@Overzeetop@beehaw.org

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Overzeetop,

That was a nice term report by a precocious 5th grader or, more likely, an AI generated article.

Overzeetop,

“live and work and build and pay in that world in an ongoing basis”

There, that’s more what they’re envisioning.

Overzeetop,

They’d better not be playing all my free games before I get to them.

Overzeetop,

Surge pricing to level demand is a potentially valid strategy when you’re trying to even optimize your off-peak manpower or have limited production capacity. Surge pricing to increase profits is going to be detrimental to their business.

Overzeetop,

Fascinating interview around the technology. As someone who is generally skeptical of wild “zero carbon” claims, this was interesting enough that I would definitely go out of my way to see the process in person, just to learn more about it.

Overzeetop,

Funny - I’m going to be in Boston in three weeks. Unfortunately I think all my time is booked from the time my plane touches down to the time I head back to BOS.

Overzeetop,

T-mos general coverage outside of city centers and interstates is trash (they’re all pretty bad, but Tmo is very binary). I’d get it over xfinity, but it’s not even offered in my major university town due to coverage limitations. And it’s not like there aren’t big pipes nearby - the university consumes more than 100TB of data traffic a day; their Netflix traffic alone was so large just 3 years ago that they were on the edge of getting a co-located Netflix rack on campus.

Overzeetop,

Just to be clear, generally stock buy backs are not to increase revenue or dividends, but to increase the stock price by creating a false scarcity. Potential dividend increases from corporate stock ownership are a shell game as the corporation received the dividend and it is simply added to the cash on hand and book value.

Nearly all growth in stocks is capital based. Every corporation wants to increase revenue and profits because that forms the basis for valuation. Yes, there are young companies who are “forward looking” and trading on factors based on revenue and not net income, but most of the market is based on a net income multiplier (which varies by industry).

As much pressure as the boomers (and soon GenXers) will place on revenue, it will never be enough to support the lifestyle to which they have become accustomed. Rather, they will be selling capital to fund their retirements. This will lead to long term stagnation of stock prices (in the best scenario) or a collapse of market value as retirees try to sell their stock for the next 9 month round-the-world cruise. This is a negative feedback loop, too, as the more people sell, the lower the value of their stock, requiring they sell even more shares to get to a fixed value in cash. I think of this as just one more Fuck You (added to the collapse of public health and public retirement subsidies) the boomers will be handing Millennials and GenZ. Actually, I thought you might catch a break with housing, as the value of housing as they all move into retirement homes would drop with the glut of units coming to market. Alas, corporations have found they can buy those units and rent them back at exorbitant rates, so they’ll be tag teaming the boomers in fucking over the youth of today.

Overzeetop,

And, ime, a lot of corporations are serving content through third party (or at least non-native) servers, which means that any blocker which touches any of those servers breaks content completely. I’ve experienced major Travel, banking, and retail sites which simply don’t work unless most blacklisted sites are allowed. That means either turning blocking off for that main site entirely, or spending an hour testing every one of their 30 off-site connections to see which ones break. I don’t have that kind of bullshit time, and the rest of my family don’t have the patience or skill to do that troubleshooting. PiHole turned out to be multiple hours a week of frustration so I gave up - I already have a full time job and full slate of hobbies. In-browser blockers are, at least, easier to toggle on and off.

Google's Chrome Browser Analyzing Your Browsing History with so-called "Privacy Sandbox" Feature

For nearly two years now, Google has been gradually rolling out a feature to all Chrome users that analyzes their browsing history within the browser itself. This feature aims to replace third-party cookies and individual tracking by categorizing you into an interest category and sharing that category with advertisers. It’s...

Overzeetop,

I’m about 99% sure that this is exactly what credit card companies do.

Overzeetop,

Whether you take the stick out of your dog’s mouth or you tell the dog to give it to you, you’re the taking the stick. Breaking up and selling off IP is exceedingly commonplace.

We’ve already established they are whores, Tencent has simply been unsuccessful, so far, in negotiating their price.

Editing memories, spying on our bodies, normalising weird goggles: Apple’s new Vision Pro has big ambitions (theconversation.com)

Apple Vision Pro is a mixed-reality headset – which the company hopes is a “revolutionary spatial computer that transforms how people work, collaborate, connect, relive memories, and enjoy entertainment” – that begins shipping to the public (in the United States)....

Overzeetop,

Wait until people find out what their smart watches are already cataloging. 🙄

Microsoft stole my Chrome tabs, and it wants yours, too (www.theverge.com)

Last week, I turned on my PC, installed a Windows update, and rebooted to find Microsoft Edge automatically open with the Chrome tabs I was working on before the update. I don’t use Microsoft Edge regularly, and I have Google Chrome set as my default browser. Bleary-eyed at 9AM, it took me a moment to realize that Microsoft...

Overzeetop,

I’m not a programmer. Open source garbage isn’t in my control either.

Overzeetop,

I sat in a room of probably 400 engineers last spring and they all laughed and jeered when the presenter asked if AI could replace them. With the right framework and dataset, ML almost certainly could replace about 2/3 of the people there; I know the work they do (I’m one of them) and the bulk of my time is spent recreating documentation using 2-3 computer programs to facilitate calculations and looking up and applying manufacturer’s data to the situation. Mine is an industry of high repeatability and the human judgement part is, at most, 10% of the job.

Here’s the real problem. The people who will be fully automatable are those with less than 10 years experience. They’re the ones doing the day to day layout and design, and their work is monitored, guided, and checked by an experienced senior engineer to catch their mistakes. Replacing all of those people with AI will save a ton of money, right up until all of the senior engineers retire. In a system which maximizes corporate/partner profit, that will come at the expense of training the future senior engineers until, at some point, there won’t be any (/enough), and yet there will still be a substantial fraction of oversight that will be needed. Unfortunately, ML is based on human learning and replacing the “learning” stage of human practitioner with machines is going to eventually create a gap in qualified human oversight. That may not matter too much for marketing art departments, but for structural engineers it’s going to result in a safety or reliability issue for society as a whole. And since failures in my profession only occur in marginal situations (high loads - wind, snow, rain, mass gatherings) my suspicion is that it will be decades before we really find out that we’ve been whistling through the graveyard.

Overzeetop,

The future is already here. This will sound like some old man yelling at clouds, but the tools available for advanced structural design (automatic environmental loading, finite element modeling) are used by young engineers as magical black boxes which spit out answers. That’s little different than 30 years ago when the generation before me would complain that calculators, unlike sliderules, were so disconnected from the problem that you could put in two numbers, hit the wrong operation, and get a non-sensical answer but believe it to be correct because the calculator told you so.

This evolution is no different, it’s just that the process of design (wither programming or structures or medical evaluation) will be further along before someone realizes that everything that’s being offered is utter shit. I’m actually excited about the prospect of AI/ML, but it still needs to be handled like a tool. Modern machinery can do amazing things faster, and with higher precision, than hand tools - but when things go sideways they can also destroy things much quicker and with far greater damage.

Overzeetop,

I’m assuming you’re being facetious. If not…well, you’re on the cutting edge of MBA learning.

There are still some things that just don’t get into books, or drawings, or written content. It’s one of the drawbacks humans have - we keep some things out our brains that just never make it to paper. I say this as someone who has encountered conditions in the field that have no literature on the effect. In the niches and corners of any practical field there are just a few people who do certain types of work, and some of them never write down their experiences. It’s frustrating as a human doing the work, but it would not necessarily be so to a ML assistant unless there is a new ability to understand and identify where solutions don’t exist and go perform expansive research to extend the knowledge. More importantly, it needs the operators holding the purse to approve that expenditure, trusting that the ML output is correct and not asking it to extrapolate in lieu of testing. Will AI/ML be there in 20 years to pick up the slack and put it’s digital foot down stubbornly and point out that lives are at risk? Even as a proponent of ML/AI, I’m not convinced that kind of output is likley - or even desired by the owners and users of the technology.

I think AI/ML can reduce errors and save lives. I also think it is limited in the scope of risk assessment where there are no documented conditions on which to extrapolate failure mechanisms. Heck, humans are bad at that, too - but maybe more cautious/less confident and aware of such caution/confidence. At least for the foreseeable future.

Thoughts on BOOX Tab Ultra C? (shop.boox.com)

Does anyone here have a BOOX e-paper tablet? I’m a big fan of e-paper devices—I love my Pebble smartwatch, Kindle Paperwhite, and Light Phone II. I’ve been eyeing the Tab Ultra C for quite a while, and I am considering the pros and cons. Mostly, I intend to use it for browsing the web and maybe some light note taking and...

Overzeetop,

The use of the term backlight is common, but even Amazon refers to it as a “front light” (it’s edge-lit, of course, as you say). Bit like using a floppy disc as the “save” icon, or walling wireless networks “wi-fi” despite having nothing to to with “fidelity”. We all know what it means.

Overzeetop,

I get it…it’s hard to say something you know is incorrect, accuracy of language going to shit and other modern problems, and I feel that. I think of it as more of a “internal lighting that illuminates the device interface.” In dealing with non-technical people on a daily basis, I find it’s much more productive to allow/ignore this sort of colloquialism unless it’s that specific thing I’m trying to fix/undo/explain. I barely even flinch now when people refer to the large box on their deck as the “CPU.” ;-)

Overzeetop,

It’s not that at all. I keep tabs on several far-flung friends and relatives on FB. Zero spam. TBF, I make it a point to click on ads for things I don’t need but don’t mind seeing (rockets, 3D printers, vocal jazz stuff). Of course, I’m on IPv4 with my whole household, so if I search for hiking shoes, everyone in the house gets FB ads for hiking shoes. I got a bunch of ads for Merino Wool outerwear in mid December. My wife was kind enough to get me several base layers for Christmas. There is no good and bad, just poor internet management and hygiene (IMHO).

Overzeetop,

So - honest advice. I remember magazines - some with more ads than articles. You just flip past them. It’s different now because websites know your scrolling rates and FB wants you to engage. It’s why I actually click though a couple of ads every so often. Merrill? Great shoes. Osprey? Yeah, nice backpacks. Anycubic? I’ll probably want a new resin printer some day. Sure, with 2-3 clicks I can - and do - switch to the chronological “friends” feed that is exclusively friend-posted content with some paid ads (not engagement content) to pay the bills.

As for private browsing, I hear you. But, also, your IP is part of your online fingerprint. You don’t need cookies or tracking pixels from previous sessions active for FB to know - through the aggregation data they buy (possibly even from your Internet provider) what you’re looking at.

[Disclaimer - this next bit is anecdotal, no data to support the following theory.]I had a friend who suddenly was getting a ton of MAGA and alien conspiracy ads of his FB page. He doesn’t track his outgoing IP but I suspect that he was just re-assigned an outgoing IP that has previously been used by someone else (his locality is very red, politically, though he ie not). I know for my IP I’ve had my IPv4 for at least 7 months. It’s one reason that my wife, daughter and I all get intertwined ads on what we search.

To attempt get around this, one option is a vpn. Add to that a separate private browser (it’s how I did my online Christmas shopping, and it’s kind of a pain). You’re still in danger of machine fingerprinting, but it’s usually too much hassle for just marketing to wind its way back to you.

Overzeetop,

in the code that was talking to the ink cartridge.

So the flaw is in the printer or driver, and HP has just admitted to shipping an insecure, nay negligently dangerous, product to consumers?

Overzeetop,

This is like people abandoning a stick shift and rigid frames/chasses for modern automatic/CVT and and unibody with crumple zones. The latter are complicated, expensive, and inefficient - but substantially more forgiving to the average driver who merely wants to get from A to B with the minimum amount of effort. Linux will be there for people who choose to dedicate hundreds of hours a year to the hobby of computers. For everyone else who doesn’t want to open their laptop to replace the keyboard, update their wireless card, and clean or replace the system fans and solder in a new power connector, buying a new laptop with the extra horsepower (to overcome the code creep) will offer them all those things at a price cheaper than even taking them to the corner repair shop to get the mechanical failures fixed.

Overzeetop,

Many Linux distributions have very good user experience for beginners

I 100% agree. The issue isn’t beginners, it’s people who already know windows, and only windows. They’d be just as lost switching to OSX. Kids pick up chromebooks easily, but most adults - the ones who have 5-8 year old machines with only 8GB - are completely lost. I tried to get my mother onto LibreOffice (okay, Open Office…it’s been that long) and it lasted less than a week and one panicked old-lady newsletter deadline. She was utterly lost, and no amount of help would get her out. To be fair, she gets lost when her phone updates to the newest major OS revision.

I choose one of those niche distributions since I have advanced requirements.

I chose windows for the same reason - specialized industry where nearly all tools are written for Windows. I have $15k in software, $200k in setup and procedures, and $100-200k in training I would have to redo to switch to linux, and while that was happening I would have zero income, so double those numbers for net losses. That’s assuming I could even find perfect analogs in the linux world, which is unlikely, and that I was willing to receive and send non-standard files to all of my colleagues. I could consider Wine/Proton, but then I’d have to learn it or risk losing $2000/day plus the cost of tracking down repairs if anything (like an update) broke a critical piece of software. It simply not worth the financial risk.

Overzeetop,

Oh, it didn’t take Israel for the venture capitalists among them to want to harvest defense money. AI support has always had a contingent that intends to use it for military purposes. Same with remote vehicles. Same with robots. Same with, well, pretty much every advance in science - chemical agents, biological agents, lasers, space, nuclear power. Practically anything we create has a military use if you’re morally bankrupt or thirsty for power or money. Nearly every project starts out with “to serve mankind” as its goal.

Overzeetop,

Humans are so massively susceptible to gamification. It’s nice for providing motivation, but it ends up being like an addiction the way companies leverage it.

Overzeetop,

Yeah, I saw it the other day. I would bet a lot of money that the information was already stored. In fact, if their data group didn’t already store that information (link clicks to external websites) they should all be fired. This is just a way that you can find something that you’d previously looked at on facebook (which, oddly, may be the only site on the planet with a worse search function than reddit).

Overzeetop,

Since even inadvertent or unintentional copying can be punished as infringement, any takedown should be subject to the same level of scrutiny and false claims should be awarded statutory damages matching infringement of registered works, collectible by the party who’s non-infringing work was blocked, but actionable by any party who is denied access. So if you can’t get to content due to DMCA you can sue, but you cannot recover damages or expenses - if you win, the $175k(?) per fraudulent take down is payable to the content owner. In that way, it’s recognized that an individual looking for content is injured by the takedown, but there is no financial incentive for take-down vigilantism.

Overzeetop,

You think the old bird is bad, you should see the rabid slobbering of neckbeards still obsessed with reddit or the masticating Meta connecting Threads to Activity pub. The impotent outrage is palpable.

Overzeetop,

AI does have little to do with it, but we can’t do housing the way people want housing. The land does not exist in sufficient quantity, in the desired areas, without other strings attached (such as private ownership). And it would still take a decade to build it all because there aren’t enough tradespeople in the places where you want the housing built.

Overzeetop,

“Y’all come here an’ look at 'dis 'fore I calculate it!”

Overzeetop,

I won’t argue that AI won’t solve the housing problem. And I agree that we can build a bunch of housing. But it won’t be where people want to live, or it won’t be affordable. I’ve got people in my town screaming for affordable housing. Even with subsidies its hard to get things going when the local municipality is practically bending over backwards. Why? Because it has to be on a bus line. It has to be within walking distance of X services. And all the land that fits those criteria is millions of dollars an acre. Even if you could find them, the contractors can’t find enough qualified, reliable workers at premium rates to service their million dollar home builds. I’m in the industry and I don’t care how much “will power” you have; short of taking land through eminent domain and using it for free, you won’t have anyplace that meets any kind of criteria for livability. Hell, I could go buy 1000 acres just an hour down the road for $1M and put up 10,000 houses that only cost $50k each to build. Thing is, nobody is going to buy them. There is literally no demand, even for cheap housing, that takes an hour drive to get anywhere useful - and if you get closer in, you won’t find land that’s affordable. Heck, by the time I extended infrastructure to them or built it out, it would be 3-4 years before the first resident could move in, and that’s with zero delay on any governmental paperwork.

Overzeetop,

Severability is standard boilerplate. As is waiving of all liability (essentially in perpetuity, even if not stated as such), incidental and consequential damage, and indemnification of the writing party against any and all claims. This is a mole hill on the landscape of click-through licensing fuckery.

Overzeetop,

At this point, I think my pavlov-like reaction to Thursdays and grabbing the free games is the game now. I know full well I’ll never play these games.

Overzeetop,

Yeah, it’s high on my list. Along with a half dozen other AAAs from the last decade. I think Cyberpunk is next on my list, though there’s a Fallout languishing on my Deck I keep meaning to go back to.

Overzeetop,

Still cheaper than switching to Linux, even if the price were high.

Overzeetop,

lol- yup. It’s free in the same way that building your own house is free. Which is true if you already have the skills and free time. And even if you’re light on skills there’s tons of YouTube videos and forums where you can get any technical data you need. Right?

Overzeetop,

You can’t have a transaction without mining. Mining is the work done to solve a batch of transactions, so the exact cost of a transaction is easy to determine provided that you don’t include the cost of plant (buildings and IT to run the miners, though this is usually very minor compared to the actual calculation consumption). Each block contains (typically) between 3000 and 4000 transactions and is solved every 10 minutes. As of today, it takes 2.6GWh to solve a block, given the current number of miners (137TWh/yr per digiconomist.net/bitcoin-energy-consumption), which is 744kWh per transaction at 3500 transactions per block.

The cost of a Visa transaction is more difficult because there are people involved and other plant costs (buildings to house the people who work for Visa). The actual cost to process a Visa transaction, in direct transactional power usage, is trivial because a Raspberry Pi can “process” hundreds of thousands of transactions a second locally - it’s literally a couple hundred bytes of login/query/reply data, and adding or subtracting from a ledger which is mirrored to distributed servers. Distributed across a server with enough transactions to keep it busy it’s probably a few hundred milliseconds on 1/8 of a 50W processor - call it 0.001Wh at the server, which is the equivalent of the 700kWh per bitcoin transaction. If we say that there are 10 machines all doing the same virtual transaction on each physical transaction (incl. POS, backup, billing, etc) and we figure a 5:1 cost of total power (a/c, losses, memory, storage) then we’re all the way up to 0.00005kWh (0.05 Wh, or 180 watt-seconds) per transaction. That means that the overall cost for visa to process your charge is 1.5kWh/0.00005kWh for the computers or 30,000:1 due to humans being involved in the process.

Here’s the thing, though: Bitcoin gets harder (more compute intensive) as time goes on, and the rate of increase is faster than the ability to solve, on a Wh basis. IE - Bitcoin transactions will get more expensive over time unless bitcoin changes their code - and there is always resistance to that because there is a financial disincentive to reduce the work in Proof of Work systems. This is mitigated on other blockchains by using Proof of Stake, but that has other implications. Visa, otoh, is taking advantage of AI and drops in processor and storage costs to lower their per-transaction cost because there is a financial incentive to reduce processing costs as the fees charged are fixed (nominally 3% of the transaction cost) and anything left over is profit.

Overzeetop,

Again, it would take a substantial change to the code or reality. The options are to change the block size (more transactions per block), alter the difficulty curve (which is intended to limit growth in the limited bitcoin supply), alter the way blocks are solved (massive theoretical mathematical breakthrough or, possibly, a move from asic to quantum computing), or switch away from proof of work. The first increases the storage of the blockchain (substantially for a substantial reduction), rewrite - and get approval - to change the difficulty steps which had been a hallmark of the system, the third is magical thinking, and the fourth completely undermines the egalitarian ethos of the coin.

I’ve heard of no substantive move on any front to alter the plan because, for now, it working. And the true believers are generally libertarians who have faith that market forces will correct any shortcomings organically. This usually results in everything working perfectly right up until it doesn’t, at which point the wheels come off and the bus slams into the class of kindergarteners crossing the road.

Overzeetop,

the whole network could be run with a cleverly configured raspberry pi

Which would defeat the entire purpose of a distributed blockchain. I’m ribbing you, of course, on that ;-) Bitcoin was not built for efficiency and the very basis of distributed proof of work trades efficiency for security. The more “successful” it gets, the larger the incentive to waste power in a fight to win each block reward becomes - by design.

Overzeetop,

I’m always surprised that nobody worries about the random long-chain polymers created in the seasoning process which are then released into your food as you cook.

Overzeetop,

Yes, many long chain polymers are carcinogens. That makes them bad. Long chain polymers are what make commercial non-stick pans non-stick. Note: they are different long chain polymers, but still just a bunch of polymer hydrocarbons because…that’s what makes both of them non-stick.

Overzeetop,

Thanks for the great ride!

Dragged 20 feet vs a beautiful golden parachute. They really do live in an alternate reality.

Overzeetop,

Yeah, I’m with you and it’s keeping me from really starting a new game. I got back into gaming with Elite Dangerous and got a kick out of the hours of offline research (because the in-game tools were fucking terrible when they even existed). It took me a while to get past the cool graphics and flight, but it got boring and tedious managing stuff. I failed to start Witcher 3 twice before just diving in and deciding I was going to not figure out anything and just play. It’s a far more forgiving system than most, and the gameplay benefits from it (to the suffering of realism).

While I enjoy the games, I loathe the min-max and inventory management necessary in most games. That’s not technically necessary if you spend a couple hundred hours perfecting technique. While that’s less than a month for a full time gamer, it’s about 5 years of play time in my life, so I end up looking up some obscure bit on line and chasing crafting for no good reason except to make my gaming time no fun. As a result, most of my SteamDeck time has been on simple arcade shooters and a couple of card-combat games. It’s frustrating to know there are good games out there if I just had 20-30 hours to get into them, and also knowing that I’ll have 20-30 hours free on a regular basis only when I retire some day. I guess my nursing home days will have lots of content, so I’ve got that going for me.

Overzeetop,

It’s the same as Apple devices. You save something and it puts it where it can be found the same way you saved it…but not necessarily where you think it should be or where it makes sense. The entire ecosystem (both of them) are designed to be insular - you stay in the box and things just work. Yes, people have lost stuff in both cases - usually through their own fault, or the fault of someone who doesn’t actually understand how the system was set up to work.

If we treated MS the way most users treat Apple, there would be little concern. You turn on the device, do things using the MS core apps, and when you go to set up a new device all your stuff auto-populates. It’s just that Windows users tend to muck around with things, use non-Microsoft software, and - especially long time users - expect things to be where they used to be. In trying to make their system more streamlined (and Apple-like, both insular and user-friendly), but allowing the system to be used in a more traditional, manual fashion, you can make things go bad. It’s like adding an automatic transmission to a car but leaving the manual clutch - it can only end in tears.

Overzeetop,

advanced settings

I agree. However, for the “what’s a computer?” crowd, manually adjusting the screen brightness is pretty advanced. They intentionally obfuscate settings they (a) don’t want people messing with - like the ability to show the entire right-click context or uninstalling Candy Crush and (b) which are likely to lead to screwing up their system - like entering their own namerserver IP address or opening ports in the firewall. Still, if I were in a room with Hitler and the person who decided to create the Settings app without all the control panel functions included, and had a gun with only had one bullet, Hitler would still be alive.

Overzeetop,

If you were to send him a get-well basket of fruit, would you include or exclude apples?

Overzeetop, (edited )

a toy for professional workloads

[rant]

I think this is one of those words which has lost its meaning in the personal computer world. What are people doing with computers these days? Every single technology reviewer is, well, a reviewer - a journalist. The heaviest workload that computer will ever see is Photoshop, and 98% of the time will be spent in word processing at 200 words per minute or on a web browser. A mid-level phone from 2016 can do pretty much all of that work without skipping a beat. That’s “professional” work these days.

The heavy loads Macs are benchmarked to lift are usually video processing. Which, don’t get me wrong, is compute intensive - but modern CPU designers have recognized that they can’t lift that load in general purpose registers, so all modern chips have secondary pipelines which are essentially embedded ASICs optimized for very specific tasks. Video codecs are now, effectively, hardcoded onto the chips. Phone chips running at <3W TDP are encoding 8K60 in realtime and the cheapest i series Intel x64 chips are transcoding a dozen 4K60 streams while the main CPU is idle 80% of the time.

Yes, I get bent out of shape a bit over the “professional” workload claims because I work in an engineering field. I run finite elements models and, while sparce matrix solutions have gotten faster over the years, it’s still a CPU intensive process and general (non video) matrix operations aren’t really gaining all that much speed. Worse, I work in an industry with large, complex 2D files (PDFs with hundreds of 100MP images and overlain vector graphics) and the speed of rendering hasn’t appreciably changed in several years because there’s no pipeline optimization for it. People out there doing CFD and technical 3D modeling as well as other general compute-intensive tasks on what we used to call “workstations” are the professional applications which need real computational speed - and they’re/we’re just getting speed ratio improvements and the square root of the number of cores, when the software can even parallelize at all. All these manufacturers can miss me with the “professional” workloads of people surfing the web and doing word processing.

[\rant]

Overzeetop,

Indeed! It makes the benchmarks that much more disingenuous since pros will end up CPU crunching. I find video production tedious (it’s a skill issue/PEBKAC, really) so I usually just let the GPU (nvenc) do it to save time. ;-)

  • All
  • Subscribed
  • Moderated
  • Favorites
  • fightinggames
  • All magazines