Apple literally rolled out the feature 13 months ago with 24 months free use with the purchase of a compatible device.
How can you claim any statistics on the topic?
But yeah, I think the real interesting thing is what’s going to happen with the LEO constellations, but I also get why Apple isn’t keen on relying on a Musk-driven enterprise.
All other LEO-constellations are probably a decade away from having enough coverage.
I think Apple wants to get in the game now, and they have the money to spend on differentiating themselves.
And for those who have stumbled into a situation where they needed it and been rescued it’s great, but on the other hand the majority of the planet is not served as of now.
Apple has shown that the market could be willing to adapt.
But then again, they’ve always had more leverage than the Wintel-crowd.
But what people seem to ignore is that there is another option as well: hardware emulation.
IIRC correctly old AMD CPU’s, notably the K6, was actually a RISC core with a translation layer turning X86 instructions into the necessary chain of RISC instructions.
That could also be a potential approach to swapping outright. If 80% of your code runs natively and then 20% passes this hardware layer where the energy loss is bigger than the performance loss you might have a compelling product.
I’m saying that anyone singling out Apple for planned obsolescence and disregarding the rest of the market is playing into someone’s playbook.
I’m also fully aware of the so-called batterygate (oh, how I loathe how people add a “gate”-suffix to things to make a “scandal” completely clueless to the fact that Water_gate_ was the name of a fucking hotel. Anyways…), and while we may only speculate wether or not Apple was trying to push people to buy new phones, from appearances it would seem that they were acting in the (somewhat*, I’ll get back to that later) best interest of the consumers, but just failing to communicate it in a good manner.
Now let me get back to my asterisk:
*: There are different types of battery chemistries, and while Apple thumped their own chests back in the day that their MacBook batteries took 1000 charge cycles to get to 80% of factory capacity.
Apple willingly choose to use cheaper chemistries for iPhone batteries than they could use if they wanted longevity to be higher.
So yes, in that regard you can argue planned obsolescence. The amount of money Apple charge for their phones they could definitely put better batteries in, but on the other hand there’s likely arguments for why they choose these batteries, such as capacity or other characteristics. I’m not going to claim to be an expert on battery chemistries, and will leave that to someone else.
With regards to some of your comments on longevity then and now; note that we used to use the best material to make something, regardless of its impact on people and environment. Some environmental concerns do actually reduce product longevity.
Combined with increased technological complexity and a higher rate of improvement in the digital era than in the analog era it’s been a long period where don’t think it’s too bad to replace a device after a few years time.
However, we’re now seeing so good performance from a lot of our tech products that an upgrade feels much more incremental than it used to.
I definitely think we should demand more lifetime from our products, but this needs to be through regulation and not just left to consumers.
Louis Rossmann also had some good points here: https://youtu.be/l27_75pDvd4
We should be able to use cloud features without being locked to the manufacturer. Especially if they go belly-up.
He mentions a Chinese car manufacturer, and Arlo cameras, but it could just as well be Norwegian EV charge box manufacturer Easee, or a cell phone manufacturer like RIM (BlackBerry) or a TV manufacturer, etc.
So many products today depend on cloud services for basic functionality, and for a lot of those devices their planned obsolescence will be the cloud service they’re connected to.
Should Apple support their products longer?
Yes, definitely.
But there’s a big difference between not supporting old devices with software updates and designing them to stop working which you allege to.
If you ask me theres way worse fish out there than Apple, and if you look at phone support Apple is the golden standard by a mile with most Android devices still not being supported for more than a year or two tops.
What we should have is a requirement to support devices for at least ten years.
Yes, I know, ten years is a long time, but we’ve gotten to a point where we should expect a device that’s been treated well to last that long.
My 2013 MBP runs just fine, so does my 2011 MBA, my dad’s Fujitsu-Siemens laptop from 2008 even still works. But only one of those is running an updated operating system. Guess which one?
Doesn’t mean that the product is designed to fail, just that Apple chose not to support them any longer.
In my experience very varied. I feel students lean more towards Android, but if you develop on Mac you’re also more likely to have an iPhone, but the one place where it’s somehow been consistently Android in my team is the app developers.
While I don’t mind it at all, somehow the Android build of our app still has the most issues. Consistently over almost six years now. Which I find a bit ironic.
A friend of mine that was also a former colleague has always been an Android guy. A year ago he switched employer and the new company is iPhone only - but he can’t get the latest versions, and it’s basically just the base version too. So he’s still running with his Galaxy S21, but no e-mail or calendar sync.
I think he’d switch if he could put some of his own cash in and upgrade to the top model.
People can have the preference they want in life, but there’s no need to obnoxious about it.
“Known to scam people”, “designed to stop working”.
I am fully aware that people can say anything on the internet, but clearly you are not objective at all.
Obviously any further attempt at discussion is pointless. Enjoy your fruit-less life, may it treat you with software updates until the next flagship device is launched.
The fact that you can audit it has zero value.
People don’t audit anything, and pretending that they do is hopeful at best, deceitful at worst.
Even if you audit it you are likely not understanding the code well enough to figure out if it is vulnerable.
Which leads back to my original point which thus still stands; there’s no smart way to choose non-vulnerable plugins. One can obviously avoid things that don’t meet certain standards (popularity, lines of code, known issues, how they’re resolved, etc.), but still doesn’t guarantee anything.
This means that your statement about “smart Wordpress sites don’t pick vulnerable plugins” is frivolous. May I suggest “smart Wordpress sites chooses plugins carefully and limits the amount to those strictly necessary, but should still pay attention to updates patching issues”. Because that’s the difference between smart and dumb. Dumb sites are just left running whatever they shipped with, PHP or not, and smart devs make sure to keep their system and/or CMS and plugins up date.
And if you still want to argue that people actually review the code they depend upon I have one word for you: Heartbleed.
Heck, sometimes someone comes to me and asks if some system can solve something they just thought of. Sometimes, albeit very rarely, it just works perfectly, no code changes required.
Not going to argue that my code is artificial intelligence, but huge AI models obviously has a higher odds of getting something random correct, just because it correlates.
I like how you just keep on talking about what we all agree on.
Would you like to imagine how you would argue if the first sentence you wrote was true?
That’s when the interesting scenarios start showing up, including how humans are ready to grab the pitchforks when an automated system kills someone, but when humans do it 10x more it’s perfectly fine.
If the weight of the car stopped her from breathing it would have been a very different thing.
You are adapting your arguments to the situation.
It should be clear that no self-driving car will ever know what “the right thing” is in cases like this and it would require human interaction/intervention to resolve*. This is simply because the car would be unable to gather the necessary information about the situation.
That should not deter us from adopting self-driving, as self-driving vehicles will be the biggest boon to pedestrian safety seen since the advent of urbanization.
* One could obviously imagine a future where other vehicles could contribute information about the situation so that the vehicle in question could take actions and react based on what happens around it and seeing different perspectives than its own. Interactions with robots or drones could potentially also contribute information or actively aid in the situation.
If the vehicle was intelligent enough to converse with other humans or even the human in question, or at least use human voice to gather information to aid its decision making this could also be different. But the vehicle itself will always struggle with the lack of information about what is actually going on in a situation like this.
Where I work we haven’t really shut down any projects in the last six years.
We’ve had some smaller projects which got parked due to shifting priorities, but other than that we’ve shipped everything else.
But inevitably, over a career in software there will be projects that don’t make it to production for one reason or another.
Personally I’m very pragmatic about it, but I know people who get very attached to the code they write.
I’m the kind of guy that is passionate about what I’m doing when I’m doing it, not necessarily for all eternity. I’ve written stuff that I’d be more than happy for someone to come and replace, but the thing about revenue generating systems (most people say “legacy”, but I prefer this term) is that they aren’t always easy to replace.
I know we’re not all wired that way, and some people find it harder to see an older system get retired. A consultant I use is more attached to my code than I am, for instance.
Author seems to think that starting salary for developers working for Google is representative as well. The average computer science graduate does not get a job at Google.
People who learn to code because it means job security are not the ones we look to hire. We look for people who are passionate about it, whose interest in the subject is deeper than skin deep.
Not looking for people who live and breathe code, but you need to like to solve problems and like to learn new things.
No idea where the other guy is getting his Lenovos. We’ve been buying Pi’s because of form factor and GPIO.
We had a fair share of those Lenovos in the office when I started (I think they were around $500 in our config back then), but they’ve all been replaced with laptops now.
In my department we run around with $3000 MacBook Pros, so not very budget minded at all.
eBay is not where we buy new hardware.
Pi has been ridiculously expensive and hard to get from 2020 to 2023, and we’ve had applications where we’ve been deploying them.
Think we’ve seen cost up to $200 for a complete kit.
You need power, SD-card, a case, and depending on application also a micro HDMI adapter. It all adds up.
Slight difference if you are just upgrading in place, but comparing the unit price of a bare Pi to a computer with everything that you need is not apples to apples.
I mean, it’s all fine on paper.
But… how… the… fuck… do…. we… get… there???
Communism is fine on paper. Fuck. Even capitalism is fine on paper.
However; through empiric data we can learn that humanity is full of shitheads who want to be in power and have control.
Sadly, as I see it, that is incompatible with any form of utopia.
I’m from Norway and we used to be fucking close to having an utopia for a short while. Politics were civil, the differences between low income and high income were low, and we actually pooled our oil money into a pension fund so that we would be wealthy when the oil age ended.
On top of that we were rich on natural resources and had abundant renewable electricity from harnessing our mountains (read: damming up valleys and putting rivers and falls in pipes) to create hydro power.
Combine that with a socialist government (“the Scandinavian model”) with free education for the masses, affordable housing, free healthcare, some of the best employee protections in the world, great consumer protection with the law basically granting consumers 5 years warranty on everything from cars to phones or TV’s.
Sadly, since everyone was feeling so wealthy everyone stopped caring. Housing is now anything but affordable. Electricity that we paid for by destroying beautiful nature is no longer a resource for the Norwegian people, but thanks to numerous new export cables to Europe and the fact that production is sold on a fucked-up “stock market” where the most expensive bid to produce electricity for any hour of the day sets the price for everyone, we now have extremely high and volatile electricity prices affecting inflation and reducing competitiveness of Norwegian businesses.
On top of that politicians keep getting caught with their hands in the cookie-jar at an ever increasing rate, and I think it must have been 20-30 years since we had a prime minister with actual work experience.
Call me cynical, but good things don’t last if we even get them at all.
The Romans knew it; the masses simply needs to be entertained by bread and circus and you can do what you want.
Social media is the best circus so far, and when everyone is busy debating pronouns or whatever flavor of distraction there is this week the political decisions that actually affect us gets made without anyone paying any attention.
Sincerely though, best of luck with your utopian society. I hope for all of us that we get what you describe.
I sadly suspect we will keep doing what we are doing until it kills the planet.
Obviously I’m doing a poor job at getting my points through if you think I’m arguing for the current state of affairs.
It doesn’t mean I’m against copyright.
The principle of copyright is important, so is copy-left (eg. GPL).
Being for copyright doesn’t mean I am against artists being paid their fair share. These are not contradictory principles.
There are certainly huge problems with parts of copyright legislation, especially in the US, and in particular the DMCA.
I always recommend this TED Talk where Larry Lessig talks about the issues with DMCA, and even though it’s starting to get old now it’s still just as relevant and he is still just as on point:
However, the fact that you don’t care about how business works means you ignore the root of the problem - how business works.
I’m not going to argue for communism, but when politicians are for sale to the highest bidder the rest of us lose out.
Feel free to dive into other videos with Larry Lessig if the first one hits home.
I would particularly recommend these two:
Your point of view needs corrective lenses.
Streaming (as a legal business model) is not violating copyright, but streaming changed the business model for a lot of artists negatively.
That’s because in the old days people would buy an album just to listen to a song or two. So basically you get paid up-front for an infinite amount of playbacks.
With streaming artists and copyright holders are paid after the fact, based on the amount of playbacks.
This means singles are much more important than albums, because people don’t really listen to albums like they used to, and if I really like a song and play it a lot it will take a long time before the artist makes an equivalent amount of money as to me buying an album.
It should be fairly obvious that the big record companies come out of this change of business model a lot better because they have a continuous stream of revenue across their played/consumed portfolio, but smaller labels face the same difficulty as the artists.
This has nothing to do with copyright law - which you decide to focus on.
But remove copyright law and no-one is getting paid for anything.
The problem you are complaining about is how labels are milking artists, in lack of a better analogy. A cow gets fed and cared for just enough to make sure milk production keeps going and the cow stays healthy.
A farmer doesn’t cry when a cow gets old and slaughtered, he’ll get a new cow to replace her. That’s just how the business works.
While musical artists are obviously more sentient than cows, record labels follow a fairly similar business model. Help them become creators and make money on the produce.
Obviously not a perfect analogy, but the discrepancy between what the label earns and the artist is nothing new and anyone who was around before streaming should know this.
It’s a common misconception that blockchain gives trust. If you control a majority of nodes in a Blockchain system you decide what the truth is.
This opens the door for illicit players to manipulate things their way.
Lack of trust doesn’t replace trust.
Central, provable/accountable, trust is needed for financial systems to work.
Everything else is “Wild West”.
The biggest problem is people trying to peddle it as currency.
It isn’t currency, never will be. Much more alike to bonds.
It’s an investment object with a speculative value, and no tangible value. The only value it has is what the next guy is willing to pay for it.
While currency is deflationary by nature, crypto is entirely based on demand and supply, and sure, as long as people think it will be worth more tomorrow - sky’s the limit.
Like any pyramid scheme it pays out to get in early, and get out before it collapses.
Relying on crypto is high stakes gambling, and people being people is the only reason I can find for it not having collapsed totally already.
All technological advancements have caused changes, many have made entire professions obsolete.
One could even be allowed to imagine that science itself ought to have put priests out of a job, yet that hasn’t happened yet either.
“AI” is a generic term that’s being thrown around a lot.
There’s a huge distance from today’s AI, which at its best is generative AI based on large language models, to actual General AI that is able to learn, understand, and adapt.
Sure, you can train a language model, but it doesn’t make it “smarter” in the same instance.
There’s no down-side to selling a smart TV to someone who doesn’t want one/doesn’t use the features.
The features we “want” from modern TV’s like DolbyVision and all the shit they do the image to make it stand out in the store requires a significant amount of processing power.
It’s simply better business to sell smart TV’s to everyone than to make dumb TV’s that compete for a tiny fraction of the market when people buy Smart TV’s in every price segment.
As a Mac user at work I just close the lid and put the laptop in my back. Windows users shut down and power up again the next day.
Whenever I bring this topic up IRL people inundate me with stories about how much issues arise if they just sleep their computers.