How far can Apple go?

Rumblings are surfacing about Apple’s recent pricing strategy and I can understand why this is so. An iPhone for £1,000 and up, an iPad for £1,800 and up, and the cheapest Mac starting at above £1,000 feel high to me with the occasional offering (iPhone 7, iPad and Xr) being priced at more reasonable levels.

The above does not look accurate and it isn’t, but consider Apple’s marketing and the Air, iPad Pro and iPhone Xs are spoken about the most, which feeds the thoughts that Apple products are getting more expensive. It’s hard to work out actually which is the clever strategy from Apple, but the high-end products are easy upwards quickly with the low-end staying fairly static. Throw in the series 4 Apple Watch and the cost of accessories such as the Apple Pencil and Smart Keyboard, and you again start to believe that the general pricing is moving up across the board.

Rumours persist that the latest iPhones are not selling as Apple would like and the same has been said of the iPads and Macs which makes all of the sense in the world to me. From what I can see, Apple is gaining customers and maintaining serious loyalty, but is not selling at the same frequency as before which is a problem.

I know a few people, including myself, who are happy with their iPhones and who see no need to upgrade. They last for 5 years max, mostly without issue, and so that may stop the standard user from upgrading (saying that many many people upgrade through mobile contracts). iPads last particularly well and so do Macs, and in these three products Apple may be a victim of its own success. Products that are expensive and which are specified in most cases for longevity, to work well through multiple software updates, will stop owners from upgrading. It’s as simple as that.

Add to this the fact that people don’t tend to be showing off their new phones anymore and that the ‘fashion’ bubble has deflated a little and the problem worsens. Let’s be honest, all phones look the same in 2018 and the Xr is an X is an Xs in terms of how they look from more than a few inches away. Phones are in everyone’s hand most of the time so they have become invisible and all that now matters to an increasing number of people is reliability, good enough features and pricing.

It’s a shame really because we live in a time where a company like Apple has to grow, and grow and grow, every quarter and under this pressure there will be the need to increase pricing to make up for the lack of numbers, but it is merely sharpening a downward curve and lessening the number of people who are able to afford the products.

I have a MacBook, an iPhone X and a series 4 Apple Watch, and even I have spent some time looking at Android phones recently. The eco-system has kept me firmly where Apple wants me to be, but I will tell you that if you have not looked at Android for a while you may be in for a surprise. Some of the phones are a distance ahead of any iPhone in specific areas, but I still get a sense that the software has not changed much at all.

At some point, however, Apple may push too a little too far and people may make the call that the advantages no longer justify the extra cost. It’s by far the biggest threat I see to Apple at this time and it’s not an easy one to stop because of the external pressures to always sell more and to make more profit every three months.

I see a utopia where the company could just concentrate on making the very best products they can at prices that are perfectly justifiable, which has been the case in the past, but alas I suspect that utopia will not be visited any time soon.

Categories: Apple, Articles

9 replies

  1. – Gruber made an interesting, rather elitest argument about the Mac: “In some ways, the worst thing that ever happened to the Mac is that it got so much more popular a decade ago. In theory, that should have been nothing but good news for the platform — more users means more attention from developers. The more Mac users there are, the more Mac apps we should see. The problem is, the users who really care about good native apps — users who know HIG violations when they see them, who care about performance, who care about Mac apps being *right* — were mostly already on the Mac. A lot of newer Mac users either don’t know or don’t care about what makes for a good Mac app.”

    As I look at that through a Kirk-lens, it’s that many techies just like a system that has a Unix-y feel for doing work, and in general having far fewer viruses and other problems.

    Also it all reminds me of how much I hate people saying “oh the new version of this gadget is only incremental improvements – where’s the innovation?” without really being able to say what kind of innovation that would be (I mean they would say “well it’s not my job to think that up”, but to ask for features without thinking about what constraints have prevented that happening so far is just daydreaming)

    But, Moore’s law has petered out it seems- my 2013 Macbook Air still seems to be chugging along just fine, which when combined with some “modernizations” in terms of losing power ports, old-USB slots, and SD card readers takes away incentive to upgrade.

    I’ll see how long my iPhone Xs lasts. I have a weird feeling like it could be a good while, but given that it’s only a few months old that’s pretty premature 😀

  2. Apple devices are more expensive for what one would consider the equivalent. Either that or the options you want are ridiculously overpriced, as I have commented previously on about memory and SSD upgrades.

    I don’t think Moore’s Law has stopped. But what we have is, for the most part, up to the tasks at hand. Unless you’re pushing the boundaries, what you bought a few years ago will suffice. I have a 2009 (OMG it’s almost 10 years old) MacBook Pro. I replaced the hard drive with an SSD, use patches to install Sierra, and it runs well for what I need a laptop for. Likewise, I expect my 2014 Retina iMac to do almost everything I need it to do for a few more years. Granted, it’s top of the line everything, so I paid the price up front. Except for RAM, which was thankfully user upgrade-able. But wasn’t that part of Apple’s appeal? A device that was expensive, but well built so it would last longer than your average PC.

    Kirk’s comment on people asking “where’s the innovation” reminded me of a comment I read saying that people don’t know what true innovation they want. True innovation solves problems that people don’t know they had or problems they took for granted that they would always have. I’m not positive, but I believe it was Steve Jobs.

  3. “what one would consider the equivalent” – only true if you consider hardware only!

    I believe Moore’s law has stopped, but you’re right it is more things have gotten “sufficient”.

    Yeah, it’s hard to exactly understand Jobs’ role. Like putting a screen in front of a conductive touch pad- did he come up with it, or “merely” champion it?

    • To be picky, Moore’s Law actually talks about the number of transistors that can be packed into a given space doubling every 2 years. It is usually spoken of to mean that computing power doubles every 2 years. So far, both have been generally true, at least at the technology’s leading edge, but it may be slowing down as this article explains.

      As for Jobs, I doubt he came up with most of what Apple innovated, at least not directly. But I don’t know that I’d call championing an innovation a “merely”. After all, it has to be explained and marketed to a public who had little if any prior knowledge. Plus it has to be a smart business move. At that, he was one of the best.

      • Yeah, mixed feelings on Apples love of removing things. Optical Drive was probably the right thing – but I don’t think USB-C has built the infrastructure Apple must have assumed it would have by now. But without a drive to eliminate, we’d still be having laptops with VGA ports.

        I guess that urge to simplify and make standalone goes way back to Jobs and the first Mac, where I think that was kind of the intention, which was opposite Woz’s take on the Apple II with all its great slots.

        I mostly got out of PC gaming in the 90s, when the option was spending tons for a PC (and not loving mouse and keyboard) or less for a N64 I could then play with 3 of my friends in front of a couch 😀 But now like you I’m more or less set in my ways, but no so much that i can’t play “what mighta been” had I stuck with PCs.

        • I wonder how much of removing/simplifying was a matter of control. With PCs, the end user could control what they wanted, which was usually what was readily available. So we ended up with parallel ports and serial ports among others, and various sizes of floppy disks and diskettes. Apple pretty much decided and decides what you’re getting. Of course there are always expansion hubs.

          • I guess I’d say yes, but there are more noble and less noble reasons for control.
            The classic is “prevent the user from doing ugly things” (and assuming that the tastes of the user and the company correspond). Then there’s the control needed for “design wankery”

            There’s also a company protection thing – when you have a wild west add-on system, you run the risk of things being less stable and in general less a culture of reliability, and often the consumer will blame the maker of the big thing, not all the sketchy add-ons…

            I guess you see the same thing with the concept of the app store. I know free software people hate the gatekeeping that goes on – but it does make for a much friendlier and safer world for the common folk.

  4. Yeah, I knew the technical-ish version of Moore’s Law, but may have just skimmed some articles that were a little more pessimistic. We at least agree a sufficiency was reached a while back – probably over a decade – for basic browser and office-y tasks, and waiting 5 years to upgrade a personal computer is no longer anywhere near the big deal it was in the 90s, say.

    I’m a little worried it’s worse than that. Right now I’m cynical about some gaming bits – I just got a PS4 (released 5 years ago, come to think of it) and haven’t found it doing much that an Xbox 360 (released 13 years ago) couldn’t. The places for improvement are obvious.

    It’s fun, if well-worn territory, to think about what Jobs’ Apple stuff did that really set it apart, and how. Come to think of it 80s Lisa and Mac is probably the biggest exemplar of doing it so much better than anyone else… though 2001s iPod is good too – there were other MP3 players, but I guess it was the generous (for the time) screen and clickwheel that set its place in history. And as for the iPhone – I’d say it’s no one realized you could put a browser you’d actually want to use on a phone like that, and that with the right screen sensing tech you could even get a “good enough” onscreen keyboard. Macbook Air is in the conversation too – that you could get people to live without an optical drive.

    So, where do we see Jobs lack? I dunno. Maybe Apple Watch could be more set apart?

    • I wonder how much Jobs was involved in the Apple Watch. I know these things take years to develop, but…

      Interesting that you brought up the MacBook Air. Very few people would consider the removal of something an innovation, but it is in terms of forcing people to use the replacement technology. I can’t remember when I last used my external optical drive. Although I haven’t digitized my movie library. At least not completely.

      As for gaming, other than higher end audio/video/graphics processing, which few consumers use, gaming is what takes the most computing and graphics power for the average consumer. I play RPGs, so my iMac is fine for most of them, assuming I can get them on the Mac. I ran The Witcher 3 under Bootcamp, but after finishing, I had problems with Windows and ended up deleting it. I played Fallout New Vegas under WINE. I’ve tried using the latest virtual machines, but they just aren’t good enough if you want high graphics quality.

      My son keeps bugging me to get a PS4, but as you say, it’s 5 years old. And I’m not wild about using a controller. Too much old muscle memory. Plus you can’t seriously mod games on a PS4. I know Divinity: Original Sin II is coming out on the Mac, but I may be forced to get a PS4 or a PC to play Cyberpunk 2077 and The Outer Worlds.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s