If you are a tech junkie then Apple’s Worldwide Developers Conference (WWDC) must hold a special place in your heart. It is that time of the year when all curious eyes from across the world look up at Apple Inc’s door for the latest releases. In these conferences, Apple announces new features, software, and technologies that it has been able to perfect over the years.
The most satisfying part is when something you anticipated or even wanted becomes part of your favorite Apple product. To be fair, it is now pretty easy to predict what new Apple cooks up, especially for iPhone.
People who keep a keen eye on this event know that most of the time (literally every WWDC event), a new feature that Apple launches for its iPhone is something we have already seen somewhere, perhaps on Android devices or Google I/O.
Ever wondered why iPhone is always late to the party?
Let’s try to map out the rationale behind Apple’s procrastination.
Reason #1: Apple Takes Time Perfecting Each Feature Before the Launch
There is nothing wrong in copying anything from anyone, especially, in the tech world, when you know you can do it better than the competitors. There is no quart of doubt that Apple does it better than all its competitors. So, Apple needs some time to perfect things for iPhone, before hitting it in the market.
Some Examples of iPhone Perfecting Others Ideas
The live text feature enables your camera to recognize any text in the image saved in your gallery and allows you to copy and paste this text anywhere you want or look it up for more info on the web or in dictionaries.
This feature is like magic — you can even extract handwritten and printed text and put it anywhere you want. Besides, you can also look up, translate, search it on the web and even share it with a simple long press.
No matter how interesting this feature may seem, it was already functioning in Android phones in Google Lens.
Despite doing it before iPhone, this feature does not work as seamlessly as the live text feature works on iPhone.
For example, if you have an image with a phone number, you can long-press the phone number to initiate a call or send a message from your iPhone. See how much Apple has perfected the original idea of Google.
Now, let’s come to FaceTime screen-sharing feature (SharePlay) introduced in iOS 15 to cater to virtual meeting situations post-pandemic. Screen sharing feature was there in the most popular video-conferencing platform, Zoom, and even Google Meet.
SharePlay allows syncing to match Apple’s goal of more integrated products in its ecosystem. You can watch any movie or series, listen to music, or use similar apps together with anyone on your call all synced up. Similarly, it offers synced playback and shared controls.
So, in simple words, this feature gives access to complete controls to every party to the FaceTime call e.g. anyone can pause a movie, which will pause it on another screen. Besides, SharePlay also has better FPS (frame per second) than your typical screen-sharing feature.
Another feature, Apple’s on-device voice recognition or “Voice Control”, is an extension of Google Assistant and voice recognition. The good thing is that it even works offline and is faster than Siri.
There are many more examples that we can write about. There are so many features that Apple is perfecting to try to catch up with Google and others and even surpass them.
Reason #2: Apple Cares for Its Ecosystem
There is no doubt that Google is a very innovative company, churning out cutting-edge technologies for us. Apple, on the other hand, has an immense focus on perfecting its ecosystem. So, whatever it tries to create, it focuses on blending it well within its ecosystem, which it flexes proudly.
There is nothing unlike Apple’s highly-integrated ecosystem, where everything works seamlessly together. With this, Apple aims to make it difficult for any user to step out of its ecosystem. Perfecting such a system requires time, as Apple teams have to coordinate and work together.
So, if Apple launches something late, it not necessarily means its engineers were slow to envision the idea. The main thing for Apple is to make sure everything eventually plugs into its ecosystem.
Let’s take an example of, the “Focus” feature on the iPhone. You can set sleeping, do not disturb, work, personal, or customized modes. Whichever mode you select, you will receive notifications from apps or people related to that mode. What makes this experience even more unique is, the fact that it will work simultaneously on all your devices.
For example, if you select Work mode, all your work apps or people will send notifications while the rest will go silent on your iPhone, iPad, Mac, or even watch signed with the same Apple ID.
On the other hand, Google teams do not have such aspirations. They create things for innovations, not giving much precedence to plugging everything within their ecosystem. So, generally, some things go well within Google’s ecosystem, while the rest are not very well coordinated.
Whatever Apple Does, It Does it Better
It is a known fact that Apple creates dope products. It even flexes its perfectly coordinated ecosystem. Apple is late to offer new features than Google, but when it does, everyone knows how perfected and carefully designed those features would be.
We hope Google also starts giving more thought to perfecting its ecosystem to provide a seamless experience for its users like Apple.