Apple CEO Tim Cook once said, “Technology should serve humanity, not the other way around.” That’s an incredibly deep, yet increasingly relevant, point. As technology advances — sometimes at an alarming rate — it seems to strengthen its grip on us, not the other way around.

Look at smart home technology, for example. It has the potential to be remarkably disruptive, and when it works, it can make our home lives so much more convenient.

However, the market is so fragmented currently, and there are so many different standards, protocols, devices and even hubs, new consumers getting involved are often confused. For smart home tech, strictly speaking, at what point do we stop and say OK, let’s quit adding, and instead improve what we have?

Say what you will about Apple and their products, but that’s what they’ve continued to do for decades. They release a product, they perfect it — and then they release it again with better, more powerful hardware.

In the case of the new iPhone X, we have nearly the same scenario we always have. While there are a few aesthetic changes, for the most part, the iPhone is still the iPhone — except for one modern, innovative element born from the technology of modern artificial intelligence and machine learning.

Thanks to these powerful technologies — found in the iPhone X, by the way — the consumer electronics spectrum is about to change considerably. As Cook so accurately pointed out, we are on the precipice of seeing our technology serve us, not the other way around.

On stage, Cook brazenly claimed the iPhone X will “set the path for technology for the next decade.” Let’s take a closer look at what he means by that, and how it’s going to happen.

  1. Introducing the Neural Engine

Inside the new iPhone X is an A11 processor that includes embedded AI support, referred to as the neural engine. The goal of the component is to accelerate AI processing and software functions, thanks to the enhanced use of images and speech. This is all part of the artificial neural network, or neural engine, as it’s called.

In layman’s terms, the neural engine is a small portion of the internal hardware dedicated specifically to artificial intelligence.

This opens up many new opportunities for the technology, especially when it comes to innovative features. For example, the tech is responsible for the iPhone’s new facial recognition authentication, and customized emojis complete with a copy of your facial expressions.

[easy-tweet tweet=”“A11 bionic neural engine” is capable of handling up to 600 billion operations per second” hashtags=”Bionic, Technology”]

  1. Machine Learning in Your Pocket

The “A11 bionic neural engine” is capable of handling up to 600 billion operations per second, which is 70 percent faster than the previous generation of Apple’s mobile processors, the A10.

This boost in processing power and performance means machine learning is not just possible, but incredibly viable, and in such a small device. You see, machine learning requires a variety of conditions to work reliably — including access to lots of data, significant processing power and an audience. The latter allows the system to gather enough intelligence and stats to build actionable insights.

With the new iPhone, Apple has combined all these things into a tiny, handheld device. At any given time, the system can take advantage of Apple’s vast network of data, plenty of processing power —  thanks to the new A11 dedicated chip — and, finally, Apple’s large customer base.

  1. Accessibility Is Now Seemingly Limitless 

To work with machine learning and AI technologies, developers and organizations had to invest a lot of money, time, hardware and resources into the endeavor. An indie software development company making a mobile app, for instance, may find AI or machine learning out of their reach — at least, that’s how it was before Apple’s latest move.

Now, the new chipset, new focus and new technologies provide more accessibility to everyone. Mobile app developers can tap into inexpensive platforms and tools offered by Google, Amazon, Apple, IBM and many others. They can build fully operational, robust and capable mobile apps and services that tap into these technologies, without deploying the necessary hardware themselves. That’s a huge leap forward, especially regarding putting the technology into the hands of a large community.

  1. Siri Is About to Get a Whole Lot Smarter

 Siri supporters and enthusiasts already know the personal assistant is quite accurate already, but there’s always room for improvement. Thanks to the technology embedded in the new iPhone, that improvement’s going to happen, and faster than ever before.

Siri will be able to tap into the boost in processing power, systems and data these technologies are built upon, not to mention Apple’s gigantic install base. Imagine the assistant returning answers to queries with little to no delay, and with increasing accuracy. Furthermore, imagine if Siri can react to more natural speech, as opposed to specific phrases or commands.

One thing is certain: Siri and her OS are both about to get a whole lot smarter. Thanks to AI and machine learning, that’s not only possible, but will happen at unprecedented speeds.