Why natural language processing is A.I.’s silver bullet

Artificial intelligence (A.I.), in some form or fashion, is all around us; its presence pervades our lives in ways seen, hidden and/or overlooked. Its influence is growing and will continue to do so for years to come. As currently constructed, though, it will not lead to the idyllic future researchers and enthusiasts predict, but rather an ideal-adjacent alternative owing to one missing element: natural language processing (NLP).

Why is that so important? Well, beyond revolutionizing the concept of a user interface in ways undeniably more natural and efficient, it would truly upend the way we interact with and utilize machines forever.

We all love the concept of Siri, Google Assistant, or Alexa, especially if it one day yields us all individual copies of Tony Stark’s personal robot assistant Jarvis. And even though our current NLP platforms are powerful and getting better all the time, they’re nowhere close to true natural language cognition, processing or analytics. If your syntax or pronunciation comes to Siri even slightly different than it’s programmed to handle, the system goes on the fritz. That’s because at the heart of how NLP currently works is simply a massively complex algorithm of programmed triggers and subsequent behaviors; that’s opposed to a self-reliant system that actually understands what you’re saying and responds accordingly.

But who really cares if we don’t get a Jarvis in our lives (even if it would be super cool…)?

Is that really the holy grail of A.I.?

It turns out that no, that’s nowhere near the possibilities presented by NLP.

The idea behind NLP and why it’s A.I.’s silver bullet is that it could totally remake how humans interact with machines.

From the beginning of the personal computer until today, humans have to modify our behavior to interact with a computer. We have to learn a series of protocols, a different language and a whole set of behaviors to use a computer. At first, it was learning DOS command codes to get the computer to do what you want. Then it was learning how to use a mouse within a graphical user interface. Then as things became pocket-sized, we learned how to tap, pinch, drag, double-tap, etc. Those iPhone-imbued gestures dominate our interaction with technology.

That’s not a wholly natural way to interact with something. Sure, we’re adapted to it rather adeptly, but it’s nowhere close to the most natural or efficient way to work with a machine. There’s a reason humans communicate by talking to one another — it’s the fastest way to convey the most amount of information accurately between two humans.

If machines could truly understand what we were saying and then act on that input, it would make every aspect of our technological lives more efficient and seamless. But, even more revolutionary would be what we could then do with software.

As it stands now, getting a computer program to do what we want it to requires programmers to learn a foreign language essentially. For many programmers or software projects, the architects have to know and understand multiple programming languages to bring the product to fruition. And, there’s no one right answer to how to write a particular command code — there’s tons of different ways to engineer a solution and write the code to get the computer to do what you want it to. Some of those options are far more elegant or simple or intuitive than others, but that doesn’t change the fact that writing code is often as much artwork as it is science.

But, what if instead of having to learn that new language(s) and having to figure out how to translate what you want into something the computer can then interpret and act on, you could literally just talk to the computer system. Theoretically, the machine would simply understand what you asked of it, and it would then act on that information and build the piece of software exactly as you requested. It could then run that request through a neural network, iterate rapidly, learn from its mistakes, and then arrive at the ideal version of the thing you asked for. Imagine how much faster, more creative, and more powerful such a system would be — it truly would change the very nature of our technological present and future.

That’s why it’s important for us to be clear — we do not poo-poo Alexa or Siri or Google Assistant or any of the other systems in research or development at the moment. We think they’re already powerful tools that are getting better all the time (I mean, we build out Alexa Skills for clients all the time because we’re so bullish on the platform), and we want to encourage their respective companies to plow forward undeterred. But, true NLP is the holy grail of A.I. because it will have meant we’ve achieved machines with true cognition and intelligence and fundamentally altered the way we interact with the technology in our lives.



Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

3 × one =

Jeff Francis

Jeff Francis is a veteran entrepreneur and founder of Dallas-based digital product studio ENO8. Jeff founded ENO8 to empower companies of all sizes to design, develop and deliver innovative, impactful digital products. With more than 18 years working with early-stage startups, Jeff has a passion for creating and growing new businesses from the ground up, and has honed a unique ability to assist companies with aligning their technology product initiatives with real business outcomes.

Get In The Know

Sign up for power-packed emails to get critical insights into why software fails and how you can succeed!

EXPERTISE, ENTHUSIASM & ENO8: AT YOUR SERVICE

Whether you have your ducks in a row or just an idea, we’ll help you create software your customers will Love.

LET'S TALK

When Will Your Software Need to Be Rebuilt?

When the software starts hobbling and engineers are spending more time fixing bugs than making improvements, you may find yourself asking, “Is it time to rebuild our software?” Take this quiz to find out if and when to rebuild.

 

is it time to rebuild our software?