What is natural language processing (NLP) and how it helps

Share this post

Have you ever wondered how Alexa works? Or when you type things in a search bar on a website, it auto-fills and tries to predict what you want to search how it works? This is natural language processing at work.

What is natural language processing? In a nutshell, natural language processing (NLP) is a subfield branch of artificial intelligence. Its primary purpose is to enable computers to interact with human languages.It can be used in a variety of applications, from voice-to-text processing, language translations or find insight by churning through large amounts of natural language data.

Natural language processing in an enormous topic spanning speech, translations, chatbots and analytics. First off, let’s run through what the typical NLP pipeline looks like and what it takes to get computers to understand language. (Hint: a lot).

In this article we’ll cover the basics of NLP, understand how do you apply it to your business and where is the technology is not quite ready to be applied yet.

Without further ado:

What is Natural language processing?

Computers have always had a reputation for handling structured data. Stuff like database tables, spreadsheets, mathematics.

Where the human brain currently excels and where computers really struggle is handling unstructured data such as language, text and multimedia content. Everything from emails, word documents, videos & photos.

Language is especially complex as we’ll soon cover with some of my favourite (and hilarious) examples. 

Primarily, natural language processing is used for:

1. Text & document analytics (for example, helping companies find keywords to identify common themes and patterns within customer service complaints)

Source

2. Allowing humans to communicate with computers in an intuitive ‘natural’ way, whether that’s via text or speech. 

Due to recent advancements in NLP, we’re now able to ask important commands to virtual assistants like Alexa and Siri such ask: “Hey Alexa, what does the fox say” or play pranks on your friends.  

An example of putting the latest NLP technology
to good use
source

The way NLP does this and solves a lot of ambiguity in language is by using statistics and probability to map frequently irrational human language into logical data structures that computers can understand.

Why are chatbots and NLP becoming so prevalent? 

Thanks to the wide-scale access to internet access, we now have access to an unfathomable amount of information and online tools.

This is causing huge changes in our culture and way of working.

To give you two examples: 

People are working faster and becoming more efficient:

Because of unprecedented access to cheap and powerful computers that are able to run the necessary algorithms for accurate natural language processing that would have been impossible even 5-10 years ago.

Thanks to these developments, NLP has seen a recent boom in popularity and is used every day by billions of people in applications such as: 

  • Auto-predicting your sentences In Gmail.
  • Word Processors such as Microsoft Word and Grammarly that check your writing and grammar accuracy.
  • Personal assistant applications such as Alexa and Siri.
  • Machine translation platforms like Google Translate.

Changes in buying behaviours

At Enterprise Bot Manager (EBM) we use NLP to facilitate automated, intelligent customer service chatbots.

Because of all this access to information, our shopping behaviours are changing. We’re now able to do our own research into products we’re considering and making more informed decisions.

It takes longer for us to talk to customer service teams now and when we do, we want convenience, speed and a personal touch.

Obviously, it’s hard for companies to keep up with this. Which is why everyone is looking to embrace artificial intelligence (AI) to reduce repetition, improve efficiencies and provide better user experiences.

Natural language processing is a key pillar to the AI technology stack.

Why do computers have such a hard time understanding human language?

As I previously said. 

Language is hard.

Logical flows and button hybrid systems often perform better the best NLP systems at the moment in most business applications. (However, NLP are systems are VERY close and will overtake soon).

Let’s run through some fun examples to elaborate on the challenge chatbot & NLP developers have ahead of us:

One sentence. Many meanings.

Language translation

As you will have experienced first hand when using translation tools, we still haven’t cracked the nuances of language.

For example, there are mythical tales that a first attempt translating some biblical documents from English to Russian went a bit… wrong.

Here is the biblical sentence that required translation:

“The spirit is willing, but the flesh is weak.”

Which got translated to:

“The vodka is good, but the meat is rotten.”

Fortunately, we’ve made advances, but still not quite there:

Homonyms

Words that are spelt and sound the same, but carry different meanings, create an interesting NLP problem.

Such as:

“Paris Hilton listens to Paris Hilton at the Paris Hilton”

Is something you understand without a problem.

But how do you get a computer to understand the difference?
When does “Paris” refer to a person, and when does it mean a hotel’s location in France?

Conversational implicature:

Source

When you have a conversation with a close friend, you will find that you can transfer ideas at a much faster rate. Almost like you’re reading each other’s mind. You’re predicting what they say next, or what they mean from very little verbal information given.

We use metaphors or use past experiences to give unspoken context in a conversation.

For another human being, it’s second nature. 

Of course we “know” what the other person wants. We don’t need people to use a specific set of words.

This is “conversational implicature” at work. 

We see this a lot in customer service. Customers who call about a broken internet connection complain about the same problem in a different way. “I’m not able to get online” and, “I think my internet is broken” is both indicative of “a problem” and the problem is “lack of internet”. This is why you need to write alternate queries for the same piece of information.

Most of these problems are conversation design issues as much as NLP, which we cover more here.

What does the NLP process actually look like? 

As we’ve mentioned, natural language processing is a very broad topic. Generally speaking, however, there are 6 common steps in the NLP process. 

Sentence segmentation 

First off, we need to break down paragraphs into separate sentences. Sadly, It’s not as simple as finding punctuation marks. Documents are often not formatted correctly, have errors (yes, full stops in the wrong place!) and so on.

For example:

We use full stops at the end of sentences, but also for things such as: 
“E.g” 

and 

numbers: “1.2”. 

Fortunately, modern NLP systems have sophisticated techniques to get around this. 

Word Tokenisation

Also known as lexical analysis. A lexicon is the collections of words or phrases of the language used. To carry out lexical analysis means we need to analyze the structure of words and convert the sentences used – essentially a sequence of characters into a sequence of tokens (or strings) that makes it easier for the computer to understand. (That’s a bit oversimplified but let’s not go full PHD in this article). 

Parsing

Parsing is the process of analysing these tokens and figuring out what they mean in a sentence and shows the relationship between words.

Dependency parsing 

We also need to figure out how all the words in a sentence relate to each other, which is a subset of the parsing process. 

Semantic Analysis

Source

Once parsed, we then have to use semantic analysis to understand the syntactic structures. Syntax is the set of rules and principles that govern language. Above is the friendliest image I could find on the word Syntax.

Now we’ve got a basic structure the computers can understand, we can talk about some of the popular NLP methods

Once we’ve built an algorithm that understands the basic structure of the sentence, we can actually start deriving ideas and insight from the text.

What are the most popular NLP methods?

Since the topic of NLP is large and a portion of NLP methods are still for research purposes at universities or used to support solving larger tasks, we’ll just cover the important methods need for business applications that you need to be aware of.

We can summarise this further by breaking it down into two categories:

  • Speech
  • Text

Text

Text classification (intent recognition & spam detection) 

Source

Naive Bayesian Classifiers area popular example of a text classification model. Have you ever wondered how your email inbox knows when something is a social email, spam or worth going into your main inbox?
You’re seeing Bayesian spam filtering in action; a statistical NLP technique that compares the words in an email. It then decides whether it’s valid or junk.

Sentiment analysis

Source

Sentiment analysis is another popular example of text classification.

Identifying the mood or subjective opinions within large amounts of text, including average sentiment and opinion mining. “When you type out “this service is awful!” in the customer support chat – it’s sentiment analysis that detects “awful” and grades that word on a scale of how emotional the sentence is.

Named entity recognition (NER)

Source

Named Entity Recognition is a technique used to extract “entities” from text. Some of the entities that these algorithms are typically trained on include:

  • Company names
  • People’s names
  • Geographic locations (Both physical and political)
  • Product names
  • Dates and times
  • Amounts of currency

NER models don’t just look up definitions from a big list. They gather the whole context of the sentence and how the word appears to then make a statistical prediction on what that word/noun means. 

Note: machine learning algorithms always work from a 0 to 1 certainty and are rarely absolutely certain on something. Eg. 0.99 certain or 0.74 certain.

A good NER algorithm can tell you the difference between Paris Hilton the person and Paris Hilton the hotel.

NER is immensely valuable in NLP as it makes it so easy to gather structured data out of vast quantities of text and get bottom-line value out of an NLP process.

Translation: 

source

Simply: Translating text or speech from one language to another. Everyone’s used Google translate at some point. 

A favourite example of mine:

Where many of the subsets of NLP is used and combined: is the app Duolingo. If you haven’t given it a go yet, I highly recommend it. 

That way you can see for yourself the progress we’ve made with NLP technology. It’s a great demonstration of how NLP helps to create much more fun and personalised experience for the user.

Natural language generation:

Source

The most recent and politically charged example of natural language generation (NLG)  has to be GPT-2 created by Open AI. GPT-2 can create an entire article, just from a small sentence prompt. You can give it a go yourself here!

NLG has a very wide set of applications. From creating eCommerce product descriptions to predicting what you’re going to type in next on your email in Gmail! 

NLG is still a few months or years away from being indistinguishable from human writing and some academics think it will never be as good as a human. 

It’s likely the commercial uptake of NLG is still quite limited as you still need humans in the loop to proof-read and edit generated text. 

The best NLG models like GPT-2 are still very resource hungry and expensive to run.

When you couple the expense of running these models with the need for humans to sense-check the outputs probably means that they won’t give you much of an edge in your business just yet.

Text analytics. 

Text analytics is used to explore textual content and derive new variables from raw text that may be visualized, filtered, or used as inputs to predictive models or other statistical methods. 

One example is using it in social media analytics to track awareness and sentiment about specific topics and identify key influencers. ed.

We cover some examples of text analytics later in this article. 

Automatic summarisation

An example of this would be an app call summly that a teenager sold to Yahoo! for almost $30 million. Automatic summarisation takes an extensive text document and boils it down to an easy to read summary. This uses a combination of other machine learning techniques alongside NLP.

There are 2 flavours of automatic summarisation, extractive and abstraction:

Extractive summarisation:

Which tries to extract and combine a limited number of key sentences from a long document and abstractive summarisation – a newer and more experimental type of system that ingests and parses a long document before generating a summary from scratch which is not always a perfect process.

Abstraction-based summarisation

The weakness of extraction techniques is that they only copy information the algorithm deems most important to the summary it makes.

Meanwhile, abstraction involves paraphrasing sections of the source document.

Overall, abstraction creates a more condensed and higher quality summary compared to extraction. However, the programs to do this are more expensive to develop and require natural language generation algorithms, which is a field still in its infancy.

Some examples of how you can use these methods in your own business:

  • Making executive summaries for board members
  • Creating reports and gathering insights from large text datasets. 

Speech

Speech recognition 

I recently signed up to a Halifax travel credit card. During the process, they asked if I wanted to use my voice as part of the security process to make my bank more secure. This is speech recognition in a nutshell.

Text-to-speech (TTS) conversion: 

Basically, transforming voice commands into written text, and vice versa. Have you ever missed a phone call and read the automatic transcript of the voicemail in your email inbox or smartphone app? That’s speech-to-text conversion.

What is Natural language understanding?

Source

Natural language understanding (NLU) is a subset of NLP. Of the many semantic NLP examples we gave above, such as Duolingo, translations and virtual assistant, they will all be using this. 

Natural language understanding goes beyond just understanding language but also identifies intent and figures out the ambiguity in sentences. 

As we briefly covered, NLU tackles the immensely complex features of language. Such as conversational implicature and homonyms.

As a side note, NLU is still a controversial term, especially in the academic field. Many believe it’s redundant!

What are the top 3 NLP business applications that drive the highest ROI?

Virtual assistants

There are more virtual assistant examples than you can shake a stick at. For us at EBM, our most successful chatbot in terms of bringing a return of investment has to be our work with HSBC and theirchatbot AiDA, which resulted in a full-time equivalent customer service employee of 6-10x!

AiDA is evolving to becoming your own banking assistant in your pocket. Completing a variety of tasks such as helping them get a bird’s eye view of their spending habits or letting them know what benefits are available to them from their card.

Best of all, it all ties neatly within the current HSBC app. Cutting down the expenditure on developing another app separately. 

Text & automatic Analytics

source

Approximately 90% of all data in an organisation is unstructured. 

Text analytics can be used to explore raw textual content and derive insight that can be visualized, filtered, or used as inputs to predictive models such as we did here:

Some other examples are:

  • Accumulating reviews: By gathering all of your reviews for products and services and summarising the content into a word clustering diagram, you can see how frequently different pros and cons are mentioned.
source
  • Social media analytics. Track awareness and sentiment about specific topics and identify key influencers or judge how happy or sad your customers are.
  • Automated analytics: as NLP technology continues to evolve and improve, we’re currently looking into a feature brings all of the above examples together in an automated fashion. We’re not able to talk too much more about this for the time as it’s still in the workshop, but be very excited for the coming months!

Customer service automation 

Tying in very closely with virtual assistants, chatbots and NLP is extremely useful for helping customers navigate knowledge bases.
Today, NLP systems are capable of generating support tickets and answers to incoming questions. As with our AiDA example, we were able to automate 24,213 conversations (over 6 months). 14% of which started by AiDA without human intervention. 14% of conversations started with AiDA were completed without human intervention

Most exciting of all, this kind of automated support doesn’t just save businesses money. It also expedites help for customers, who come away feeling more satisfied. 

Related questions

What are your favourite courses in natural language processing?

If you’re looking to really sink your teeth into NLP and want further reading we highly recommend these sources:

More to explore

Ready to kickstart your chatbot journey?