The global AI market size was estimated at $62.35 billion in 2020. That number is expected to reach $93.53 billion by the end of 2021 (GrandView Research). While advances in robotics and consumer-facing applications are often what comes to mind when we think of AI, there is one subset of the technology that has been a huge driver of this growth in recent years: Natural Language Processing (NLP).
Put simply, NLP helps computers communicate with humans. It enables algorithms to understand text, human speech, and even images in order to scour documents, academic papers, customer information, and other data sources to determine which parts are important. Unlike basic speech-to-text, NLP can analyze sentiment, sarcasm, and other nuances of the human language, making it effective for a range of tasks.
Read More: E2open Named A Leader In The 2021 Nucleus Control Tower Value Matrix For Seventh Consecutive Year
For example, in a medical setting, NLP can be used to screen clinical trial candidates to accelerate drug development — something that’s clearly important in light of the global COVID-19 pandemic. In retail, NLP helps power customer service chatbots and in media, it can help identify fake news online — another area that’s gaining importance on a global scale. The use cases for NLP span industries and companies and there are no signs of slowing.
In fact, new research from John Snow Labs and Gradient Flow found that 90% of global respondents reported that their NLP budgets grew by 10-30% compared to last year. This is significant, given many markets are still recovering from the economic aftermath of 2020. It’s not just the technology sector either; NLP has been growing significantly in healthcare and financial services. This is a sign that even industries with strict regulatory bodies are finding ways to put their data to work.
Some of the forces driving this uptick in use are Named Entity Recognition (NER) and document classification, the most popular use cases of the technology according to survey respondents. NER can automatically scan entire articles and reveal the relevant people, organizations, and places discussed in them. While it sounds basic, knowing the major parties within large groups of text can help in automatically categorizing the information to improve content discovery quickly and efficiently.
Read More: SalesTechStar Interview With Mary Pat Donnellon, Chief Revenue Officer At CallRail
Cited among both general respondents and tech leaders as the top NLP use case, it will be interesting to see if NER remains the darling of NLP or more advanced use cases inch ahead over the next several years. While it’s not likely, areas such as question and answering, in which humans can ask questions in plain language and receive relevant responses, is another use case experiencing growth. It’s an interesting concept — one that aligns more squarely with our perception of AI — but there’s something to be said for the practicality NER brings to enterprise organizations looking for insights within their data.
Despite the increased investments and broad applications of NLP, like any technology, NLP has its shortcomings. While current tools on the market are getting smarter and more sophisticated overtime, ensuring accurate results still requires a level of finesse only humans can provide — and not just any humans. Intervention from a data scientist is often needed to get NLP models from research to production, and then monitor and tune them appropriately to prevent them from degrading with time.
For humans and machines to successfully deploy NLP projects together, a reliable solution is a necessary intermediary, and there is no shortage of options to choose from.A large majority (83%) of survey respondents said they use at least one of the major cloud providers, despite difficulty tuning models and cost being two big concerns. As such, many technologists use a mix of solutions, from the aforementioned cloud, to libraries and other open source tools.
Developers and companies building language applications will benefit from recent progress in transfer learning, data augmentation, and model testing and validation. Transfer learning involves taking knowledge extracted from one setting and applying it in another setting. Researchers are actively pursuing this research, so we expect tools for developers will begin to appear over the next few years.
With a consistently growing AI market and increasing investments in NLP specifically, it’s safe to say this is not just a flash in the pan. Businesses that want to use their data strategically should consider how NLP can benefit their organization and start to explore some of the proven use cases and solutions that make it possible. It’s an exciting time to be in the field of AI, but we still have a long way to go before we realize its potential.
Read More: Usability: The Key To RevTech Success