Our world is flooded with acronyms, and there is no shortage of these language shortcuts in the technology sector. NLP is a hot acronym of late, referring to natural language processing, a market projected to exceed $22b by 2025. But what exactly is NLP?
NLP powers personal assistants like Siri and Alexa, interprets language in Google Translate, and filters and organizes your email based on subject lines. Contact centers use NLP to analyze millions of bytes of text and voice data, looking for patterns in caller sentiment. And NLP is used to search datasets large and small to find answers to questions spoken or typed into devices, including those not-always-so-popular VRUs many service providers use.
According to SaS, “Natural language processing (NLP) helps computers communicate with humans in their own language and scales other language-related tasks. For example, NLP makes it possible for computers to read text, hear speech, interpret it, measure sentiment and determine which parts are important.”
NLP uses machine learning and other AI technologies to perform incredibly complex interpretation tasks quickly. Typically, when you speak into a device, the NLP engine processes what you have said, determines how to best respond and then returns the response within a few seconds. That might not sound that complicated, but the variances of human language make it difficult for the computer to decipher. I once learned this challenge first hand when conversing with a student to whom I was teaching English as a second language. I said, “after we finish this I’ll drop you on my way to my next appointment.” Imagine my surprise at learning that my student was deeply disturbed at the notion of being dropped. Of course, my grasp of English, and of the various shortcuts one uses on a regular basis would be understood by most to mean, not that I would drop her literally, but that I would take her to her destination. Understanding this variance meant processing the context of the sentence as well as the speech shortcut (sometimes thought of as “slang” or vernacular”). While humans can who speak that language can generally process those nuances quickly, it can be difficult for a machine. Add to that the lexicon of the English language alone. The Oxford English Dictionary has more than 200,000 entries and an average English speaker’s vocabulary contains around 20,000 of these words.
Even with all of this, within that roughly five seconds of time that passed between your asking your device a question and its returning a response, various NLP processing techniques are applied to parse, tag, detect and identify the language, whether in text or the spoken word. “NLP tasks break down language into shorter, elemental pieces, try to understand relationships between the pieces and explore how the pieces work together to create meaning.” And NLP is backed by computers with incredible power; a lexicon the size of the English language is no problem for a computer, and machine learning can apply algorithms to look for patterns in the way similar words are used for the same purpose.
The future market for NLP technology in Government
As noted earlier, the projected market size of all the components that make up NLP; software, hardware and services, is nearly $23b by 2025. Government is expected to be a part of the user group, as urbanization is expected to reach 2.5 billion more people to cities by 2050, governments are looking for ways to process big data and leverage AI to more efficiently manage and deliver government services.
According to SaS, “Considering the staggering amount of unstructured data that’s generated every day, from medical records to social media, automation will be critical to fully analyze text and speech data efficiently.”
Use cases at the government level can include analyzing data from multiple reports and databases to produce actionable insights, detecting trends in citizen engagement, and automating multiple service delivery points; such as permit requests, issue reporting and language interpretation. Managing of transportation and parking assets through smart parking applications are another opportunity to leverage AI, including NLP, to better manage city government services. Sensors used to monitor surface and garage parking can also manage city street light usage, saving energy, and collect data about traffic patterns to assist city planners.
“Cities are looking for ways to leverage AI, including NLP, to better manage services and engage with their citizens, who are increasingly looking for a digital interaction,” said Steve Denney, Co-Founder and CTO of CityFront Innovations, a provider of AI technology for mobile city applications. “The smart cities of tomorrow are being built today.”
About CityFront Innovations: About CityFront Innovations: CityFront partners with cities, municipalities and community organizations to deliver the first smart city integration platform, powering an artificially intelligent (AI) citizen engagement mobile app that enables citizens to engage with the city intuitively and intelligently. To learn more, contact us.