The Power of Conversational Intelligence (Digital Magazine - page 31)
Along with technology innovation, enterprises are looking for ways to track, oversee, and organize customer journey and identify the problems within them to offer new opportunities to create a greater customer experience.
The blend of innovation and execution, as being his key success factor, drives Peter Relan to design and deliver the best for his customers at all times. Through GICRM AI, Peter and his team focus on making users and knowledge workers super productive, leveraging AI+Human-In-The-Loop approaches.
Transformers have taken the AI research and product community by storm. We have seen them advancing multiple fields in AI such as natural language processing (NLP), computer vision, and robotics. In this blog, I will share some background in conversational AI, NLP, and transformers-based large-scale language models such as BERT and GPT-3 followed by some examples around popular applications and how to build NLP apps.
I'll start with historical context before explaining my view on how AI should get ready to become democratized for implementation and use by any business professional. Kind of like SaaS.
Chandra Khatri, Chief Scientist and Head of AI Research
The Surge of No-Code AI Platforms, Products, and Startups: Past few years were spent on building powerful Deep Learning/AI toolkits such as PyTorch and Tensorflow. Engineers are now ready to build the No-Code AI platform and product layers on top of existing toolkits, wherein users can simply provide their data and list/select the model through config or UI. AI models will not only be trained and served but also REST APIs will be exposed to applications. Got-It AI's No-Code, Self-Discovering, Self-Training, and Self-Managing platform is an effort towards Democratizing Conversational AI. Microsoft's recent "Lobe" app for anyone to train the AI model is an effort in that direction as well.
Generating queries corresponding to natural language questions is a long standing problem. Traditional methods lack language flexibility, while newer sequence-to-sequence models require large amount of data. Schema-agnostic sequence-to-sequence models can be fine-tuned for a specific schema using a small dataset but these models have relatively low accuracy. We present a method that transforms the query generation problem into an intent classification and slot filling problem. This method can work using small datasets.