SalesTech Star

SalesTech Interview with Vaidya J.R., Senior Vice President and Global Head of Data and AI Business at Hexaware

In this catch-up with SalesTechStar, Vaidya J.R., Senior Vice President and Global Head of Data and AI Business at Hexaware shares a few thoughts on the growing benefits and impact of AutoML in today’s B2B-Tech marketplace:

_______

Hi Vaidya, welcome to this SalesTechStar chat! Tell us about yourself and your role at Hexaware?

Thanks for having me here!

I am deeply passionate about technology and business. I love to bring the two together in a strategic manner. I am fortunate to be achieving exactly that in my current role as the SVP and Global Head – BI, Big Data, AI, and Analytics of Hexaware Technologies.

Leading the Data and AI business, I take great pride in building, guiding, and motivating our immensely talented team of experts in- designing, creating, and deploying – powerful transformational Data, Analytics and AI solutions and services. 

In my 25-plus years’ experience, I have been afforded the opportunity to work with a gamut of technologies (from PaaS to Cloud to Data & Analytics to IoT). I have been fortunate to provide thought leadership and develop strategies for several industries and create impact in different roles (from consulting organizations to building them, and from product management to delivery management, across geographies). It has afforded me the opportunity to lead the latest Data & AI technology transformations – both for Hexaware and our clients. And I strive to give my best to both.

As an analytics evangelist and strategic change catalyst, I feel lucky and empowered employing Hexaware’s unique transformational solutions like AMAZE for Data & AI to our diverse client base that spans across dozens of industries and geographies. The ultimate reward is when our teams deliver results and value to enterprises across the globe thus fulfilling our purpose statement at Hexaware: “Creating smiles through great people and technology.”

Read More: SalesTechStar Interview With Ryan Johnson, Chief Product Officer At CallRail

We’d love to hear more about Hexaware’s recent partnership with DataRobot and how this impacts your core offering and end users?

The DataRobot and Hexaware partnership enables institutions to break through the barrier of low deployment success rate with the powerful capabilities of the DataRobot AI Cloud. 

The partnership is a coherent union of Hexaware’s data science expertise in open-source statistical tools, NLP, NLG and a fast-paced AutoML platform that drastically accelerates ML model building and deployment.

The partnership promises ML solutions to be “2x faster in data preparation and 5x faster in AI deployment.” Thus, leveraging data expertise and proficiency to productionize AI at scale in record time.

As part of the partnership, we have been working on multiple verticalized offerings. One such offering is a KYC (Know Your Customer) process for the banking industry that leverages Artificial Intelligence to help financial services businesses meet strict regulatory KYC standards.

How have you been observing the need and growth of AI driven platforms and cloud systems in the industry and what in your view are some of the base level challenges industries still face when implementing new systems? 

The exit out of the pandemic will be through digital transformation. The world had been witnessing a boom in data for a while now and the pandemic has only accelerated this further. The world will inevitably need smarter and more efficient ways to store and manage the data with cloud systems. In addition, there is the growing opportunity to utilize and make sense of this data through growth of AI. The above digital transformation will be enabled by speedy adoption to cloud systems and AI driven platforms.

However, there are some universal challenges that we have observed to cloud and AI adoptions. Some of the common critical questions that I usually get asked across our engagements are: 

  • How does one find top technical talent? AI/ML is a space with a steep learning curve and there is a dearth of AI/ML experts. Hiring/training talent with an ability to solve problems through data science is a difficult and costly affair.
  • How do I find the ‘right’ technology partner for my AI journey? Identifying the right AI vendor partner can play a huge role in how AI can be contextually implemented for your business to deliver impact.
  • Is there more to the AI adoption process? Just having a team of data scientists working on your data does not lead to anything. The AI teams working in silos fail to realize the desired value. What is required, over and above quality data scientists and ML experts, is a community of people having multiple skills– so that the best business insights can be derived from the entire exercise.

There is clearly more to AI platform and cloud adoption than what meets the eye. Moreover, there exists no ‘one-size-fits-all’ AI solution. Every industry has different data, different business problems they want to solve in different contextual realities. 

It is here that trusted and experienced technology partners like us can deliver immense value. 

In fact, particularly our latest AutoML platforms can be impactful to drive business value. With the times of black box AI behind us – we have developed technology and platforms that adopt an explainable and open approach to AI – which is exactly the ‘democratic’ AutoML platforms we offer here at Hexaware.

How in your view can industries enhance/optimize the use and adoption of AI to power overall production? 

I strongly believe that the process to initiate or even optimize the AI adoption can vary significantly from one industry to another, but the fundamental blocks will essentially be the same. The following steps are relevant in my view:

  • Identifying the right use cases: The first thing I would like to say is that any organization should not adopt AI just for the sake of it. It is undoubtedly one of the most powerful technologies businesses have ever had access to. But to have an optimum deployment, teams should identify the business drivers in terms of both challenges and opportunities and how AI can be applied effectively.
  • Building libraries: Industry focused, sub-function focused libraries and taxonomies would hugely enhance the quality of the AI/ML model output. Hence building the domain knowledge and these libraries is one of the key steps. 
  • Enabling citizen data scientists: Gone are the days where AI deployment was only a concern of high-profile data scientists with specialized degrees. Today organizations need to realize the immense potential their entire workforce possess. With the help of complementary roles such as business translators, developers, data engineers and machine learning architects, citizen data scientists can boost the AI lifecycle by quickly reducing the time to build and deploy models.
  • Improvise deployment and monitoring: As pointed out earlier, AI/ML deployment success rate is still not at a scale in every enterprise. MLOps (DevOps for ML) brings the best practices in deployment and monitoring in AI/ML. I highly recommend usage of MLOps processes, tools, and technologies so that enterprises can quickly scale up in terms of deployment, monitoring and retraining models.

Read More: SalesTechStar Interview With Saumil Mehta, General Manager: Square Point Of Sale

For industries looking to build out their AI capabilities, what kind of teams, structures and processes can come handy in ensuring fair deployment and optimized efforts across levels and functions? 

I would like to share an important lesson learned from our various engagements in building and deploying models at scale for our customers across industries. The first thing I would like to mention is that you cannot just throw a data scientist at a problem and expect it to be solved; it takes a lot more than that. The entire community needs to come together. You need your Statisticians, Data Engineers, ML Ops Engineers, Business Analysts, Cloud Architects, Domain Specialists, Visualization specialist, even the customer representatives, to solve a problem effectively. 

This is exactly what I call the community as Decision Sciences Community that I have formed and nurtured to scale to solve problems across industries. This also ensures a human centric design and approach towards AI. 

Implementing a smart automation will help you enhance your entire AI value chain, right from exploratory data analysis to model deployment. AutoML platforms can help you optimize the exploratory data analysis, feature engineering, identification of suitable models, deployment, model explain-ability, monitoring and retraining. 

As you know, there are established processes ML model building like CRISP-DM and SEMMA which mandates the process steps for any AI/ML model building in a sequential fashion right from Business Understanding, data understanding, data preparation, model building, evaluation, and a feedback loop to Business understanding.

A few thoughts on the future of AI? 

We know how AI is rapidly transforming the way we all work and live. We are calling data the new ‘oil.’ But what’s the point of having all the data if we can’t put it to effective use! So, what we need is an ‘engine’ that consumes the data and ‘powers’ the systems. That ‘engine’ is the AI and ML. 

In this context, I’d like to share 5 prominent trends that we are observing, amongst others, when it comes to leveraging AI & ML:

  • Democratization of Data Science – Rise of AutoML: With all the data explosion happening, data science cannot remain the domain of experts alone. In fact, democratizing data use is the most powerful way of ensuring that data is effectively utilized enterprise wide. And this can be done only through democratizing AI – and that is where AutoML comes in.

    Let’s take an example. Imagine you are a retailer that sells thousands of products across the globe. You want to do a forecast of every item. It’s impossible to achieve that via manual data models, which at best helps in aggregate category planning! But AutoML models can help you forecast at individual item levels. That’s the power AutoML holds!
  • Small Data and TinyML – Edge Computing, re-imagined: The world is all gaga about Big Data but forgets to realize that big data comes with big algorithms too, with as high as billions of modeling parameters! It is thus, the paradigm of small data and tinyML emerges – to facilitate fast cognitive analysis of vital data where time and bandwidth is of critical importance. It’s essentially ‘edge computing’ re-imagined. A perfect example would be self-driving cars. In case of emergency, we need quick data processing – for which we can’t rely on the clouds, since the decision must be taken in micro-seconds! It is here that tinyML, operating at the ‘edge,’ helps sail the tide. We see a similar trend in wearables, home appliances, etc.
  • Deepfake and Synthetic Data – Generative AI at work: Generative AI is the technology to create new content by utilizing existing text, audio files, or images. The viral prank videos of Obama taking a jab at Trump or TikTok videos of Tom Cruise – all are ‘created’ using Deepfake features of Generative AI. From creating something as light-hearted as a ‘new’ piece of art or music or pranks to something as serious as impersonation videos of world leaders/actors saying or doing anything you want them to – it all can be done Generative AI! On the positive side the synthetic data generated can be used to train more powerful ML models; create speech for people with speech impairment; to create language & concept image capabilities – where we could create the architectural plan of a building only by listening to descriptions of designs, etc.
  • Data-driven Customer Experience – AI Hyper-personalization: Hyper-personalization (or “Personalization 2.0”) is the use of data and AI to deliver more personal and tailored products, services, and information. Businesses can use omni-channel data to create “clusters of one” and customize customer journeys in real-time. From Spotify playlists to Netflix suggestions – it’s all hyper-personalization at work. Let’s move to one of our rather serious hyper-personalization use cases: personalized drugs, for one of our major US life sciences clients. The problem was to help the medical practitioners monitor individual medicine responses and needs and help make individualized recommendations. The data was collected from ingestible tablets – data was monitored and dosage recommendations were made based on individual bodily reactions, needs, progress, etc. 
  • The Convergence of AI, IoT, Cloud Computing and 5G: Convergence refers to the coming together of the keystone technologies (6T) of today, that are poised to transform all industries. These technologies are feeding off each other’s energy and enhancing each other’s impact on the world. Each is quite powerful in its own right, but the combination of these technologies is poised to take the game to a whole new level. The prominent examples here would be smart cities, connected cars, smart homes, etc. – where the confluence of all the technologies is transforming lives in ways never imagined before; and will further continue to do so in future.

The good news is that – we at Hexaware, are continuously monitoring the latest trends and actively preparing towards bringing the above to our customers. We are sure that AI promises some truly exciting times ahead for all of us!

Some last thoughts and takeaways before we wrap up? 

Enterprises are creating huge data today and they will continue accumulating the same in future. Very soon we will have zettabytes and yottabytes of data in all forms, shapes and sizes – with every single enterprise. The biggest challenge for these enterprises will be – how can they effectively put these humongous data to effective use and how to drive business outcomes from them. 

In an environment where data explosion is already hitting us hard, it is not feasible to rely on manual crunching of data, model building or deployment. One can thus imagine the huge impact AutoML can make; it will be a game changer!

The most prominent impact of AutoML will be that – Data Science will no longer remain the domain of an elite few in the organization. We will have the ability of developing and deploying model at a scale by the ‘common masses’ – with everyone being able to use data to drive business impacts. We call this the rise of ‘Citizen Data Scientists’.

With AutoML, soon everybody in the organization will be Data Scientists in their own right!

 

Hexaware

Hexaware is a fast-growing automation-led next-generation service provider delivering excellence in IT, BPO and Consulting services.

Vaidya J.R., is Senior Vice President and Global Head of Data and AI Business at Hexaware

Missed The Latest Episode of The SalesStar Podcast? Have a quick listen here!

 

Episode 111: Driving Better Productivity within your Product Team: with Kristina Simkins, VP of Product at Lessonly by Seismic

Episode 110: Driving Sales-Marketing Unity with Chetan Chaudhary, Chief Revenue Officer at Scale AI

Episode 109: B2B Revenue Generation Tactics with Michelle Pietsch, VP of Revenue at Dooly.ai

 

 

Brought to you by
For Sales, write to: contact@martechseries.com
Copyright © 2024- SalesTechStar. All Rights Reserved. Website Design:SalesTechStar | Privacy Policy
To repurpose or use any of the content or material on this and our sister sites, explicit written permission needs to be sought.