Vijaya Shanker, Senior Vice President & CTO – Technology and Product Management, Symphony SUMMIT talks about the Analytics, Cloud and the IoT platforms that evolved over time and complex challenges that were faced during development and deployment talks to Shanosh kumar from EFY.
Q. Could you please take us through the evolutionary path of data analytics from the last decade till now?
A. Perhaps the greatest change over the past decade is the evolution of data as the ‘new natural resource’. While the gaming industry, consumer and service oriented businesses were early adopters of data analytics, the emergence of social media platforms like Facebook and Twitter has fuelled technology disruption that paved the way for data analytics to become the norm rather than the exception. At present, with the third wave of the data analytics revolution kicking in, an entire science has emerged to streamline the vast amount of data generated through technology and mobile telephony use, into newer revenue channels. Without doubt, this is an exciting time for the IT Operations field. IT operations management ( ITOM) has seen a drastic change in role from a mere support function to an important enabler and enhancer of productivity and cost efficiency. It has also spurred the development of operational intelligence, predictive technologies and agile systems that help companies stay ahead of the curve in the ever changing market scenarios. To cite an example of this inevitable trend, even traditional industries like oil and manufacturing are also investing in and adopting data analytics to redefine their business models, ramp up productivity and reduce human errors.
Q. Could you please talk about the fundamental changes that softwares dealing with analytics have endured?
A. There has been sea change in data the way we look at analytics industry. Luckily the fundamental knowledge hierarchy is still valid. Data, Information, Knowledge as well as Wisdom are still the hierarchy blocks in the system. Quality of data has consistently deteriorating in terms moving from structured and unstructured. Moreover, volumes of data have increased substantially. Data quality cannot be enhanced by manual methods and automated processes are needed to address the same. Information extracted from data is so high that it can no longer be confined by information extraction rules which can be defined by a piece of code. The interpretation of data needs to be extracted from the data itself. Very similar to how we humans realize develop knowledge from the learning we get to realize over the years. Strict facts and rules logic based Artificial Intelligence (AI) which was very popular in 80s suddenly went out of practice. Analytics so far has been focused on visualization and reporting but now it has to develop into creating new knowledge. When AI started solving Fermat’s last theorem using rule based axiomatic mathematics there was claims that theorem proving is still not intelligence as its knowledge validation, validation of a theorem against a set of known axioms and not knowledge creation.
Q. What challenges did you witness as an architect of this fully functioning cloud platform you built?
A. Cloud did not have a great start. Originally envisioned in 2003 as a form of distributed computing, cloud failed to impress users with its unfriendly interface, poor response time and low data processing power. There was no virtualization or option for multi-tenanted usage, resulting in a cumbersome and expensive on premise service, bogged down by low internet speed and connectivity. The period between 2003 to 2008 saw a phenomenal transformation of the digital landscape, which catapulted cloud as the most promising technology enabler of today. The rampant evolution of technology in this short span – faster internet speeds, greater computing power to enable quicker movement of data traffic and most importantly, a leap in software virtualization, which has powered the evolution of cloud computing by shedding most of the disadvantages of the preceding years. Cloud computing has not only altered the technology domain, but has revolutionized business models as well. The success of e-commerce ventures rests solely on the power of cloud computing. Nobody doubts that cloud is the future. Legacy IT systems are not suited for the modern IT landscape. Companies have understood that if businesses are to flourish and grow, then they have to incorporate cloud as a key strategy. This shift of focus from the traditional IT service platform towards cloud marks a major shift in IT operations management in the modern scenario. Cloud adoption and deployment of IT services on cloud is more of a rule rather than an exception. And in future, this trend will only continue to grow.
Q. Could you share some insights on operational intelligence, contextual problem and contextual intelligence?
A. Operational Intelligence is a key driver in IT Operations Management. As a real-time, dynamic analytical engine, it can help in drastically reducing time, complexity and cost involved in issue resolution by providing a contextual framework for analysis. For instance, whenever an IT analyst is working on or trying to resolve an issue he can get information regarding all related previous scenarios, resolutions provided, associated problems and associated work orders and change processes carried out as part of the solution. This helps the analyst reuse all relevant solutions provided by all other analysts and thus provide a holistic view of the issue than work in isolation. Operational Intelligence is not a static framework, but correlates data from disparate systems to develop a situational awareness, which is accessible by all analysts of the organization.
Q. Is machine learning, computer vision, etc. leading IoT to a whole new level?
A. Several studies have shown that Machine Learning will propel significant innovation in IoT. For traditional industries like manufacturing in particular, the transformative benefits of Machine Learning has made companies venture into IoT, even if on a project basis, to test waters. Most leading IoT providers like Microsoft Azure, Amazon web Service (AWS), Google have Machine Learning embedded in their offerings. With Machine Learning, IoT systems develop cognitive intelligence, with the ability to analyze sensor data, search for correlations and predict accurate decisions. Another advantage is that the deep algorithms enable machines to ‘self-learn’ by constantly analyzing the accuracy of their own predictions. Today e-commerce is a booming industry where you see strong competition. E-commerce makes use of machine learning and artificial intelligence on high scale to be competitive in the market. For them, the packaging, tracking of products, delivery and everything is made possible with the same. As this is a growing industry, AI and machine learning will lead IoT to a whole new level.
Q. What is your take on AI and cognitive technology? How do mechanisms like Text extraction, entity extraction etc. work with respect to service or solutions providing organisations are concerned?
A. As mentioned earlier AI is going to be the next big thing in IT industry in 2017. Like E-commerce many other industries will adopt AI to scale up their business and to attain competitive advantage in the market. AI will provide the facility to have one-on-one chats with customers to understand their concerns and challenges and thereby to address the same.
Historically, AI didn’t put a large dent in the computing market in previous two decades as we tried to make humans machine like than focusing on the basic premise of AI that’s what a man can do and machine cannot. We tried to define metadata and asked humans to create data in the information framework we provided. Like a database or a XML format in well-defined schema etc. Asked people to fill up rigid forms for everything. With enhancement of electronics, computing is focusing on cognitive power of machines to reach to human level of achievement. For example, in e-commerce we have moved from superstore or megastores into online marketplaces. This essentially means thousands of sellers and millions of buyers will be exchanging in a virtual exchange platform. No platform practically can monitor this activity in a consistent manner for accuracy, content, security, legality and morality of the exchange. The information needs to be extracted from text, image or other means of data available and to be presented to the buyer in a consistent manner than makes him trust the platform. There will be differences in extraction of data from different stores. For example, color of a mobile phone may be displayed as black while in the text there may mention of phone colour being white. Entities of these natures need to be extracted and claims need to be normalized to provide better consistent output. Another classic example could be the geographic effect. An American may pay with a dollar bill for a cheque while someone in India may write a cheque for his restaurant bill. The underlying entity for the text extracted in both sentences are exactly opposite. Systems need to understand these context to make meaningful decisions.