Q. I read that NetApp worked closely with CERN on the LHC. Could you describe the kind of work involved?
A. The LHC produces extraordinary levels of data, somewhere in the range of petabytes/hr. The LHC has close to 150 thousand sensors which constantly monitor every happening in the collider. All of those sensors are feeding in inputs back to the databases that are being stored for future analysis. Hence, the storage architecture really has to have the capability to be able to handle that kind of a data ingest rate and then, make it available for analysis at a later date. They chose us for their data storage solutions, because of our scaling capabilities as well as the fact that we can handle those kinds of ingest as well as bandwidth rates.
Q. Tell us a bit about Agile Data Infrastructure (ADI).
A. What ADI really represents is that in the traditional approaches of storing data, data was locked into physical boxes and disparate architectures. So, people had viewpoints where they needed to store their mission critical data in tier 1 type architecture and non-mission critical data in tier 2 type architecture and archive data in a different architecture. People chopped up their storage architectures in many different packets viz. tier 1, tier 2, tier 3, SAN (storage area network), NAS (network attached storage) etc. What that really did was it made the data captive to the physical architecture that it was stored on.
What ADI was conceived to convey was data infrastructure should not place any limits on the ability to store all the different data types regardless of the service levels the applications are looking for. It should be able to scale seamlessly, it should be available for a much longer period than the physical hardware and it should have all the data management intelligence integrated into the data architecture itself. So, we at NetApp call it, Infinite, Immortal and Intelligent. Infinite because of scaling, Immortal because of its ability to preserve data for long periods of time and intelligent because of all the data management capabilities.
Q. What are your thoughts on Big Data being accepted throughout the Indian industry?
A. In information technology, big data is a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools. I think as companies in the country along with the government organisations are taking advantage of the advances in technology, the global trends apply equally to India as they do to any other country.
Even in India, there are enough examples, what with the UID card project where an enormous amount of data is being created and that requires different approaches to storing it and managing it. So, I would say that the global trend of Big Data permeating data collection would apply to India sooner than later. Ultimately, what you need to do is to store data in a most cost effective manner, so it should be cheaper to store, it should give you the level of protection that you’re looking for the data integrity in your data and, most importantly, it should allow you to take the data and convert it into business value. And that’s what the NetApp promise is, we enable our customers to become more flexible in their infrastructure, they can do things much more easily and with a high degree of cost effectiveness.
Q. What are your thoughts on the security concerns regarding placing information out on the cloud?
A. There are clearly security concerns and there are different approaches being tried out for the same. NetApp, in conjunction with CISCO and VMware, has developed multi-tenancy architecture to ensure security to the people putting their data in the cloud so that each individual’s data sets and infrastructure is physically isolated from a security standpoint.
Q. What are the factors for the increasing demand of virtualisation techniques in India?
A. Essentially, what virtualisation promises is a much higher degree of capital utilisation and that’s really what started virtualisation. People had a whole bunch of unutilised x86 servers and virtualisation enabled them to drive the capital efficiency of that infrastructure much more dramatically. As virtualisation goes to its next stage of maturity, now people are beginning to focus on the flexibility that virtualisation provides because virtualisation creates a separation between the physical state of the hardware and the actual state of the application in the machine.
Q. How do you maintain visibility in the Indian market?
A. Our marketing strategy is two-fold. One is to work with the multinationals who are already customers of ours and be a part of their global infrastructure. The second is to focus on the Indian enterprise and government customers and enable them to understand the same value proposition that has made us successful globally.