Tuesday, March 3, 2015

Solving the Last Mile Challenge in Big Data

As we ease into the Big Data economy, business intelligence needs to undergo big changes to make it effective till the last mile - where results matter the most: the end user. While end user is the last mile where the most important business decisions are made, the current business intelligence poses two huge obstacles: cost and complexity. Let us understand why businesses aren’t getting the most out of BI and what can be done to overcome the last mile challenges.

What Businesses are Not Getting Right with Current BI
In any regular enterprise, the BI projects are mammoth undertakings that require huge price tags, armies of implementation consultants, and specialized, full-time staff for ongoing support. According to Gartner, over 60% of enterprises state they have a BI strategy, but despite the many years of experience that most enterprises have with BI, they aren’t doing better in addressing the fundamental challenges of BI from an end-user's perspective.

BI technology needs to refocus on the following points:
  • Adopt Search-based Simple Interfaces: If you had to take a training course or rely on outside support to use Google, Whatsapp, or Facebook, would they have claimed billions of users? Of course not! The simplicity of User Experience is critical and paramount. A simple, intuitive end-user experience will be an integral element because it is an already common and effective user experience model to search and access consumer information (just the way you use Facebook, Amazon, or LinkedIn). Following this model, more and more BI providers can add features that allow users to simply search across billions of rows of data to gain instant insights. 
  • Enable Ad-hoc Reporting: The next natural step in the Big Data & Analytics evolution is enabling ad-hoc reporting for the broader organizational audience. Yet, this turns out to be more difficult than anticipated. Less technical personnel are faced with a blizzard of arcane data names and a mountain of hard-to-understand tables. They are forced to be dependent on the technical report writers for any new reporting. Gartner reports that IT takes an average of six months before a single report is generated on any new data source. The nimble nature of the Big Data & Analytics opportunity is lost for them.

Drawing a BI strategy doesn’t mean just choosing which mega vendor your enterprise will work with. Gartner estimates that no more than 20% of business users actually use BI proactively. It indicates that BI is not being widely used to manage performance. IT must look for means to align BI initiatives with business objectives, keeping in mind the increasing speed of technology change.

How Search Insights Solves the Last Mile through Search Layer

While a quickly-evolving Big Data revolution ripples across multiple industries, enterprises are still using archaic business intelligence. They are forced to rely solely on scarce technical resources to accomplish anything with Big Data & Analytics. Big Data adoption will remain limited, if the last-mile challenges continue to obstruct users in accessing data. A focused shift back to the BI end user via search will be a promising step forward.

What enterprises really need is an easy-to-use, simple interface that gives non-technical users self-service reporting ability and actionable insights. A layer that has a Facebook or Google like consumer grade experience can bridge the last mile. Non-technical users should no longer have a difficult time deriving meaning through data structures or programming languages. To derive useful search insights, a search layer should have the following characteristics: 
  • A search layer should understand the complexities of the data structure.
  • It should translate technical jargon into recognizable business names, and organize them in a format that seems logical to the front-line business person.
  • It should enable non-technical users to create queries and build reports based on terminology that is familiar and meaningful.
  • It should enable natural language search.
  • It should have a more real-time, interactive, and iterative user interface for data exploration and analysis.
Let’s look at some use cases of the search layer.
  • The sales team can easily access everyday sales data by asking ad hoc questions.
  • The Accounting and Finance group can use the search layer to review the general ledger.
  • The Operations team can benefit from a version that focuses on branches.
  • The Marketing team may need search insights that focus on both campaign management and geo-demographic analysis.
Once in place, the beauty of the search layer is readily apparent. In most cases, typical reporting and analytics tools can easily access the underlying data. But, the people costs associated with everyday report generation, professional services, and configurations required for running data reports will easily outweigh the licensing costs of any past generation business intelligence solutions.

Creating a search layer is a virtual necessity for any enterprise to push the power of Big Data & Analytics out to the wider organization. It is better described as a technique for bringing data closer to the user in a meaningful way. With a user-focused BI technology, enterprises can overcome the last mile challenges and discover search insights for important decision making.

Wednesday, January 28, 2015

Building user oriented enterprise solutions, not just products

While discussing the topic of why retailers cannot keep up with their customers in terms of information and expectations – several enterprise businesses told me that the single most reason is the lack of easy & timely access to business insights! While this is the exact problem that we are trying to solve at Drastin, we observed several (not-so) interesting aspects in the journey – on how software vendors have been approaching the customer needs.

The Piecemeal Approach for Enterprise Products

Traditionally, IT-facing software vendors believed that selling to enterprises is very different from selling to consumers, due to the following differences:
  • Consumer products are mostly end-to-end solutions, because the buyers are the users. For example, while selling a car, the car’s frame, engine, seats, electronics, road permit, etc. are not sold separately. They are sold together as a car to the consumer.
  • In case of enterprises - only layered products are built and not solutions, because the buyers and users of the product are different. This leads to an unfortunate thinking that the buyers are more important than the users at enterprises.

This belief led them to design their software as a piece-meal approach to the enterprises, something that can be sold easily, rather than something that can be consumed as an end-to-end solution and that solves an entire use-case for the enterprise user.


The net result of this thinking, lack of design orientation and the lack of focus on user experience forced the enterprise buyers to always buy their day-to-day tools separately. Take the BI space for instance. The enterprises have to buy half-a-dozen layered products from ETL products, BI layers, reporting tools, and visualization tools – separately to fulfill their BI requirements. BI is not alone, I observed the same piecemeal approach in the infrastructure and cloud space as well, as part of my venture. Each of our pilot customers is going through the same issues in the world of decision making. In trying to design products for enterprise buyers rather than consumers, the software vendors have to face evaluation teams and lengthy approval times because of the top-down thinking of enterprises. This ultimately results in lack of attention towards developing a great solution against a great technology.

Is this an issue with the thinking, or the design, or the focus that is only on technology and not on the user?

Our Approach at Drastin
We are looking at a very different approach of providing an integrated “solution” to all customers, rather than a “technology product”. An approach that solves a use-case scenario end-to-end for an enterprise user. An approach that solves both time and cost to the enterprise user. An approach that involves giving a consumer experience to an enterprise user. 

I would like to share some of our key learnings that any enterprise product can take away to create a better experience.
  • Build a solution, not a product. A solution that gives an advantage in everyday work, reduces dependencies, and enables self-empowerment. 
  • Build something that the users would love to use, and not because it has been authorized by their boss.
  • Build something that both the user and the buyer understand. Both have different dimensions to the same problem – while one may focus on the functionality and usability, the other may focus on governance, compliance, service, and TCO.
  • Build something with which the user can attach emotionally and can use as an efficiency booster. The enterprise product should allow the user to see the efficiency in one’s own productivity. 
  • Build something that solves not only a technical problem, but also a business problem. 

An enterprise product, when built as a solution and with the end-user in mind, could be a lot more complex technically but needs to focus constantly on the point of consumption, so that the user can get value out of it easily. We need to build solutions with the consumer in mind – what they want, how they want to use it, and how it fits in with their lives. It shouldn’t be the technology layers that are important. It should always be about the people who use them to derive value.

Sunday, December 28, 2014

Selling to Customers Who Know More Than You!

The word revolution is overused, but in the past five odd years, there has been a significant one. A technology revolution that has empowered customers and impacted the way they engage with products and service providers. Thanks to a combination of technologies, the Customer of 2015 is vastly different from 2010.

The Encyclopedia Effect: The Customer Knows More
When a customer walks into either a retail or an online store today, the smart money is on the probability that he or she knows more about the product than the available product literature. Apart from high turnaround and relative inexperience of store staff, the major factor that has exacerbated this trend is on-demand information. Customers today have all the means to learn and thoroughly research their purchase in terms of features, price comparisons, technologies, accessories, and performance.  Contributing to this is the ease of garnering information via social media. For example, just observe the amount of information provided by Amazon for any product, and the information posted by the users in the reviews section. 

How ready is a business to deal with this customer? When Customers are empowered with on-demand information, why are businesses still stuck with outdated information?

You the Customer vs. You the Business Decision-maker
You, as the customer, carry information on the go. You have Wikipedia, Amazon, Google, and every possible product database in your pocket. But as the enterprise decision-maker, you probably have only one source of information and can rarely combine information from other data sources. Have you ever taken a print out of a product brochure to a retail store to make a purchase? But in your office, you do print BI reports before taking them to meetings. Even if the report is not printed, just the mere fact that it is a ‘report’, makes it already outdated by the time it got saved as a file! Do you see the contrast? You as the customer are empowered with on-demand information, but as a business decision-maker, you are restricted and forced to depend on outdated information for important decision-making. Remember that your business is facing customers armed with information on the go!

Are you making it easier for yourself to find that information? Are you enabling or constricting this behavior? Does your sales process factor in the always addressable customer? 

The Real-time Effect: The decision making cycles in seconds
Back then about 5 years ago, you heard a song, you tried to find out what it was, and maybe you heard it again, then on the radio. Somebody told you what the song was if you were able to hum it. Or you searched the lyrics on the internet. You went to the music store or Amazon, and bought the CD, at whatever the arbitrary price point for the CD was.

Now you hear and like a song that you've never heard before, you "Shazam" it, and it tells you the song and artist, and offers you the chance to buy it with a single click off iTunes. In 30 seconds, from never having heard the song, you now own it.

This telescoping marketing and sales cycle is what the mobile world is accelerating. Real time is in. Waiting is out. Customers are starting to expect this in more and more areas. Whether it’s your bank account or your energy bill, or an itemized break up of your estimate for fitting out a new nursery, there is an increased expectation to make it available now. How real time is your business? How long do your customers have to wait for information about your products and services? How much self-service do you enable in the information buffet?

The journey isn’t over yet, but not recognizing and adapting to these changes already could mean that you are out of step with the customer of today.

Tuesday, December 2, 2014

Is your enterprise suffering from “Data Divide”?

Businesses today rely heavily on all forms of data for crucial decision-making. With increasing data sources and continually changing technology, data gets spread and stored in different silos as the business grows. There comes a point where the divide between collecting or recording data and utilizing it efficiently for business goals grows wider obstructing access to quality data.

Data Divide broadly covers the enterprise-level data issues that businesses face:
  • Lack of data usage in a timely manner
  • Cumbersome process to access high quality data
  • Inability to offer continuous innovation




Technology defines the capabilities of an enterprise to compete in a global arena. As the era of big data takes hold, it is time for businesses and IT to collaborate and harness technology to create a fully supportive platform for business insights for the future. Just relying upon the traditional BI products, which are meant for IT and not for business users, is not enough to bridge the data divide.

Effective data analysis defines market winners
An effective analysis and reporting can elevate data from mere information to knowledge to competitive edge. Besides taking right decisions, business executives may actually create new business opportunities with the right type of data. Working with poorly defined analytics, incomplete data sources, and inability to handle mixed data types, can lead to partial information and knowledge, and the false belief among the business executives that the right decision has been made.

Use technology to simplify data
Technology should make data simple and understandable, rather than technical and complicated. With use of technical terminology and jargon, IT loses focus of the issues at hand. Businesses are not concerned with discussions on using SQL versus Non-SQL databases or Hadoop as a persistent store. Whether IT wants to regard the data sources as just another data problem, or cloudify it in with the big data debate, makes no difference. Businesses are interested in creating strategies and solutions based on the data to attain their enterprise objectives.

The key for IT and business is to communicate in the same language, the “human” language to understand and support each other. IT must ensure that it has the right tools to serve the business needs.

Blend data across data sources
As new data sources and types emerge, they have to be assimilated to prevent data silos. Old methods of data analytics can no longer keep pace with these requirements. Just analyzing data and generating hard-coded reports is not enough to convince customers and stakeholders about information security and to fulfil the laws of different geographies. The businesses must promptly demonstrate that governance is being actively monitored and maintained, instead of paying a mere lip service. A business-centric view of the problem and a result-oriented view for the present and future can bring a true differentiation in the market.

Identify capabilities to handle mixed data types
Traditional databases are pass̩. Today, it is important to work with mixed data sources Рinternal or external, structured or unstructured, relational or reference-only. When enterprises use several data sources to satisfy specific application needs, they hardly gain any contextual insights. All these data sources must be pulled together and analyzed through a single system that can be used effectively by the business executives themselves.

A true analytics solution can provide the much needed environment where decisions are made effectively, timely, and optimally. The right tools can make data fluid, ensure easy and right access for everyone in the enterprise, and take the business and its operations to the next level.

Wednesday, October 29, 2014

Quality of Big Data – Turning Volumes into Insights

Executives love to talk about big data and the amount of data being collected. However, big data is more than just the big talk of the town. Wall Street is now planning to assign a dollar value to the data assets of enterprises. Everyone presumes that big data can make businesses more agile and responsive. But the reality is that within a few months of launching big data projects, managements realize that they need more fundamental changes in organizing and analyzing this big data, irrespective of the technology platforms. Unless a strategy is applied to find deep insights from this data, and in a more natural way, it is unlikely that the efforts invested to harvest big data will reap long term benefits.


After a series of meetings with multiple customers recently, I realized that the different business units in the organization do not have an agreement on a common definition of customer. While it seems basic to have a common lexicon, the reason for discrepancy is that the value of the data depends on the use case and what it is used for. Though most people like to talk about good data, the bad data of today can become good data for tomorrow, and there is no perpetuity for the value of today’s good data. 

While the social and mobile platforms have been generating volumes of data, it is important to note that the variety of data sources, aka islands, are also vastly increasing. Aggregating these different sets of information is the key to arrive at better insights. For instance, imagine that the sales data is in one data source, and that the customer loyalty information is in another. How do you easily find the information about the stores that get the most number of repeat customers?

It is important to note that as long as the data is not made available to those who make decisions, it will be deemed as unusable, aka data noise. Traditionally, data resides only with IT, and businesses never get to play around with the data to find the deep insights. Common sense will make you laugh at this problem – but that is how all the enterprises have been. Businesses and IT just point fingers at each other on where they are and think that it is not their problem to solve. Most IT administrators manage either the infrastructure or the tools required for data rather than managing the data itself.

The source of truth ends up as a questionable topic in most organizations since every business transformation project or M&A activity introduces changes to data infrastructure, operations, and underlying business applications. Depending on the organizational maturity, either the changes are not consistently applied or different businesses remain with copies of out of sync data.

In order to realize the true insights from the big data and make it a good data, it is important to identify a business function or a set of problems and opportunities to get this off the ground and stay focused. Instead of focusing too much on the data governance or data definitions, get technologies that can drive up the data usage by the business teams. The usage would in turn boost the feedback and inputs for governance and data management. The key is to have an ongoing Data Quality program, driven by data usage that monitors constant improvements. Given the rapid growth rate of data, there will never be a single system of truth. Instead, consider having a single system of reference which becomes the single reference point for analyzing data and getting business insights.

There is no question that data is one of the most valuable strategic resources for any organization, and big data presents a great opportunity to leverage it. The strategy should not be focused on implementing big data and analytics platforms. But the real strategy is to identify the new white-space opportunities and customer insights that big data (along with analytics) will make available.

(Picture credit: CIO Insight)

Thursday, August 14, 2014

System of play: Transition from push marketing to pull marketing of products

What makes a good product to build an ever-lasting business? While most entrepreneurs think of building good products, they don't necessarily think about building ever-lasting businesses. Products are normally intended for a target audience, and address a specific use. Once this intended use, or the use-case, changes - the product is no longer valuable to the customer. An excellent case of product marketing would be to address a variety of use-cases and long-lasting use cases. This can only be addressed by putting a 'system' into play.

The system of play massively increases the use-cases, and elevates the net value of marketing efforts to the roof. Customers get hooked onto the product after the first-buy, and results in constant recurring revenues. We can also say that the push marketing gets transitioned to pull marketing.

Achieving the system of play is not easy for a product, and in most cases - it needs to be built grounds up. This requires a tremendous amount of thought leadership and the supporting team to think far ahead. As the system has to go through generations of technology, people and processes - this is not an easy process. Whenever these systems of play come into action - there are multiple revenue sources and all the business has to do is to adjust the pricing such that the cumulative revenues are the highest. The key lesson in these cases is not look at every revenue stream in isolation.

This system of play must be done whenever the cost of marketing is very high, and the customer use-cases cannot accurately be predicted. Such as the case of toy industry where the supply chain must start an year in advance to fill the store-shelves for holiday season. How do you ensure to hook a customer exactly an year later? We had done a horizontal as well as a vertical integration, by building an ecosystem of solutions, in my earlier entrepreneur stint and had seen a tremendous success. I recommend this thinking not only the early stage of any business, but also to re-invent an ongoing business.

There is a very interesting video on Lego on how they put a 'System' into play, and transformed the company to run across multiple generations. Watch it here:


Saturday, July 26, 2014

AWS cloud business is growing so fast, at very low profits, that it's scaring the shareholders

Well, finally. After almost 10 years since the first AWS service started, the Amazon shareholders are looking at the AWS revenues, growing over $5B, and its profitability.

Yesterday Amazon said that while its cloud business almost doubled (by over 90%) compared to last year, it was significantly less profitable. Amazon’s AWS cloud business makes up the majority of its income statement it labels as “other” (along with its credit card and advertising revenue). Amazon is piling on customers faster than it’s adding dollars to its bottom line.


While most of the technology companies like VMware, Cisco operate at a 60-85% gross margins, the AWS' significantly low-margin business (estimated 14-20%) makes it impossible for the other companies to set a level playing field. While, AWS is capturing the market share and investing heavily into the new technology services, do the shareholders expect AWS to be in this low-margin business forever? What happens when the margins need to go up? How much do the shareholders expect them to go up?


The market advantage that AWS currently has, with its closest competitors, is about 12 months and the moment the prices become comparative - at least 2 or 3 of the closest competitors will baseline within an year. As such, the customer loyalty in the cloud business is very little and massive movements would trigger all over the place. 


While AWS has great products, Amazon's shareholders have undue expectations that only Amazon can build the future. To me, it looks like as if the wall street does not encourage building profitable businesses (or companies) any more!




Sunday, June 1, 2014

Quick peek into Gartner's Magic Quadrant for Cloud IaaS 2014

I had been waiting to see if there would be any surprises in this year's magic quadrant of Cloud IaaS from Gartner, and there are none. Well, except for one - CenturyLink! In spite of confusing the market with several parallel cloud offerings, and frequent depricated solutions - they could still get into the chart for the first time (moved downwards in ability to execute, and forward as a visionary, and with a name change from Savvis!).

AWS leads with a great mind share as well as market share across the entire ecosystem, and Microsoft caught up with their strong enterprise solutions and partner led customer value creation. 

For the predictions of next year in 2015 - I believe we would see IBM & Google moving upwards into the leader quadrant. VMware would be hoping to move upwards in terms of execution, but they may end up faring better as a visionary. The report & Lydia's blog expressed a caution that CSC is more becoming a platform neutral provider with ServiceMesh capability, I believe that is the reason they acquired it as the services market is huge. We may see more of CSC's services play - similar to the Dell announcements last year - rather than continuing to build on their infrastructure play. While CSC may not grow further of this chart in 2015, I believe they would be performing better in services business in the due course.

Latest Magic Quadrant for May 2014:



Last Year Magic Quadrant of May 2013:


Monday, May 19, 2014

Data purging and log rotation are now "dead"

In my early days of career as a developer in AT&T SVR4 sub-systems, I worked on several new and improvements related to the core OS and also ecosystem solutions. One of the most used ecosystem solutions at that point was about Backup, Archival and Recovery (a.k.a. BAR) use-cases. Fast forward 15 years and the value of the OS up-time deprecates, cost of storage falls from $1 per KB to 1c per GB, changing the use-cases totally.

In the past decade of business application development, one of the core operational elements had been to handle applications for data purging and log rotation. Just think for yourself – how many times you haven’t seen on web applications that only the past 3 months data is available to be viewed? While most developers don’t realize and the devops don’t pay attention to – the programs written by younger developers don’t consider this operational requirement at all because the data storage is infinite for them, also for the business teams it is all about the data – more the better, bigger the merrier (a.k.a big data!). There is a generation gap that would see if you just think about this one aspect.

In the past decade of IT operations, the core monitoring elements was storage and if the volumes were becoming full. The typical managed services of storage was 28c per GB even an year ago and one must believe that this is not a one-off case, and every enterprise was paying this whether the IT is done in-house, or outsourced. While most CIOs don’t realize, the cloud infrastructures and the cloud applications are built to operate in a never-ending pool of storage volume(s). In fact, someone will be stupid in the future if they plan to monitor the storage capacity without considering the eco-system involved. Unless either the application, or the business mindset is a legacy – the purging of data will be dead, and the application designers looking to rotate log files will also be dead.

What makes sense is to build applications in such a way that the data is always fetched in pages by using good data architectures, and operational tools that take care of auto sizing and tiering. Whether the storage costs 1c per GB or 1c per PB, tiering would always play a role in order to maintain a competitive position of business applications. Summary… forget data purging and plan for data tiering.

Thursday, April 10, 2014

451 Research - Cloud360 is the leader in both cloud management product and services

The report I shared last week was focused primarily on Cloud360 as a Product. It’s now the time to see how we fared in Services.

It is with great pleasure that I announce the findings published in a recent magic quadrant by 451 Research on “IT Management as a Service”, where we were shown in the leader quadrant in ‘Service delivery and Excellence’.
451 Research says… “With Cloud360, Cognizant can manage the existing customer estate, as well as the virtualized layers. It is a good fit for customers looking to experiment with cloud in public, private and on-premises scenarios because it keeps processes consistent” and by further developing its “blended brokering and managed support services for applications and infrastructure”, Cognizant can strengthen its foothold in the global market and gain sustained popularity.

“Tooling, automation, process and policy” have gained a lot of importance for businesses looking at long-term benefits and sustainability in maintaining service portfolios. This has led to the growing requirement for service integration and management, positioning “IT as a Service” as a desired goal. The report by 451 Research clearly shows that “Cognizant has come to market more quickly with an ITMaaS offering – in the form of OnTarget and Cloud360 – than many of the tier one Western-headquartered service providers and is managing to grow a respectable installed base for the offering among its customers.”

A SWOT analysis of ITaaS and ITMaaS vendors have also established Cognizant as one of the best and leading service providers. The report by 451 Research states, while some of the vendors have disjointed go-to-market strategies, lack of standalone ITMaaS offerings, old fashioned tooling and dissatisfactory workload and business process management cycle, the only possible disadvantage for Cognizant is a lack of proper visibility as an infrastructure service provider.

We believe that 451 Research validates our approach to building a comprehensive cloud management platform and deliver services for our customers on their journey to IT as a Service (ITaaS), and IT Management as a Service (ITMaaS).

Thursday, April 3, 2014

Cognizant Cloud360 Leads Cloud Management in Enterprise Customer Survey from 451 Research

I am happy to share that in a recent report published by 451 Research on Cloud Management and Automation, IT practitioners across the globe selected Cognizant's Cloud360 as the leading vendor in five of the six measured categories.


The 451 Research report showed that Cloud360 (listed as Cognizant in the report) excelled among competing software vendors in the following categories:
  • Unified Cloud Management Console
  • Cloud Governance
  • Multi-cloud Management
  • Metering/Billing on Hybrid Clouds
  • Cloud Brokerage
The findings were based on interviews with a “…network of IT professional and key decision-makers at large and midsized enterprises.” The interviews were supported by additional interviews with IT stakeholders such as IT managers, technology vendors, managed service providers, telcos and VCs, as well as primary research. 
The report looks at the adoption of “…tools that are emerging to support the selection, provisioning, scheduling and dispatch of work to BEVs (Best Execution Venues) and the acquisition of services from them.” The research found that “… sophisticated management and automation tools are going to be required to ensure that a cloud can deliver on its promise of faster and more flexible services, on more devices and on-demand.”

According to 451 Research's survey, cloud management technologies such as unified cloud management consoles and cloud governance tools are in the early stages of gaining traction with users. The report notes that the “…next few years, however, will see a significant change in these areas, with close to one-third of respondents planning to implement these technologies in their cloud environments. Spending plans are beginning to gain momentum, with 16% and 10% of respondents citing increased spending on cloud management consoles and governance technologies, respectively, in 2013.”

We believe that this report from 451 Research validates our approach of building a comprehensive cloud management platform. For years now, our focus has been to guide our customers on their journey to IT as a Service (ITaaS), and IT Management as a Service (ITMaaS) by enabling Enterprises to use best of breed cloud services.

At Cloud360, we recognize that if IT is to operate at the speed of business, and reach that level of agility, efficiency and control, then a new approach to management is needed for the cloud era. One that delivers automated operations, intelligent operational analytics, cost transparency, on-demand access to any service, and the flexibility to support any platform or cloud.


Wednesday, March 5, 2014

No enterprise is a greenfield: Cloud360 manages and brokers across what you already have

Traditional IT environments have separate operational layers for running different applications, operating systems, networks and platforms. Provisioning an environment in such a scenario requires customers to reach out to separate teams working in silos and having different policies, processes and compliance needs. Again deploying these environments manually has its own risks and set-backs, especially the possibility of business-impeding errors. The one-view of application availability that Cloud360 ensures, not only minimizes the risk of delayed service delivery or erroneous deployment of environments, but also simplifies the cloud computing experience with its completely virtualized and automated services. 

The 451 Take on Cloud360: ... will be successful because it provides the kind of service-oriented delivery that its existing customers expect. Cloud360 customers – most of which will already have accounts with Cognizant – typically regard AWS and other clouds as technology platforms, rather than service organizations...

The key automation function in Cloud360's ‘Hyperplatform’ is provided by the ‘Xi’ automation layer (language and APIs), which abstracts terminology, methods and access controls from AWS, VMware and other cloud environments to monitor and manage those environments from a single console. The interesting bit is that a command that could signify different things to different clouds has the same effect across all supported clouds when issued in ‘Xi’.


Cognizant's Cloud360 offers a wide range of services, from management to brokering, provisioning and orchestration to monitoring and auto-remediation.  It follows a strict, policy based cloud deployment architecture and supports customers with its advisory services. The best thing about the platform is that it maximizes its functional efficiency based on that of the IT enterprises. The IT companies have their own platforms and tools to conduct day to day business. Cloud360 plays a pivotal role in transforming the existing IT environments by integrating them with other eco-system components and facilitating management across multi-cloud platforms to yield optimum business benefits. 

... There are some 150 staff members in the Cloud360 unit, which is funded, Cognizant says, like a startup. With around 52 of Fortune 2000 enterprise customers across the globe (a dozen or so using the Managed OS operations and applications capability), the Cloud360 hyperplatform manages thousands of virtual environments and performs millions of cloud operations every day. Cloud360’s revenue grew 250% each year, And by the end of this year the mix may be more diverse as it picks up business in Singapore and other regions as well. ...

Thursday, February 20, 2014

Towards a secure cloud... QualysGuard and Cloud360 join hands for "SecureApp"

Enterprise IT is becoming increasingly dependent on hybrid cloud infrastructures for executing day-to-day business operations. But when enterprises shift from traditional and static compute environments to dynamic IT services, they often face challenges related to configuration, data residency, data privacy and compliance. As a result, it is absolutely necessary for enterprises to be geared up against any risks that come with cloud adoption.

The World Economic Forum has identified cyber threat as the 4th most significant global trend in 2014, and no doubt every executive expresses security as the single most challenge to ensure a safe environment across clouds. Therefore, the security of enterprise applications has become a matter of great urgency. Keeping all these factors in mind, Cognizant has developed a joint solution 'SecureApp' by integrating Cloud360 with QualysGuard, to safeguard application environments from security vulnerabilities and exposing the threats to the automated systems of Cloud360 for auto-remediation.

The best thing about this solution (infographic) is that it facilitates on-demand, periodic, and continuous scans to identify vulnerabilities, and undertakes auto-corrective actions whenever necessary. It offers a holistic management console that safeguards applications from both internal and external security threats. It scans through the entire applications stack, database, operating system, and platform services and auto-remediates with configurable policies. It not only identifies issues related to security but also safeguards enterprise applications from compliance vulnerabilities. It protects the cloud environment from both internal compliance threats and threats related to industry and regulation compliance.

“The Cognizant Cloud360 platform offers a seamless cloud-based service management layer between applications and infrastructure helping enterprises achieve agility, operational efficiency, and better IT governance, ” said Philippe Courtot, chairman and CEO for Qualys. “The QualysGuard Cloud Platform seamlessly integrates with Cloud360, helping customers to continuously assess their security and compliance posture with no software to install and maintain.”

“Cognizant is committed to helping clients worldwide find the best solutions to meet their most critical cloud security challenges,” said Ramesh Panuganty, Founder and Managing Director for Cognizant Cloud360. “Now, partnering with Qualys, we are pleased to offer our customers industry leading QualysGuard solutions to enable them to secure their multiple cloud environments and meet compliance regulations easily and seamlessly through the Cloud360 platform.”

QualysGuard is used by 6,700 consumers in over 100 countries, performing over 1 billion IP scans/audits every year. Fortune 1000 enterprise customers across the globe fall back on Cloud360 for managing thousands of virtual environments and performing millions of cloud operations every day. Together, Cloud360 and QualysGuard helps you define, orchestrate, and operate multiple IT environments through one console, supported by an across the-board security vendor, QualysGuard, to ensure best-in-class security services.






Thursday, February 6, 2014

Making Bugs Ineffective

“Let’s make him ineffective”. In the movie Speed, the hero makes this statement when the villain threatens to blow up the cable of the elevator and thus kill its occupants. He saved the situation by securing the elevator with another parallel cable that gave him sufficient time to rescue the occupants even after the main cable was blown up.

We apply a similar strategy for Cloud360 as well; the strategy to make any bug ineffective. Since it is not feasible to have control over all datacenter situations, the next best thing that we do is neutralize the effects of the bad elements that have the potential to de-rail the solution.  As a result, bugs are not allowed to disrupt work. If the Management Console is not connected, the Service Console continues to do its job.

The other approach that we take is to identify any potential issue that may affect the health of the environment so that necessary measures can be taken before any real damage is done. In software testing and compliance, any unintended change is the enemy of quality. Without confidence that each virtual machine has been properly configured and there is consistency across the environments, it is impossible to know if a system crash or slowdown is due to a bug or merely an incorrect patch level or system setting. Without proper and consistent configurations, there is no way to ensure that known security vulnerabilities (a leading cause of security breaches) have been properly closed in all affected systems.

Cloud360 deployments are loosely coupled such that if one component fails, the impact is confined to either just one or two use-cases and does not impact the whole system as such. This is applicable even if the network connectivity fails across components where the queuing kicks in. Since Cloud360 configurations work across provider platforms, any bugs that may result from differences between staging and production hardware are also eliminated. Cloud360 thus delivers improved software quality and compliance through increased consistency in the testing environment and standardization of platform.


Sunday, January 5, 2014

Story of Quantity vs Quality… from Conception to Execution to Perfection!

There is always a new technology in the market claiming to override its predecessor. Yesterday’s solutions and experience may not be relevant today. It is important that we analyze the trends and make continuous improvements to the solution we offer. We may be making some mistakes along the way. But the learning lessons acquired from these mistakes along with the endeavor to continuously enhance the solution result in the creation of a flawless product. Let me tell talk about a story to start with.

A school teacher recently conducted a very interesting experiment that corroborates this fact further. He conducted this experiment to see if we need to change the way we set goals, make progress and become better at what we do. He divided his class in two groups, one that would be graded solely on the quantity of work that they produced and the other that would be graded solely on quality. Both groups were asked to create pots. The ‘Quantity’ group would be graded based on the number of pots they produced in a day. The ‘Quality’ group on the other hand would be required to produce only one pot, albeit a perfect one, to get an A. The outcome was remarkable. It was seen that the best quality pots were those created by the ‘Quantity’ group. As the ‘Quantity’ group was churning out the piles of work, learning from their mistakes, and thereby producing the perfect work, the ‘Quality’ group sat theorizing about perfection and had little more than grandiose theories to show for at the end of the day.


We followed a similar approach in the development of Cloud360: conceptualized it, developed it, implemented it for active engagements during every stage of the product, and made multiple enhancements making it the perfect solution.  

Around 5 years ago, we saw that the industry was moving towards fluid computing models, and infrastructure was soon becoming available as as-a-service models. We noticed that economics of scale could not justify legacy infrastructure models. Even before the term ‘cloud’ was coined, we had started exploring the idea of building a solution that would address the changing landscape at customer organizations, retain their investments, and create new modes of infrastructure delivery. We eventually developed Cloud360, a comprehensive, service-oriented cloud management platform integrating consulting, professional and IT services to fundamentally transform a client’s environment.

We had customers using the Cloud360 at a time when the very concept was very new in the market. We were not merely reacting to the need of the time, theorizing about a solution but actively managing customer environments where we handled real challenges in real time. Continuous analysis of user feedback and requirements enabled us to constantly enhance its features. These early customer engagements helped us refine our concept, design approach, iterations, product roadmap and our approach towards reliability, availability and serviceability of the product.

Envisioned at a time when the very word ‘Cloud’ did not even exist, the hyperplatform (now Cloud360), with its ability to deploy, manage, and operate modern, dynamic, and scalable architectures swiftly and cost-effectively within a seamless holistic environment, has come a long way in being the most preferred partner for organizations planning operations on Cloud.

Friday, December 20, 2013

Cognizant's Cloud360 is OnTarget to deliver ITMaaS market share... 451 Research

IT-management-as-a-service (ITMaaS) is a promising space in the cloud computing market that is estimated to make a significant growth in the coming years. Cognizant is already leveraging Cloud360 to integrate its application and infrastructure services to deliver ITMaaS. What makes Cloud360 a key differentiator for ITMaaS is its ability to integrate application and infrastructure services.

Cloud360 offers an application management module at levels 1 and 2 in the ITIL workflow making the application infrastructure completely traceable. It helps delivering integrated services between the application and integration services, by seamless workflow management between the application and the infrastructure layers, and providing application awareness for a holistic management of environments.

Cloud360 plays a vital role in the integration process whereby cloud can be integrated into the main management stack so that services can be monitored in a non-linear way. It proves useful for customers whether one wants to use public, or private, or on-premises platforms. It acts as the accelerator that can be used for one or multiple VMs. Cognizant offers Cloud360 as a product, providing licenses on a SaaS model.

Cloud360’s key automation capability is facilitated by Cognizant’s Xi Automation layer that abstracts VMs from AWS and VMWare managing the environments through a single console. Deployment and integration are supported by Cognizant’s managed services team or through the self-service portal. Apart from the provisioning, orchestration, monitoring, auto-remediation, metering and chargeback, analytics and advisory services which the customers can access through the self-service portal, what makes the product unique is the customer’s ability to plug in their existing tools with Cloud360’s service management layer. The Service Management layer, coupled with the Operations management layer results in a unique way of delivering ITMaaS.

Both Cognizant and its customers have reaped benefits on account of the real-time dashboard that eliminates dependence on periodic reports for measurement of SLA performance. Compared to some of its competitors, Cloud360’s customer portfolio is increasing steadily indicating its rising popularity with the customers.  Cloud360 is considered to be one of the key determinants of Cognizant’s success in the ITMaaS market.

Get insights into ITMaaS in this news report from 451 Research: “The company's Cloud360 offering appeals to the market because of its 'tenancy agnostic' architecture. Could this be a theme for 2014 ITMaaS market development?” by Katy Ring & William Fellows.


Thursday, December 5, 2013

Winner of UP-CON Best Cloud Automation Solution, 2013

The Cloud360 success story continues…

I am happy to announce that Cognizant's Cloud360 has been awarded by UP-CON as the Best Cloud Automation Solution of 2013. Having been selected for this award from among some of the best next-generation cloud computing companies is a very prestigious recognition for us. The cloud automation feature of Cloud360 is creating quite a stir in the cloud world giving us impetus to enhance it even further.

The ability to automate application environments is one of the primary features of Cloud360 that is drawing more and more customers towards it. Its customizable automation policy makes it a unique offering giving customers full control of their environments. Its Instance and Application profiles allow for automated deployment enabling the customers to standardize and simplify the automation of application deployments. The automation policies cover a range of automated activities from how provisioning happens to where the provisioning happens to how a monitoring event needs to be handled to how cost control is enforced in an environment. This was never a luxury that the IT users enjoyed before.

The technology that is behind Cloud360’s automation capability is "Xi - Experience Intelligible", an easy-to-use English-like language that enables IT users to automate their everyday operations. It empowers Administrators and Business users to dynamically control the behavior of their Enterprise Applications. We recognized the fact that different enterprises have different management, monitoring and automation needs and one type of policy may not work for all. With Xi Configurator, users have the flexibility to create policies based on their requirement, thus giving them greater decision-making power and control over their environment. Users can also keep a check on the resource utilization and consumption by different group and implement timely actions to avoid misuse and contain costs.

Cloud360’s automation capability enables Business users as well as Administrators to collaborate and customize the hosting of applications for their end user. There was a time when the Business users had no control over the applications during run-time even though they were the owners of the business. With Cloud360, the enterprises are experiencing greater synergy between the Business users and the administrators. With the help of policies, the users can easily automate processes, thus saving the time spent on configuring and troubleshooting environments, reducing the element of manual error and investing the time and resources in more productive activities instead.

This technological innovation of Cloud360 has significantly simplified the usage of the Cloud technology and management of environments. Making automation easily accessible and customizable has provided users with greater decision-making power at their finger-tips, thus providing a powerful virtual canvas for automation. This unique technological breakthrough that has redefined the way Cloud environments are managed and monitored. It is an extremely rewarding experience as this technology is gets more recognition each day with increasing number of accolades and awards. This eggs us to further enhance the Cloud360 platform with newer innovations.


Friday, November 29, 2013

Brazil - Technology and Business Innovations

I was in Brazil earlier this week to speak at the Cognizant Innovation Forum at Sao Paulo, and was struck by the tremendous potential the country promises for start-ups and business innovators. While I had an idea that the country’s economy is gradually on the rise giving rise to excellent investment opportunities, I was fascinated to see the perspectives and geography specific requirements of this country that ranks #2 in Facebook and Twitter usage. I met with various customers and investors, and spoke about a diversified set of topics from banking innovations to emerging technologies in card payments to online insurance to retail analytics, and of course cloud computing use cases.

One of the interesting facts that I noted was the shooting up of online sales from 0 to 28 billion within a span of just 12 years indicating a phenomenal economic growth. I happened to visit a Hippie fair and was quite surprised to find that most sellers were accepting credit cards. That spoke a great deal about the country’s economy. The steady economic growth in Brazil is contributing towards a growing middle class which is creating a demand for different kinds of insurance products. During the meet, one of the discussions revolved around the challenges related to not just technology, but people, process and social elements for bringing up a SaaS-based insurance offering in a country that has more than 70,000 insurance agents. The demand for insurance products in Brazil, in fact, is said to be as diverse as it is in Europe and the U.S.A.

We also discussed about a unique challenge in the retail and online stores. The credit card transactions that get timed out and translate as lost business opportunities is close to 25% in Brazil. This number is less than 3% in U.S.A. In this context, we spoke about what could be the underlying issues in the infrastructure. I met with a start-up in Brazil that is trying to solve this technology problem and decrease the percentage of failed transactions for businesses. We had addressed similar problems of lost opportunity to several of our online customers of Cloud360 using various auto remediation, and other platform features of ours. It was interesting to see a different perspective of the business problem, technology means and the solution as well.

My stay has been a fruitful one when I learned a lot about the country’s prospects and figured that in spite of some of the challenges it has, this is a place that has a lot of scope for start-ups and companies planning business innovations.

Thursday, October 3, 2013

Developer-centric IT World

With cloud computing becoming a well-established option for IT operations, Cloud providers traditionally ensured that they delivered varied cloud experience to a wide range of enterprise customers. Be it private, or a virtual private cloud solution, or solely for security and performance, the providers made sure that their IT operations' clientele got what they needed - or what they thought their clientele need!

Most cloud providers today realize that the cloud world is getting more and more developer-centric. A lot of developers are launching applications without being dependent on IT. There are times when IT Operations are not able to provide the required infrastructure within a set timescale. At times like these - they turn to Cloud, for competitive advantages. The businesses have much aggressive deadlines to deliver a new or improved offering - more or less becomes the least common denominator for timelines.

Apart from reliability and stability, what developers effectively look from their cloud providers - are flexibility, affordability and speed in turning up new services. In essence, every customer today comes with a different and unique requirement. Now Cloud users have the luxury to not just rely on one Cloud provider to meet their specific requirements. They have the option to choose system integrators that are, as Gartner puts it, “combining application development and application management with managed services on a strategic cloud IaaS provider’s platform”.

Cognizant’s Cloud360, for instance enables users to provision, monitor, and manage their application environments faster and more efficiently than before. With its self-service portal, it allows development teams to spend less time provisioning new environments and more time understanding business needs and developing applications. It enables agile development and test environments, empowers the engineering team in meeting the demands of a tighter development life-cycle and enables business to deliver faster and higher-quality products to users.
http://www.spkaa.com/wp-content/uploads/2012/11/SuccessfullDeveloper_000.jpg

Thursday, September 19, 2013

How Disaster-proof is your Data?

When we talk about disaster recovery strategy and business continuity plan ‘Cloud’ is bound to feature in the agenda. Apart from the flexible backup and recovery solutions offered by the cloud providers, the cloud itself is sometimes used as a backup repository. What other disaster recovery plan should one have in place? Well, imagine this scenario:
Your cloud provider is going out of business due to financial difficulties and has given you two weeks’ time to vacate all your data.

What would you do in a situation like this? Even if you manage to restore all your data in time, would you be able to continue your business with some other cloud provider with the same level of confidence as you did the first time? Maybe not! 

Cloud computing is undoubtedly one of the best IT innovations in recent times, having redefined the way we operate today. Then again, it is not the solution to all our IT problems. Like every new technology, cloud too comes with its set of challenges. It is important to realize its benefits and challenges so as to leverage it in a way that would best suit your company’s needs.  


Storing data on Cloud might seem like the ideal disaster recovery plan in today’s age. There is no doubt that the process of backup and recovery of data is much simpler in cloud than the traditional methods of data storage. While this holds true for any kind of natural disaster, what does one do in case of a business disaster similar to the one in the scenario above? Is there a recovery plan in place for business disasters such as this?

A full-proof way to alleviate all cloud related concerns is to invest in a strategy that supports multiple cloud platforms. A hyperplatform that supports industry-leading cloud platforms through a single interface and enables you to switch between different clouds, add or remove clouds and change platforms smoothly as and when you require. Its ability to manage multiple Cloud platforms provides the flexibility to leverage multiple Cloud Platforms eliminating vendor lock-in. Not only does it help protect your previous IT investments, it also future-proofs your environment against challenges emerging from advancements in technology.