Cloud · IoT

Understanding the five tiers of IoT core architecture

Internet of Things (IoT) is all the rage today. Just tagging something as belonging to the IoT family brings quite a bit of attention. However, this tagging has also created quite a bit of noise in the industry for organizations trying to sort through how best to leverage IoT. Call it IoT marketing overload. Or IoT-washing.

That being said, just about every single industry can leverage IoT in a meaningful way today. But where does one begin? There are many ways to consider where to start your IoT journey. The first is to understand the basic fundamentals of how IoT solutions are architected. The five tiers of IoT core architecture are: Applications, Analytics, Data, Gateway and Devices. Using this architecture, one can determine where any given IoT solution fits…and the adjacent components required to compete the solution.

THE FIVE TIERS OF IOT CORE ARCHITECTURE

  • DEVICE TIER

The device tier is the physical device that collects data. The device is a piece of hardware that collects telemetry (data) about a given situation. Devices can range from small sensors to wearables to large machines. The data itself may be presented in many forms from electrical signals to IP-data.

The device may also display information (see Application tier).

  • GATEWAY TIER

The sheer number of devices and interconnection options creates a web of complexity to connect the different devices and their data streams. Depending on the streams, they may come in such diverse forms as mechanical signals or IP-based data streams. On the surface, these streams are completely incompatible. However, when correlating data, a common denominator is needed. Hence, the need for a gateway to collect and homogenize the streams into manageable data.

  • DATA TIER

The data tier is where data from gateways is collected and managed. Depending on the type of data, different structures may be called for. The management, hygiene and physical storage of data is a whole classification onto itself simply due to the four V’s of data (Volume, Variety, Velocity, Veracity).

  • ANALYTICS TIER

Simply managing the sheer amount of data coming from IoT devices creates a significant hurdle when converting data into information. Analytics are used to automate the process for two reasons: Manageability and Speed. The combination of these two present insights to the varied complexity of data coming from devices. As the number and type of devices vary and become increasingly more complex, so will the demand for analytics.

  • APPLICATION TIER

Applications may come in multiple forms. In many cases, the application is the user interface that leverages information coming from the analytics tier and presented to the user in a meaningful way. In other cases, the application may be an automation routine that interfaces with other applications as part of a larger function.

Interestingly, the application may reside on the device itself (ie: wearable).

IoT Architecture

 

Today, many IoT solutions cover one or more of the tiers outlined above. It is important to understand which tiers are covered by any given IoT solution.

CLOUD-BASED IOT SOLUTIONS

Several major cloud providers are developing IoT solutions that leverage their core cloud offering. One thing that is great about these solutions is that they help shorten the IoT development time by providing fundamental offerings that cover many of the tiers outlined above. Most of the solutions focus on the upper tiers to manage the data coming from devices. Three such platforms are: Amazon AWS IoT, IBM Watson IoT, and Microsoft Azure IoT Suite. Each of these emphasize on a different suite of ancillary solutions. All three allow a developer to shorten the development time for and IoT solution by eliminating the need to develop for all five tiers.

THE SECURITY CONUNDRUM

One would be remiss to discuss IoT without mentioning security. Security of devices, data elements and data flows are an issue today that needs greater attention. Instead of a one-off project or add-on solution, security needs to be part of the DNA infused in each tier of a given solution. Based on the current solutions today, there is a long way to go with this aspect.

That being said, IoT has a promising and significant future.

Business · CIO · Cloud · Data

HPE clarifies their new role in the enterprise

IMG_3755

Last week, Hewlett Packard Enterprise (HPE) held their annual US-based Discover conference in Las Vegas. HPE has seen quite a bit of change in the past year with the split of HP into HPE & HP Inc. They shut down their Helion Public Cloud offering and announced the divestiture of their Enterprise Services (ES) business to merge with CSC into a $26B business. With all of the changes and 10,000 people in attendance, HPE sought to clarify their strategy and position in the enterprise market.

WHAT IS IN AND WHAT IS OUT?

Many of the questions attendees were asking circled around the direction HPE was taking considering all of the changes just in the past year alone. Two of the core changes (shutting down Helion Public Cloud and splitting off their ES business) have raised many eyebrows wondering if HPE might be cutting off their future potential.

While HPE telegraphs that their strategy is to support customers with their ‘digital transformation’ journey, the statement might be a bit overreaching. That is not to say that HPE is not capable of providing value to enterprises. It is to say that there are specific aspects that they do provide value and yet a few significant gaps. We are talking about a traditional hardware-focused company shifting more and more toward software. Not a trivial task.

There are four pillars that support the core HPE offering for enterprises. Those include Infrastructure, Analytics, Cloud and Software.

INFRASTRUCTURE AT THE CORE

HPE’s strength continues to rest on their ability to innovate in the infrastructure space. I wrote about their Moonshot and CloudSystem offerings three years ago here. Last year, HPE introduced their Synergy technology that supports composability. Synergy, and the composable concept, is one of the best opportunities to address the evolving enterprise’s changing demands. I delve a bit deeper into the HPE composable opportunity here.

Yet, one thing is becoming painfully clear within the industry. The level of complexity for infrastructure is growing exponentially. For any provider to survive, there needs to be a demonstrable shift toward leveraging software that manages the increasingly complex infrastructure. HPE is heading in that direction with their OneView platform.

Not to be outdone in supporting the ever-changing software platform space, HPE also announced that servers will come ready to support Docker containers. This is another example of where HPE is trying to bridge the gap between traditional infrastructure and newer application architectures including cloud.

CLOUD GOES PRIVATE

Speaking of cloud, there is quite a bit of confusion where cloud fits in the HPE portfolio of solutions. After a number of conversations with members of the HPE team, their solutions are focused on one aspect of cloud: Private Cloud. This makes sense considering HPE’s challenges to reach escape velocity with their Helion Public Cloud offering and core infrastructure background. Keep in mind that HPE’s private cloud solutions are heavily based on OpenStack. This will present a challenge for those considering a move from their legacy VMware footprint. But does open the door to new application architectures that are specifically looking for an OpenStack-based Private Cloud. However, there is already competition in this space from companies like IBM (BlueBox) and Microsoft (AzureStack). And unlike HPE, both IBM & Microsoft have established Public Cloud offerings that complement their Private Cloud solutions (BlueBox & Azure respectively).

One aspect in many of the discussions was how HPE’s Technical Services (TS) are heavily involved in HPE Cloud deployments. At first, this may present a red flag for many enterprises concerned with the level of consulting services required to deploy a solution. However, when considering that the underpinnings are OpenStack-based, it makes more sense. OpenStack, unlike traditional commercial software offerings, still requires a significant amount of support to get it up and running. This could present a challenge to broad appeal of HPE’s cloud solutions except for those few that understand, and can justify, the value proposition.

It does seem that HPE’s cloud business is still in a state of flux and finding the best path to take. With the jettison of Helion Public Cloud and HPE’s support of composability, there is a great opportunity to appeal to the masses and leverage their partnership with Microsoft to support Azure & AzureStack on a Synergy composable stack. Yet, the current focus appears to still focus on OpenStack based solutions. Note: HPE CloudSystem does support Synergy via the OneView APIs.

SOFTWARE

At the conference, HPE highlighted their security solutions with a few statistics. According to HPE, they “secure nine of the top 10 software companies, all 10 telcos and all major branches of the US Department of Defense (DoD).” While those are interesting statistics, one should delve a bit further to determine how extensive this applies.

Security sits alongside the software group’s Application Lifecycle Management (ALM), Operations and BigData software solutions. As time goes on, I would hope to see HPE mature the significance of their software business to meet the changing demands from enterprises.

THE GROWTH OF ANALYTICS

Increasingly, enterprise organizations are growing their dependence on data. A couple of years back, HP (prior to the HPE/ HP Inc split) purchased Autonomy and Vertica. HPE continues to mature their combined Haven solution beyond addressing BigData into the realm of Machine Learning. That that end, HPE now is offering Haven On-Demand (http://www.HavenOnDemand.com) for free. Interestingly, the solution leverages HPE’s partnership with Microsoft and is running on Microsoft’s Azure platform.

IN SUMMARY

HPE is bringing into focus those aspects they believe they can do well. The core business is still focused on infrastructure, but also supporting software (mostly for IT focused functions), cloud (OpenStack focused) and data analytics. After the dust settles on the splits and shifts, the largest opportunities for HPE appear to come from infrastructure (and related software), and data analytics. The other aspects of the business, while valuable, support a smaller pool of prospective customers.

Ultimately, time will tell how this strategy plays out. I still believe there is an untapped potential from HPE’s Synergy composable platform that will appeal to the masses of enterprises, but is often missed. Their data analytics strategy appears to be gaining steam and moving forward. These two offerings are significant, but only provide for specific aspects in an enterprises digital transformation.

CIO · Cloud

IT transformation is difficult, if not impossible, without cloud

IMG_2135Information Technology (IT) transformation is all the rage these days. It started as a lofty objective among Chief Information Officers (CIOs) and shifted to a stark requirement for businesses to remain competitive. Even those beyond the IT organization are pushing IT transformation including the rest of the C-Suite and Board of Directors. Why? Without it, companies struggle to remain competitive and potentially suffer catastrophic failure. Simply put, IT has become so important to a business’ success that transformation is now a requirement for remaining competitive in business.

At the same time, the maturity of cloud-based solutions leads to a fundamental requirement for IT transformation. Cloud is no longer just a discussion among IT professionals. Cloud is now a discussion among C-Suite executives and the Board of Directors. Essentially, IT transformation relies on cloud as a significant lever in a company’s arsenal.

THE RIGHT CLOUD CONVERSATION

However, not all cloud conversations are the same. While many in IT will focus on the technical merits (and hurdles) that cloud provides, C-Suite executives and Boards are looking at the leverage it provides for economic growth and business agility. If the CIO and IT organization are only focused on cloud for technical merit, it will inevitably fail. A conversation pitting one technology against another is missing a key component: context. What is the context in which one technology provides value over the other? And the answer needs to be in terms that convey clear business value.

The reality is that cloud is nothing more than a tool that provides significant leverage. The real question is: What leverage can cloud provide in terms of business advantage not technical merit.

TRADITIONAL IT FAILURE

Historically, IT managed most of the solutions internally due to a lack of alternative solutions. Now it is time to get beyond doing everything internally. Regardless if you are in a heavily regulated and compliance industry such as Financial Services or Healthcare, there are mature solutions. In addition, those regulations and compliance requirements do not apply to every system and piece of data that IT manages.

In addition, new requirements coming from Internet of Things, Machine Learning, data integration and mobile will continue to rip apart traditional IT architectures. In essence, traditional architectures have no hope of keeping up with the increasing flow of data and complexity of solutions. IT desperately needs to change to keep up and get ahead of this onslaught.

GETTING PAST THE BASICS

In order for CIOs to build trust for transformation, they need to get the basics under foot. This statement is non-negotiable. Fundamental functions like email, phone systems, file sharing need to work without incident. These solutions are becoming more complex, but not business differentiating for any given organization. Yet many IT organizations continue to insist on running these functions internally. Sadly, many of the reasons given for this approach no longer hold true.

At the same time, mature cloud-based alternatives exist that provide greater stability, function and agility. Not only does running commodity functions create a distraction for the organization from business-differentiating functions, it also creates an incredible amount of risk to basic business functionality. Unfortunately, failures to get the basics right will continue to plague the CIO and rest of the IT organization by extension.

FOCUS ON THE RIGHT OBJECTIVES

To be clear, I am not saying cloud for cloud sake. There is a right and wrong place to leverage cloud. The IT organization needs to take a holistic approach to identify how best to leverage cloud. However, for commodity services, cloud should be a mandatory requirement at this point. And those organizations still trying to run commodity services internally…and failing…are only hindering their company’s progress.

It is time we (as IT leaders) take a serious look at our role and consider how best to leverage the tools at our disposal. Transformation is a requirement. Cloud is a requirement. The question is really how to chart the path forward. What we have done in the past will not serve us well in the future. And remember…time is not your friend.

Cloud · Data · IoT · Mobile

Intel playing a key role with Cloud, Mobile, IoT & Analytics

In the past couple of weeks, I spent time with the Intel team in Oregon to see their work in leading areas including Cloud, Mobile, Internet of Things (IoT) and Analytics. Before I get too far down the path, one may be asking what Intel, a chip manufacturer, is doing in some of these areas. As it turns out, Intel is actually one of the largest software developers today. Intel also plays a leadership role in driving adoption and bridging the gaps in these leading areas.

SUPPORTING THE MOVE TO CLOUD

Today, 75% of current cloud demand comes from consumer services. By 2020, 65-85% of applications will be delivered via cloud infrastructure. The key for Intel (and others) is to move from consumer applications to enterprise applications. Intel’s approach is to leverage Jevon’s Paradox. The easier computing is to access, the faster the adoption. One of the key areas Intel is working on is orchestration software that is transparent vs. opaque.

Simply put, the industry is simply not moving fast enough. Friction exists in several key areas with adoption:

  • Fragmented solution stacks
  • Complexity in deploying solutions
  • Lack of key features

While these may seem straight forward, the path is not always the most direct. Intel IT is a great test bed of methodology, technology and culture. Today, any developer in Intel IT can go request their own instance for compute & storage.

One of the areas related to cloud is Intel IT’s move to Software Defined Networking (SDN). Prior to SDN, the process of Landing, Security Setup (ACL), Load Balancing and Auto IP Provisioning took an average of 31.99 days(!). After SDN, the process is nearly instant. The biggest challenges were Immature Technology (71%), Existing Network/ Processes (64%), Lack of Knowledge/ Training (29% and Cost (25%).

To Intel, cloud is not the end-game and does not see enterprises completely divesting of data centers. Intel’s perspective is that every CIO wants to get to a hybrid cloud scenario.

ENGAGING IN THE MOBILE ECOSYSTEM

Today, there are 1.9 billion smartphones. Each smartphone has an average of 26 applications. Each application has (on average) 20 transactions with a data center every day. That turns into 1 trillion data center transactions…every day!

Imagine the challenges of scale using traditional data center technologies. The sheer amount of data, let along transactions, is massive. And this is just what we see from the mobile endpoints.

THE INTERNET OF THINGS (IoT)

There is a significant opportunity for any of the IoT players by turning data into value. With 50 billion ‘things’ and 35 zettabytes of data, there is quite a bit of upside for even the most narrowly focused of companies. Intel is working with companies to enable the two categories of the IoT.

THE DATA AND ANALYTICS OF CANCER RESEARCH

One example is Intel’s partnership with Oregon Health & Science University (OHSU) to assist with their cancer research programs. OHSU is one of the country’s leading cancer research institutions. Intel has engaged with OHSU on multiple levels. However, one of the core activities when doing cancer research is genome sequencing.

Today, a single patient genome generates more than 1 terrabyte of data. That’s 1TB+ per patient. With 1.65 million cancer patients in the US alone, that equates to 4 exabytes of data for genome sequencing. Today, <1% of cancer patients are actually sequenced due to a number of issues including costs. Imagine if all cancer patients were sequenced. Now imagine if patients for other diseases were sequenced. One can quickly see that we are just scratching the surface on data analytics in healthcare and have a long way to go!

SUPPORTING THE OVERALL EFFORT

As the scale for workloads moves from rack-scale to larger, specialized implementation, Intel is ready with custom silicon. Cloud providers, such as Amazon AWS, have already taken this approach to leverage a myriad of features that best support their service offering. Expect others to follow suit as their scale increases.

 

Today, the breakdown of Intel’s market by workload is as follows:

Screen Shot 2015-11-03 at 9.53.29 AM

 

It is impressive to see how much of the workload pie is squarely focused on technical computing today. Consider how this will change as the adoption rate of cloud and analytics increases.

All that being said, Intel’s core is still building infrastructure technology. Their new 3D xPoint memory technology is about to turn the industry on its head. Consider that xPoint addresses many of the concerns with NAND memory today and presents significant opportunities for applications in need of low latency, fast system recovery and high-endurance. Large in-memory databases, gaming and genomics analysis are just a few of the leading contenders that will benefit from 3D xPoint memory technology.

Screen Shot 2015-11-03 at 9.52.07 AM

 

In summary, Intel is far from just a ‘chip manufacturer’. They are constantly innovating their silicon expertise while taking a leadership role in many of the hot technology areas. While many still struggle with basic block-and-tackling of cloud adoption, there are many significant opportunities that lie ahead of us both commercially and personally.

IoT

IoT is squarely about two things

IMG_1903

Internet of Things (IoT) is one of the hottest topics today. Everyone is talking about it and how their products and services fit in with IoT. With so much discussion around the subject, a healthy amount of confusion will follow. In a nutshell, IoT presents a significant opportunity for both companies and consumers.

The purpose of this post is to break down the subject into two core areas. From there, one can delve deeper into how companies are engaging within the two categories. Within those, there are opportunities and challenges that both companies and consumer need to consider.

The two categories of of IoT:

MECHANICS

The first category of IoT solutions are what I call the mechanics. These are the solutions that connect the different devices together. It involves the end-device, data collection and transfer. Management and security layers provide governance of both the devices themselves and the data they collect and transfer.

ANALYTICS

The second category of IoT solutions move to working with the data itself. A fundamental opportunity for IoT is to gain greater insights from the data itself through analytics. Consider the sources of data and how they can provide greater insights on customers, their services and the ‘Market of One’. More data sources can lead to better decision making.

 

Granted, IoT is a complex subject with many facets. Not all tools (mechanics or analytics) are the same. There are a huge number of very specialized solutions on the market today. Before delving into IoT, understand what components are most useful for your ultimate objective. For most this will lead to a clearer path of execution and better decision making.

Business · CIO · Cloud · Data

Are the big 5 enterprise IT providers making a comeback?

Not long ago, many would have written off the likes of the big five large enterprise IT firms as slow, lethargic, expensive and out of touch. Who are the big five? IBM (NYSE: IBM), HP (NYSE: HPQ), Microsoft (NASDAQ: MSFT), Oracle (NYSE: ORCL) and Cisco (NASDAQ: CSCO). Specifically, they are companies that provide traditional enterprise IT software, hardware and services.

Today, most of the technology innovation is coming from startups, not the large enterprise providers. Over the course of 2015, we have seen two trends pick up momentum: 1) Consolidation in the major categories (software, hardware, and services) and 2) Acquisitions by the big five. Each of them are making huge strides in different ways.

Here’s a quick rundown of the big five.

IBM guns for the developer

Knowing that the developer is the start of the development process, IBM is shifting gears toward solutions that address the new developer. Just look at the past 18 months alone.

  • February 2014: Dev@Pulse conference showed a mix of Cobol developers alongside promotion of Bluemix. The attendees didn’t resemble your typical developer conference. More details here.
  • April 2014: Impact conference celebrated 50 years of the mainframe. Impact also highlighted the SoftLayer acquisition and brought the integration of mobile and cloud.
  • October 2014: Insight conference goes further to bring cloud, data and Bluemix into the fold.
  • February 2015: InterConnect combines a couple of previous conferences into one. IBM continues the drive with cloud, SoftLayer and Bluemix while adding their Open Source contributions specifically around OpenStack.

SoftLayer (cloud), Watson (analytics) and Bluemix are strengths in the IBM portfolio. And now with IBM’s recent acquisition of BlueBox and partnership with Box, it doesn’t appear they are letting up on the gas. Add their work with Open Source software and it creates an interesting mix.

There are still significant gaps for IBM to fill. However, the message from IBM supports their strengths in cloud, analytics and the developer. This is key for the enterprise both today and tomorrow.

HP’s cloudy outlook

HP has long had a diverse portfolio that addresses the needs of the enterprise today and into the future. Of all big five providers, HP has one of the best matched to the enterprise needs today and in the future.

  • Infrastructure: HP’s portfolio of converged infrastructure and components is solid. Really solid. Much of it is geared for the traditional enterprise. One curious point is that their server components span the enterprise and service provider market. However, their storage products are squarely targeting the enterprise to the omission of the service providers. You can read more here.
  • Software: I have long since felt that HP’s software group has a good bead on the industry trends. They have a strong portfolio of data analytics tools with Vertica, Autonomy and HAVEn (being rebranded). HP’s march to support the Idea Economy is backed up by the solutions they’re putting in place. You can read more here.
  • Cloud: I have said that HP’s cloud strategy is an enigma. Unfortunately, discussions with the HP Cloud team at Discover this month further cemented that perspective. There is quite a bit of hard work being done by the Helion team, but the results are less clear. HP’s cloud strategy is directly tied to OpenStack and their contributions to the projects support this move.

HP will need to move beyond operating in silos and support a more integrated approach that mirrors the needs of their customers. While HP Infrastructure and Software are humming along, Helion cloud will need a renewed focus to gain relevance and mass adoption.

Microsoft’s race to lose

Above all other players, Microsoft still has the broadest and deepest relationships across the enterprise market today. Granted, much of those relationships are built upon their productivity apps, desktop and server operating systems, and core applications (Exchange, SQL, etc). There is no denying that Microsoft probably has relationships with more organizations than any of the others.

Since Microsoft Office 365 hit its stride, enterprises are starting to take a second look at Azure and Microsoft’s cloud-based offerings. This still leaves a number of gaps for Microsoft; specifically around data analytics and open standards. Moving to open standards will require a significant cultural shift for Microsoft. Data analytics could come through the acquisition of a strong player in the space.

Oracle’s comprehensive cloud

Oracle has long been seen as a strong player in the enterprise space. Unlike many other players that provide the building blocks to support enterprise applications, Oracle provides the blocks and the business applications.

One of Oracle’s key challenges is that the solutions are heavy and costly. As enterprises move to a consumption-based model by leveraging cloud, Oracle found itself flat-footed. Over the past year or so, Oracle has worked to change that position with their cloud-based offerings.

On Monday, Executive Chairman, CTO and Founder Larry Ellison presented Oracle’s latest update in their race for the enterprise cloud business. Oracle is now providing the cloud building blocks from top to bottom (SaaS PaaS IaaS). The message is strong: Oracle is out to support both the developer and business user through their transformation.

Oracle’s strong message to go after the entire cloud stack should not go unnoticed. In Q4 alone, Oracle cloud cleared $426M. That is a massive number. Even if they did a poor job of delivering solutions, one cannot deny the sheer girth of opportunity that overshadows others.

Cisco’s shift to software

Cisco has long since been the darling of the IT infrastructure and operations world. Their challenge has been to create a separation between hardware and software while advancing their position beyond the infrastructure realms.

In general, networking technology is one of the least advanced areas when compared with advances in compute and storage infrastructure. As cloud and speed become the new mantra, the emphasis on networking becomes more important than ever.

As the industry moves to integrate both infrastructure and developers, Cisco will need to make a similar shift. Their work in SDN with ACI and around thought-leadership pieces is making significant inroads with enterprises.

Summing it all up

Each is approaching the problem in their own ways with varying degrees of success. The bottom line is that each of them is making significant strides to remain relevant and support tomorrow’s enterprise. Equally important is how quickly they’re making the shift.

If you’re a startup, you will want to take note. No longer are these folks in your dust. But they are your potential exit strategy.

It will be interesting to watch how each evolves over the next 6-12 months. Yes, that is a very short timeframe, but echoes the speed in which the industry is evolving.

CIO · Data

HP Software takes on the Idea Economy

The Idea Economy is being touted pretty heavily here at HP Discover in CEO Meg Whitman’s keynote. Paul Muller (@xthestreams), VP of Strategic Marketing in HP Software took us on a journey of how HP Software is thinking about solving today’s problems and preparing for the future state. Unlike many of the other presentations the journey is just as important as the projects. It helps organizations, partners, customers and providers align their vision and understand how best respond to the changing business climate.

The combination of non-digital natives looking at new technology in one way while millennials are approaching technology in a completely fresh way creates a bit of a challenge. Millennials often create and support disruption. Quite a different approach from their non-digital natives. According to HP’s Muller, a full “25% of organizations will fail to make it to the next stage through disruption.” If you’re an existing legacy enterprise, how do you embrace the idea economy while at the same time running existing systems? This presents a serious, but real challenge for any established enterprise today.

Muller then took the conversation of ‘bi-modal IT’ as a potential answer to the problem. Bi-modal IT is being discussed as ‘hybrid IT’ or two-speed IT to address the differences between running existing core systems while innovating with new products and services. In addition to the technology challenges, bi-modal IT creates a number of other challenges that involve process and people. Side note: Look for an upcoming HP Discover Performance Weekly episode we just recorded on the subject of bi-modal IT with Paul Muller and Paul Chapman, CIO of HP Software. In the episode, we take a deeper dive from a number of perspectives.

HP Software looks at five areas that people need to focus on:

  1. Service Broker & Builder: Recognize that the problem is not a buy vs. build question any longer. Today, both are needed.
  2. Speed: The speed in which a company innovates by turning an idea into software is key. Most companies are just terrible at this process. DevOps plays a key role with improving the situation.
  3. BigData & Connected Intelligence: Understand the differences between what customers ask for vs. what they use. BigData can provide insights here.
  4. User Experience: What is the digital experience considering the experience, platforms and functions?
  5. Security: Securing the digital assets are key. 33% of successful break-ins have been related to a vulnerability that has been known for 2 years. (Stuxnet).

HP leverages their Application Lifecycle Management process to address each of these five areas with data playing a fundamental role.

There was some discussion about the maturity cycle of companies regarding BigData. Trends show that companies start with experimentation of data outside the enterprise in the cloud. The data used is not sensitive or regulated. When it’s time to move into production, the function is brought back in-house. The next step in the maturity cycle are those that then move production BigData functions back outside into the cloud. Very few folks are doing this today, but this is the current trend.

And finally a core pain point that is still…still not managed well by companies: Backup and Disaster Recovery. This is nothing new, but an area ripe for disruption.

Overall, it was refreshing to hear more about the thought leadership that goes into the HP Software machine rather than a rundown of products and services.