CIO · Cloud

3 ways enterprises can reduce their cybersecurity risk profile

IMG_5834

If you are an executive (CIO, CISO, CEO) or board member, cybersecurity is top of mind. One of the top comments I often hear is: “I don’t want our company (to be) on the front page of the Wall Street Journal.” Ostensibly, the comments are in the context of a breach. Yet, many gaps still exist between avoiding this situation and reality. Just saying the words is not enough.

The recent Equifax breach brings to light many conversations with enterprises and executive teams about shoring up their security posture. The sad reality is that cybersecurity spending often happens immediately after a breach happens. Why is that? Let us delve into several of the common reasons why and what can be done.

ENTERPRISE SECURITY CHALLENGES

There are a number of reasons why enterprises are challenged with cybersecurity issues. Much of it stems from the perspective of what cybersecurity solutions provide. To many, the investment in cybersecurity teams and solutions is seen as an insurance policy. In order to better understand the complexities, let us dig into a few of the common issues.

Reactive versus Proactive

The first issue is how enterprises think about cybersecurity. There are two aspects to consider when looking at how cybersecurity is viewed. The first is that enterprises often want to be secure, but are unwilling or unable to provide the funding to match. That is, until a breach occurs. This has created a behavior within IT organizations where they leverage breaches to gain cybersecurity funding.

Funding for Cybersecurity Initiatives

Spending in cybersecurity is often seen in a similar vein as insurance and comes back to risk mitigation. Many IT organizations are challenged to get adequate funding to appropriately protect the enterprise. It should be noted that no enterprise will be fully secured and to do so creates a level of complexity and cost that would greatly impact the operations and bottom line of the enterprise. Therefore, a healthy balance is called for here. Any initiatives should follow a risk mitigation approach, but also consider the business impact.

Shifting to Cybersecurity as part of the DNA

Enterprises often think of cybersecurity as an afterthought to a project or core application. The problem with this approach is that, as an afterthought, the project or application is well on its way to production. Any required changes would be ancillary and rarely get granular in how they could be applied. More mature organizations are shifting to cybersecurity as part of their core DNA. In this culture, cybersecurity becomes part of the conversation early and often…and at each stage of the development. By making it part of the DNA, each member of the process is encouraged to consider how to secure their part of the project.

Cybersecurity Threats are getting more Sophisticated

The level of sophistication from cybersecurity threats is growing astronomically. No longer are the traditional tools adequate to protect the enterprise. Enterprises are fighting an adversary that is gaining ground exponentially faster than they are. In essence, no one enterprise is able to adequately protect themselves and must rely on the expertise of others that specialize in this space.

Traditional thinking need not apply. The level of complexity and skills required is growing at a blistering clip. If your organization is not willing or able to put the resources behind staying current and actively engaged, the likelihood of trouble is not far way.

THREE WAYS TO REDUCE CYBERSECURITY RISK

While the risks are increasing, there are steps that every enterprise large and small can invoke to reduce their risk profile. Sadly, many of these are well known, yet not as well enacted. The first step is to change your paradigm regarding cybersecurity. Get proactive and do not assume you know everything.

Patch, Patch, Patch

Even though regular patching is a requirement for most applications and operating systems, enterprises are still challenged to keep up. There are often two reasons for this: 1) disruption to business operations and 2) resources required to update the application or system. In both cases, the best advice is to get into a regular rhythm to patch systems. When you make something routine, it builds muscle memory into the organization that increases the accuracy, lessens the disruption and speeds up the effort.

Regular Validation from Outsiders

Over time, organizations get complacent with their operations. Cybersecurity is no different. A good way to avoid this is to bring in a trusted, outside organization to spot check and ‘tune up’ your cybersecurity efforts. They can more easily spot issues without being affected by your blind spots. Depending on your situation, you may choose to leverage a third-party to provide cybersecurity services. However, each enterprise will need to evaluate their specific situation to best leverage the right approach for them.

Challenge Traditional Thinking

I still run into organizations that believe perimeter protections are the best actions. Another perspective is to conduct security audits with some frequency. Two words: Game Over. While those are both required, security threats today are constant and unrelenting. Constant, evolving approaches are required today.

As we move to a more complicated approach to IT services (SaaS, Public Cloud, Private Cloud, On Premises, Edge Computing, Mobile, etc), the level of complexity grows. Now layer in that the data that we view as gold is spread across those services. The complexity is growing and traditional thinking will not protect the enterprise. Leveraging outsiders is one approach to infuse different methods to address this growing complexity.

 

One alternative is to move to a cloud-based alternative. Most cloud-based alternatives have methods to update their systems and applications without disrupting operations. This does not absolve the enterprise from responsibility, but does offer an approach to leverage more specialized expertise.

The bottom line is that our world is getting more complex and cybersecurity is just one aspect. The rate of complexity and sophistication from cybersecurity attacks is only growing and more challenging for enterprises to keep up. Change is needed, the risks are increasing and now is the time for action.

Cloud · IoT

Understanding the five tiers of IoT core architecture

Internet of Things (IoT) is all the rage today. Just tagging something as belonging to the IoT family brings quite a bit of attention. However, this tagging has also created quite a bit of noise in the industry for organizations trying to sort through how best to leverage IoT. Call it IoT marketing overload. Or IoT-washing.

That being said, just about every single industry can leverage IoT in a meaningful way today. But where does one begin? There are many ways to consider where to start your IoT journey. The first is to understand the basic fundamentals of how IoT solutions are architected. The five tiers of IoT core architecture are: Applications, Analytics, Data, Gateway and Devices. Using this architecture, one can determine where any given IoT solution fits…and the adjacent components required to compete the solution.

THE FIVE TIERS OF IOT CORE ARCHITECTURE

  • DEVICE TIER

The device tier is the physical device that collects data. The device is a piece of hardware that collects telemetry (data) about a given situation. Devices can range from small sensors to wearables to large machines. The data itself may be presented in many forms from electrical signals to IP-data.

The device may also display information (see Application tier).

  • GATEWAY TIER

The sheer number of devices and interconnection options creates a web of complexity to connect the different devices and their data streams. Depending on the streams, they may come in such diverse forms as mechanical signals or IP-based data streams. On the surface, these streams are completely incompatible. However, when correlating data, a common denominator is needed. Hence, the need for a gateway to collect and homogenize the streams into manageable data.

  • DATA TIER

The data tier is where data from gateways is collected and managed. Depending on the type of data, different structures may be called for. The management, hygiene and physical storage of data is a whole classification onto itself simply due to the four V’s of data (Volume, Variety, Velocity, Veracity).

  • ANALYTICS TIER

Simply managing the sheer amount of data coming from IoT devices creates a significant hurdle when converting data into information. Analytics are used to automate the process for two reasons: Manageability and Speed. The combination of these two present insights to the varied complexity of data coming from devices. As the number and type of devices vary and become increasingly more complex, so will the demand for analytics.

  • APPLICATION TIER

Applications may come in multiple forms. In many cases, the application is the user interface that leverages information coming from the analytics tier and presented to the user in a meaningful way. In other cases, the application may be an automation routine that interfaces with other applications as part of a larger function.

Interestingly, the application may reside on the device itself (ie: wearable).

IoT Architecture

 

Today, many IoT solutions cover one or more of the tiers outlined above. It is important to understand which tiers are covered by any given IoT solution.

CLOUD-BASED IOT SOLUTIONS

Several major cloud providers are developing IoT solutions that leverage their core cloud offering. One thing that is great about these solutions is that they help shorten the IoT development time by providing fundamental offerings that cover many of the tiers outlined above. Most of the solutions focus on the upper tiers to manage the data coming from devices. Three such platforms are: Amazon AWS IoT, IBM Watson IoT, and Microsoft Azure IoT Suite. Each of these emphasize on a different suite of ancillary solutions. All three allow a developer to shorten the development time for and IoT solution by eliminating the need to develop for all five tiers.

THE SECURITY CONUNDRUM

One would be remiss to discuss IoT without mentioning security. Security of devices, data elements and data flows are an issue today that needs greater attention. Instead of a one-off project or add-on solution, security needs to be part of the DNA infused in each tier of a given solution. Based on the current solutions today, there is a long way to go with this aspect.

That being said, IoT has a promising and significant future.

Data · IoT

The intimacy of Internet of Things along with security and privacy

IMG_4092

Internet of Things (IoT) is hot. Really hot! Every single industry from buildings to healthcare to financial services is looking at IoT. As more and more organizations look to capitalize on the blistering IoT market, they…and consumers risk the equivalent of ‘running with scissors’ when it comes to security and privacy.

PERSONALIZING MECHANICAL DATA

IoT presents a number of new sources of data. Some of that data is mechanical and non-personal in nature. A good example is machine data from sensors in a data center. The sensors tell a story about how the data center is performing. Those sensors may include electrical equipment, temperature sensors, airflow and water consumption.

However, some machine data can be personalized to provide insights to a person’s behaviors. An example in the Retail industry might be sensing how many times a person enters a store, the path they take within the store and ultimately what they purchase. In an office building, it may be telemetry around movements of people in/ out of the building, use of the restroom, meeting rooms and the like. At home, it might be temperature settings, entering through the garage door, turning on/ off lights.

The fact is: sensors are everywhere and organizations are starting to correlate the data from those sensors to people behaviors. Now, while most have the best intentions on integrating this data (creating better work environments, automating efficiencies, understanding purchase decisions, etc), it does bring up the question of how data is used.

THE INTIMACY OF WEARABLES

Adding to the data streams from machine data, there is a large source of IoT data is coming from personal devices including those we wear or wearables. And in some cases, this data is being correlated with machine data to provide even more personal automation.

The data coming from wearable devices is interesting…and personal. How personal? Very personal. Sure, wearable devices can register your exercise, activity patterns and sleep patterns. They can also tell when you have elevated heartrates and swift movements. It essentially starts to identify patterns…even intimate patterns. Start to marry this data with other data around location, purchase habits and you start to see how the data streams can be very telling. For example, one could surmise with accuracy what products were purchased before and after certain activities which in turn could provide a very personal perspective on the person.

Now imagine if that data or those patterns were made public. One starts to see how the privacy concerns about our behaviors (intimate or otherwise) quickly become apparent. That is not to say that we should avoid IoT and wearables.

SHIFTING FROM AN AFTERTHOUGHT TO CORE

The obvious solution is to take care with the data and understand how it is used. As with many technology solutions, the security of the device and the data often comes as an afterthought. Unfortunately, IoT is following in those well-understood footsteps.

Security often flies in the wind to avoid constricting innovation and speed to delivery. This is true across the strata for IoT from devices through gateways and all the way to applications. It seems that privacy and security is a routine subject for IoT. Privacy and IoT are not new challenges for innovation in its infancy. The level and intensity of interest in privacy is starting to reach a feverish pitch as device users start to consider the implications.

The issues are not just around wearables either. The same issues reside for corporate IoT solutions that start to understand and react to user behaviors. Even the historically most mundane things, like a building, are starting to get a personality. The personality of the building is starting to understand the behaviors of its users. No longer is security and privacy relegated to only newer solutions.

Now is the time to stop thinking of security and privacy as an afterthought. It is possible to infuse both into each step of the process. It requires a change in the culture and way that solutions are developed. Consumers can help drive these changes through their buying habits. Look for solutions that take security and privacy seriously. End-users, whether corporate or consumer have the best opportunity to impact change.

Business · CIO · Cloud · Data

HPE clarifies their new role in the enterprise

IMG_3755

Last week, Hewlett Packard Enterprise (HPE) held their annual US-based Discover conference in Las Vegas. HPE has seen quite a bit of change in the past year with the split of HP into HPE & HP Inc. They shut down their Helion Public Cloud offering and announced the divestiture of their Enterprise Services (ES) business to merge with CSC into a $26B business. With all of the changes and 10,000 people in attendance, HPE sought to clarify their strategy and position in the enterprise market.

WHAT IS IN AND WHAT IS OUT?

Many of the questions attendees were asking circled around the direction HPE was taking considering all of the changes just in the past year alone. Two of the core changes (shutting down Helion Public Cloud and splitting off their ES business) have raised many eyebrows wondering if HPE might be cutting off their future potential.

While HPE telegraphs that their strategy is to support customers with their ‘digital transformation’ journey, the statement might be a bit overreaching. That is not to say that HPE is not capable of providing value to enterprises. It is to say that there are specific aspects that they do provide value and yet a few significant gaps. We are talking about a traditional hardware-focused company shifting more and more toward software. Not a trivial task.

There are four pillars that support the core HPE offering for enterprises. Those include Infrastructure, Analytics, Cloud and Software.

INFRASTRUCTURE AT THE CORE

HPE’s strength continues to rest on their ability to innovate in the infrastructure space. I wrote about their Moonshot and CloudSystem offerings three years ago here. Last year, HPE introduced their Synergy technology that supports composability. Synergy, and the composable concept, is one of the best opportunities to address the evolving enterprise’s changing demands. I delve a bit deeper into the HPE composable opportunity here.

Yet, one thing is becoming painfully clear within the industry. The level of complexity for infrastructure is growing exponentially. For any provider to survive, there needs to be a demonstrable shift toward leveraging software that manages the increasingly complex infrastructure. HPE is heading in that direction with their OneView platform.

Not to be outdone in supporting the ever-changing software platform space, HPE also announced that servers will come ready to support Docker containers. This is another example of where HPE is trying to bridge the gap between traditional infrastructure and newer application architectures including cloud.

CLOUD GOES PRIVATE

Speaking of cloud, there is quite a bit of confusion where cloud fits in the HPE portfolio of solutions. After a number of conversations with members of the HPE team, their solutions are focused on one aspect of cloud: Private Cloud. This makes sense considering HPE’s challenges to reach escape velocity with their Helion Public Cloud offering and core infrastructure background. Keep in mind that HPE’s private cloud solutions are heavily based on OpenStack. This will present a challenge for those considering a move from their legacy VMware footprint. But does open the door to new application architectures that are specifically looking for an OpenStack-based Private Cloud. However, there is already competition in this space from companies like IBM (BlueBox) and Microsoft (AzureStack). And unlike HPE, both IBM & Microsoft have established Public Cloud offerings that complement their Private Cloud solutions (BlueBox & Azure respectively).

One aspect in many of the discussions was how HPE’s Technical Services (TS) are heavily involved in HPE Cloud deployments. At first, this may present a red flag for many enterprises concerned with the level of consulting services required to deploy a solution. However, when considering that the underpinnings are OpenStack-based, it makes more sense. OpenStack, unlike traditional commercial software offerings, still requires a significant amount of support to get it up and running. This could present a challenge to broad appeal of HPE’s cloud solutions except for those few that understand, and can justify, the value proposition.

It does seem that HPE’s cloud business is still in a state of flux and finding the best path to take. With the jettison of Helion Public Cloud and HPE’s support of composability, there is a great opportunity to appeal to the masses and leverage their partnership with Microsoft to support Azure & AzureStack on a Synergy composable stack. Yet, the current focus appears to still focus on OpenStack based solutions. Note: HPE CloudSystem does support Synergy via the OneView APIs.

SOFTWARE

At the conference, HPE highlighted their security solutions with a few statistics. According to HPE, they “secure nine of the top 10 software companies, all 10 telcos and all major branches of the US Department of Defense (DoD).” While those are interesting statistics, one should delve a bit further to determine how extensive this applies.

Security sits alongside the software group’s Application Lifecycle Management (ALM), Operations and BigData software solutions. As time goes on, I would hope to see HPE mature the significance of their software business to meet the changing demands from enterprises.

THE GROWTH OF ANALYTICS

Increasingly, enterprise organizations are growing their dependence on data. A couple of years back, HP (prior to the HPE/ HP Inc split) purchased Autonomy and Vertica. HPE continues to mature their combined Haven solution beyond addressing BigData into the realm of Machine Learning. That that end, HPE now is offering Haven On-Demand (http://www.HavenOnDemand.com) for free. Interestingly, the solution leverages HPE’s partnership with Microsoft and is running on Microsoft’s Azure platform.

IN SUMMARY

HPE is bringing into focus those aspects they believe they can do well. The core business is still focused on infrastructure, but also supporting software (mostly for IT focused functions), cloud (OpenStack focused) and data analytics. After the dust settles on the splits and shifts, the largest opportunities for HPE appear to come from infrastructure (and related software), and data analytics. The other aspects of the business, while valuable, support a smaller pool of prospective customers.

Ultimately, time will tell how this strategy plays out. I still believe there is an untapped potential from HPE’s Synergy composable platform that will appeal to the masses of enterprises, but is often missed. Their data analytics strategy appears to be gaining steam and moving forward. These two offerings are significant, but only provide for specific aspects in an enterprises digital transformation.

CIO · Cloud · Data · IoT

2016 is the year of data and relevance

IMG_2099

Over the past few years, I have outlined my thoughts for the upcoming year. You can read those from past years here:

Cloud Predictions for 2013 (12/14/12)

CIO Predictions for 2014 (12/20/13)

5 things a CIO wishes for this holiday season (12/26/14)

In 2016, data and relevance will play the leading role across the industry. Data and relevance are not necessarily new, but are providing something new to each of the hot technology areas.

THE YEAR OF DATA AND RELEVANCE

Two core trends came to light during the latter half of 2015; data and relevance. While innovation and experimentation are hallmark components to growth, data and relevance lead to one clear message for 2016: Focus. And it is this very focus that will provide opportunity to buyers and providers alike.

In the past couple of years, we saw new solutions and methods to collect data. In 2016, organizations will hone their craft in how data is collected and used. This is easier said than done as there are still technological and cultural boundaries to overcome. However, 2016 brings a renewed focus on the importance of data from all aspects.

Over the same past few years, a myriad of novel and clever solutions overwhelmed us. 2016 brings forth a drive to focus on those most relevant to our needs; both today and in the future. As part of this focus, look for continued consolation.

2016 SIDENOTES

There are two areas that I suspect will influence activity among buyers and providers alike in 2016. One of those is the economic impact of changes in economies, industries and geopolitical areas. The second has more to do with the specific matchups, consolidations, mergers, acquisitions and folds that take place. The early half of 2016 should be interesting as many were already teed up in the latter half of 2015.

The hot areas for 2016 continue to be Cloud, Data, IoT, Software, Infrastructure and Security.

  • Cloud: Look for enterprises to continue their adoption of cloud as they 1) get out of the data center business and 2) provide leverage to do things not otherwise possible.
  • Data: As mentioned above, the drive for greater data consumption will continue. Look for enterprises to start to understand the relevance of data as it pertains to their business (today and in the future).
  • IoT: Enterprises across a wide range of industries are looking to capitalize on the Internet of Things movement. The value and interest in IoT will only continue to grow through 2016 as enterprises find novel ways to leverage IoT to grow their business.
  • Software: The world of enterprise software continues to evolve. Enterprises continue to move away from large, monolithic applications to applications more directly focused on their industry or area of need. This presents a great opportunity for incumbent enterprise players as much as new entrants.
  • Infrastructure: The infrastructure world is being turned on its head and for good reason. Look for changes in the paradigm of how infrastructure is leveraged. The need for infrastructure is not going away…but how it is consumed is completely changing.
  • Security: While a perennial subject, look for security to weave a path through each of the areas above as organizations begin to focus on how to best leverage (and protect) their assets. For many, their core asset is data.

HAPPY NEW YEAR! HERE’S TO A PROSPEROUS 2016!!!

We all have quite a bit to look forward to in 2016! Change is in the wind and it will continue to provide us with opportunity. Here’s to it bringing great tidings in 2016!

Cloud

Containers in the Enterprise

Containers are all the rage right now, but are they ready for enterprise consumption? It depends on whom you ask, but here’s my take. Enterprises should absolutely be considering container architectures as part of their strategy…but there are some considerations before heading down the path.

Container conferences

Talking with attendees at Docker’s DockerCon conference and Redhat’s Summit this week, you hear a number of proponents and live enterprise users. For those that are not familiar with containers, the fundamental concept is a fully encapsulated environment that supports application services. Containers should not be confused with virtualization. In addition, containers are not to be confused with Micro Services, which can leverage containers, but do not require them.

A quick rundown

Here are some quick points:

  • Ecosystem: I’ve written before about the importance of a new technology’s ecosystem here. In the case of containers, the ecosystem is rich and building quickly.
  • Architecture: Containers allow applications to break apart into smaller components. Each of the components can then spin up/ down and scale as needed. Of course automation and orchestration comes into play.
  • Automation/ Orchestration: Unlike typical enterprise applications that are installed once and run 24×7, the best architectures for containers spin up/ down and scale as needed. Realistically, the only way to efficiently do this is with automation and orchestration.
  • Security: There is quite a bit of concern about container security. With potentially thousands or tens of thousands of containers running, a compromise might have significant consequences. If containers are architected to be ephemeral, the risk footprint shrinks exponentially.
  • DevOps: Container-based architectures can run without a DevOps approach with limited success. DevOps brings a different methodology that works hand-in-hand with containers.
  • Management: There are concerns the short lifespan of a container creates challenges for audit trails. Using traditional audit approaches, this would be true. Using newer methods provides real-time audit capability.
  • Stability: The $64k question: Are containers stable enough for enterprise use? Absolutely! The reality is that legacy architecture applications would not move directly to containers. Only those applications that are significantly modified or re-written would leverage containers. New applications are able to leverage containers without increasing the risk.

Cloud-First, Container-First

Companies are looking to move faster and faster. In order to do so, the problem needs reduction into smaller components. As those smaller components become micro services (vs. large monolithic applications), containers start to make sense.

Containers represent an elegant way to leverage smaller building blocks. Some have equated containers to the Lego building blocks of the enterprise application architecture. The days of large, monolithic enterprise applications are past. Today’s applications may be complex in sum, but are a culmination of much smaller building blocks. These smaller blocks provide the nimble and fast speed that enterprises are clamoring for today.

Containers are more than Technology

More than containers, there are other components needed for success. Containers represent the technology building blocks. Culture and process are needed to support the change in technology. DevOps provides the fluid that lubricates the integration of the three components.

Changing the perspective

As with the newer technologies coming, other aspects of the IT organization must change too. Whether you are a CIO, IT leader, developer or operations team, the very fundamentals in which we function must change in order to truly embrace and adopt these newer methodologies.

Containers are ready for the enterprise…if the other aspects are considered as well.