Business · Cloud · Data

Shifting the thinking of enterprise applications

Enterprise applications are not new and have been around for decades. Since their start, enterprise applications have increased in their level of sophistication and business automation. However, this sophistication comes with a significant degree of complexity too.

Historically, enterprises were in the position where they needed to build everything themselves. Much of this stemmed from the fact there were limited options to consider. Fast forward to today and there are a myriad of options of how an enterprise can consume an enterprise application.

However, getting from here to there is not trivial. Practically every enterprise application has a strong degree of complexity that is directly tied to the intricacies of their specific business operations. For decades, enterprises have taken the approach of customizing the application to match their existing business processes. Due to the degree of customization, every enterprise Information Technology (IT) organization essentially created their own enterprise application snowflake.

CHANGING THE ENTERPRISE APPLICATION PARADIGM

One of the challenges for enterprise applications in the cost to upgrade. All of the unique customizations significantly increase the cost and complexity to upgrade the system. The customizations, related programming, configuration and testing involved turn each upgrade into the equivalent of a new implementation.

For many enterprises, it is common practice to skip versions rather than maintain currency due to the cost and disruption associated with the complexity to upgrade. This also means that many enterprises delay their ability to leverage new functionality.

New opportunities from cloud computing and Artificial Intelligence (AI) present unique opportunities to enterprise applications. With cloud-based enterprise applications, no longer is the enterprise required to install, manage and operate the underlying enterprise application. As applications increase in their level of complexity, this takes an increasingly huge burden off the shoulders of the IT organization.

AI presents a different type of opportunity. Enterprises are increasingly their reliance on data to gain greater insights. The volume and types of data are adding increased pressure on the traditional methods to analyze data. AI presents a unique opportunity to automate the process and gain insights not previously possible. However, the more data available to the AI algorithm, the more supportive it can be. And that is where cloud comes in to provide additional resources in a meaningful way when needed without the need to build a fortress internally.

TRADITIONAL VERSUS TRANSFORMATIONAL

Of late, enterprise IT organizations are shifting their focus from a traditionalIT organization to that of a transformationalIT organization. That is to say that their focus is shifting from technology-centricto business-centric. As part of this shift, IT organizations are looking for ways to streamline their technical operations and focus more on data and insights.

The shift to transformational IT organizations is having an impact on the most sacred applications within the IT portfolio including the enterprise applications.

MATURING THE THINKING ABOUT ENTERPRISE APPLICATIONS

More mature enterprises are starting to shift their thinking about enterprise applications. This is due to a number of factors including 1) IT organizations are shifting their focus on business-centric outcomes, 2) Mature alternatives exist for even the largest of implementations 3) The pressure to implement advanced functions is increasing and 4) The speed in which IT organizations must respond with changes is increasing.

Each of these pose a significant challenge to the traditional approach of maintaining enterprise applications. The only real solution is to change the thinking around enterprise applications to avoid proliferating snowflakes.

 

This post was sponsored by:

SAP_Best_R_grad_blk

https://www.sap.com/intelligentdata

Business · Cloud

Riverbed extends into the cloud

logo_riverbed_orange

One of the most critical, but often overlooked components in a system is that of the network. Enterprises continue to spend considerable amounts of money on network optimization as part of their core infrastructure. Traditionally, enterprises have controlled much of the network between applications components. Most of the time the different tiers of an application were collocated in the same data center or across multiple data centers and dedicated network connections that the enterprise had control of.

The advent of cloud changed all of that. Now, different tiers of an application may be spread across different locations, running on systems that the enterprise does not control. This lack of control provides a new challenge to network management.

In addition to applications moving, so does the data. As applications and data move beyond the bounds of the enterprise data center, so does the need to address the increasingly dispersed network performance requirements. The question is: How do you still address network performance management with you no longer control the underlying systems and network infrastructure components?

Riverbed is no stranger to Network performance management. Their products are widely used across enterprises today. At Tech Field Day’sCloud Field Day 3, I had the chance to meet up with the Riverbed team to discuss how they are extending their technology to address the changing requirements that cloud brings.

EXTENDING NETWORK PERFORMANCE TO CLOUD

Traditionally network performance management involved hardware appliances that would sit at the edges of your applications or data centers. Unfortunately, in a cloud-based world, the enterprise does not have access to the cloud data center nor network egress points.

Network optimization in cloud requires an entirely different approach. Add to this that application services are moving toward ephemeral behaviors and one can quickly see how this becomes a moving target.

Riverbed takes a somewhat traditional approach to how they address the network performance management problem in the cloud. Riverbed gives the enterprise the option to run their software as either a ‘sidecar’ to the application or as part of the cloud-based container.

EXTENDING THE DATA CENTER OR EMBRACING CLOUD?

There are two schools of thought on how one engages a mixed environment of traditional data center assets along with cloud. The first is to look at extending the existing data center so that the cloud is viewed as simply another data center. The second approach is to change the perspective where the constraints are reduced to the application…or better yet service level. The latter is a construct that is typical in cloud-native applications.

Today, Riverbed has taken the former approach. They view the cloud as another data center in your network. To this point, Riverbed’s SteelFusion product works as if the cloud is another data center in the network. Unfortunately, this only works when you have consolidated your cloud-based resources into specific locations.

Most enterprises are looking at a very fragmented approach to their use of cloud-based resources today. A given application may consume resources across multiple cloud providers and locations due to specific resource requirements. This shows up in how enterprises are embracing a multi-cloud strategy. Unfortunately, consolidation of cloud-based resources works against one of the core value propositions to cloud; the ability to leverage different cloud solutions, resources and tools.

UNDERSTANDING THE RIVERBED PORTFOLIO

During the session with the Riverbed team, it was challenging to understand how the different components of their portfolio work together to address the varied enterprise requirements. The portfolio does contain extensions to existing products that start to bring cloud into the network fold. Riverbed also discussed their Steelhead SaaS product, but it was unclear how it fits into a cloud native application model. On the upside, Riverbed is already supporting multiple cloud services by allowing their SteelConnect Manager product to connect to both Amazon Web Services (AWS) and Microsoft Azure. On AWS, SteelConnect Manager can run as an AWS VPC.

Understanding the changing enterprise requirements will become increasingly more difficult as the persona of the Riverbed buyer changes. Historically, the Riverbed customer was a network administrator or infrastructure team member. As enterprises move to cloud, the buyer changes to the developer and possibly the business user in some cases. These new personas are looking for quick access to resources and tools in an easy to consume way. This is very similar to how existing cloud resources are consumed. These new personas are not accustomed to working with infrastructure nor do they have an interest in doing so.

PROVIDING CLARITY FOR THE CHANGING CLOUD CUSTOMER

Messaging and solutions geared to these new personas of buyers need to be clear and concise. Unfortunately, the session with the Riverbed team was very much focused on their traditional customer; the Network administrator. At times, they seemed to be somewhat confused by questions that addressed cloud native application architectures.

One positive indicator is that Riverbed acknowledged that the end-user experience is really what matters, not network performance. In Riverbed parlance, they call this End User Experience Management (EUEM). In a cloud-based world, this will guide the Riverbed team well as they consider what serves as their North Star.

As enterprise embrace cloud-based architectures more fully, so will the need for Riverbed’s model that drives their product portfolio, architecture and go-to-market strategy. Based on the current state, they have made some inroads, but have a long way to go.

Further Reading: The difference between hybrid and multi-cloud for the enterprise

Business

Droplet Computing makes a big splash at Cloud Field Day 3

droplet.png

Every so often there is a company that catches your eye. Not because of their flashy marketing, but because they are solving a really interesting problem with a clever approach. That happened at Tech Field Day’s Cloud Field Day 3, where Droplet Computingofficially came out of stealth and held the public launch of the company.

LIBERATION OF APPLICATIONS

Droplet Computing’s core value proposition is the containerization of applications in an effort to modernize infrastructure. Essentially, Droplet Computing provides the ability to take an application and create a container around the application. This creates an abstraction layer between the application and the underlying system operating system (OS). Droplet Computing does note that there are components of the original OS that are needed inside the container to support the application, but that they are not the full OS. Once the application is containerized, it can move to other platforms; newer OS, different platform, moved to a mobile device, etc.

The underlying technology uses a combination of Wine& WebAssemblyto containerize the application.

There are many applications still in use across the globe that are not able to be upgraded for a myriad of reasons. Unfortunately, this limits the ability of the operator to move forward in modernizing their entire infrastructure while these older applications are still in use.

The solution is not limited to just older applications. Other applications could use the same technology to provide mobility between different system types, OS and the like. However, there are a number of competing products that provide similar functions for current applications.

Several of the use-cases that the Droplet Computing team mentioned included custom applications associated with CNC machines, MRI devices and the like. One thought was, think of all the WindowsXP based applications and older custom machines that are still in use today.

REDUCING THE CYBERSECURITY FOOTPRINT

There is one clever side-effect to encapsulating the application and providing the ability to upgrade the underlying hardware and OS without having to upgrade the application. It reduces the cybersecurity footprint and risk for that system. Does it eliminate the risk completely? No. But by leveraging modern hardware and OS, it does make a dent into reducing the potential risk.

IN SUMMARY

Droplet Computing is not the silver bullet that will magically modernize your entire environment, but it does give a level of abstraction to those older applications still widely used. This allows enterprises to bring legacy applications forward through modernization.

At the same time, it addresses a core issue that all enterprises are seeking: reduce your cybersecurity footprint. In today’s world where the risks from cyber-attacks are increasing, anything that reduces the footprint is a welcome approach.

Droplet Computing’s product is currently in ‘pre-GA’ and slated to move to GA by mid-May.

Business · Cloud

Oracle works toward capturing enterprise Cloud IaaS demand

4

The enterprise cloud market is still shows a widely untapped potential. A significant portion of this potential comes from the demand generated by the legacy applications that are sitting in the myriad of corporate data centers. The footprint from these legacy workloads alone is staggering. Start adding in the workloads that sit in secondary data centers that often do not get included in many metrics and one can quickly see the opportunity.

ORACLE STARTS FROM THE GROUND UP

At Tech Field Day’s Cloud Field Day 3, I had the opportunity to meet with the team from Oracle Cloud Infrastructureto discuss their Infrastructure as a Service (IaaS) cloud portfolio. Oracle is trying to attract the current Oracle customer to their cloud-based offerings. Their offerings range from IaaS up through Software as a Service (SaaS) for their core back-office business applications.

The conversation with the Oracle team was pretty rough as it was hard to determine what, exactly, that they did in the IaaS space. There were a number of buzzwords and concepts thrown around without covering what the Oracle IaaS portfolio actually offered. Eventually, it became clear during a demo, in a configuration page what the true offerings were: Virtual Machines and Bare Metal. That’s a good start for Oracle, but unfortunate in how it was presented. Oracle’s offering is hosted infrastructure that is more similar to IBM’s SoftLayer(now called IBM Cloud) than Microsoft Azure, Amazon AWSor Google Cloud.

ORACLE DATABASE AS A SERVICE

Beyond just the hardware, applications are one of the strengths of Oracle’s enterprise offerings. And a core piece of the puzzle has always been their database. One of the highlights from the conversation was their Database as a Service (DBaaS)offering. For enterprises that use Oracle DB, this is a core sticking point that keeps their applications firmly planted in the corporate data center. With the Oracle DBaaS offering, enterprises have the ability to move workloads to a cloud-based infrastructure without losing fidelity in the Oracle DB offering.

Digging deeper into the details, there were a couple interesting functions supported by Oracle’s DBaaS. A very cool feature was the ability to dynamically change the number of CPUs allocated to a database without taking an outage. This provides the ability to scale DB capacity up and down, as needed, without impact to application performance.

Now, it should be noted that while the thought of a hosted Oracle DB sounds good on paper, the actual migration will be complicated for any enterprise. That is less a statement about Oracle and more to the point that enterprise application workloads are a complicated web of interconnects and integrations. Not surprisingly, Oracle mentioned that the most common use-case that is driving legacy footprints to Oracle Cloud is the DB. This shows how much pent-up demand there is to move even the most complicated workloads to cloud. Today, Oracle’s DB offering runs on Oracle Cloud Infrastructure (OCI). It was mentioned that the other Oracle Cloud offerings are moving to run on OCI as well.

Another use-case mentioned was that of High-Performance Computing (HPC). HPC environments need large scale and low latency. Both are positive factors for Oracle’s hardware designs.

While these are two good use-cases, Oracle will need to do things that attract a broader base of use-cases moving forward.

THE CIO PERSPECTIVE

Overall, there seems to be some glimmers of light coming from the Oracle Cloud offering. However, it is hard to get into the true differentiators. Granted that Oracle is playing a bit of catch-up compared with other, more mature cloud-based offerings.

The true value appears to be focused on existing Oracle customers that are looking to make a quick move to cloud. If true and the two fundamental use-cases are DBaaS and HPC, that is a fairly limited pool of customers when there is significant potential still sitting in the corporate data center.

It will be interesting to see how Oracle evolves their IaaS messaging and portfolio to broaden the use-cases and provide fundamental services that other cloud solutions have offered for years. Oracle does have the resources to put a lot of effort toward making a bigger impact. Right now, however, it appears that the Oracle Cloud offering is mainly geared for existing Oracle customers with specific use-cases.

Business · Cloud · Data

Delphix smartly reduces the friction to access data

Screen Shot 2018-04-17 at 10.00.57 AM

Today’s CIO is looking for ways to untap the potential in their company’s data. We have heard the phrase that data is the new oil. Except that data, like oil, is just a raw material. Ultimately, we need to refine it into a finished good which is ultimately where the value resides.

At the same time, enterprises are concerned with regulatory and compliance requirements to protect data. Recent data breaches by globally-recognized companies have raised the concern around data privacy. Historically, the financial services and healthcare industries were the ones to watch when it came to regulatory and compliance requirements. Today, the regulatory net is widening with the EU’s General Data Protection Regulation(GDPR), US Government’s FedRAMPand NY State DFS Cybersecurity Requirements.

Creating greater access to data while staying in compliance and protecting data sit at opposite ends of the privacy and cybersecurity spectrum. Add to this the interest in moving data to cloud-based solutions and one can quickly see why this is one of the core challenges for today’s CIO.

DELPHIX REDUCES THE FRICTION TO DATA ACCESS

At Tech Field Day’s Cloud Field Day 3, I had the opportunity to meet with the team from Delphix.

Fundamentally, Delphix is a cloud-based data management platform that helps enterprises reduce the friction to data access through automation of data management. Today, one-third of Fortune 500 companies use Delphix.

Going back to the core issue, users have a hunger for accessing data. However, regulatory and compliance requirements often hinder that process. Today’s methods to manage data are heavily manual and somewhat archaic compared with solutions like Delphix.

Delphix’ approach is to pack up the data into, what they call, a Data Pod. Unlike most approaches that mask data when it is shared, Delphix masks the data during the intake process. The good thing about this approach is in removing the risk of accidentally sharing protected data.

In terms of sharing data, one clever part of the Delphix Dynamic Data Platform is in its ability to replicate data smartly. Considering that Delphix works in the cloud, this is a key aspect to avoiding unnecessary costs. Alternatively, enterprises would see a significant uptick in data storage as masked data is replicated to the various users. Beyond structured, transactional data, Delphix is also able to manage (and mask) databases, along with unstructured data and files.

THE CIO PERSPECTIVE

From the CIO perspective, Delphix appears to address an increasingly complicated space with a clever, yet simple approach. The three key takeaways are: a) Ability to mask data (DB, unstructured, files) at intake versus when pulling copies, b) ability to smartly replicate data and c) potential to manage data management policies. Lastly, this is not a solution that must run in the corporate data center. Delphix supports running in public cloud services including Microsoft Azureand Amazon AWS.

In Summary, Delphix appears to have decreased the friction to data access by automating the data protection and management processes. All while supporting an enterprise’s move to cloud-based resources.

Business · Cloud

Morpheus Data brings the glue to multi-cloud management

clover-b4ff8d514c9356e8860551f79c48ff7c

Enterprises across the globe are starting to leverage cloud-based resources in a multitude of ways. However, there is not a one-size-fits-all approach to cloud that makes sense for the enterprise portfolio. This leads to a rise in multi-cloud deployments for the varied workloads any given enterprise uses. Meaning, any given enterprise will use a variety of different cloud-based services depending on the specific requirements of any given workload. It is important to understand the difference between Multi-Cloud and Hybrid Cloud.

This cloud ‘sprawl’ creates an increasingly complicated management problem as each cloud provider uses a different approach to manage their cloud-based services. Layer in management processes, automation routines and management tools and one can quickly understand the challenge. Add to this that any given application may use a different combination of cloud services and one can quickly see how the problem gets exponentially more complicated with each workload.

MORPHEUS DATA PROVIDES THE GLUE

At Tech Field Day’s Cloud Field Day 3, I had the opportunity to meet with the team from Morpheus Data.

Morpheus Data addresses this complicated web of tools and services by providing an abstraction layer on top of the various tools and services. More specifically, Morpheus Data creates abstraction between the provisioning and underlying infrastructure. To date, they support 49 service integrations out of the box that cover a variety of cloud services, governance tools, management tools and infrastructure.

Providing governance and automation is key to any multi-cloud or hybrid-cloud deployment. Leveraging a solution like Morpheus Data will help streamline CloudOps & DevOps efforts through their integration processes.

One interesting aspect of Morpheus Data’s solution is the ability to establish application templates that span a number of different tools, services & routines. The templates assist with deployment and can set specific time limitations on specific services. This is especially handy to avoid one form of sprawl known as service abandonment where a service is left running and accruing cost even though it is no longer used.

Much of Morpheus Data’s efforts are geared toward ‘net-new’ deployments to cloud. Moving legacy workloads will require re-working before fully taking advantage of cloud-based resources. I wrote about the challenges with legacy workloads moving to public cloud in these posts:

LOOKING BEYOND THE TOOL

While Morpheus Data provides technology to address the systemic complexities of technology, it does not address the people component. To be fair, it is not clear that any tool will necessarily fix the people component. Specifically, in order to truly leverage good governance and automation routines, one needs to come to grips with the organizational and cultural changes to support such approaches.

In order to address the people component, it is helpful to break down the personas. The key three are Developer, Infrastructure Administrator and Executive. Each of these personas have different requirements and interests that will impact how services are selected and consumed.

IN SUMMARY

Morpheus Data is going after a space that is both huge and highly complicated. A big challenge for the team will be to focus on the most critical spaces without trying to cover every tool, process and model. This is really a question going broad or going deep. You can’t do both.

In addition, it is clear that Morpheus Data has a good start but would benefit from bringing operational data and costs into the factors that drive decisions on which services to use. The team already has some cost components included but are not as dynamic as enterprises will need moving forward.

In summary, the Morpheus Data solution looks like a great start to the increasingly complicated multi-cloud space. Every enterprise will have some form of complexity dealing with multi-cloud and hybrid cloud. As such, they could benefit from a solution to help streamline the processes. Morpheus Data looks like a good start and will be interesting to see how the company and solution evolve over time to address this increasingly complicated space.

Cloud

Four expectations for AWS re:Invent

zD82bkYyQdKmZW4oSNxakg

This week brings Amazon Web Services’ (AWS) annual re:Invent conference where thousands will descend upon Las Vegas to learn about cloud and the latest in AWS innovations. Having attended the conference for several years now, there are a number of trends that are common at an AWS event. One of those is the sheer number of products that AWS announces. Aside from that, there are a number of specific things I am looking for at this week’s re:Invent conference.

ENTERPRISE ENGAGEMENT

AWS has done a stellar job of attracting the startup and web-scale markets to their platform. The enterprise market, however, has proven to be an elusive customer except for a (relatively) few case examples. This week, I am looking to see how things have changed for enterprise adoption of AWS. Has AWS found the secret sauce to engage the enterprise in earnest?

PORTFOLIO MANAGEMENT

Several years back, AWS made a big point of not being one of “those” companies with a very large portfolio of products and services. Yet, several years later, AWS has indeed become a behemoth with a portfolio of products and services a mile long. This is a great thing for customers, but can have a few downsides too. Customers, especially enterprise customers, tend to make decisions that last longer than the startup & web-scale customers. Therefore, service deprecation is a real concern with companies that a) do not have a major enterprise focus and b) have a very large portfolio. Unfortunately, this is where AWS is today. Similarly, to date, AWS has not done much in the way of portfolio pruning.

HYBRID CLOUD SUPPORT

For the enterprise, hybrid is their reality. In the past, AWS has taken the position that hybrid means a way to onboard customers into AWS Public Cloud. Hybrid, a combination of on-premises and cloud-based resources can be a means to onboard customers into public cloud. The question is: How is AWS evolving their thinking of hybrid cloud? In addition, how has their thinking evolved to encompass hybrid cloud from the perspective of the enterprise?

DEMOCRATIZATION OF AI & ML

Several of AWS’ competitors have done a great job of democratizing artificial intelligence (AI) and machine learning (ML) tools in a means to make them more approachable. AWS was one of the first out of the gate with a strong showing of AI & ML tools a few years back. The question is: How have they evolved in the past year to make the tools more approachable for the common developer?

BONUS ROUND

As a bonus, it would be interesting if AWS announced the location of their 2nd headquarters. Will they announce it at re:Invent versus a financial analyst call? We shall see.

In summary, AWS never fails to put on a great conference with a good showing. This year should not disappoint.