Business · Cloud · Data

Shifting the thinking of enterprise applications

Enterprise applications are not new and have been around for decades. Since their start, enterprise applications have increased in their level of sophistication and business automation. However, this sophistication comes with a significant degree of complexity too.

Historically, enterprises were in the position where they needed to build everything themselves. Much of this stemmed from the fact there were limited options to consider. Fast forward to today and there are a myriad of options of how an enterprise can consume an enterprise application.

However, getting from here to there is not trivial. Practically every enterprise application has a strong degree of complexity that is directly tied to the intricacies of their specific business operations. For decades, enterprises have taken the approach of customizing the application to match their existing business processes. Due to the degree of customization, every enterprise Information Technology (IT) organization essentially created their own enterprise application snowflake.

CHANGING THE ENTERPRISE APPLICATION PARADIGM

One of the challenges for enterprise applications in the cost to upgrade. All of the unique customizations significantly increase the cost and complexity to upgrade the system. The customizations, related programming, configuration and testing involved turn each upgrade into the equivalent of a new implementation.

For many enterprises, it is common practice to skip versions rather than maintain currency due to the cost and disruption associated with the complexity to upgrade. This also means that many enterprises delay their ability to leverage new functionality.

New opportunities from cloud computing and Artificial Intelligence (AI) present unique opportunities to enterprise applications. With cloud-based enterprise applications, no longer is the enterprise required to install, manage and operate the underlying enterprise application. As applications increase in their level of complexity, this takes an increasingly huge burden off the shoulders of the IT organization.

AI presents a different type of opportunity. Enterprises are increasingly their reliance on data to gain greater insights. The volume and types of data are adding increased pressure on the traditional methods to analyze data. AI presents a unique opportunity to automate the process and gain insights not previously possible. However, the more data available to the AI algorithm, the more supportive it can be. And that is where cloud comes in to provide additional resources in a meaningful way when needed without the need to build a fortress internally.

TRADITIONAL VERSUS TRANSFORMATIONAL

Of late, enterprise IT organizations are shifting their focus from a traditionalIT organization to that of a transformationalIT organization. That is to say that their focus is shifting from technology-centricto business-centric. As part of this shift, IT organizations are looking for ways to streamline their technical operations and focus more on data and insights.

The shift to transformational IT organizations is having an impact on the most sacred applications within the IT portfolio including the enterprise applications.

MATURING THE THINKING ABOUT ENTERPRISE APPLICATIONS

More mature enterprises are starting to shift their thinking about enterprise applications. This is due to a number of factors including 1) IT organizations are shifting their focus on business-centric outcomes, 2) Mature alternatives exist for even the largest of implementations 3) The pressure to implement advanced functions is increasing and 4) The speed in which IT organizations must respond with changes is increasing.

Each of these pose a significant challenge to the traditional approach of maintaining enterprise applications. The only real solution is to change the thinking around enterprise applications to avoid proliferating snowflakes.

 

This post was sponsored by:

SAP_Best_R_grad_blk

Business · Data

Understanding the value of data integration

To understand the value of data integration, one has to first understand the changing data landscape. In the past few years, more data has been created than existed in all of time prior to that. In 2014, I penned a post asking ‘Are enterprises prepared for the data tsunami’? When it comes to data, enterprises of all sizes and maturity face two core issues: 1) How to effectively manage the sheer volume of data in a meaningful way and 2) How to extract insights from the data. Unfortunately, the traditional ways to manage data start to break down when considering these new challenges.

DIVERSE DATA SETS

In the above-mentioned post, there was reference to an IDC report suggesting that by 2020, the total amount of data will equate to 40,000 exabytes or 40 trillion gigabytes. That is more than 5,200 gigabytes for every man, woman and child in 2020.

However, unlike data in the past, this new data will come from an increasingly varied list of sources. Some of the data will be structured. Other data will be unstructured. And then there is meta data that is derived through analysis of these varied data sets. All of which needs to be leveraged by the transformational enterprise.

In the past, one might have pooled all of this data into a classic data warehouse. Unfortunately, many of the new data types do not fit nicely into this approach. Then came the data lake as a solution to simply pool all of this data. Unfortunately, this approach is also met with challenges as many enterprises are seeing their data lakes turn into data swamps.

Even beyond data generated internally enterprises are increasing their reliance on externally sourced data. Since this data is not created by the enterprise, there are limits on how the data is leveraged. In addition, simply bringing all of this data into the enterprise is not that simple. Nor is it feasible.

Beyond the concept of different data sets, these new data sets create ‘data gravity’ as they grow in size. Essentially, creating a stronger bond between the data set and the application that leverages it. As the size of the data set grows, so does its ‘gravity’ which prevents movement. All of these reasons create significant friction to considering any movement of data.

VALUE OF INTEGRATING DATA

The solution rests with data integration. Essentially, leave data where it resides and leverage integration methods to the various data sets in order to create insights. There are actually two components when considering how to integrate data.

There is a physical need for data integration and one that is more logical in nature. The physical component is how to physically connect the different data sources together. This is easier said than done. It was already challenging when we managed all of the data within the enterprise. Today, the data resides in the hands of many other players and approaches. This can add complexity to the integration efforts. Modern data integration methods rely on Application Programming Interfaces (APIs) to create these integration points. In addition, there are security ramifications to consider too.

The logical integration of data often centers around the customer. One of the core objectives for enterprises today is customer engagement. Enterprises are finding ways to learn more about their customer in an effort to build a more holistic profile that ultimately leads to a stronger relationship. Not all of that data is sourced internally. This really is a case of 1+1=3 where even smaller insights can lead to a larger impact when combined.

THE INTERSECTION OF DATA INTEGRATION AND ADVANCED FUNCTIONS

Data integration is a deep and complicated subject that is evolving quickly. Newer advancements in the Artificial Intelligence (AI) space are leading enterprises to gain greater insights that even they didn’t think about. Imagine a situation where you thought you knew your customer, but the system suggested other aspects that weren’t considered. AI has the opportunity to significantly augment the human capability to create more accurate insights and faster.

Beyond AI, other newer functions such as Machine Learning (ML) and Internet of Things (IoT) present new sources of data to further enhance insights. It should be noted that nether ML nor IoT are able to function in a meaningful way without leveraging data integration.

DATA INTEGRATION LEADS TO SPEED AND INSIGHTS…AND CHALLENGES

Enterprises that leverage AI and ML to augment their efforts find increased value from both the insights and the speed in which they respond. In today’s world where speed and accuracy are becoming a strong differentiation for competitors, leveraging as much data as possible is key. In order to leverage the sheer amount of data, enterprises must leverage data integration to remain competitive.

At the same time, enterprises are facing challenges from new regulations such as the General Data Protection Regulation (GDPR). There are many facets and complexities to GDPR that will only further the complexities for data integration and management.

While enterprises may have leveraged custom approaches to solve the data integration problem in the past, today’s complexities demand a different approach. The combination of these challenges push enterprises to leverage advanced tools to assist in the integration of data to gain greater insights.

 

This post sponsored by:

SAP_Best_R_grad_blk

Business · Cloud · Data

How cloud is changing the business and IT landscape

 

Enterprises around the globe are facing disruption from a number of different directions. To combat this disruption, they increasingly turn to cloud computing.

Cloud computing is the single biggest opportunity for organizations to dramatically change how their business operates through changes in their Information Technology (IT) landscape. Organizations are evaluating ways to leverage cloud-based solutions in differentiated ways. As these organizations evaluate how best to leverage cloud, there are a number of factors to consider when moving into a cloud-based world.

SHIFTING HOW WE THINK ABOUT ORGANIZATIONS

The move for enterprises leveraging cloud-based technology causes more than just a shift in technology. It also causes a shift in how we think about strategy, organization and culture. In the past, enterprises were required to build their strategy, organization and support around the entire technology platform. There were few to no other reasonable options.

In the cloud-based era, those aspects change. Today, there are mature, viable solutions that allow the enterprise to leverage alternatives. By shifting functions that are important but not strategic nor differentiated, it allows organizations to shift their focus as well.

By making this shift, organizations are able to up-level conversations and focus on business outcomes rather than technology features. These changes open the door to other shifts as well.

CHANGING THE WAY WE WORK

Two of those shifts directly impact the advantage to a business and their customers. The first shift is leverage. Cloud computing provides leverage in ways not previously feasible. IT organizations now have access to a bevy of solutions and virtually unlimited resources. Prior to cloud computing, organizations were limited in their ability to quickly scale resources and leverage new technology. For many, cost constraints were a significant limitation to potential growth.

Beyond technology, cloud provides the organization with the ability to leverage the expertise of others. As mentioned, previous methods required IT organizations to be an expert across a myriad of strategic and non-strategic areas. With cloud, organizations can focus on those aspects strategic and differentiated for their business and customers. This shifts the organization and culture to focus more directly on solutions that provide business advantage.

The second shift is speed. Unlike past options that may take weeks or months for the availability of applications and resources, cloud shortens the time to a matter of minutes. As businesses are looking for ways to respond quickly to ever-changing customer requirements, this rapid flexibility provides the organization with the ability to respond in a timely fashion.

OPENING UP NEW OPPORTUNITES

Aside from speed, cloud opens up a whole new world of opportunity for organizations. Historically, there was a minimum cost of entry for many of the more advanced systems. Unfortunately, these costs were far outside the realm of possibility for Small and Medium Businesses (SMB). With cloud, SMB organizations now have access to the same solutions as the largest of enterprises. Essentially, this levels the playing field by lowering the barriers to entry of these advanced systems.

As organizations shift toward a greater focus on customers through quicker and more accurate business decisions, so does their reliance on data. At the same time, organizations are facing an explosion of data. Traditional reporting and analytics are collapsing under the weight of this new influx of data.

To combat the data deluge, organizations are increasing their reliance on Artificial Intelligence (AI) and Machine Learning (ML) to automate the insights gleaned from the data. While AI & ML require large quantities of data, traditional solutions are simply unable to provide the resources that AI & ML algorithms require. Again, cloud computing opens the door to organizations, large and small, to leverage these advanced functions.

BRINGING IT TOGETHER

The combination of cloud-based resources with advanced functions provide organizations with new opportunities to glean greater insights much quicker than previously possible. Cloud’s leverage provides organizations the ability to shift the IT landscape to focus on business outcomes and strategic initiatives rather than deep technical expertise.

In summary, cloud computing presents not only an alternative option for technology consumption, but a required pillar for tomorrow’s business.

 

This post is sponsored by:

SAP_Best_R_grad_blk

Business · Cloud

Riverbed extends into the cloud

logo_riverbed_orange

One of the most critical, but often overlooked components in a system is that of the network. Enterprises continue to spend considerable amounts of money on network optimization as part of their core infrastructure. Traditionally, enterprises have controlled much of the network between applications components. Most of the time the different tiers of an application were collocated in the same data center or across multiple data centers and dedicated network connections that the enterprise had control of.

The advent of cloud changed all of that. Now, different tiers of an application may be spread across different locations, running on systems that the enterprise does not control. This lack of control provides a new challenge to network management.

In addition to applications moving, so does the data. As applications and data move beyond the bounds of the enterprise data center, so does the need to address the increasingly dispersed network performance requirements. The question is: How do you still address network performance management with you no longer control the underlying systems and network infrastructure components?

Riverbed is no stranger to Network performance management. Their products are widely used across enterprises today. At Tech Field Day’sCloud Field Day 3, I had the chance to meet up with the Riverbed team to discuss how they are extending their technology to address the changing requirements that cloud brings.

EXTENDING NETWORK PERFORMANCE TO CLOUD

Traditionally network performance management involved hardware appliances that would sit at the edges of your applications or data centers. Unfortunately, in a cloud-based world, the enterprise does not have access to the cloud data center nor network egress points.

Network optimization in cloud requires an entirely different approach. Add to this that application services are moving toward ephemeral behaviors and one can quickly see how this becomes a moving target.

Riverbed takes a somewhat traditional approach to how they address the network performance management problem in the cloud. Riverbed gives the enterprise the option to run their software as either a ‘sidecar’ to the application or as part of the cloud-based container.

EXTENDING THE DATA CENTER OR EMBRACING CLOUD?

There are two schools of thought on how one engages a mixed environment of traditional data center assets along with cloud. The first is to look at extending the existing data center so that the cloud is viewed as simply another data center. The second approach is to change the perspective where the constraints are reduced to the application…or better yet service level. The latter is a construct that is typical in cloud-native applications.

Today, Riverbed has taken the former approach. They view the cloud as another data center in your network. To this point, Riverbed’s SteelFusion product works as if the cloud is another data center in the network. Unfortunately, this only works when you have consolidated your cloud-based resources into specific locations.

Most enterprises are looking at a very fragmented approach to their use of cloud-based resources today. A given application may consume resources across multiple cloud providers and locations due to specific resource requirements. This shows up in how enterprises are embracing a multi-cloud strategy. Unfortunately, consolidation of cloud-based resources works against one of the core value propositions to cloud; the ability to leverage different cloud solutions, resources and tools.

UNDERSTANDING THE RIVERBED PORTFOLIO

During the session with the Riverbed team, it was challenging to understand how the different components of their portfolio work together to address the varied enterprise requirements. The portfolio does contain extensions to existing products that start to bring cloud into the network fold. Riverbed also discussed their Steelhead SaaS product, but it was unclear how it fits into a cloud native application model. On the upside, Riverbed is already supporting multiple cloud services by allowing their SteelConnect Manager product to connect to both Amazon Web Services (AWS) and Microsoft Azure. On AWS, SteelConnect Manager can run as an AWS VPC.

Understanding the changing enterprise requirements will become increasingly more difficult as the persona of the Riverbed buyer changes. Historically, the Riverbed customer was a network administrator or infrastructure team member. As enterprises move to cloud, the buyer changes to the developer and possibly the business user in some cases. These new personas are looking for quick access to resources and tools in an easy to consume way. This is very similar to how existing cloud resources are consumed. These new personas are not accustomed to working with infrastructure nor do they have an interest in doing so.

PROVIDING CLARITY FOR THE CHANGING CLOUD CUSTOMER

Messaging and solutions geared to these new personas of buyers need to be clear and concise. Unfortunately, the session with the Riverbed team was very much focused on their traditional customer; the Network administrator. At times, they seemed to be somewhat confused by questions that addressed cloud native application architectures.

One positive indicator is that Riverbed acknowledged that the end-user experience is really what matters, not network performance. In Riverbed parlance, they call this End User Experience Management (EUEM). In a cloud-based world, this will guide the Riverbed team well as they consider what serves as their North Star.

As enterprise embrace cloud-based architectures more fully, so will the need for Riverbed’s model that drives their product portfolio, architecture and go-to-market strategy. Based on the current state, they have made some inroads, but have a long way to go.

Further Reading: The difference between hybrid and multi-cloud for the enterprise

Business · Cloud · Data

Rubrik continues their quest to protect the enterprise

Screen Shot 2018-04-23 at 9.56.47 AM

Data protection is all the rage right now. With data moving beyond the corporate data center to multiple locations including cloud, the complexity has increased significantly. What is Data Protection? It generally covers a combination of backup, restore, disaster recovery (DR) and business continuity (BC). While not new, most enterprises have struggled for decades to effectively backup their systems in a way that ensures that a) the data is protected, b) can be restored if needed and c) can be restored in a timely fashion when needed. Put a different way: BC/DR is still one of most poorly managed parts of an IT operation. Add cloud to this and one can see where the wheels start to fall off.

The irony is that, while not new, enterprises struggle to effectively balance the needs of DR/BC in a meaningful way. The reasons for this are longer than this blog will permit. This is an industry screaming for disruption. Enter Rubrik.

RUBRIK BRINGS A FRESH PERSPECTIVE TO AN OLD PROBLEM

A couple of weeks back, I caught up with the Rubrik team at Tech Field Day’s Cloud Field Day 3. Rubrik came into the market a few years back and has continued their drive to solve this old, but growing problem.

Unlike traditional solutions, Rubrik takes a modern approach to their architecture. Everything that Rubrik does calls an API. By using this API-centric architecture, Rubrik provides modularity and flexibility to their approach. API-centric architectures are a must in a cloud-based world.

At Cloud Field Day, the Rubrik team went through their new SaaS-based solution called Polaris. Knowing that enterprise data is increasingly being spread across multiple data centers and cloud providers, they need a cohesive way to visually manage their data. Polaris is a SaaS-based solution that does just that. Polaris becomes the overarching management platform in which to effectively manage the growing complexity.

COMPLEXITY DRIVES THE NEED FOR A NEW APPRAOCH

There are two dynamics that are driving these changes: 1) the explosion in data growth and 2) the need to effective management data. As applications and their data move to a myriad of different solutions, so does the need to effectively manage the underlying data.

An increase in compliance and regulatory requirements are just adding further complexity to data management. As the complexity grows, so does the need for systemic automation. No longer are we able to simply throw more resources at the problem. It is time to turn the problem on its head and leverage new approaches.

DATA PROTECTION IS NOT IMPORTANT…UNTIL IT IS

During the discussion, Rubrik’s Chief Technologist Chris Wahl made a very key observation that everyone in IT painfully understands: Data protection is not important…until it is. To many enterprises, the concept of data protection is seen as an insurance policy that you hopefully will not need. However, in today’s world of increasingly regulated and highly complicated architectures with data spreading out at scale, the risks are simply too great to ignore.

While data protection may have been less important in the past, today it is critical.

GOING BEYOND SIMPLY BACKUP AND RECOVERY

If the story about Rubrik were to stop with just backup and recovery, it would still be impressive. However, Rubrik is venturing into the complexity that comes with integration into other systems and processes. One of the first areas is their integration with ServiceNow. Rubrik integrates with ServiceNow by ingesting CMDB data into the system. By doing so, it provides a cohesive look at the underlying components that Rubrik has visibility into.

Looking into the crystal ball, one can start to see how Rubrik is fully understanding that backup and recovery is just the start. The real opportunity comes from full integration into business processes. However, in order to do that, integrations like ServiceNow are needed. Expect to see more as Rubrik continues their quest to provide a solid foundation to the enterprise when they need it most.

Business

Droplet Computing makes a big splash at Cloud Field Day 3

droplet.png

Every so often there is a company that catches your eye. Not because of their flashy marketing, but because they are solving a really interesting problem with a clever approach. That happened at Tech Field Day’s Cloud Field Day 3, where Droplet Computingofficially came out of stealth and held the public launch of the company.

LIBERATION OF APPLICATIONS

Droplet Computing’s core value proposition is the containerization of applications in an effort to modernize infrastructure. Essentially, Droplet Computing provides the ability to take an application and create a container around the application. This creates an abstraction layer between the application and the underlying system operating system (OS). Droplet Computing does note that there are components of the original OS that are needed inside the container to support the application, but that they are not the full OS. Once the application is containerized, it can move to other platforms; newer OS, different platform, moved to a mobile device, etc.

The underlying technology uses a combination of Wine& WebAssemblyto containerize the application.

There are many applications still in use across the globe that are not able to be upgraded for a myriad of reasons. Unfortunately, this limits the ability of the operator to move forward in modernizing their entire infrastructure while these older applications are still in use.

The solution is not limited to just older applications. Other applications could use the same technology to provide mobility between different system types, OS and the like. However, there are a number of competing products that provide similar functions for current applications.

Several of the use-cases that the Droplet Computing team mentioned included custom applications associated with CNC machines, MRI devices and the like. One thought was, think of all the WindowsXP based applications and older custom machines that are still in use today.

REDUCING THE CYBERSECURITY FOOTPRINT

There is one clever side-effect to encapsulating the application and providing the ability to upgrade the underlying hardware and OS without having to upgrade the application. It reduces the cybersecurity footprint and risk for that system. Does it eliminate the risk completely? No. But by leveraging modern hardware and OS, it does make a dent into reducing the potential risk.

IN SUMMARY

Droplet Computing is not the silver bullet that will magically modernize your entire environment, but it does give a level of abstraction to those older applications still widely used. This allows enterprises to bring legacy applications forward through modernization.

At the same time, it addresses a core issue that all enterprises are seeking: reduce your cybersecurity footprint. In today’s world where the risks from cyber-attacks are increasing, anything that reduces the footprint is a welcome approach.

Droplet Computing’s product is currently in ‘pre-GA’ and slated to move to GA by mid-May.

Business · Cloud

Oracle works toward capturing enterprise Cloud IaaS demand

4

The enterprise cloud market is still shows a widely untapped potential. A significant portion of this potential comes from the demand generated by the legacy applications that are sitting in the myriad of corporate data centers. The footprint from these legacy workloads alone is staggering. Start adding in the workloads that sit in secondary data centers that often do not get included in many metrics and one can quickly see the opportunity.

ORACLE STARTS FROM THE GROUND UP

At Tech Field Day’s Cloud Field Day 3, I had the opportunity to meet with the team from Oracle Cloud Infrastructureto discuss their Infrastructure as a Service (IaaS) cloud portfolio. Oracle is trying to attract the current Oracle customer to their cloud-based offerings. Their offerings range from IaaS up through Software as a Service (SaaS) for their core back-office business applications.

The conversation with the Oracle team was pretty rough as it was hard to determine what, exactly, that they did in the IaaS space. There were a number of buzzwords and concepts thrown around without covering what the Oracle IaaS portfolio actually offered. Eventually, it became clear during a demo, in a configuration page what the true offerings were: Virtual Machines and Bare Metal. That’s a good start for Oracle, but unfortunate in how it was presented. Oracle’s offering is hosted infrastructure that is more similar to IBM’s SoftLayer(now called IBM Cloud) than Microsoft Azure, Amazon AWSor Google Cloud.

ORACLE DATABASE AS A SERVICE

Beyond just the hardware, applications are one of the strengths of Oracle’s enterprise offerings. And a core piece of the puzzle has always been their database. One of the highlights from the conversation was their Database as a Service (DBaaS)offering. For enterprises that use Oracle DB, this is a core sticking point that keeps their applications firmly planted in the corporate data center. With the Oracle DBaaS offering, enterprises have the ability to move workloads to a cloud-based infrastructure without losing fidelity in the Oracle DB offering.

Digging deeper into the details, there were a couple interesting functions supported by Oracle’s DBaaS. A very cool feature was the ability to dynamically change the number of CPUs allocated to a database without taking an outage. This provides the ability to scale DB capacity up and down, as needed, without impact to application performance.

Now, it should be noted that while the thought of a hosted Oracle DB sounds good on paper, the actual migration will be complicated for any enterprise. That is less a statement about Oracle and more to the point that enterprise application workloads are a complicated web of interconnects and integrations. Not surprisingly, Oracle mentioned that the most common use-case that is driving legacy footprints to Oracle Cloud is the DB. This shows how much pent-up demand there is to move even the most complicated workloads to cloud. Today, Oracle’s DB offering runs on Oracle Cloud Infrastructure (OCI). It was mentioned that the other Oracle Cloud offerings are moving to run on OCI as well.

Another use-case mentioned was that of High-Performance Computing (HPC). HPC environments need large scale and low latency. Both are positive factors for Oracle’s hardware designs.

While these are two good use-cases, Oracle will need to do things that attract a broader base of use-cases moving forward.

THE CIO PERSPECTIVE

Overall, there seems to be some glimmers of light coming from the Oracle Cloud offering. However, it is hard to get into the true differentiators. Granted that Oracle is playing a bit of catch-up compared with other, more mature cloud-based offerings.

The true value appears to be focused on existing Oracle customers that are looking to make a quick move to cloud. If true and the two fundamental use-cases are DBaaS and HPC, that is a fairly limited pool of customers when there is significant potential still sitting in the corporate data center.

It will be interesting to see how Oracle evolves their IaaS messaging and portfolio to broaden the use-cases and provide fundamental services that other cloud solutions have offered for years. Oracle does have the resources to put a lot of effort toward making a bigger impact. Right now, however, it appears that the Oracle Cloud offering is mainly geared for existing Oracle customers with specific use-cases.