Business · Data

Understanding the value of data integration

To understand the value of data integration, one has to first understand the changing data landscape. In the past few years, more data has been created than existed in all of time prior to that. In 2014, I penned a post asking ‘Are enterprises prepared for the data tsunami’? When it comes to data, enterprises of all sizes and maturity face two core issues: 1) How to effectively manage the sheer volume of data in a meaningful way and 2) How to extract insights from the data. Unfortunately, the traditional ways to manage data start to break down when considering these new challenges.

DIVERSE DATA SETS

In the above-mentioned post, there was reference to an IDC report suggesting that by 2020, the total amount of data will equate to 40,000 exabytes or 40 trillion gigabytes. That is more than 5,200 gigabytes for every man, woman and child in 2020.

However, unlike data in the past, this new data will come from an increasingly varied list of sources. Some of the data will be structured. Other data will be unstructured. And then there is meta data that is derived through analysis of these varied data sets. All of which needs to be leveraged by the transformational enterprise.

In the past, one might have pooled all of this data into a classic data warehouse. Unfortunately, many of the new data types do not fit nicely into this approach. Then came the data lake as a solution to simply pool all of this data. Unfortunately, this approach is also met with challenges as many enterprises are seeing their data lakes turn into data swamps.

Even beyond data generated internally enterprises are increasing their reliance on externally sourced data. Since this data is not created by the enterprise, there are limits on how the data is leveraged. In addition, simply bringing all of this data into the enterprise is not that simple. Nor is it feasible.

Beyond the concept of different data sets, these new data sets create ‘data gravity’ as they grow in size. Essentially, creating a stronger bond between the data set and the application that leverages it. As the size of the data set grows, so does its ‘gravity’ which prevents movement. All of these reasons create significant friction to considering any movement of data.

VALUE OF INTEGRATING DATA

The solution rests with data integration. Essentially, leave data where it resides and leverage integration methods to the various data sets in order to create insights. There are actually two components when considering how to integrate data.

There is a physical need for data integration and one that is more logical in nature. The physical component is how to physically connect the different data sources together. This is easier said than done. It was already challenging when we managed all of the data within the enterprise. Today, the data resides in the hands of many other players and approaches. This can add complexity to the integration efforts. Modern data integration methods rely on Application Programming Interfaces (APIs) to create these integration points. In addition, there are security ramifications to consider too.

The logical integration of data often centers around the customer. One of the core objectives for enterprises today is customer engagement. Enterprises are finding ways to learn more about their customer in an effort to build a more holistic profile that ultimately leads to a stronger relationship. Not all of that data is sourced internally. This really is a case of 1+1=3 where even smaller insights can lead to a larger impact when combined.

THE INTERSECTION OF DATA INTEGRATION AND ADVANCED FUNCTIONS

Data integration is a deep and complicated subject that is evolving quickly. Newer advancements in the Artificial Intelligence (AI) space are leading enterprises to gain greater insights that even they didn’t think about. Imagine a situation where you thought you knew your customer, but the system suggested other aspects that weren’t considered. AI has the opportunity to significantly augment the human capability to create more accurate insights and faster.

Beyond AI, other newer functions such as Machine Learning (ML) and Internet of Things (IoT) present new sources of data to further enhance insights. It should be noted that nether ML nor IoT are able to function in a meaningful way without leveraging data integration.

DATA INTEGRATION LEADS TO SPEED AND INSIGHTS…AND CHALLENGES

Enterprises that leverage AI and ML to augment their efforts find increased value from both the insights and the speed in which they respond. In today’s world where speed and accuracy are becoming a strong differentiation for competitors, leveraging as much data as possible is key. In order to leverage the sheer amount of data, enterprises must leverage data integration to remain competitive.

At the same time, enterprises are facing challenges from new regulations such as the General Data Protection Regulation (GDPR). There are many facets and complexities to GDPR that will only further the complexities for data integration and management.

While enterprises may have leveraged custom approaches to solve the data integration problem in the past, today’s complexities demand a different approach. The combination of these challenges push enterprises to leverage advanced tools to assist in the integration of data to gain greater insights.

 

This post sponsored by:

SAP_Best_R_grad_blk

https://www.sap.com/intelligentdata

Business · Cloud · Data

How cloud is changing the business and IT landscape

 

Enterprises around the globe are facing disruption from a number of different directions. To combat this disruption, they increasingly turn to cloud computing.

Cloud computing is the single biggest opportunity for organizations to dramatically change how their business operates through changes in their Information Technology (IT) landscape. Organizations are evaluating ways to leverage cloud-based solutions in differentiated ways. As these organizations evaluate how best to leverage cloud, there are a number of factors to consider when moving into a cloud-based world.

SHIFTING HOW WE THINK ABOUT ORGANIZATIONS

The move for enterprises leveraging cloud-based technology causes more than just a shift in technology. It also causes a shift in how we think about strategy, organization and culture. In the past, enterprises were required to build their strategy, organization and support around the entire technology platform. There were few to no other reasonable options.

In the cloud-based era, those aspects change. Today, there are mature, viable solutions that allow the enterprise to leverage alternatives. By shifting functions that are important but not strategic nor differentiated, it allows organizations to shift their focus as well.

By making this shift, organizations are able to up-level conversations and focus on business outcomes rather than technology features. These changes open the door to other shifts as well.

CHANGING THE WAY WE WORK

Two of those shifts directly impact the advantage to a business and their customers. The first shift is leverage. Cloud computing provides leverage in ways not previously feasible. IT organizations now have access to a bevy of solutions and virtually unlimited resources. Prior to cloud computing, organizations were limited in their ability to quickly scale resources and leverage new technology. For many, cost constraints were a significant limitation to potential growth.

Beyond technology, cloud provides the organization with the ability to leverage the expertise of others. As mentioned, previous methods required IT organizations to be an expert across a myriad of strategic and non-strategic areas. With cloud, organizations can focus on those aspects strategic and differentiated for their business and customers. This shifts the organization and culture to focus more directly on solutions that provide business advantage.

The second shift is speed. Unlike past options that may take weeks or months for the availability of applications and resources, cloud shortens the time to a matter of minutes. As businesses are looking for ways to respond quickly to ever-changing customer requirements, this rapid flexibility provides the organization with the ability to respond in a timely fashion.

OPENING UP NEW OPPORTUNITES

Aside from speed, cloud opens up a whole new world of opportunity for organizations. Historically, there was a minimum cost of entry for many of the more advanced systems. Unfortunately, these costs were far outside the realm of possibility for Small and Medium Businesses (SMB). With cloud, SMB organizations now have access to the same solutions as the largest of enterprises. Essentially, this levels the playing field by lowering the barriers to entry of these advanced systems.

As organizations shift toward a greater focus on customers through quicker and more accurate business decisions, so does their reliance on data. At the same time, organizations are facing an explosion of data. Traditional reporting and analytics are collapsing under the weight of this new influx of data.

To combat the data deluge, organizations are increasing their reliance on Artificial Intelligence (AI) and Machine Learning (ML) to automate the insights gleaned from the data. While AI & ML require large quantities of data, traditional solutions are simply unable to provide the resources that AI & ML algorithms require. Again, cloud computing opens the door to organizations, large and small, to leverage these advanced functions.

BRINGING IT TOGETHER

The combination of cloud-based resources with advanced functions provide organizations with new opportunities to glean greater insights much quicker than previously possible. Cloud’s leverage provides organizations the ability to shift the IT landscape to focus on business outcomes and strategic initiatives rather than deep technical expertise.

In summary, cloud computing presents not only an alternative option for technology consumption, but a required pillar for tomorrow’s business.

 

This post is sponsored by:

SAP_Best_R_grad_blk

https://www.sap.com/intelligentdata

Business · Cloud · Data

Rubrik continues their quest to protect the enterprise

Screen Shot 2018-04-23 at 9.56.47 AM

Data protection is all the rage right now. With data moving beyond the corporate data center to multiple locations including cloud, the complexity has increased significantly. What is Data Protection? It generally covers a combination of backup, restore, disaster recovery (DR) and business continuity (BC). While not new, most enterprises have struggled for decades to effectively backup their systems in a way that ensures that a) the data is protected, b) can be restored if needed and c) can be restored in a timely fashion when needed. Put a different way: BC/DR is still one of most poorly managed parts of an IT operation. Add cloud to this and one can see where the wheels start to fall off.

The irony is that, while not new, enterprises struggle to effectively balance the needs of DR/BC in a meaningful way. The reasons for this are longer than this blog will permit. This is an industry screaming for disruption. Enter Rubrik.

RUBRIK BRINGS A FRESH PERSPECTIVE TO AN OLD PROBLEM

A couple of weeks back, I caught up with the Rubrik team at Tech Field Day’s Cloud Field Day 3. Rubrik came into the market a few years back and has continued their drive to solve this old, but growing problem.

Unlike traditional solutions, Rubrik takes a modern approach to their architecture. Everything that Rubrik does calls an API. By using this API-centric architecture, Rubrik provides modularity and flexibility to their approach. API-centric architectures are a must in a cloud-based world.

At Cloud Field Day, the Rubrik team went through their new SaaS-based solution called Polaris. Knowing that enterprise data is increasingly being spread across multiple data centers and cloud providers, they need a cohesive way to visually manage their data. Polaris is a SaaS-based solution that does just that. Polaris becomes the overarching management platform in which to effectively manage the growing complexity.

COMPLEXITY DRIVES THE NEED FOR A NEW APPRAOCH

There are two dynamics that are driving these changes: 1) the explosion in data growth and 2) the need to effective management data. As applications and their data move to a myriad of different solutions, so does the need to effectively manage the underlying data.

An increase in compliance and regulatory requirements are just adding further complexity to data management. As the complexity grows, so does the need for systemic automation. No longer are we able to simply throw more resources at the problem. It is time to turn the problem on its head and leverage new approaches.

DATA PROTECTION IS NOT IMPORTANT…UNTIL IT IS

During the discussion, Rubrik’s Chief Technologist Chris Wahl made a very key observation that everyone in IT painfully understands: Data protection is not important…until it is. To many enterprises, the concept of data protection is seen as an insurance policy that you hopefully will not need. However, in today’s world of increasingly regulated and highly complicated architectures with data spreading out at scale, the risks are simply too great to ignore.

While data protection may have been less important in the past, today it is critical.

GOING BEYOND SIMPLY BACKUP AND RECOVERY

If the story about Rubrik were to stop with just backup and recovery, it would still be impressive. However, Rubrik is venturing into the complexity that comes with integration into other systems and processes. One of the first areas is their integration with ServiceNow. Rubrik integrates with ServiceNow by ingesting CMDB data into the system. By doing so, it provides a cohesive look at the underlying components that Rubrik has visibility into.

Looking into the crystal ball, one can start to see how Rubrik is fully understanding that backup and recovery is just the start. The real opportunity comes from full integration into business processes. However, in order to do that, integrations like ServiceNow are needed. Expect to see more as Rubrik continues their quest to provide a solid foundation to the enterprise when they need it most.

Business · Cloud · Data

Delphix smartly reduces the friction to access data

Screen Shot 2018-04-17 at 10.00.57 AM

Today’s CIO is looking for ways to untap the potential in their company’s data. We have heard the phrase that data is the new oil. Except that data, like oil, is just a raw material. Ultimately, we need to refine it into a finished good which is ultimately where the value resides.

At the same time, enterprises are concerned with regulatory and compliance requirements to protect data. Recent data breaches by globally-recognized companies have raised the concern around data privacy. Historically, the financial services and healthcare industries were the ones to watch when it came to regulatory and compliance requirements. Today, the regulatory net is widening with the EU’s General Data Protection Regulation(GDPR), US Government’s FedRAMPand NY State DFS Cybersecurity Requirements.

Creating greater access to data while staying in compliance and protecting data sit at opposite ends of the privacy and cybersecurity spectrum. Add to this the interest in moving data to cloud-based solutions and one can quickly see why this is one of the core challenges for today’s CIO.

DELPHIX REDUCES THE FRICTION TO DATA ACCESS

At Tech Field Day’s Cloud Field Day 3, I had the opportunity to meet with the team from Delphix.

Fundamentally, Delphix is a cloud-based data management platform that helps enterprises reduce the friction to data access through automation of data management. Today, one-third of Fortune 500 companies use Delphix.

Going back to the core issue, users have a hunger for accessing data. However, regulatory and compliance requirements often hinder that process. Today’s methods to manage data are heavily manual and somewhat archaic compared with solutions like Delphix.

Delphix’ approach is to pack up the data into, what they call, a Data Pod. Unlike most approaches that mask data when it is shared, Delphix masks the data during the intake process. The good thing about this approach is in removing the risk of accidentally sharing protected data.

In terms of sharing data, one clever part of the Delphix Dynamic Data Platform is in its ability to replicate data smartly. Considering that Delphix works in the cloud, this is a key aspect to avoiding unnecessary costs. Alternatively, enterprises would see a significant uptick in data storage as masked data is replicated to the various users. Beyond structured, transactional data, Delphix is also able to manage (and mask) databases, along with unstructured data and files.

THE CIO PERSPECTIVE

From the CIO perspective, Delphix appears to address an increasingly complicated space with a clever, yet simple approach. The three key takeaways are: a) Ability to mask data (DB, unstructured, files) at intake versus when pulling copies, b) ability to smartly replicate data and c) potential to manage data management policies. Lastly, this is not a solution that must run in the corporate data center. Delphix supports running in public cloud services including Microsoft Azureand Amazon AWS.

In Summary, Delphix appears to have decreased the friction to data access by automating the data protection and management processes. All while supporting an enterprise’s move to cloud-based resources.

Business · Cloud · Data

Microsoft empowers the developer at Connect

iagGpMj8TOWl3Br86sxsgg

This week at Microsoft Connect in New York City, Microsoft announced a number of products geared toward bringing intelligence and the computing edge closer together. The tools continue Microsoft’s support of a varied and growing ecosystem of evolving solutions. At the same time, Microsoft demonstrated their insatiable drive to woo the developer with a number of tools geared toward modern development and advanced technology.

EMBRACING THE ECOSYSTEM DIVERSITY

Microsoft has tried hard in the past several years to shed their persona of Microsoft-centricity of a .NET Windows world. Similar to their very vocal support for inclusion and diversity in culture, Microsoft brings that same perspective to the tools, solutions and ecosystems they support. The reality is that the world is diverse and it is this very diversity that makes us stronger. Technology is no different.

At the Connect conference, similar to their recent Build & Ignite conferences, .NET almost became a footnote as much of the discussion was around other tools and frameworks. In many ways, PHP, Java, Node and Python appeared to get mentioned more than .NET. Does this mean that .NET is being deprecated in favor of newer solutions? No. But it does show that Microsoft is moving beyond just words in their drive toward inclusivity.

EXPANDING THE DEVELOPER TOOLS

At Connect, Microsoft announced a number of tools aimed squarely at supporting the modern developer. This is not the developer of years past. Today’s developer works in a variety of tools, with different methods and potentially in separate locations. Yet, they need the ability to collaborate in a meaningful way. Enter Visual Studio Live Share. What makes VS Live Share interesting is how it supports collaboration between developers in a more seamless way without the cumbersome screen sharing approach previously used. The level of sophistication that VS Live Share brings is impressive in that it allows each developer to walk through code in their own way while they debug and collaborate. While VS Live Share is only in preview, other recently-announced tools are already seeing significant adoption in a short period of time that ranges in the millions of downloads.

In the same vein of collaboration and integration, DevOps is of keen interest to most enterprise IT shops. Microsoft showed how Visual Studio Team Services embraces DevOps in a holistic way. While the demonstration was impressive, the question of scalability often comes into the picture for large, integrated teams. It was mentioned that VS Team Services is currently used by the Microsoft Windows development team and their whopping 25,000 developers.

Add to scale the ability to build ‘safe code’ pipelines with automation that creates triggers to evaluate code in-process and one can quickly see how Microsoft is taking the modern, sophisticated development process to heart.

POWERING DATA AND AI IN THE CLOUD

In addition to developer tools, time was spent talking about Azure, data and Databricks. I had the chance to sit down with Databricks CEO Ari Ghodsi to talk about how Azure Databricks is bringing the myriad of data sources together for the enterprise. The combination of Databricks on Azure provides the scale and ecosystem that highlights the power of Databricks to integrate the varied data sources that every enterprise is trying to tap into.

MIND THE DEVELOPER GAP

Developing applications that leverage analytics and AI is incredibly important, but not a trivial task. It often requires a combination of skills and experience to fully appreciate the value that comes from AI. Unfortunately, developers often do not have the data science skills nor business context needed in today’s world. I spoke with Microsoft’s Corey Sanders after his keynote about how Microsoft is bridging the gap for the developer. Both Sanders & Ghodsi agree that the gap is an issue. However, through the use of increasingly sophisticated tools such as Databricks and Visual Studio, Sanders & Ghodsi believe Microsoft is making a serious attempt at bridging this gap.

It is clear that Microsoft is getting back to its roots and considering the importance of the developer in an enterprise’s digital transformation journey. While there are still many gaps to fill, it is interesting to see how Microsoft is approaching the evolving landscape and complexity that is the enterprise reality.

Business · Cloud · Data

Salesforce bridges the customer engagement gap for growth at Dreamforce

JIBpFbbzTT6vN+p%RxbEzg
Last week was Salesforce’s Dreamforce conference in San Francisco with a whopping 170,000+ attendees. Even so, what were the key takeaways?

Today, many enterprises are either Salesforce customers and follow the space closely as it pertains to a key element for executive teams today: Customer engagement. One of the top issues that executive teams and board of directors face is how to create a deeper relationship with customers. Salesforce sits at this nexus. Here are the top takeaways from the conference;

UPSIDES:

  1. Partnership with Google: Salesforce announced their partnership with Google. While much of the discussion was integration with Google Cloud and G Suite, there are benefits that both companies (and customers) could gain from the relationship. The data that Google maintains on user behavior and ad-related impact could provide useful to Salesforce customers. Salesforce in turn could provide integration and insights to Google Ad Words. The potential from this symbiotic relationship could prove significant.
  2. Democratizing Einstein & AI: Last year, Einstein provided an interesting opportunity for Salesforce and their customers. This year, Salesforce showed how providing customers with an easy way to leverage Einstein provides a powerhouse of potential to support customer engagement. Plus, proactively predicting outcomes provides insights not previously possible.
  3. myTrailhead: Personalization has long-since been a key success factor to engage users. myTrailhead provides a level of personalization to allow users to work as they work best. Often, we require all users to work from a single console or interface. myTrailhead allows users to customize their experience.

DOWNSIDES:

  1. Fewer Feature/ Function Announcements: There was quite a bit of discussion around the number of feature/ functionality announcements made at Dreamforce. Further suggesting that maybe things are slowing down for Salesforce in terms of innovation. It is unclear to predict a trend from one data point. However, there are several indicators that this may only indicate a maturing of the innovation cycle.
  2. Expansion of Platform to Verticals: Salesforce supports a number of verticals with their solution. However, the depth they support the ecosystem around verticals pales in comparison with newer startups focused on specific verticals in the CRM space.
  3. Lack of New Data Sources: Unlike its competition, Salesforce takes a partnership approach to data integration into the platform. That is, they rely on partners to bring data sources for customers to leverage. Examples are financial services, traffic, weather, and other common data elements.

REVENUE GUIDANCE

Another key question that came up was around Salesforce’s revenue guidance. Can they (essentially) double their revenue to match guidance? And if so, how. There are a number of factors that I believe will support this.

All in, Salesforce is faced with significant headwinds from both competition and adoption of innovation by enterprises. Bringing partnerships with Google and democratization of newer technologies will do well to carry them forward. There is still a significant amount of potential upside for Salesforce.

CIO · Cloud · Data

Why are enterprises moving away from public cloud?

IMG_6559

We often hear of enterprises that move applications from their corporate data center to public cloud. This may come in the form of lift and shift. But then something happens that causes the enterprise to move it out of public cloud. This yo-yo effect and the related consequences create ongoing challenges that contribute to several of the items listed in Eight ways enterprises struggle with public cloud.

In order to better understand the problem, we need to work backwards to the root cause…and that often starts with the symptoms. For most, it starts with costs.

UNDERSTANDING THE ECONOMICS

The number one reason why enterprises pull workloads back out of cloud has to do with economics. For public cloud, it comes in the form of a monthly bill for public cloud services. In the post referenced above, I refer to a cost differential of 4x. That is to say that public cloud services cost 4x the corporate data center alternative for the same services. These calculations include fully-loaded total cost of ownership (TCO) numbers on both sides over a period of years to normalize capital costs.

4x is a startling number and seems to fly in the face of a generally held belief that cloud computing is less expensive than the equivalent on-premises corporate data center. Does this mean that public cloud is not less expensive? Yes and no.

THE IMPACT OF LEGACY THINKING

In order to break down the 4x number, one has to understand legacy thinking heavily influences this number. While many view public cloud as less expensive, they often compare apples to oranges when comparing public cloud to corporate data centers. And many do not consider the fully-loaded corporate data center costs that includes server, network, storage…along with power, cooling, space, administrative overhead, management, real estate, etc. Unfortunately, many of these corporate data center costs are not exposed to the CIO and IT staff. For example, do you know how much power your data center consumes and the cost for real estate? Few IT folks do.

There are five components that influence legacy thinking:

  1. 24×7 Availability: Most corporate data centers and systems are built around 24×7 availability. There is a significant amount of data center architecture that goes into the data center facility and systems to support this expectation.
  2. Peak Utilization: Corporate data center systems are built for peak utilization whether they use it regularly or not. This unused capacity sits idle until needed and only used at peak times.
  3. Redundancy: Corporate infrastructure from the power subsystems to power supplies to the disk drives is designed for redundancy. There is redundancy within each level of data center systems. If there is a hardware failure, the application ideally will not know it.
  4. Automation & Orchestration: Corporate applications are not designed with automation & orchestration in mind. Applications are often installed on specific infrastructure and left to run.
  5. Application Intelligence: Applications assume that availability is left to other systems to manage. Infrastructure manages the redundancy and architecture design manages the scale.

Now take a corporate application with this legacy thinking and move it directly into public cloud. It will need peak resources in a redundant configuration running 24×7. That is how they are designed, yet, public cloud benefits from a very different model. Running an application in a redundant configuration at peak 24×7 leads to an average of 4x in costs over traditional data center costs.

This is the equivalent of renting a car every day for a full year whether you need it or not. In this model, the shared model comes at a premium.

THE SOLUTION IS IN PLANNING

Is this the best way to leverage public cloud services? Knowing the details of what to expect leads one to a different approach. Can public cloud benefit corporate enterprise applications? Yes. Does it need planning and refactoring? Yes.

By refactoring applications to leverage the benefits of public cloud rather than assume legacy thinking, public cloud has the potential to be less expensive than traditional approaches. Obviously, each application will have different requirements and therefore different outcomes.

The point is to shed legacy thinking and understand where public cloud fits best. Public cloud is not the right solution for every workload. From those applications that will benefit from public cloud, understand what changes are needed before making the move.

OTHER REASONS

There are other reasons that enterprises exit public cloud services beyond just cost. Those may include:

  1. Scale: Either due to cost or significant scale, enterprises may find that they are able to support applications within their own infrastructure.
  2. Regulatory/ Compliance: Enterprises may use test data with applications but then move the application back to corporate data centers when shifting into production with regulated data. Or compliance requirements may force the need to have data resources local to maintain compliance. Sovereignty issues also drive decisions in this space.
  3. Latency: There are situations where public cloud may be great on paper, but in real-life latency presents a significant challenge. Remote and time-sensitive applications are good examples.
  4. Use-case: The last catch-all is where applications have specific use-cases where public cloud is great in theory, but not the best solution in practice. Remember that public cloud is a general-purpose infrastructure. As an example, there are application use-cases that need fine-tuning that public cloud is not able to support. Other use-cases may not support public cloud in production either.

The bottom line is to fully understand your requirements, think ahead and do your homework. Enterprises have successfully moved traditional corporate applications to public cloud…even those with significant regulatory & compliance requirements. The challenge is to shed legacy thinking and consider where and how best to leverage public cloud for each application.