Business · Cloud

Morpheus Data brings the glue to multi-cloud management

clover-b4ff8d514c9356e8860551f79c48ff7c

Enterprises across the globe are starting to leverage cloud-based resources in a multitude of ways. However, there is not a one-size-fits-all approach to cloud that makes sense for the enterprise portfolio. This leads to a rise in multi-cloud deployments for the varied workloads any given enterprise uses. Meaning, any given enterprise will use a variety of different cloud-based services depending on the specific requirements of any given workload. It is important to understand the difference between Multi-Cloud and Hybrid Cloud.

This cloud ‘sprawl’ creates an increasingly complicated management problem as each cloud provider uses a different approach to manage their cloud-based services. Layer in management processes, automation routines and management tools and one can quickly understand the challenge. Add to this that any given application may use a different combination of cloud services and one can quickly see how the problem gets exponentially more complicated with each workload.

MORPHEUS DATA PROVIDES THE GLUE

At Tech Field Day’s Cloud Field Day 3, I had the opportunity to meet with the team from Morpheus Data.

Morpheus Data addresses this complicated web of tools and services by providing an abstraction layer on top of the various tools and services. More specifically, Morpheus Data creates abstraction between the provisioning and underlying infrastructure. To date, they support 49 service integrations out of the box that cover a variety of cloud services, governance tools, management tools and infrastructure.

Providing governance and automation is key to any multi-cloud or hybrid-cloud deployment. Leveraging a solution like Morpheus Data will help streamline CloudOps & DevOps efforts through their integration processes.

One interesting aspect of Morpheus Data’s solution is the ability to establish application templates that span a number of different tools, services & routines. The templates assist with deployment and can set specific time limitations on specific services. This is especially handy to avoid one form of sprawl known as service abandonment where a service is left running and accruing cost even though it is no longer used.

Much of Morpheus Data’s efforts are geared toward ‘net-new’ deployments to cloud. Moving legacy workloads will require re-working before fully taking advantage of cloud-based resources. I wrote about the challenges with legacy workloads moving to public cloud in these posts:

LOOKING BEYOND THE TOOL

While Morpheus Data provides technology to address the systemic complexities of technology, it does not address the people component. To be fair, it is not clear that any tool will necessarily fix the people component. Specifically, in order to truly leverage good governance and automation routines, one needs to come to grips with the organizational and cultural changes to support such approaches.

In order to address the people component, it is helpful to break down the personas. The key three are Developer, Infrastructure Administrator and Executive. Each of these personas have different requirements and interests that will impact how services are selected and consumed.

IN SUMMARY

Morpheus Data is going after a space that is both huge and highly complicated. A big challenge for the team will be to focus on the most critical spaces without trying to cover every tool, process and model. This is really a question going broad or going deep. You can’t do both.

In addition, it is clear that Morpheus Data has a good start but would benefit from bringing operational data and costs into the factors that drive decisions on which services to use. The team already has some cost components included but are not as dynamic as enterprises will need moving forward.

In summary, the Morpheus Data solution looks like a great start to the increasingly complicated multi-cloud space. Every enterprise will have some form of complexity dealing with multi-cloud and hybrid cloud. As such, they could benefit from a solution to help streamline the processes. Morpheus Data looks like a good start and will be interesting to see how the company and solution evolve over time to address this increasingly complicated space.

Business · Cloud · Data

Microsoft empowers the developer at Connect

iagGpMj8TOWl3Br86sxsgg

This week at Microsoft Connect in New York City, Microsoft announced a number of products geared toward bringing intelligence and the computing edge closer together. The tools continue Microsoft’s support of a varied and growing ecosystem of evolving solutions. At the same time, Microsoft demonstrated their insatiable drive to woo the developer with a number of tools geared toward modern development and advanced technology.

EMBRACING THE ECOSYSTEM DIVERSITY

Microsoft has tried hard in the past several years to shed their persona of Microsoft-centricity of a .NET Windows world. Similar to their very vocal support for inclusion and diversity in culture, Microsoft brings that same perspective to the tools, solutions and ecosystems they support. The reality is that the world is diverse and it is this very diversity that makes us stronger. Technology is no different.

At the Connect conference, similar to their recent Build & Ignite conferences, .NET almost became a footnote as much of the discussion was around other tools and frameworks. In many ways, PHP, Java, Node and Python appeared to get mentioned more than .NET. Does this mean that .NET is being deprecated in favor of newer solutions? No. But it does show that Microsoft is moving beyond just words in their drive toward inclusivity.

EXPANDING THE DEVELOPER TOOLS

At Connect, Microsoft announced a number of tools aimed squarely at supporting the modern developer. This is not the developer of years past. Today’s developer works in a variety of tools, with different methods and potentially in separate locations. Yet, they need the ability to collaborate in a meaningful way. Enter Visual Studio Live Share. What makes VS Live Share interesting is how it supports collaboration between developers in a more seamless way without the cumbersome screen sharing approach previously used. The level of sophistication that VS Live Share brings is impressive in that it allows each developer to walk through code in their own way while they debug and collaborate. While VS Live Share is only in preview, other recently-announced tools are already seeing significant adoption in a short period of time that ranges in the millions of downloads.

In the same vein of collaboration and integration, DevOps is of keen interest to most enterprise IT shops. Microsoft showed how Visual Studio Team Services embraces DevOps in a holistic way. While the demonstration was impressive, the question of scalability often comes into the picture for large, integrated teams. It was mentioned that VS Team Services is currently used by the Microsoft Windows development team and their whopping 25,000 developers.

Add to scale the ability to build ‘safe code’ pipelines with automation that creates triggers to evaluate code in-process and one can quickly see how Microsoft is taking the modern, sophisticated development process to heart.

POWERING DATA AND AI IN THE CLOUD

In addition to developer tools, time was spent talking about Azure, data and Databricks. I had the chance to sit down with Databricks CEO Ari Ghodsi to talk about how Azure Databricks is bringing the myriad of data sources together for the enterprise. The combination of Databricks on Azure provides the scale and ecosystem that highlights the power of Databricks to integrate the varied data sources that every enterprise is trying to tap into.

MIND THE DEVELOPER GAP

Developing applications that leverage analytics and AI is incredibly important, but not a trivial task. It often requires a combination of skills and experience to fully appreciate the value that comes from AI. Unfortunately, developers often do not have the data science skills nor business context needed in today’s world. I spoke with Microsoft’s Corey Sanders after his keynote about how Microsoft is bridging the gap for the developer. Both Sanders & Ghodsi agree that the gap is an issue. However, through the use of increasingly sophisticated tools such as Databricks and Visual Studio, Sanders & Ghodsi believe Microsoft is making a serious attempt at bridging this gap.

It is clear that Microsoft is getting back to its roots and considering the importance of the developer in an enterprise’s digital transformation journey. While there are still many gaps to fill, it is interesting to see how Microsoft is approaching the evolving landscape and complexity that is the enterprise reality.

CIO

#CIOitk: Is IT still relevant? The conversation is heating up!

The subject of IT relevance is a hotly contested subject. According to Forrester, 41% of business decision-makers believe that IT is an impediment to accelerating business success. In a recent Twitter chat, the participants addressed their thoughts on the relevance and value of IT. Disconnects, disruption and shadow IT were on the minds of participants. Not far behind…the subject of organization and culture took front and stage.

Closing the chasm

It is apparent that IT needs to change…and quick. The chasm between IT and the rest of the business is growing to the breaking point. In many ways, this is where shadow IT starts. If there is such a chasm, where does the process start to decrease the gap? The key is for the CIO to step up, and take the lead in changing the culture across the organization. Ironically, many outside IT do not believe the CIO has the capacity to make the shift. I wrote about this in Transforming IT Requires a Three-Legged Race over two years ago.

Awareness is the first step

Aside from ‘who’ starts the conversation, awareness of the problem is the first step toward resolution. The shared concern is that many IT leaders 1) are not recognizing the chasm and 2) if they do, they struggle to identify the first steps to close the gap.

For too long, IT focused on history as a predictor of the future. In other words, IT leveraged best practices as a means to future guidance. Today, the game is completely different. Today, leveraging history is almost the worst thing an IT organization can do. The past is not an indicator of the future.

Today’s organization needs to heed the phrase: Disrupt or be disrupted. IT organizations that are comfortable are those most likely to get disrupted by competitors.

Respecting the legacy

Don’t get me wrong. The legacy thinking (ie: culture, process, etc) and footprint are a reality within IT organizations today. That does not go away or change overnight. Equally an issue is that enterprise vendors face similar situations and are challenged to turn the corner too. This is about an evolution that needs to start picking up momentum…and quick!

Quoting the quotable

During the chat, a number of key quotable moments popped up. Here are just a few:

Bob Egan: If you’re not seeking change, there’s an issue. Drive change and live on the edge.

Mark Thiele: There is no comfort zone in IT.

Stuart Appley: Culture and trust are key points to today’s IT leaders.

After the chat, I asked the participants on Twitter two questions: 1) What was their one takeaway from the conversation and 2) what recommendation would they have?

Brian Katz

Takeaway: Shadow IT is only a problem if you look at it as Shadow IT versus Shadow Innovation. People just want to get their work done.

Recommendation: Move away from the idea that everything is a sacred cow, you have to be willing to move stuff when it makes sense.

Ryan Fay

Takeaway: IT is in a great position today to become Change Agents & partner with other business leader to create value and lead innovation. The tools to use are collaboration, authenticity, empathy & openness to change & becoming a modern CIO business authority.

Recommendation: Help give big picture recommendations to help all modern CIOs. Become business leaders and not just another IT “guy.” In order to have a seat at the ‘table’ you must be able to think and act like a business leader.

Philipe Abdoulaye

Takeaway: Consensus on agile collaboration between the business and IT is the key competitive advantage.

Recommendation: A model of the new style of IT is needed. Example of such a model is the Complete ITaaS Delivery Model.

Kong Yang

Takeaway: If disruptive innovation creates new business opportunities, then IT must realize innovation without disrupting themselves in order to succeed.

Recommendation: IT leaders must bridge the gap from IT Ops to business utility and show tangible ROI. Plus, have a plan to deal with industry changing technologies.

Charles Dunkley

Takeaway: Be willing to take risks to grow.

Recommendation: C-Suite could better leverage IT staff hands on insight to social and cloud tools.

The bottom line

The bottom line is that IT has a significant challenge ahead. With the advent of cloud, DevOps, containers and new service offerings, IT now has the tools to leverage a change in process and culture. To be clear, these are just tools for leverage, not the solution. Look for opportunities to drive change and disruption.

Scroll the credits…

First, thank you to the four colleagues whom developed the idea and questions for the #CIOitk discussion on IT Relevance:

Stuart Appley (@sappley) Post: Yes – IT is still relevant

Bob Egan (@bobegan)

Amy Hermes (@amyhermes)

Mark Thiele (@mthiele10) Post: IT is more relevant than ever – at least it can be

Second, another thank you to all that joined the #CIOitk conversation on Twitter. Without you, there would be no conversation. Speaking of conversations, let’s keep it going! What are your thoughts? Is IT relevant and how is that changing over time? Join the chat at #CIOitk.

Business · CIO

Outages happen. How prepared are you for the next one?

Significant outages hit several major firms today including United Airlines, New York Stock Exchange (NYSE) and the Wall Street Journal. And that was just this morning alone. While many suggest a correlation between the three, I will leave that to the experts.

The point is, outages happen. How prepared you are and how you respond to the next outage is what matters.

DR/ BC: Not new and still broken

The bottom line is that disaster recovery and business continuity is not new and is still broken. IT organizations have cobbled together solutions for decades with limited success. There are shining lights but often they are startups based on an entirely different culture.

Are you only using backups? How do you know you’re backing up the right information? Is that enough? Generally, the answers are less flattering than most would want.

Redundancy doesn’t cut it either. Organizations continue to build redundancy into systems to the point of over complicating systems. And complication leads to greater risk of missing a step along the way

Today, backups and redundancy is not enough. Building to increase the number of 9’s of uptime is not the answer either. Organizations need to go beyond that.

Time to streamline

There are a number of ways this problem can be resolved. DR/ BC is a must but not using traditional means. Investment is needed, and there are clever solutions available today. This is more than just a technology problem. It involves process, organization and culture. Think about DevOps. Think about how to streamline.

 

Remember: The more the complicated the system, the longer to troubleshoot. And the greater the risk to both the company and its customers. How prepared are you for the next outage?

Business · CIO · Cloud · Data

Are the big 5 enterprise IT providers making a comeback?

Not long ago, many would have written off the likes of the big five large enterprise IT firms as slow, lethargic, expensive and out of touch. Who are the big five? IBM (NYSE: IBM), HP (NYSE: HPQ), Microsoft (NASDAQ: MSFT), Oracle (NYSE: ORCL) and Cisco (NASDAQ: CSCO). Specifically, they are companies that provide traditional enterprise IT software, hardware and services.

Today, most of the technology innovation is coming from startups, not the large enterprise providers. Over the course of 2015, we have seen two trends pick up momentum: 1) Consolidation in the major categories (software, hardware, and services) and 2) Acquisitions by the big five. Each of them are making huge strides in different ways.

Here’s a quick rundown of the big five.

IBM guns for the developer

Knowing that the developer is the start of the development process, IBM is shifting gears toward solutions that address the new developer. Just look at the past 18 months alone.

  • February 2014: Dev@Pulse conference showed a mix of Cobol developers alongside promotion of Bluemix. The attendees didn’t resemble your typical developer conference. More details here.
  • April 2014: Impact conference celebrated 50 years of the mainframe. Impact also highlighted the SoftLayer acquisition and brought the integration of mobile and cloud.
  • October 2014: Insight conference goes further to bring cloud, data and Bluemix into the fold.
  • February 2015: InterConnect combines a couple of previous conferences into one. IBM continues the drive with cloud, SoftLayer and Bluemix while adding their Open Source contributions specifically around OpenStack.

SoftLayer (cloud), Watson (analytics) and Bluemix are strengths in the IBM portfolio. And now with IBM’s recent acquisition of BlueBox and partnership with Box, it doesn’t appear they are letting up on the gas. Add their work with Open Source software and it creates an interesting mix.

There are still significant gaps for IBM to fill. However, the message from IBM supports their strengths in cloud, analytics and the developer. This is key for the enterprise both today and tomorrow.

HP’s cloudy outlook

HP has long had a diverse portfolio that addresses the needs of the enterprise today and into the future. Of all big five providers, HP has one of the best matched to the enterprise needs today and in the future.

  • Infrastructure: HP’s portfolio of converged infrastructure and components is solid. Really solid. Much of it is geared for the traditional enterprise. One curious point is that their server components span the enterprise and service provider market. However, their storage products are squarely targeting the enterprise to the omission of the service providers. You can read more here.
  • Software: I have long since felt that HP’s software group has a good bead on the industry trends. They have a strong portfolio of data analytics tools with Vertica, Autonomy and HAVEn (being rebranded). HP’s march to support the Idea Economy is backed up by the solutions they’re putting in place. You can read more here.
  • Cloud: I have said that HP’s cloud strategy is an enigma. Unfortunately, discussions with the HP Cloud team at Discover this month further cemented that perspective. There is quite a bit of hard work being done by the Helion team, but the results are less clear. HP’s cloud strategy is directly tied to OpenStack and their contributions to the projects support this move.

HP will need to move beyond operating in silos and support a more integrated approach that mirrors the needs of their customers. While HP Infrastructure and Software are humming along, Helion cloud will need a renewed focus to gain relevance and mass adoption.

Microsoft’s race to lose

Above all other players, Microsoft still has the broadest and deepest relationships across the enterprise market today. Granted, much of those relationships are built upon their productivity apps, desktop and server operating systems, and core applications (Exchange, SQL, etc). There is no denying that Microsoft probably has relationships with more organizations than any of the others.

Since Microsoft Office 365 hit its stride, enterprises are starting to take a second look at Azure and Microsoft’s cloud-based offerings. This still leaves a number of gaps for Microsoft; specifically around data analytics and open standards. Moving to open standards will require a significant cultural shift for Microsoft. Data analytics could come through the acquisition of a strong player in the space.

Oracle’s comprehensive cloud

Oracle has long been seen as a strong player in the enterprise space. Unlike many other players that provide the building blocks to support enterprise applications, Oracle provides the blocks and the business applications.

One of Oracle’s key challenges is that the solutions are heavy and costly. As enterprises move to a consumption-based model by leveraging cloud, Oracle found itself flat-footed. Over the past year or so, Oracle has worked to change that position with their cloud-based offerings.

On Monday, Executive Chairman, CTO and Founder Larry Ellison presented Oracle’s latest update in their race for the enterprise cloud business. Oracle is now providing the cloud building blocks from top to bottom (SaaS PaaS IaaS). The message is strong: Oracle is out to support both the developer and business user through their transformation.

Oracle’s strong message to go after the entire cloud stack should not go unnoticed. In Q4 alone, Oracle cloud cleared $426M. That is a massive number. Even if they did a poor job of delivering solutions, one cannot deny the sheer girth of opportunity that overshadows others.

Cisco’s shift to software

Cisco has long since been the darling of the IT infrastructure and operations world. Their challenge has been to create a separation between hardware and software while advancing their position beyond the infrastructure realms.

In general, networking technology is one of the least advanced areas when compared with advances in compute and storage infrastructure. As cloud and speed become the new mantra, the emphasis on networking becomes more important than ever.

As the industry moves to integrate both infrastructure and developers, Cisco will need to make a similar shift. Their work in SDN with ACI and around thought-leadership pieces is making significant inroads with enterprises.

Summing it all up

Each is approaching the problem in their own ways with varying degrees of success. The bottom line is that each of them is making significant strides to remain relevant and support tomorrow’s enterprise. Equally important is how quickly they’re making the shift.

If you’re a startup, you will want to take note. No longer are these folks in your dust. But they are your potential exit strategy.

It will be interesting to watch how each evolves over the next 6-12 months. Yes, that is a very short timeframe, but echoes the speed in which the industry is evolving.

Cloud

Containers in the Enterprise

Containers are all the rage right now, but are they ready for enterprise consumption? It depends on whom you ask, but here’s my take. Enterprises should absolutely be considering container architectures as part of their strategy…but there are some considerations before heading down the path.

Container conferences

Talking with attendees at Docker’s DockerCon conference and Redhat’s Summit this week, you hear a number of proponents and live enterprise users. For those that are not familiar with containers, the fundamental concept is a fully encapsulated environment that supports application services. Containers should not be confused with virtualization. In addition, containers are not to be confused with Micro Services, which can leverage containers, but do not require them.

A quick rundown

Here are some quick points:

  • Ecosystem: I’ve written before about the importance of a new technology’s ecosystem here. In the case of containers, the ecosystem is rich and building quickly.
  • Architecture: Containers allow applications to break apart into smaller components. Each of the components can then spin up/ down and scale as needed. Of course automation and orchestration comes into play.
  • Automation/ Orchestration: Unlike typical enterprise applications that are installed once and run 24×7, the best architectures for containers spin up/ down and scale as needed. Realistically, the only way to efficiently do this is with automation and orchestration.
  • Security: There is quite a bit of concern about container security. With potentially thousands or tens of thousands of containers running, a compromise might have significant consequences. If containers are architected to be ephemeral, the risk footprint shrinks exponentially.
  • DevOps: Container-based architectures can run without a DevOps approach with limited success. DevOps brings a different methodology that works hand-in-hand with containers.
  • Management: There are concerns the short lifespan of a container creates challenges for audit trails. Using traditional audit approaches, this would be true. Using newer methods provides real-time audit capability.
  • Stability: The $64k question: Are containers stable enough for enterprise use? Absolutely! The reality is that legacy architecture applications would not move directly to containers. Only those applications that are significantly modified or re-written would leverage containers. New applications are able to leverage containers without increasing the risk.

Cloud-First, Container-First

Companies are looking to move faster and faster. In order to do so, the problem needs reduction into smaller components. As those smaller components become micro services (vs. large monolithic applications), containers start to make sense.

Containers represent an elegant way to leverage smaller building blocks. Some have equated containers to the Lego building blocks of the enterprise application architecture. The days of large, monolithic enterprise applications are past. Today’s applications may be complex in sum, but are a culmination of much smaller building blocks. These smaller blocks provide the nimble and fast speed that enterprises are clamoring for today.

Containers are more than Technology

More than containers, there are other components needed for success. Containers represent the technology building blocks. Culture and process are needed to support the change in technology. DevOps provides the fluid that lubricates the integration of the three components.

Changing the perspective

As with the newer technologies coming, other aspects of the IT organization must change too. Whether you are a CIO, IT leader, developer or operations team, the very fundamentals in which we function must change in order to truly embrace and adopt these newer methodologies.

Containers are ready for the enterprise…if the other aspects are considered as well.

CIO · Cloud · Data

What to watch for at HP Discover this week

This week marks HP’s annual Discover conference in Las Vegas. HP has come a long way in the past couple of years and this year should prove interesting in a number of ways. Here is a list of items to watch in the coming couple of days:

Announcements: There are a couple of significant announcements planned this week. While the announcement itself is interesting, the long term impact should prove a more interesting opportunity for HP’s strategy post-split. Watch the keynotes for more details Tuesday and Wednesday.

Split Update: News about the HP split into two companies is not new. Look for more details on the progress of the split and what it means for each of the two entities. On the surface and through a number of ‘hallway conversations’ I’ve had, it seems that the split is bringing greater focus to the enterprise teams. This is good for HP and for customers.

Software: The HP Software team is a large and diverse bunch. The areas I’m particularly interested in are the progress around HAVEn, Vertica and Autonomy. Initial conversations point to some really interesting progress for customers. As BigData, Analytics and data (in general) become front-and-center for organizations, look for this area to explode. We have only scratched the surface with more opportunities abound. I’m looking at ways HP is educating customers on the value opportunities in a way they can consume. While there are themes, we are moving to a ‘market of one‘.

Cloud: The HP Helion Cloud has a number of things happening at the conference. I’m particularly interested in the progress they’ve made around commercial offerings of OpenStack and private cloud. Overall, cloud adoption is still very anemic (not just for HP). I’m looking for ways HP is creating the onramps to cloud to reduce apprehension and increase adoption rates. Many of the challenges span greater than the technology itself. Look for ways HP is engaging customers in new and different ways. In addition, watch for changes in how the solutions are shifting from supporting enterprises directly to supporting service providers. Bridging the gap here is key and the needs are very different.

Infrastructure: Many enterprise customers still maintain a large infrastructure presence. Even if their strategy is to shift toward a cloud-first methodology, there are reasons to support internal infrastructure. Look for ways HP is evolving their infrastructure offerings to support today’s enterprise along with its evolution to a cloud-first model. As the sophistication of data increases, so will storage solutions to meet the ever-changing requirements. Similarly, the complexity from networking that solutions like Software Defined Networking (SDN) address will be interesting to watch for.

Wild Cards: There are a number of wild cards to watch for as well. The first is DevOps. DevOps is critical to today’s IT organization and moving forward. It applies differently to different orgs. Watch for the subject addressed in Keynotes. The second wild card is an update from HP Labs. HP Labs has a number of really interesting…and innovative solutions in the works. Look for an update on where things stand and how HP sees innovation changing.

Finally, I have a number of video interviews scheduled over the next couple of days where I dive deeper into each of these areas. Plus, will cover an update on the state of the CIO. Look for links to those either using the #HPDiscover hashtag or on the blog after the show.

As always, feel free to comment and join the conversation on Twitter. The hashtag to follow is: #HPDiscover