It’s an issue that has been nagging at me for quite awhile. Who gets to decide what functionality belongs in a software-defined storage stack and what should belong to the array controller or the operating system? I found my discussion with Chandra Mukhayala, IBM’s Portfolio Marketing Manager for File and Object Storage, on this topic so interesting, I had to get it on video.
Here is the result. I can’t thank Chandra enough for taking time to share his thoughts with us. Very insightful commentary.
This interview was shot at IBM Interconnect 2016, where I was contracted to live tweet sessions and support other social media efforts. IBM did not edit this video in any way. Opinions are those of the interviewees, and hopefully — in this case — of Big Blue as well!
It isn’t every day that we score an interview with the General Manager of zSystems and LinuxONE, Ross Mauri. He is a very busy guy working to bridge the constituencies — AppDev and Ops — within the data center, while adapting IBM’s mainframe narrative to the brave new world of hybrid clouds.
So, when you get the chance to pull him into a video interview, you take it. The extraordinary thing about Mauri is how readily he switches his focus to the topic at hand and frames his ideas as though he has had a lot of time to prepare for questions that were not provided to him in advance. What also comes through is the personable-ness of the fellow. You find yourself wanting to hear more of his thoughts whenever he talks.
We have divided Mauri’s interview into five short parts. Each one contains insights and wit that you may just find useful as you build your own hybrid cloud initiative. So, enjoy!
Ross begins by addressing the cultural gap between the mainframers and the appdev folks that exists in many data centers. He notes how IBM Interconnect 2016 is providing a forum for the two communities to interact…
The mainframe remains a fixture of the contemporary hybrid cloud datacenter. Mauri lists some of the mainframe attributes that keep Big Iron so relevant.
This year’s Interconnect covered Systems of Insight in greater depth than we have heard in prior events. A large ecosystem of vendors and technologies appears to be taking shape. We asked Mauri what the challenges were for IBM in wrangling so many players, technologies and APIs?
Finally, we asked what IBM sees itself as becoming in the hybrid data center? Hardware technology innovator? Essential software and services provider? Trusted advisor? Mauri gives his view.
For the record, these interviews were recorded at IBM Interconnect 2016. I was engaged by IBM to live tweet from some sessions at the show and the company picked up the expenses for my transportation and lodging at the show, as well as the cost of the show ticket. This interview, the questions I posed, and the edits I made of the answers are completely my own.
Special thanks to Ross Mauri, Mary Hall, and all the gang at IBM’s Social Media Organization for making these interviews possible.
I have to admit, I am not very good at remembering brand names. I am even less good at remembering brand names when the vendor changes them. A year or two back, IBM decided to regroup and rebrand its storage offerings under the “Spectrum” umbrella. It made for pretty artwork, but it confused me…and probably many of Big Blue’s customers.
IBMers have a tendency to drop “Spectrum Accelerate” or “Spectrum Protect” or “Spectrum Virtualize” or “Spectrum XYZ” into their dialog from time to time, requiring trade press writers and bloggers to do a Google-quest to find out which products they are talking about. But, at IBM Interconnect 2016, Eric Herzog, Vice President of Marketing for IBM Storage Systems (and long-time friend, even in pre-Big Blue years), took the time to sit for a video interview and straighten me out, once and for all, on the whole Spectrum thing. Here is his interview, decked out in his colorful Hawaiian shirt!
I attended IBM Interconnect 2016 as a guest of IBM. For the record, they covered my attendee fee, room, board and transportation and also provided a stipend for live tweeting sessions I attended at the show. This video and its edited content are entirely my own.
Thank you, Eric, for doing the interview. And thanks for clarifying the Spectrum Storage family of products from IBM.
To all of the pundits, analysts and vendors of disk and flash who once again claim that tape is dead, killed this time by cloud, I want you all to take a deep breath. Hold it for a second. Release it. Repeat three or four times.
Then watch the video interview below with Shawn Brume and Ed Childers of IBM. These guys know something about tape and its continuing role — especially with respect to hybrid clouds. Just give them a listen and don’t just use the time to think up your nonsensical responses. The facts are the facts: we need a storage ecosystem if we are going to deliver even half of the promise of cloud service delivery models.
This video interview was shot at IBM Interconnect 2016, where IBM paid for my expenses as a guest blogger. I was also compensated for tweeting at sessions during the show, but this video interview, the questions asked and the edited final were all my own work. IBM was not involved with my video work.
Thanks to Shawn and Ed for giving a comprehensive overview of the continuing role of tape in hybrid cloud environments. You guys are the best!
Okay. So we are bringing this up again after about eight or nine years. Here is the base article behind today’s post. Apparently, Hollywood is looking into using DNA to store digital movie data. Cheap, capacious, durable data storage is the lure and at least one start-up is now striving to perfect the technology to […]
At IBM Interconnect 2016, I enjoyed picking up a conversation where I last left it with Kathryn Guarini, Vice President of Offering Management for z Systems and LinuxONE at IBM Systems. We first met when she took me on the grand tour of the z13 platform when it was released last year. This time, with […]
I am really happy about a trend I am seeing in start-ups and established players talking about the need for analytics to support the management, availability and delivery of resources like storage. I plan to write an Infrastruggle column about it at Virtualization Review. Please stand by. In preparation, you might want to take a […]
Okay, so I admit it. I really like what IBM is doing with Linux. At a macro level, it is a really smart strategy for refreshing the IBM brand and associating it with the next generation of computer geeks coming through the ranks, while maintaining the strengths of IBM technology and its cadre of dedicated […]
I have said many times here and in other articles and columns that there is really very little “new” in the concept of cloud. I have also quipped to anyone who would listen that mainframes are now and always have been clouds “in a box.” I didn’t need to attend IBM Interconnect 2016 to have […]
Interesting, that. A recent IDG research, sponsored by DataLink, surveys a few hundred of the world’s biggest companies. One finding is that disaster recovery/business continuity planning ranks #2 on everyone’s list of key projects underway. I hope that’s true; I had been getting a bit perturbed by the BS claims that High Availability trumps Disaster […]