When I went to my first OpenStack summit in Paris 2014, it was a huge thing. A vision of an open-source private cloud went out of the bottle and the hype exploded. But in these two years, in the market where I work, I haven’t seen too much adoption. So this year I went to the summit in Barcelona with low to moderate expectations. Luckily I’ve been wrong, because OpenStack is still a huge thing. And here below are just few reasons why.
First of all – numerous improvements, which make OpenStack more robust, easier to maintain and upgrade. Auto-remediation, for example will automatically add more hypervisors or evacuate VMs in case of HW failure, resolve rabbitmq problems, clean log files etc. With Newton release you will be able to upgrade the cloud without taking it down. Another interesting feature is that now you can create pools of external IP addresses, or create a compute node without an IP address for later addition.
Next, a complete set of new projects. For example Murano, which facilitates application deployment. Developers can package and publish their applications in a catalog, and deploy them with a push of a button. Or Sahara for automatic deployments of Hadoop clusters for big data analytics.
I had an opportunity recently to learn a lot about Softlayer and get certified in Softlayer Solution Design, so I would like here to share my insights about some features, which may come as very useful for the post-digital world.
First of all I need to say that I work at IBM, and Softlayer is an IBM company. Softlayer cloud offering is actually, together with cognitive computing and Watson division, one of the strategic imperatives of the Big Blue. But I’m not directly attached to the Cloud Unit, so this certification was a challenge. IBM is known for the rigor in professional certifications. To get certified, it takes a lot of learning, practical exercises and experience, considerable time to prepare your certification package, and several levels of technical and peer reviews. All this hard work and learning has to fit into your regular work schedule, so in the end being certified at IBM is something IBMers are usually very proud of.
From April 11 to April 13, I attended Annual Investment Meeting (AIM) in Dubai, one of the most important world forums on the promotion of Foreign Direct Investments (FDI).
I had an honor to moderate a session on outsourcing within this conference, whose objective was to advise the governments on how to capture opportunities in this dynamic and global market of outsourcing. Here is a brief account on this session.
My panelists were renowned experts and global industry thought leaders:
- Christina Knutsson, Director at GDP Global
- Jon Becker, Legal Counsel at Clearly Legal Ltd
- Michael Ansah, Vice President of Dell
- Roland Jabbour, Chairman at Jabbour Holding Group
- Zhengping Yu, Executive Director of IBM China GTS GCG
Outsourcing was selected as one of the hot topics at AIM due to the fact that more and more money is being channeled into the alternative forms of investment, like mergers and acquisitions, licensing, subcontracting and outsourcing.
Foreign Direct Investments
Several thousands participants, ministers of more than 100 countries, government officials, industry experts and consulting companies convened to a large number of sessions and country presentations. Here are just few numbers to illustrate the current state of foreign direct investments in the world:
Year 2016, marked by the win of Google DeepMind in strategy game Go is definitely the year of artificial intelligence. Cognitive computing, machine learning, neural networks, natural language processing – these new concepts offer tremendous opportunities for bringing god-like intelligence and judgment capabilities to our everyday technical and business applications.
In January we, at IBM, made a presentation to the FinTech community in Geneva on cognitive computing and on how to use Watson APIs. Here in this article is a summary of what cognitive computing is and how to put it easily in practice.
Geneva, January 11, 2016 : Sasha Lazarevic, Alexandre Gaillard (Swiss Fintech Leader, InvestGlass CEO), and Pierre Kauffmann (IBM Cognitive Solutions Architect)
But first of all, I feel I need to clarify some concepts:
The Future of Project Management in the Digital Age
I am tempted to conclude that project management is on the decline. There is saturation in the number of project managers, in the activities called projects, in the number of projects managed by professional project managers, and many other metrics used to measure the effectiveness of project management discipline. It seems that this practice , as designed and used more than 15 years ago needs to redefine itself to be able to produce strategic advantage in the era of digital transformation.
If it is so, what other methods and concepts should business companies use to execute the work in an efficient manner? And also, what should the project managers do?
To answer these questions, we need first to remember how project management achieved such a high visibility and popularity in the last 15 years. Recall that this was the time of big investments in IT infrastructure and in-house developped IT applications. These investments required very strict scope management, time scheduling, resource deployment and cost management skills to make sure the company resources are used wisely.
But nowadays, the data is moving onto the cloud platform, and business value is created by using readily available APIs from an external ecosystem to make our business and application experiments. And it seems we don’t need project managers for that.
Let me then outline some advice on how to navigate through this new digital world without traditional project management.
User Experience for Project Managers or UX-PM is a new aspect of the project management, which emphasizes effective management of a specific kind of new projects in the domain of digital transformation, and particularly geared towards the final customer’s user experience. Project managers who work on this kind of projects need to master a very specific methodology and understand the use of various tools and techniques that facilitate the communication around the UX.
User Experience (or UX) is defined as a collection of person’s emotions and attitudes with regards to using a particular product, system or a service, and as such is a critical success factor of web and digital projects, since human-computer interaction is increasingly getting in the center of all modern IT activities.
I had chance recently to work with Continue reading
Everybody is talking about sustainability, but we don’t see a clear thought leadership here. I want to share some of my thoughts on how a business organization could be developed around thought leadership in sustainability with a very noble mission to propel forward the whole movement.
October 27, 2014
Owner of Alibaba (阿里巴巴) is known in the West as Jack Ma, but his real name is Ma Yun (马云), or if translated literaly : Cloud Ma. If we consider the fact that the word “cloud” appears one hundred times in the Alibaba’s IPO prospectus, this anectode about the Ma Yun’s name shows also very clearly that the world’s biggest e-commerce company based in Hangzhou sees itself as a major cloud services provider.
For those who would like to know more about this world-class business, I will analyze the structure of Alibaba Group and then place the focus on their IT operations. The story about Alibaba is a blueprint for successful cloud and e-commerce companies.
Alibaba’s main source of revenue is a group of e-commerce businesses:
My ideas on how IT Outsourcing should be organized.
2014, September 7
Storage virtualization is not a new concept, but in the light of disaster recovery requirements, cloud computing and big data, it is still a hot topic. Storage virtualization allows your applications to see data being stored on a logical volume, while the configuration and location of the physical storage resources are unknown to you and completely irrelevant. This allows transparency in moving the data from one location to another, uninterupted service delivery during infrastructure migrations and increased data protection during disasters.
Storage virtualization controller is a device that is actually doing the whole work : mapping the underlying disk systems with logical volumes, replicating data to your secondary data center, providing data snapshots for the database backups, deciding and automatically moving data to the faster flash-based SSD disks if they are frequently accessed, etc. This is a very sophisticated piece of equipment that has to perform flawlessly, since all applications and business processes depend on the underlying data.