The International Data Corporation has stated that the enterprise software market achieved £342bn in revenue in 2012, a 3.6 per cent growth on the previous year. While the overall growth of the market is slowing – it is less than half of the growth rate seen in 2010 and 2011 – evidence suggests that the increased adoption of big data and cloud has contributed to the improved turnover of the last 12 months.
The findings were presented in the organisation’s ‘Worldwide Semi-annual Software Tracker’. It revealed that data access, analysis and delivery, collaborative applications, CRM applications, security software, and system, and network management software represent the faster expanding areas of the market; each grew by six to seven per cent last year, about double the rate for enterprise software as a whole.
IDC has said that the larger investments in tools that enable organisations to manage, access and analyse their data indicates the prominent role big data is now playing in this market as greater emphasis is placed on uncovering value from vast data sets. Furthermore, the ever-increasing number of cloud deployments, often underpinning the way big data technologies are procured, has also become integral to the market’s success.
In terms of the success of software vendors, the top five based on revenue last year were Microsoft, IBM, Oracle, SAP and Symantec. SAP witnessed the largest growth at 5.1 per cent, followed by Oracle at 3.9 per cent.
Commenting on the research, Henry Morris, senior vice president for worldwide software, services and executive advisory research at IDC, said in a statement: “The global software market, comprised of a multi-layered collection of technologies and solutions, is growing more slowly in this period of economic uncertainty. Yet there is strong growth in selective areas.
“The management and leveraging of information for competitive advantage is driving growth in markets associated with big data and analytics.
“Similarly, rapid growth in cloud deployments is fuelling growth in application areas associated with social business and customer experience. The combination of these forces is advancing the growth to what IDC has termed the third platform.”
Recent years have seen a lot of development in the cloud computing sphere. Big data is believed by many to be here to stay, and a lot of real investment is touted to happen in this particular area. Such a trend is quite exciting, as new, better and more powerful infrastructure will be needed to support all this. So, a lot of further development is on the way to accommodate these computing perimeters.
Does agility matter?
Your company might have been in the business for many years, or it may be a newcomer to the field – JIT (Just in Time) deployment of services is very important for both types of organizations. This is also critical to the success of any of the types of company described above. While small business owners may sometimes think that there is a cost control through traditional IT, they really need to consider the agility that the cloud brings to their businesses. Cloud computing, when moved and executed properly, can help companies tap market opportunities to the best possible effect, due to the extended flexibility and agility it has to offer. The recent acquisition of Cloud dot com by Citrix clearly proves that there has been an increase in interest in cloud computing technology. It is also touted in the networking space that many emerging SDN players are bound to be acquisition targets for major companies keeping an eye on the developments in the sphere.
The most important standards in Cloud
There are a lot of important standards that need to be provisioned into the cloud computing sphere, one of which is using a standardAPI. With services like Cloud Stack and Open Stack, a lot of progress is being made in this regard. However, there is still work left to be done regarding the way abstraction of various layers is defined, especially in the security and networking layers. The work being done with Open Stack is quite commendable, and it could serve as a guiding platform for SDN to become aligned with this vision of abstraction in network and security.
Big data – the buzz
Big data is not new. It has existed for ages and can be attributed even to the initial years of computing. However, one might do well to consider why is there an increased buzz around this now. The answer is quite simple: Significant advances that have been brought about by x86 hardware have actually helped in bringing computing power to the masses. However, with new technologies such as vPlane, etc., cloud computing has extended this power. Now, users have extended perimeters, while still being able to control costs effectively. Cloud computing platforms have achieved performance standards that were previously only possible by using ASICs built specifically for that purpose.
The 3Vs were discussed in the 2001 paper, 3D Data Management: Controlling Data Volume, Velocity and Variety, by Doug Laney. The paper predicted trends in data warehousing techniques that evolved from 2001 to 2006, particularly in e-commerce-based big data.
3Vs is a data management trend that was conceived to help organizations realize and cope with the emergence of big data. The 3Vs compare the storage, utilization and consumption of data with regards to the three base dimensions, and it encompasses all data forms, regardless of storage location or format, that are eventually compiled as a big data repository.
There is no clear-cut verdict in this domain. While cloud computing has become a boon to network infrastructures, it has also been a support to the ever-expanding needs of infrastructure, and also put a check on the costs. The agility it brings does has not yet seen a the time when end-to-end cloud computing operating systems are a common phenomenon, and fewer articles stressing its importance will be written – perhaps like this one!
Big data’s impact will influence cloud computing in 2013
Cloud computing and big data were two of the most important trends to shape the IT industry in 2012. Looking ahead, both will continue to impact the landscape in a number of ways. According to a recent RedHat report, many businesses implemented cloud-based environments last year as a way to manage the influx of structured and unstructured data.
The report suggested that corporate storage will evolve in 2013 into a “data platform,” rather than a “data destination.”
“As a platform for big data and not just a destination for data storage, enterprise storage solutions will need to deliver cost-effective scale and capacity; eliminate data migration and incorporate the ability to grow without bound; bridge legacy storage silos; ensure global accessibility of data; and protect and maintain the availability of data,” the report explained.
Cloud/big data relationship goes deeper
There is a great deal of industry research that continues to point to the growing importance of both cloud computing and big data. In many cases, both are impacting the other. A recent Markets and Markets report indicated that the global public and private cloud storage market will expand at a compound annual growth rate of 40.2 percent between 2012 and 2018, approaching $47 billion.
The research firm explained that the cloud storage market will reach this level as businesses address the emergence of digital trends that have impacted the volume in which unstructured data is generated. The cloud, which is not only cost-effective but also scalable, is an ideal technology to help companies address this need.
US firms using cloud for storage
The influx of big data has reached an important point that has made it necessary for companies to migrate information to hosted environments. A recent Redwood Software survey found that only 35 percent of U.K. firms are using the cloud for private data storage, compared to nearly 60 percent of U.S. organizations.
Looking ahead to the global cloud storage industry, U.S. companies will have a major role in the growing market. MarketsandMarkets predicted that North America will account for nearly $22 billion worth of the global cloud storage landscape by 2018.
There are a few constants that are expected to remain throughout the entire IT industry moving forward. The fact is that companies are producing more data than ever, making it necessary for businesses to not become overwhelmed with storing this information.
Distributed or grid computing has been powering many big data projects from gene-cracking to the search for extraterrestrial intelligence. The idea is simple: take huge amounts of data that need to be analyzed-break it into small chunks and let individual computers tackle those chunks. It is the basis of the popular Hadoop platform for big data. Sure, Hadoop is designed for server clusters and companies like Amazon have huge amounts of servers that can tackle any given problem. Hence, Amazon Elastic Cloud Compute (Amazon EC2) is a prime way for companies to tackle big data and cloud computing. And while there is little doubt that servers are better at handling such massive data through-put better than any smart phone – let’s face it – Amazon doesn’t have 1 billion and growing servers either. The cost of such expansion to them is enormous. However, consumers continue to purchase additional phones and constantly upgrade their hardware.
There are already grid computing models that take advantage of the fact that a desktop computer that may remain on, idle, and connected for extended periods of time. The popular BOINC platform is such an example. Download their client software, and choose a project. When computer is “idle” – it is actually working on analyzing genes, astronomical data, or any number of projects volunteer computer cycles to.