TECHNOLOGY
Converting software products into cloud services
It was in the 1960s that general purpose business computers first became available and marked the beginning of the evolution of business computers. Because they were so large and so expensive, it was impractical for small and medium businesses to have one. However, they could buy “time” on someone else’s computer and this led to the first timeshare operating systems – which allowed many users to run their programs on one computer – apparently simultaneously and without interfering with each other. Of course, these timeshare users had to get their data in and out of the computer. This was best accomplished (remotely) using telephone circuits and modems that convert digital data into a form that can be transmitted over a network designed for speech.
In the 1970s and 1980s, the cost of computers fell, allowing even small businesses to own them. Indeed, not one per company, but one per employee – the personal computer (PC) had arrived. This then created the question of how to distribute the software to make these PCs usable – for example, spreadsheets, word processors, accounting systems, etc. In the age when telephone networks were still painfully slow and unreliable, CDs and DVDs were the media of choice. Every time the software was updated – to improve functionality or to fix a bug – the user had to get a new CD/DVD so that they could update the software without destroying their own data. And if the update didn’t work as intended, the user had to go back to the previous version. As a result, many users would rather stay on old versions for years rather than risk a failed update. Add to that the fact that CDs and DVDs could be copied, software manufacturers added so many grueling licensing mechanisms that it was often impossible to use software you owned.
In 2000, the practical realization of a global network connecting every type of digital device had expanded from academia and the military to the general business world to become what we now call the Internet. With the Internet, it now made practical sense to come full circle and return to the 1960s idea of sharing large computers. Sure, in 2000 most people had computers on their desks, but big computers had gotten even more powerful and there was a whole host of applications that just wouldn’t run on a PC. The most obvious examples are artificial intelligence applications such as speech recognition, but it turned out that much less demanding applications such as business accounting worked much better with many users sharing a large computer than with their own small terminal each.
Nevertheless, it certainly made sense to use a PC as a glorified terminal that connects to a large remote computer running super-powerful software. But how does a non-specialized PC user run a remote application from a PC? Again, a technology developed in academia and government solved this problem: the World Wide Web. In fact, the web is simply a set of rules that programmers follow so that their software can talk to each other. The internet covers the rules for communicating the data and the web covers the rule for sharing the data with a human.
However, the problem remained that the internet was still not fast or reliable everywhere. This had to become the norm for the average home user to tick the last box for the growth of cloud computing – cloud is the buzzword used to describe the collection of all computers connected to the internet. Although it didn’t happen immediately, by 2010 we had reached the tipping point where software developers are now considering developing cloud applications – that is, applications that run on remote computers accessible from the Internet – rather than native applications – applications that run on PCs in local networks.
Almost inevitably, we keep repeating the cycle of returning to old technologies. Just as we have returned to timesharing, we are slowly becoming more independent of the world wide web and its utterly dependent application, the web browser (eg Internet Explorer, Chrome, Safari, and so on). As great as these are at hiding the differences between different types of devices such as Macintosh, Unix, Windows or phones, they impose limitations that are not there for native applications. So the trend these days is to replace web applications with native applications, which still use the internet to communicate with the remote host computer. Examples are YouTube, Office 365 and WhatsApp. These are all cloud applications, but you can access them with a web browser or a native application. But it’s just a trend. It would be impossible to drop support for web applications without losing the freedom to run the cloud service from any device anywhere in the world.
So with that brief discussion of the background, here are some of the key factors for cloud computing…
Reliable high speed internet – typically with average download speeds over 10mbs with 99% uptime. Generic Service Hosting – Obviously cloud computing would not be possible without the cloud applications. All major software manufacturers have a significant offering of cloud services. The revolution in distributed system software design – writing software that communicates with someone else’s software is not an easy task and making it work successfully requires significant advancements in software design. Open source software has removed the huge barrier created by proprietary (secret) software development models. Now most developers will look for existing software before developing their own – often developing applications in months that require 10s man/years of development effort. The rental model: Most cloud services make money through a monthly fee rather than a one-off upfront payment. This benefits the user by not requiring a large capital payment, no support contracts, and the freedom to stop paying when the software is no longer needed.
The main benefits…
Continuous upgrades – in the cloud model, software is continuously updated and upgraded, so that the user gains immediate benefits with minimal effort. Concentration of computing power – as mentioned, most applications run more efficiently on one large computer than on many small ones. This is partly because the big computer is inherently more efficient, but also because they reduce latency (that is, the time most computers spend doing nothing) Significantly lower costs – no time spent purchasing, servicing, maintaining, updating and securing physical hardware and native applications. Work from anywhere – cloud-based applications not only allow users to use any computer in any place, it also means that in the event of a disaster, data is safe and business interruptions are minimized.
The main risks…
Dependence on the Internet – cloud services imply a high dependence on the availability of the Internet. While some allow “offline” work, this isn’t always practical. Security breaches – With all a user’s data stored in the cloud, it is an ever-present temptation for hackers and competitors in a way that would not be possible if data is stored onsite. Data can be destroyed, stolen, sold, altered, ransomed, and so on. Privacy Violations – Sharing confidential data between employees can be just as damaging as corporate data. Data latency – no matter how fast internet access gets, it’s worth remembering that nothing moves faster than light, and when you access data on a cloud server on the other side of the world, there can be delays. The price of “free” services – many cloud services are available for free. As we all know nothing is ever free and the price you pay is sharing your data with the service provider. Read their Ts&Cs and you may be shocked at what you agreed to. And if you don’t pay, you don’t have many rights.
In short, cloud applications are a huge advance over our early native application model, although the division of work between the local client and the remote server is constantly changing. Cloud-based applications win out over local network-based applications in almost every aspect, but it’s important to remember that there are some risks.
Dr John Yardley
dr. John Yardley, Managing Director, JPY Limited and Threads Software Ltd
John began his career as a researcher in computer science and electronic engineering at the National Physical Laboratory (NPL), where he earned a PhD in speech recognition. In early 2019, John founded Threads Software Ltd as a spin-off from his company JPY Ltd to commercialize and operate the Threads Intelligent Message Hub originally developed by JPY Ltd. Today, JPY represents manufacturers of more than 30 software products distributed through a channel of 100 specialized resellers. Threads software makes it easy to search and retrieve data by uniquely collecting, aggregating and indexing all required digital business communications – both written and spoken.
John brings an in-depth understanding of a wide variety of technologies underpinning the software industry. He holds a PhD in Electrical Engineering from the University of Essex and a BSc in Computer Science from City University, London.
