This article is about something that could be boring for most of my usual readers.
But I promise that also if you skip the most technical sections (in gray), you will still be able to get to the main issue: using “cloud computing” for business-critical applications without giving away your IPR.
Back to the future
My first experience with a “virtual” disk on a customer site, beside on mainframe projects, was, I think, in late 1980s.
It was for a decision support customer- I had been told the previous day that I had to deliver a kind of training/pre-sales… the following morning.
So, the usual frantic hours (in pre-laser times) in copying-cutting-pasting-copying-assembling professional looking material out of thin hair (yes, 18 hours weren’t enough :D)
The customer had a Digital Microvax 3100 (probably).
A peculiar characteristic was that, to ensure that users saved their data (yes, it was pre-commercial Internet and pre-USB data keys), no PC had a real hard-disk: they all used a “virtual” space, basically a file, on the Digital machine.
At the time, my “portable” computer was either a luggable “brick” Compaq, or the new fancy Toshiba with plasma screen (about 9kg!) that I had been given…
…so that I could extend my 18 hour-days into the week-end; I was offered by the company to pay my fixed line (no GSM, then), but I turned down the offer- as the company policy was that there was a list of people whose calls you could not turn down or divert 😀
Transmitting data at the time meant using some data transfer systems and floppy discs (the 5.25 version); modem/fax cards at the time in Italy required to pay a tax to operate as a post office.
Whenever I had to build a model and receive data, there was a physical transfer- and, considering that I prepared models mainly for number crunching controlling, sales reporting, balance sheet-level data, sometimes I had to sign off for the data.
Quite appropriate, and, in my view, something that, with modern computing resources, should be constantly enforced, e.g. by having USB key that beside being protected, have a public key/private key system to ensure that are used only with authorized peripherals.
On mainframe, the security system I found most often was RACF, and each data extraction, upload, retrieval was generally managed on the mainframe side (or the IBM S/3x; almost nobody used UNIX in a business environment, at the time).
Who owns the data?
If you read again my narrative, you will notice something: as most security systems, it assumed that somebody had ownership of the data, and could grant access.
While also considering that, once granted access, each access could or could not be logged (depended on the data content and/or level of paranoid inclination of those involved), but you are within your own perimeter, entitled to use the data.
Fast-forward to the period between late-1990s and late-2000s.
The first startups that I supported on creating online data shared repositories did not have confidential information stored online: just data directories.
But in the late 1990s it became more common in another field (decision support systems, specifically business intelligence) to have a web-based distribution and component: and it became visible a first potential issue.
Beside pre-defined reports, to enable end-users to do their own analysis without waiting for the IT department, “templates” were prepared.
What happened? You could end up with a final user getting a “data mart” on his PC, a kind of “information mini market”- without any confidence that the user will manage the data with the appropriate care.
Certainly you read about the number of USB keys with tons of data lost every year, so this is not a surprise (I have a 32GB USB key: fifteen years ago, that would have been enough to store a data warehouse).
At the same time, empowering the business users to do analysis was coupled with the emergence of the “niche” Software as a Service (SaaS) suppliers.
Historical digression: from time-sharing to cloud computing
About 50 years ago, “time sharing” in computing meant using facilities from a supplier, without the need to invest in hardware.
The difference between, say, late 1970s and late 1990s?
In the late 1970s, you needed anyway to be an expert to be able to use those resources: control, management, security, etc, were all managed with the same level of confidentiality and security customary on internal computing resources.
In the late 1990s, the selling point of the “niche” suppliers was that their cost was usually within the budget confines for “end-user computing”, allowing the business users to access directly external services.
Anyway, except in few cases where these external suppliers formally were billing as “consulting/service suppliers”, generally the use of any “niche” suppliers required a formal validation- therefore, the security was kept.
I posted already few articles on data security online when using an hosted service- you can search on this blog for more details.
Often, when talking with vendors, I was puzzled by the quantity and content of the information that, once opened the channel, was constantly transferred and stored outside.
Moreover: to avoid having to ask help to the IT department and wait, sometimes business users made “wholesale requests”- including data that wasn’t really needed for their business purposes, and that was included in the data shipped outside the company, just in case.
I also kept being told every few years of yet another “magic” black-box system that should have been installed within the firewall.
And send information outside about services, activities, etc- to allow using an high-performance external alert system.
No, I never even tried to sell any of those systems: against all I can think about data confidentiality.
Still, selecting a SaaS, became something that also business users knew that required a preliminary assessment.
A new (business software) world
The issue is shifting from specific “niche” applications to general-purpose, low-cost SaaS and cloud-based.
From 2005, I talked often with creators of new business-oriented social networks and what we would call now “cloud-based” office platforms (from word processing to CRM and beyond).
I asked always the same question- before talking to anybody about your service, can you tell me how are you going to protect not only the privacy, but also the confidentiality of the information.
The usual answer? Passwords, security certificates, encryption- and all the usual paraphernalia.
Boring. And with a fundamental flaw: almost never the security extended to protecting also the information about the data use- i.e. data access logs that showed which information was accessed when and who was the user accessing the information.
In my perspective, that is sensitive information.
Few online “private” social networks offer a “matrioska” level of security, say- a level for the user, inside a level for the group, inside a level for the community- so that each subset of information is kept confidential within the appropriate part of the community.
IT auditing and IPR tracking in the cloud
While SaaS has an identified supplier, and can be treated as any other software or service supplier in terms of validation etc, “cloud” services have both a lower cost of entry- and, to manage economies of scale, a full supply chain of highly specialized micro-suppliers.
The second issue is the critical one- and on a webinar by Centrify and Deloitte (see my twitter @robertolofaro on 2010-10-12 for the link) I heard something that reminded me of pre-cloud and pre-Internet auditing activities.
In most companies, the corporate credit card is commonly given to cover expenses within a certain limit- also office expenses.
Thanks to the possibility of receiving electronic records, and the increased level of details supplies by, say, Amazon and others, you can check expenses- to find something unusual.
Nice suggestion- if you want: as close as you get to the old “audit trail” that supposedly disappeared when electronic record replaced manual records.
But this inspired another idea: as most commercial credit card suppliers provide itemized and structured expense reports, you can easily build an automated system identify unusual expense patterns.
It was already difficult enough to try to negotiate with SaaS suppliers changes to their contracts to cover for specific business, confidentiality, technical issues- good luck with cloud computing suppliers.
By scattering the supply chain involved in cloud computing across multiple countries, suppliers, and knowledge-providers, it becomes even more critical to re-assert data ownership.
And ensure that any information is transmitted, processed, stored so that only those entitled to use it can have access to the information.
But if managed in the wrong way, this would remove the benefits of the potential increased supply of micro-applications based on cloud computing, generated by “niche suppliers”.
Balancing secure data management in a cloud environment with flexibility of access to a potentially huge but dynamic portfolio of business applications, implies using a business model already used for Palm, iPhone, Android, and the like.
In a business environment, this requires defining interoperability approaches that entitle more secure data exchanges, to avoid “data mining” by intermediaries.
I am referring, for example, to one-use keys, so that using an application to process data and produce a result that then is fed to other applications, and so on, does not imply that you give the “store key” to the data to each application.
Just security? Or a new business model?
Yes, the cloud-based applications that you use online could store the data while processing- but this is were creating a shared standard on managing this data exchanges will act as a kind of “cloud policeman”.
To give a practical example: most customers nowadays give to their consultant a personal computer- but, following my approach, whenever given one, I prefer to leave it in the office at the end of each day.
What would be the point of bringing home a computer belonging to my customer, containing data belonging to my customer?
My basic idea is nothing really new- it is just an extension of the XML approach (a standard approach to send data along with the instruction on how you have to interpret the data).
You have the data, need to use an application, retrieve the application from the store, put it into a “bubble”, the application does its work, gives you back results, and your own unique, secure bubble (containing data and your temporary copy of the application) disappears, as it already produced the required results.
Well, you can also add to the “cloud policeman” a “cloud billing”- charging “per use” access to data, applications, etc- maybe a new way to allow generating value from IPR, without the need to get into technology.
You just create your content, set a price, and choose where it is posted- the common standards will allow each “store” to ask you the questions appropriate to let then other business users automatically find your application.
Why not? Your own application could launch an “instant auction” to find the cheapest available provider of the specific service that your need.
From an application producing an analysis of your preliminary balance sheet, to a photo to embed in your report that contains certain characteristics, to a sample of music or video to embed in your annual report.
So, instead of having your data travelling to the application, as in a typical SaaS approach, it is the micro-application travelling toward the cloud-based application that is inside you own “walled” community.
And everything happens in the cloud, still keeping the benefits of enforcing your own data ownership and management rules, without the need to invest into infrastructure or train each user in security management.
Who knows? Maybe some of your own applications or non-sensitive data could save time to others (e.g. your carefully selected list of restaurants or reliable suppliers for off-the-shelf office supplies)- and generate revenue.