Скачиваний:
50
Добавлен:
20.06.2019
Размер:
50.48 Mб
Скачать

15  Cloud Computing – Data Confidentiality and Interoperability Challenges

259

apply and the export of the data back to the United States could be restricted or prohibited. In addition, the subjects of the data would acquire rights of notice, access, correction, etc. under French law. Once an EU Member State’s data protection law applies to personal information, there is no clear way to remove the applicability of the law to the data.

The location of a cloud provider’s operations may have a significant bearing on the law that applies to a user’s data. The actual location may or may not appear in the provider’s terms of service. Even if the provider discloses the location of records, the provider may change it, possibly without any notice. The same data may be stored in multiple locations at the same time. A provider who promises to maintain user data in a specific jurisdiction (e.g. the United States) may reduce some of the location risks that a user may face.

15.1.2  Data Concerns Within a European Context

Generally, the question that arises is how national privacy and security standards can be ensured in a global cloud environment. In terms of data privacy and jurisdiction, national standards and regulations have resulted in few providers storing regional hardware, and most choosing, instead, to use European and American infrastructures. Reservations about cloud computing derive from concerns about dependability, vulnerability, and lock-in to providers, as well as security-related issues, when there are no longer true internal systems.

Many users today are choosing to combine internal IT and cloud computing simply due to the fact that by doing this, they are not risking losing control of their sensitive data, especially in the cases where no uniform service level agreements (SLAs) exist. Indeed, loss of data, hardware breakdowns, and a reduction in performance are noted in relation to today’s cloud computing offers.

The drawbacks on the current implementations lie primarily on external audits not being currently permitted, limited logs available, the users’ trust in the brand such that they have no alternative with regard to data security, and lack of information regarding the actual location or the jurisdiction of data.

Organizations must plan carefully when constructing cloud computing environments to ensure that the flexibility and scalability do not overshadow the necessity for risk-tolerant implementation. As the developments in the EU show, the initial cloud computing implementation must not only be secure, but the whole system must be flexible to accommodate emerging laws and regulations.

The Council of the European Union, in the Adoption of the Council Conclusions on the future of Information Communication Technology (ICT) research, innovation and infrastructures [4], stresses that the digital revolution is still in its early stages and that a research and innovation capacity is essential to be able to shape, master, and assimilate technologies and exploit them to economic, societal, and cultural advantage; in addition, it underlines in this regard the necessity to ensure the availability, appropriate treatment, and conservation of an unprecedented amount of data.

260

F. Gagliardi and S. Muscella

15.1.3  Government Data

Government data are being put online to increase accountability, contribute valuable information about the world, and to enable government, the country, and the world to function more efficiently [5]. All of these purposes are served by putting the information on the Web as Linked Data. Linked data principles provide a basis for realizing the Web of Data by ensuring that data are organized, structured, and independent of any application programs so that it can serve a broad community of people and many applications. The main drivers behind linked data include the value-add of structured content, a mission or mandate to make data linkable, and most importantly, low development barriers. Key enabling technologies span Web 2.0, Mash-ups, Open Source, Cloud Computing, and Software-as-a-Service. Effort toward interoperability can be made where most needed, making the evolution with time smoother and more productive.

15.1.4  Trust

The technology of cloud computing itself is not insecure. However, companies must carefully plan, from the outset, the implications of massively scalable design, storage, and computing. This is especially true if those services are outsourced to cloud providers and not directly under company control. Recently, the Cloud Security Alliance was set up [6] “to promote the use of best practices for providing security assurance within Cloud Computing, and provide education on the uses of Cloud Computing to help secure all other forms of computing.” An educational and networking event entitled, SecureCloud 2010, hosted by the European Network and Information Security Agency, the Cloud Security Alliance, and ISACA, which are organizations that help to shape the future of Cloud Computing Security deal with interoperability between cloud providers among other topics, demonstrated the need to immediately address Cloud interoperability in earnest.

In a recent survey carried out by the European Network and Information Security Agency (ENISA) [7], the principal reasons for Small and Medium-sized Enterprises (SMEs) to adopt cloud computing were to avoid capital expenditure in hardware, software, IT support, and information security by outsourcing (70% of SMEs responded in favor of this, and 67% found flexibility, scalability, and IT resources to be key to utilizing cloud). SMEs’ main concerns were that 44% were concerned about privacy and the availability of services and 48% were worried over loss of control of their own services. ENISA published a Cloud Computing Report in November 2009 [8] on the benefits, risks, and recommendations for information security, detailing that the cloud’s economies of scale and flexibility are both a friend and a foe from a security point of view. The massive concentrations of resources and data present a more attractive target to attackers, but cloud-based defenses can be more robust, scalable, and cost-effective. The paper provided security guidance for potential and existing users of cloud computing.

15  Cloud Computing – Data Confidentiality and Interoperability Challenges

261

15.1.5  Interoperability and Standardization in Cloud Computing

The development of standards and interoperability between the varying levels of clouds is inevitable. It is also tied directly to the needed adoption by the enterprise. Without clearly defined standards, best practices, and open interoperability, further adoption of the cloud will evolve at a slower pace.

There have been a significant number of publications including those by the UK Government and European Commission itself, which have made the economic case for standards and their utilization in increasing innovation. The central premise of this is that they remove the need for innovative developers and product/service designers to waste time with the lower level functionality that has been developed by others. There can also be the sharing of common solutions between application areas through the utilization of building block technologies that are not subject or area-specific. This will allow increased European competiveness through ensuring that there is a minimization of the lag between early adopters and the main stream. This ensures that organizations of varying sizes are able to contribute to the economy, with their competitiveness not hindered by large scale “vendor lock-in” or proprietary services gaining market dominance.

Dynamic capability is one of the features of cloud that differentiates it from grid by offering resources as and when needed. Virtualization is another key difference. These are among the drivers to adoption. However, there are many challenges to be addressed with grid computing community contributing to cloud needs, above all, for the Open Grid Forum (Open Grid Forum). Interoperability is not the only issue. SLAs are a big challenge, as start-up companies or SMEs, which are currently the major cloud users, want freedom of choice, although Amazon EC2 is the current market leader and the de facto standard cloud service provider. If these companies want to move to another provider, then the problem revolves not only around VM migration, but also other services such as databases that lack compatibility. Other challenges concern how to move existing software packages from internal data centers to external clouds, bearing in mind that the architecture of the majority of this software does not support scale-out, as well as network bandwidth utilization.

It suffices to say that cloud portability, possible via guaranteed standards and interoperability, has to occur in the future, and the major players in this arena have to be involved. The lack of involvement of the major players will lead to standard clouds and nonstandard clouds or companies providing some form of filtering mechanism or converters to allow for portability.

15.1.6  Open Grid Forum’s (OGF) Production Grid Interoperability Working Group (PGI-WG) Charter

Open Grid Forum’s (OGF’s) Grid Interoperation Now Community Group (GIN-CG) and the Production Grid Infrastructure Working Group (PGI-WG) lead the interoperability of global grid infrastructures. The PGI-WG, a spin-off from GIN-CG,

262

F. Gagliardi and S. Muscella

brings together members of production grid infrastructures from all over the world to address related challenges, building on the experiences of GIN-CG to create profile documents to be fed into OGF standardization groups. This focus enables work on refined or new OGF specifications. The PGI-WG chiefly focuses on three OGF standards, working closely with the dedicated working groups:

Job Submission Description Language (JSDL)

Open Grid Services Architecture-Basic Execution Service (OGSA-BES)

Grid Laboratory Uniform Environment (GLUE) schema

The efforts of GIN-CG and PGI-WG represent important milestones by enabling other grid infrastructure communities and software providers that intend to implement these specifications to join the standardization activity and contribute their experiences. This work is also a significant step in the grid community’s transition to the model proposed by EGI, where e-Infrastructures built from different software will have to operate seamlessly together. Through this work, the ongoing efforts of the Usage Records and Resource Usage Service Working Group will continue and move to include their outputs into the Production Grid Profile being developed.

15.1.7  Achievements in the OGF Open Cloud Computing

Interface (OGF-OCCI)

The OGF Open Cloud Computing Interface Working Group (OCCI-WG) is developing a clean, open application programming interface (API) for “Infrastructure as a Service” (IaaS) based Clouds. IaaS is one of the three primary services, alongside Software, and Platform, of the emerging Cloud industry. OCCI-WG is a working group of OGF established in March 2009. The group has active membership of over 160 individuals, and is led by four chairs from industry, academia, service providers, and end users. Several members are from commercial service providers that are committed to implementing the OGF-OCCI specification.

15.1.7.1  What will OCCI Provide?

OCCI is a very slim REST-based API, which can be easily extended as shown in Fig. 15.1. Without the overhead of many similar protocols, the REST approach allows users to easily access their services. Every resource is uniquely addressed using a Uniform Resource Identifier (URI).

Based on a set of operations – create, retrieve, update, and delete – resources can be managed. Currently, three types of resources are considered: storage, network, and compute resources. Those resources can be linked together to form a virtual machine with assigned attributes. For example, it is possible to provision a machine which has 2 GB of RAM, one hard disk, and one network interface.

Соседние файлы в папке CLOUD COMPUTING