Modelling And Reasoning About Trust\ud Relationships In The Development Of\ud Trustworthy Information Systems

Doctoral thesis English OPEN
Pavlidis, Michail (2014)

Trustworthy information systems are information systems that fulfill all the functional\ud and non-functional requirements. To this end, all the components of an information\ud system, either human or technical, need to collaborate in order to meet its\ud requirements and achieve its goals. This entails that system components will show\ud the desired or expected behaviour once the system is put in operation. However,\ud modern information systems include a great number of components that can behave\ud in a very unpredictable way. This unpredictability of the behaviour of the system\ud components is a major challenge to the development of trustworthy information systems\ud and more particularly during the modelling stage. When a system component\ud is modelled as part of a requirements engineering model it creates an uncertainty\ud about its future behaviour, thus undermining the accuracy of the system model and\ud eventually the system trustworthiness. Therefore, the addition of system components\ud inevitably is based on assumptions of their future behaviour. Such assumptions are\ud underlying the development of a system and usually are assumptions of trust by the\ud system developer about her trust relationships with the system components, which\ud are instantly formed when a component is inserted into a requirements engineering\ud model of a system. However, despite the importance of such issues, a requirements\ud engineering methodology that explicitly captures such trust relationships along with\ud the entailing trust assumptions and trustworthiness requirements is still missing.\ud For tackling the preceding problems, the thesis proposes a requirements engineering\ud methodology, namely JTrust (Justifying Trust) for developing trustworthy information\ud systems. The methodology is founded upon the notions of trust and control\ud as the means of confidence achievement. In order to develop an information system\ud the developer needs to consider her trust relationships with the system components\ud that are formed with their addition in a system model, reason about them, and proceed\ud to a justified decision about the design of the system. If the system component\ud cannot be trusted to behave in a desired or expected way then the question of what\ud are the alternatives in order to build confidence in the future behaviour of the system\ud component raises. To answer this question we define a new class of requirements,\ud namely trustworthiness requirements. Trustworthiness requirements prescribe the\ud functionality of the software included in the information system that compels the\ud rest of the information system components to behave in a desired or expected way.\ud The proposed methodology consists of: (i) a modelling language which contains trust\ud i\ud and control abstractions; (ii) and a methodological process for capturing and reasoning\ud about trust relationships, modelling and analysing trustworthiness requirements,\ud and assessing the system trustworthiness at a requirements stage. The methodology\ud is accompanied by a CASE tool to support it.\ud To evaluate our proposal, we have applied our methodology to a case study, and\ud we carried out a survey to get feedback from experts. The topic of the case study was\ud the e-health care system of the National Health Service in England, which was used to\ud reason about trust relationships with system components and identify trustworthiness\ud requirements. Researchers from three academic institutions across Europe and from\ud one industrial company, British Telecom, have participated in the survey in order to\ud provide valuable feedback about the effectiveness and efficiency of the methodology.\ud The results conclude that JTrust is useful and easy to use in modelling and reasoning\ud about trust relationships, modelling and analysing trustworthiness requirements and\ud assessing the system trustworthiness at a requirements level.
  • References (140)
    140 references, page 1 of 14

    Avizienis, A., J.-C. Laprie, and B. Randell (2004). \Dependability and Its Threats: A Taxonomy". In: Building the Information Society. Ed. by R. Jacquart. Vol. 156. IFIP International Federation for Information Processing. Springer US, pp. 91{120.

    Axelrod, R. (1984). The evolution of Cooperation. New York, NY, USA: Basic Books Inc.

    Bangemann, M. et al. (1994). \Recommendations to the European Council: Europe and the global information society". In: Brussels: European Commission.

    Barber, B. (1983). The logic and Limits of Trust. Rutgers University Press.

    Berard, E. V. (1995). What is a methodology. White Paper. The Object Agency.

    Berzins, V. (2004). \Trustworthiness as risk abatement". In: Center for National Software Studies Workshop on Trustworthy Software. Citeseer, p. 9.

    Bigley, G. A. and J. L. Pearce (1998). \Straining for Shared Meaning in Organization Science: Problems of Trust and Distrust". In: The Academy of Management Review 23.3, pp. 405{421.

    Bimrah, K. K. (2009). \A Framework for Modelling Trust During Information Systems Development." PhD thesis. University of East London.

    Blaze, M., J. Feigenbaum, and J. Lacy (1996). \Decentralized trust management". In: Security and Privacy, 1996. Proceedings., 1996 IEEE Symposium on, pp. 164{173.

    Blaze, M., J. Feigenbaum, and A. D. Keromytis (1999). \KeyNote: Trust Management for Public-Key Infrastructures (Position Paper)". In: Proceedings of the 6th International Workshop on Security Protocols. London, UK, UK: Springer-Verlag, pp. 59{63.

  • Metrics
    views in OpenAIRE
    views in local repository
    downloads in local repository

    The information is available from the following content providers:

    From Number Of Views Number Of Downloads
    ROAR at University of East London - IRUS-UK 0 26
Share - Bookmark