Remote attestation is an important characteristic of trusted computing, which provides reliable evidence that a trusted environment actually exists. This feature enables the remote parties to make certain assumptions about the trusted behavior of a remote platform and the software running on it. Existing approaches for the realization of this novel but straightforward concept expect trusted behavior from binaries, configurations or properties of a platform. We note that all these approaches are low-level behavior measurement techniques only, and none of these approaches define what a trusted behavior is and how to define it. Other approaches propose to associate the trusted behavior of a platform with its security policies.


Due to an enormous number of different types of security policies like e.g., different operating system policies, web service policies etc, it is also not adequate to associate the trustworthiness of a platform with its security policies. In this project, we take a novel approach where the trustworthiness of a platform is associated with the behavior of a policy model. In our approach, the behavior of a policy model is attested rather than a software or hardware platform. In this way, the attestation feature is not tied to a specific software or hardware platform, or to a specific behavior measurement technique, or to a specific type of security policy

The term “Trusted Computing” refers to a technology – introduced in the very months by the Trusted Computing Group (TCG), in which PCs, consumer electronic devices, PDA’s and other mobile devices are equipped with a special hardware chip called Trusted Platform Module (TPM). In accordance with other security hardware extensions, the trusted platform module is empowered with cryptographic mechanisms to remotely certify the integrity of the (application/system) software running on the device – called remote attestation, to protect I/O and storage of data inside the device and, to strictly isolate the data residing inside memory from other potentially malicious applications. This practice is well designed to effectively fight against malicious code, viruses, privacy violations, etc. The reason is that current practices for fighting against malicious code and other threats purely at the software level by their very nature are uncompromising. Indeed, it has been learned from past experiences that a trusted and tamperproof security basis cannot be achieved using software-based solutions alone.

Remote attestation is an essential characteristic of trusted computing, which provides reliable evidence that a trusted environment actually exists. This feature enables a trusted computing platform to remotely certify to third parties in enciphered form the behavior of its running software and the status of its hardware and software components. In a typical remote attestation scenario, a challenger verifies the trustworthiness of a target platform1 before dispatching a resource, or before and/or during an access to an object. The target platform generates a certificate signed by its trusted platform module to prove its trustworthiness. Once the target platform has provable trusted environment, sensitive data or resource can be released to it. The trusted computing group defines the trustworthiness of a platform as follows: “trust is the expectation that a device will behave in a particular manner for a specific purpose”. In the stated goal or definition of trusted computing group, the term particular manner is the way how a task is expected to be performed. On the other hand, the term specific purpose refers to a particular task or scenario like usage of an object, web service access, or some computational activity. In other words, “trust is directly proportional to the expected behavior of a particular task”.

In this project, we analyze the existing approaches for the realization of remote attestation and propose a scalable and novel approach for remote attestation in which the trustworthiness of a target platform is associated with the expected behavior of its policy models. A target platform can have different policy models in place for different specific purposes like e.g., Mandatory Access Control and Discretionary Access Control for integrity protection, BIBA and Clark Wilson for information flow control, UCON for usage control etc. Keeping in view the fact that, these policy models provide an abstract and formal representation of the security properties of a platform, we define trust as follows: “trust is the expectation that a model will behave in a particular manner for a specific purpose”. Thus, in our approach, a target platform is trustworthy for a challenger, if the behavior of its policy model is trustworthy, that corresponds to the specific purpose of the challenger. We believe that our model-based behavioral attestation approach is the closet realization of the stated goal of trusted computing group. The reason is that, a model provides an abstract and formal representation for some specific purpose like e.g., UCON for usage control, RBAC for role based access control etc.

In order to realize the concept of remote attestation, several approaches have been proposed in the literature. For example, the approach of configuration-based attestation requires that a target platform presents the trusted configurations of its platform to a challenger. Based on the configurations, the challenger determines the trustworthiness of a target platform. However, the approach has few deficiencies; for instance, revealing system configurations may give some insights of the target platform, thus a security attack is inevitable. For example, it is unacceptable if a target platform within a military domain is asked to reveal its system configurations. The approach of binary attestation in the implementation of Integrity Measurement Architecture (IMA). In their approach, a target platform presents the trusted status of all its components loaded after booting. However, IMA is not very practical in open and heterogeneous environments. Regarding the stated goal or definition of trusted computing group2, the approach associates the trustworthiness of a platform with a binary of a software. However, what is expected is the dynamic and run time behavior of a software.

Leave a Reply


(Required)

(Required)