Accepted Papers

  • Robust Watermarking Algorithm Based on Multilevel Histogram Modification
    AAmal Hamdy, Mohamed Hashem, Amal El-Sharshaby and Sawsan Shouman, Ain Shams University, Egypt
    ABSTRACT
    Digital watermarking technology has been adopted lately as an effective solution to protecting the copyright of digital assets from illicit copying. Reversible watermarking is an important requirement for many watermarking applications such as military and medical data. Reversibility is the ability to restore the original data from the watermarked data without any loss of information. Many of the existing algorithms for reversible robust watermarking suffer from low hiding capacity. In this paper we propose a new reversible, robust and blinded watermarking algorithm for relational database gives high hiding capacity compared to others techniques. The proposed technique based on a multilevel histogram modification which allows to control in the embedding level. Experimental results reveal that the proposed technique provides embedding capacity while keeping the distortions at a low level and it robust against various malicious attacks.
  • User Behavior Tracking Framework
    Mohamed Ahmed Abdullah Mohamed, BananIT, Sudan
    ABSTRACT
    In normal circumstances the development team create the project by any best practices and then ship the software to the production environment where users start using the application and everybody is happy, usually the developers will never know what is going on with their app in the live environment, and how the users actually using it, Are the users using the app like we expected and planed? Did they use that feature of the app? Or they don't know about it? Witch better this feature or that? What is going to happen if we stop this part? To answer this question and to combat this problems, this work provide software measurement tools, systematic guidance and a conceptual modeling approach to control and improve software development processes. However, due to the high cost associated with collecting the data and difficulties in analyzing it, measurement tools are not widely adopted by software organizations but with this framework things will be easy for everybody with pre-defined mechanisms they will be able to collect the data and analyzing it very easy or at least depending on the problem complexity. In the conventional situations when the developer become skeptical about something usually they deal with user experience consultant or taking some measurements from user experience lab, sometimes they did it by them self but there isn't any systematic and standard way they can do that easily, the aim to make them use the proposed framework to guide them and giving them approach to get the job done.
  • A 2-tier Risk Mitigation Framework for Reactive Routing Protocols
    Kinjal Roy, Nitin Singh and Madhukar Sikaria, IIT Delhi, India
    ABSTRACT
    Wireless networks unlike wired networks have their own set of problems. As the topography of the network is constantly changing, the routing pattern of the data packet becomes an important metric for the overall security of the network. Numerous research works have been done both on routing protocols and on the security issues of the network. But since these two metrics are intertwined for proper and optimized functioning of the entire network, a comprehensive amount of work needs to be done which balances both of them. To bridge this gap, this paper will present a derived version of the basic reactive routing algorithms mostly DSR and AODV which will take care of the security through a two-tier approach.
  • A Survey on Future Data Center Network Architectures
    Raheel Rehman, University of Waterloo, Canada
    ABSTRACT
    Nowadays data centers experience tremendous amount of network traffic. A significant portion of this traffic can be attributed to big-data applications and cloud computing. Large data centers with high bandwidth switches are required to encounter this network load. Traditional data center network architecture is based on electronic packet switch which consume incredibly high power to handle the amplified communication bandwidth of these emerging applications. Alternate technologies comprising of purely optical or hybrid optical/electrical technologies have gained attention as promising solutions. These architectures offer low latency, high throughput, reduced energy consumption and low complexity of interconnects compared to traditional data center networks. This paper presents a survey of these next generation data center network architectures. Furthermore the paper also discusses on how centralized network management can deliver application aware network control. Configuration of network using a global view of both network state and application demands can be a real game changer for high performance data center architectures.
  • A Framework for Plagiarism Detection in Arabic Documents
    Imtiaz Hussain Khan, Muazzam Siddiqui, Kamal Jambi and Abobakr Bagais, King Abdulaziz University, Saudi Arabia
    ABSTRACT
    We are developing a web-based plagiarism detection system to detect plagiarism in written Arabic documents. This paper describes the proposed framework of our plagiarism detection system. The proposed plagiarism detection framework comprises of two main components, one global and the other local. The global component is heuristics-based, in which a potentially plagiarized given document is used to construct a set of representative queries by using different best performing heuristics. These queries are then submitted to Google via Google's search API to retrieve candidate source documents from the Web. The local component carries out detailed similarity computations by combining different similarity computation techniques to check which parts of the given document are plagiarised and from which source documents retrieved from the Web. Since this is an ongoing research project, the quality of overall system is not evaluated yet.
  • Man in the Middle attack: Implementation in Wireless LAN
    Octavio Salcedo Parra1 and Brayan Reyes2, 1Universidad Distrital "Francisco Jose de Caldas" Facultad de Ingenieria, Colombia and 2Intelligent Internet Research Group, Colombia
    ABSTRACT
    This paper describes, analyzes and implements a man in the middle attack in a wireless LAN 802.11n1 using the techniques of ARP poisoning and DNS spooling, the attacker's goal is to direct all web requests made by the victim to different internet domains to a single website. For the development of this attack has been implemented in python language program running on one host of a home network, conformed of two laptops and a router that connects to the Internet, at the end of the attack all HTTP requests to any internet domain to perform the victim host are redirected to the web server that runs on the attacker host.
  • Congestion Window Comparison between TCP Friendly and TCP Reno
    Octavio Salcedo Parra1 and Brayan Reyes2, 1Universidad Distrital "Francisco Jose de Caldas" Facultad de Ingenieria, Colombia and 2Intelligent Internet Research Group, Colombia
    ABSTRACT
    This paper assesses Congestion Window techniques between TCP Friendly and TCP Reno, with its respective equation. According to the equations mentioned above, we make a simulation with 1,000 iterations that is used to vary the parameters of a stochastic heuristic algorithm created by the authors.
  • By performing the above, we obtained the best performance between the two types of TCP, with respect to packet loss rate, being the best one TCP Reno.
  • Performance Evaluation of Routing Protocol OSPFv3 on the link PE-CE on MPLS/VPN
    Octavio Salcedo Parra1 and Brayan Reyes2, 1Universidad Distrital "Francisco Jose de Caldas" Facultad de Ingenieria, Colombia and 2Intelligent Internet Research Group, Colombia
    ABSTRACT
    The rapid growth of networks base on IP, and the current challenge posed by the technological deployment of IPv6 and annexed applications, challenges that must confront the Internet Service Provider and have stimulated the development for rigorous researches on the topic. The Internet Service Providers ISP offer infrastructure for implementation of virtual private network VPN, where is fundamental the definition of routing schemas between the border route of client CE and the provider PE. In this sense, have been proposed different schemas where the new protocols as Open Short Path First version 3 OSPFv3 have a key role. In the context of VPN, the routing protocol BGP is used to distribute the client's path, the multi-protocol label switching MPLS is used to send the information packages through the network core in tunnel mode. Originally, only IPv4 was supported and expanded after support OSPFv2 and VPN IPv6. Based on the new specifications in order to support OSPFv3 as a routing protocol PE-CE and the current technological infrastructures begin the process of IPv6 deployment, these elements driving this research which evaluate the performance of routing protocol OSPFv3 on border scenarios MPLS/VPN/IPv6.
  • A Web Content Analytics Architecture For Malicious Javascript Detection
    Jung Jonghun, kisa, Korea
    ABSTRACT
    Recent web-based cyber attacks are evolving into a new form of attacks such as private information theft and DDoS attack exploiting JavaScript within a web page. These attacks can be made just by accessing a web site without distribution of malicious codes and infection. Script-based cyber attacks are hard to detect with traditional security equipments such as Firewall and IPS because they inject malicious scripts in a response message for a normal web request. Furthermore, they are hard to trace because attacks such as DDoS can be made just by visiting a web page. Due to these reasons, it is expected that they could result in direct damages and great ripple effects. To cope with these issues, in this article, a proposal is made for techniques that are used to detect malicious scripts through real-time web content analysis and to automatically generate detection signatures for malicious JavaScript.
  • A Highly Parallel Implementation of the H.264 Transformation and Quantization on GPU
    Asif Ali Khan, Laiq Hasan and Sohail Khan, UET Peshawar, Pakistan
    ABSTRACT
    H.264 is the most efficient video coding standard to-date. Real time encoding is essential for this standard to take place in consumer marketplace. This however seems difficult due to the constant rise in computational demands with respect to the frame resolution. Graphical Processing Units (GPUs) have emerged as the most viable source for accelerating these kind of applications. This paper discusses the acceleration of Discrete Cosine Transform (DCT) and quantization blocks of the H.264 on GPUs and Multi-core CPU machines. The two blocks are relatively less complex, however, their speedup is required for real time operations. Experimental results show that OpenMP based multi-core implementation is around 2 times faster than sequential implementation while GPU based parallel implementation is up to 9.51 times faster than its sequential counterpart.
  • Semantic Extraction of Arabic Multiword Expressions
    Samah Maghwry1, Abeer Elkorany2, Akram Salah2 and Tarek Elghazaly1, 1Institute of statistical studies and research, Cairo University, Egypt and 2Fcaulty of computers and information
    ABSTRACT
    A considerable interest has been given to Multiword Expression (MWEs) identification and treatment. The identification of MWEs affects the quality of results of different tasks heavily used in natural language processing (NLP) such as parsing and generation. Different approaches for MWEs identification have been applied such as statistical methods which employed as an inexpensive and language independent way of finding co-occurrence patterns. Another approach relays on linguistic methods for identification, which employ information such as part of speech (POS) filters and lexical alignment between languages is also used and produced more targeted candidate lists. This paper presents a framework for extracting Arabic MWEs (nominal or verbal MWEs) for bi-gram using hybrid approach. The proposed approach starts with applying statistical method and then utilizes linguistic rules in order to enhance the results by extracting only patterns that match relevant language rule. Different experiments have been applied using real Arabic corpus confirm that using the proposed hybrid approach outperform other traditional approaches.
  • Improved Security Providing Routing Mechanism in MANET with Channel Adaptivity under Fading
    Anupriya Augustine and Jubin Sebastian E, Vimal Jyothi Engineering College, Chemperi, India
    ABSTRACT
    Mobile Ad Hoc Network is a type of wireless network without a fixed topology consist a set of self organized nodes which are randomly, frequently and unpredictably mobile. In MANETs packet transmission is affected by radio link fluctuations. Hop count is a simple routing metric that calculate the distance between a source and destination on the number of routers in the path. Routing protocols for ad hoc networks have less channel fading. The minimum hop count is not enough for a routing protocol to achieve a good performance. MANET is an open environment and it is susceptible to many security attacks due to dynamic topology and lack of centralized monitoring authority. Anonymous routing protocols conceal the identities about the route, source and destination to provide security and privacy from intruder's attacks. So in this paper, channel adaptive protocol with improved node security, extensions to a multipath routing protocol to accommodate channel fading and node security is introduced. The resulting protocol is referred to as Channel Adaptive routing protocol with node security (CARNS). Using channel state information (CSI), a pre-emptive handoff strategy is applied to maintain reliable and stable connections. Paths are reusable, rather than simply regarding them as useless. In this paper we provide performance analysis of CARNS, as well as comparison between CARNS with AODV and AOMDV. The simulation results which confirms the improved network performance of CARNS, both in terms of node security and channel fading.
  • Analysis of Computational Complexity for HT-based Fingerprint Alignment Algorithms on Java Card Environmentg
    Cynthia Sthembile Mlambo, Meshack Bafana Shabalala and Fulufhelo Nelwamondo, CSIR, University of Johannesburg, South Africa
    ABSTRACT
    The Java Card Environment have a limited instruction sets, unlike other languages. The challenge is that Smart Card applications are increasing in the market as one of the mostly used technology. Currently what is happening in the industry is that most of the applications are shifting from computer based and large applications into small portable applications that can function on smart Cards. The basic recent application is the use of identification and verification of an individual using the Smart Card; this involves fingerprint based recognition systems. In this paper, implementations of three Hough Transform based fingerprint alignment algorithms are analyzed with respect to time complexity on Java Card environment. Three algorithms are: Local Match Based Approach (LMBA), Discretized Rotation Based Approach (DRBA), and All Possible to Match Based Approach (APMBA). The aim of this paper is to present the complexity and implementations of existing work of one of the mostly used method of fingerprint alignment. So that the complexity can be simplified or find the best algorithm with efficient complexity and implementation that can be easily implemented on Java Card environment for match on card. Efficient involves the accuracy of the implementation, time taken to perform fingerprint alignment, memory required by the implementation and instruction operations required and used.
  • Multiple User Interfaces and Cross-Platform User Experience: Theoretical Foundations
    Khalid Majrashi and Margaret Hamilton, RMIT University, Australia
    ABSTRACT
    Evaluating the user experience of cross-platform interactive systems has become increasingly essential. The lack of clear concepts, and definitions that can be used in the context of testing, evaluating or even teaching cross-platform user experience may confuse or even mislead people in the field of user experience. In this paper, we reviewed the actual meanings and interpretations of different concepts that are relevant to a service that can run across-platforms and relevant concepts used in the field of Human-Computer Interaction (HCI). We also investigated the traditional definitions of usability and user experience and differences between them and then provided precise definitions for cross-platform usability and user experience. Our explanations, definitions, and discussions aim to provide the necessary extensions of existing theories to provide the necessary theoretical foundations for cross-platform user experience evaluation.
  • Gender Classification Based on Self-Organizing Neural Network Clustering
    Vahid Rostami and Shirin Yazdanpanah, Qazvin Islamic Azad University, Iran
    ABSTRACT
    Face, being one of the most significant biometrics, includes a lot of helpful information of human beings such as gender, age, race, and identity. Nowadays, gender classification based on face images has received substantial attention. Sex recognition can be useful for human-computer interaction, such as the identification of individuals as well as in TV network systems for detecting the rate of viewers. In this paper, we address a method to derivate the face features using integration of the Gabor filters and local binary patterns. These features are noise invariant and selection of proper features against images bottleneck. In order to obtain a proper classification, self-organized map (SOM) is used that has capability to find the proper weights for each gender. The average identification rate has been gained to 92.5 that results are comparable with former methods which are work on AR and Ethnic database.
  • Quality Assessment for Online Iris Images
    Sisanda Makinana, Tendani Malumedzha and Fulufhelo V. Nelwamondo, CSIR, South Africa
    ABSTRACT
    Iris recognition systems have attracted much attention for their uniqueness, stability and reliability. However, performance of this system depends on quality of iris image. This is because in order to obtain reliable features good quality images are to be used. Therefore there is a need to be able to select good quality images before features can be extracted to ensure reliable matching. In this paper, iris quality assessment research is extended by analysing the effect of standard deviation, contrast, area ratio, occlusion, blur, dilation and sharpness of an iris image. Firstly, each parameter is estimated individually, and then fused to obtain a quality score. A fusion method based on principal component analysis (PCA) is proposed to determine whether an image is good or not. To test the proposed technique; Chinese Academy of Science Institute of Automation (CASIA), Internal Iris Database (IID) and University of Beira Interior (UBIRIS) databases are used. Support Vector Machine (SVM) was used to evaluate the performance of the proposed quality assessment algorithm. A k-fold cross validation technique was employed to the classifiers in order to obtain unbiased results. The experimental results demonstrated that the proposed algorithm is capable of detecting poor quality images as it yields an efficiency of over 84 % and 90 % respectively in Correct Rate and Area Under the Curve. The use of character component to assess quality has been found to be sufficient, though there is a need to develop a better technique for standardization of quality. The results found using a SVM classifier affirms the proposed algorithm is well-suited for quality characterisation.
  • Application of Rhetorical Relations Between Sentences to Cluster-Based Text Summarization
    Nik Adilah Hanin Binti Zahri1, Fumiyo Fukumoto2 and Suguru Matsuyoshi2, 1University of Malaysia Perlis, Malaysia and 2University of Yamanashi, Japan
    ABSTRACT
    Many of previous research have proven that the usage of rhetorical relations is capable to enhance many applications such as text summarization, question answering and natural language generation. This work proposes an approach that expands the benefit of rhetorical relations to address redundancy problem in text summarization. We first examined and redefined the type of rhetorical relations that is useful to retrieve sentences with identical content and performed the identification of those relations using SVMs. By exploiting the rhetorical relations exist between sentences, we generate clusters of similar sentences from document sets. Then, cluster-based text summarization is performed using Conditional Markov Random Walk Model to measure the saliency scores of candidates summary. We evaluated our method by measuring the cohesion and separation of the clusters and ROUGE score of generated summaries. The experimental result shows that our method performed well which shows promising potential of applying rhetorical relation in cluster-based text summarization.
  • An Empirical Evaluation of Cryptool in Teaching Computer Security
    Mabroka Maeref and Fatma Algali, Sebha university of Libya, Libya
    ABSTRACT
    In the area of network security, the fundamental security principles and security practice skills are both required for students' understanding. Instructors have to emphasize both; the theoretical part and practices of security. However, this is a challenging task for instructors' teaching and students' learning. For this reason, researchers are eager to support the lecture lessons by using interactive visualization tools. The learning tool CrypTool 2 is one of these tools that mostly cover all of the above. In fact, the evaluations of the effectiveness of the tools in teaching and learning are limited. Therefore, this paper provides an overview of an empirical evaluation for assessing CrypTool 2 tool. The effectiveness of this tool was tested using an empirical evaluation method. The results show that this visualization tool was effective in meeting its learning objectives.
  • Requirements Engineering for Business Intelligence- State of the Art and Current Trends
    Mabroka Maeref and Fatma Algali, Erasmus Mundus PhD candidate, Belgium
    ABSTRACT
    The development of Business Intelligence (BI) systems differs from the development of transaction-oriented systems because these systems do not aim to automate operational business transactions, but support the decision-making process in an organisation. A specific Requirements Engineering (RE) process is needed to support the development of such systems. RE literature for BI systems offers a comparatively low-structured body of knowledge regarding the conventional RE process. This process typically includes such tasks as discovering, specifying, validating, verifying and managing requirements. This paper evaluates the RE state of the art approaches for BI systems with respect to techniques and methodologies developed to address specific tasks of the RE process.We have studied current approaches and identified RE challenges posed by BI needs for each task. Finally, we highlight what we consider to be the important current and future research topics of RE in the BI context.
  • Governance of Information Systems: a State of the art
    Hamdoun Imene, Sayeb Yemna and Ben Ghezala Henda, laboratoire RIADI, Tunisia
    ABSTRACT
    Given the huge growth of the complexity of IT assets, application requirements and adopted technologies, the evolution of the company becomes a more difficult task. The company must organize and adopt rules and working methods that enable it to manage change, anticipate evolution and to assess the risks and impacts of these changes on the information system (IS) of the company. Thus, to ensure the overall direction, effectiveness, supervision and responsibility of the company, a change of an information system (redesign, migration, renewal, etc ...), whether it is made by an environmental change or by decision makers, must satisfy the rules and methods of governance of the information system.
  • This communication focuses on the theme of the governance of information systems. We begin by recalling the fundamentals of governance and corporate governance then we treat the governance of the information system's principles, its tools and its role in the organization of thinking and decision-making and in monitoring the implementation of decisions within the information system.
  • Formally Modelling the Structure of Single Access Point Security Pattern using Codecharts
    Abdullah A. H. Alzahrani, School of Computer Science and Electronic Engineering, United Kingdom
    ABSTRACT
    Security design patterns are usually described using a variety of UML diagrams beside textual statements. Often, UML class diagram is used to describe the structural aspects and UML sequence and/or activity diagrams are used to describe the behavioural aspects. So, when implementing, verifying, and/or detecting instances of those patterns, the issue of formality raised as UML diagrams are not formal. Many researchers have tried to formalise UML diagrams, however, loss of information and other problems are found as a result of such an action. It is important that a security pattern is implemented correctly as incorrect implementation might result into a security flaw. In this paper we introduce using LePUS3 to formally model the structural aspects of security patterns. We show a formal modelling for Single Access point (SAP) pattern in LePUS3. Furthermore, we show how TTP Toolkit is employed to verified design conformance of SAP in Java Authentication and Authorization Service (JAAS).
  • Enterprise Data Protection: Meeting Requirements with Efficient and Cost-Effective Methods
    Khaled Aldossari, Saudi Aramco, Saudi Arabia
    ABSTRACT
    This paper addresses the major challenges that large organizations face in protecting their valuable data. Some of these challenges include recovery objectives, data explosion, cost and the nature of data. The paper explores multiple methods of data protection at different storage levels. RAID disk arrays, snapshot technology, storage mirroring, and backup and archive strategies all are methods used by many large organizations to protect their data. The paper surveys several different enterprise-level backup and archive solutions in the market today and evaluates each solution based on certain criteria. The evaluation criteria cover all business needs and help to tackle the key issues related to data protection. Finally, this paper provides insight on data protection mechanisms and proposes guidelines that help organizations to choose the best backup and archive solutions.
  • E-Education With Facebook - A Social Network Service
    Mohammad Derawi, Gjovik University College, Norway
    ABSTRACT
    In this paper, we study the social networking website, Facebook, for conducting courses as a replacement of high-cost classical electronic learning platforms. At the early stage of the Internet community, users of the Interned used email as the main communication mean. Although email is still the essential approach of communication in a suitable but offline mode, other services were introduced, such as many Instant Messaging (IM) software applications like ICQ, Skype, Viber, WhatsApp and MSN, which enable people to connect in a real-time mode. However, the communication between people was further improved to the next phase, when Facebook came to reality as a social networking homepage that wires many features. People do not only link with others, but also establish all kinds of connections between them. Facebook offers rich functions for forming associations. The framework of Facebook actually delivers without charge software that were provided by traditional electronic learning. This paper looks at how people apply Facebook for teaching and learning, together with recommendations provided.
  • Semantic Enrichment of Xml Schema to Transform Association Relationships in Odl Schema
    Doha Malki and Mohamed Bahaj, University hassan 1er, Morocco
    ABSTRACT
    This paper presents an approach for transforming an XML schema, we enriched, in ODL (Object Definition Language) schemas, it is possible to realize the concepts of ODL in a model of XML schema, we propose to introduce an enrichment embodying these concepts explicitly in the XML Schema models. We chose oriented object database as a target database because there are many common characteristics between XML and object-oriented model, Thus the mapping from XML data into object-oriented databases is more interesting; also the object-oriented data bases have become very widespread and acceptable, they offer an evolutionary approach, so we agree that it is time to develop a translation between XML and OO databases.
  • Our work is focused on preserving Semantics transformation of association relationships, we describe set of rules to create ODL classes from an enriched XML schema, the experimental show that the approach is feasible, and results are the same, the source database is transformed into target one without loss of data.
  • Optimal Path Search with Temporal Constraints
    Ravi Kishan Surapaneni and Adusumilli Sravani, V.R.Siddhartha Engineering College, India
    ABSTRACT
    Nodes Most navigational apps, nowadays, merely find point-to-point route specifics and cannot handle intricate search scenarios. A more elaborate navigation method that has a route search with effective routes for complex queries in heterogeneous environments, while dealing with uncertainties with regard to geographic entities was developed using Batch Forward Search (BFS) algorithm. Although BFS formulated a way to integrate these arbitrary constraints into a specific route search, they may not be useful to the user. In realistic scenarios, the navigational service provider should consider additional complicating factors such as the working hours of the entities to be visited, type of service those entities cater to and possible restrictions on the order by which the entities may be visited. We extend the Batch Forward Search algorithm with Temporal Approximation Algorithm to handle temporal constraints over route queries. We believe that proposed method will be effective and more elaborate compared to prior approaches.
  • A High Speed CLA Based SAD Computing Hardware Architecture for Motion Estimation.
    Manu T M1, Linganagoud Kulkarni2 and Basavaraj S Anami1, 1KLE Institute of Technology, India and 2B.V.B college of Engineering and Technology, India
    ABSTRACT
    In this paper we propose and implement a SAD computational block which is a basic building block for any motion estimation unit (ME). Real time video processing application like video conferencing, TV broadcasting etc demands high speed data communication with limited bandwidth available. To realize fast computation the hardware used for computation must have least delay and data must be compressed by large ratio due to limited bandwidth. Compression can be achieved by exploiting the temporal and spatial redundancy present in video frames. And delay can be reduced by designing a fast hardware for computation. SAD computation block must therefore be designed to meet the requirements of real time application and must be optimized for both speed and delay. In this paper we design a fast carry select adders and calculate the absolute difference between frames.
  • A New Hybrid Metric for Verifying Parallel Corpora of Arabic-English
    Saad Alkahtani, Wei Liu and William Teahan, Bangor University, United Kingdom
    ABSTRACT
    This paper discusses a new metric that has been applied to verify the quality in translation between sentence pairs in parallel corpora of Arabic-English. This metric combines two techniques, one based on sentence length and the other based on compression code length. Experiments on sample test parallel Arabic-English corpora indicate the combination of these two techniques improves accuracy of the identification of satisfactory and unsatisfactory sentence pairs compared to sentence length and compression code length alone. The new method proposed in this research is effective at filtering noise and reducing mis-translations resulting in greatly improved quality.
  • Intra-cluster Routing with Back-up Path in Sensor Networks
    Turki Abdullah1, Chonggun Kim2, Mary Wu2 and Hyeoncheol Zin2, 1Yeungnam University, Saudi Arabia, 2Yeungnam University, Korea,
    ABSTRACT
    The novel applications of sensor networks impose some novel requirements in wireless sensor network design. With the energy efficiency and lifetime awareness, the throughput and network delay also required to maintain to support emerging applications of sensor networks. In this paper, we propose throughput and network delay aware intra-cluster routing protocol. We introduce the back-up links in the intra-cluster communication path. The link throughput, communication delay, packet loss ratio, interference, residual energy and node distance are the considered factors in finding efficient path of data communication among the sensor nodes within the cluster. The simulation result shows the higher throughput and lower average packet delay rate for the proposed routing protocol than the existing benchmarks. The routing protocol also shows energy efficiency and lifetime awareness with the cost of lower average connectivity rate.