• International Journal of 

     Soft Computing and Software Engineering [JSCSE]

    ISSN:  2251-7545

    Prefix DOI  :  10.7321/jscse

    URL: http://JSCSE.com

     

    A Peer-Reviewed Journal 


      JSCSE
     
  •  The International Journal of 

    Soft Computing and Software Engineering [JSCSE]

   
 

Publication Year: [ 2011 ] [ 2012 ] [ 2013 ] [ 2014 ] [ 2015 ] [ 2016 ] [ 2017 ]


Advance Search    
Table of Contents [Vol. 3, No.3, Mar]

Cover Page (PDF)


Lotfi A. Zadeh
Doi : 10.7321/jscse.v3.n3.1
Page : 1 - 2
Show Summary
Abstract . The theory which is outlined in this lecture, call it RCC for short, is a system of reasoning and computation which is not in the traditional spirit. In large measure, RCC is oriented toward reasoning and computation in an environment of uncertainty, imprecision and partiality of truth. The centerpiece of RCC is the concept of a restriction—a basic concept which is deceptively simple. Informally, a restriction is an answer to a question of the form: What is the value of a variable, X? More concretely, a restriction, R(X), is a limitation on the values which X can take. A restriction is precisiated if R(X) is mathematically well defined; otherwise it is unprecisiated. Generally, restrictions which are described in a natural language are unprecisiated. A restriction is precisiable if it lends itself to precisiation. A restriction is singular if R(X) is a singleton; otherwise it is nonsingular. Nonsingularity implies uncertainty. Examples. Robert is staying at a hotel in Berkeley. He asks the concierge, “How long will it take me to drive to SF Airport?” Possible answers: one hour; one hour plus minus fifteen minutes; about one hour; usually about one hour, etc. Each of these answers is a restriction on the variable, Driving time. The first two answers are precisiated restrictions. The last two answers are unprecisiated. Another example. The concept of a restriction is considerably more general than the concept of an interval, set, fuzzy set and probability distribution. In one form or another, much of human cognition involves restrictions, particularly in the realms of everyday reasoning and decision-making. Humans have a remarkable capability to reason and, to some degree, compute with restrictions. What is needed is a theory which formalizes this capability. RCC may be viewed as a step in this direction. What should be noted is that existing approaches to reasoning and computation, other than RCC, do not have the capability of reasoning and computation with restrictions which are described in a natural language.
Keyword : Restriction-Centered; Reasoning; Computation




Mukesh Singhal
Doi : 10.7321/jscse.v3.n3.2
Page : 3 - 4
Show Summary
Abstract . Cloud computing offers several benefits in terms of scalability, cost and performance. These benefits have contributed to the wide-scale acceptance of the cloud computing paradigm and growing adoption by the industry. With this growth, limitations of this paradigm are beginning to surface. One such limitation is that contemporary clouds are not interoperable. This limitation arises due to proprietary technologies, heterogeneous interfaces and the tight tethering of service offerings to the host cloud. Current research solutions for enabling cloud interoperability are predominantly provider-centric, requiring cloud providers to adopt and implement the changes that facilitate interoperation. This approach faces several hurdles and can take a long time to hit the market. In the meantime, a client-centric approach to interoperation is necessary for providing its benefits to consumers in the current cloud ecosystem. To this end, a novel framework for cloud interoperation called collaborative cloud computing is proposed. The proposed framework provides dynamic, on-the-fly collaborations and resource sharing among cloud-based services, without preestablished collaboration agreements or standardized interfaces, through use of client-controlled mediating agents called proxies.
Keyword : Client-centric; Cloud; Cloud Computing




Mehran Sahami
Doi : 10.7321/jscse.v3.n3.3
Page : 5 - 5
Show Summary
Abstract . Gaining insight into how students learn to program is a critical factor in improving software engineering education. Despite the potential wealth of educational indicators expressed in students' approaches to completing programming assignments, how students arrive at their final solution is largely overlooked in courses--only their final program submission is evaluated as an indicator of their understanding of how to solve a particular programming problem. In this talk, we present a methodology which uses machine learning techniques to autonomously create a graphical model of how students in an introductory programming course progress through a programming assignment. We subsequently show that this model is predictive of which students will struggle with material presented later in the class. Our eventual goal is to be able better understand students' learning and the conceptual difficulties they may encounter as novice programmers so as to be able to provide better and more personalized guidance to them during their learning process, and ultimately improve education in software engineering.
Keyword : Machine Learning; Modelling; Learn; software engineering education




Ilmi Yoon
Doi : 10.7321/jscse.v3.n3.4
Page : 6 - 7
Show Summary
Abstract . Multiplayer online game or social gaming is a new multifaceted medium of communication to reach the masses (players) effectively that fosters healthy interaction and team cohesion. Recent multiplayer online game, FoldIT (Univ. of Washington) enabled ordinary game players to play for leisure while their intuitive and collaborative efforts has lead to unlocking of the structure of an AIDS-related enzyme that scientific community had been unable to unlock for a decade. Similarly, Google's image labeling game (licensed from CMU's ESP game) utilizes human's instinctual power of intuition to perform a computational task--a task computer has yet to succeed--all through playing an interactive multiplayer online game while Google collects all of the meaningful labels for image searches. These approaches, called Crowd Computing, make use of social interaction and competition tendencies to engage massive players to work together to achieve the intended objectives. 'World of Balance' is an educational multiplayer online game designed to promote the concept of ecosystem nurturing using scientific population dynamics simulation engine as backend. This game opens a mutually beneficial communication channel between biologists and masses (players). Players benefit by entertaining nurturing game and learning important aspects of ecosystem development and food-web stability while producing huge scientific data, useful for biologists to analyze population dynamics model and significantly infeasible for biologists to produce. The presentation focuses on the motivation, the current stage of the game and the challenges that the game development is facing now.
Keyword : Gamification; Crowd Sourcing; Human Computing; online game




Ibrahiem M. M. El Emary
Doi : 10.7321/jscse.v3.n3.5
Page : 8 - 22
Show Summary
Abstract . In the current time, social media plays an important role in our lives in many ways and we can also see its use in the development and humanitarian aid fields. Also, social media encapsulates Internet-based applications filled with publicly available digital content that is created, reviewed, and directed by mass users. With these basic components, social networking enables interaction and communication among Internet users, allowing them to author, edit, and share numerous types of texts, pictures, videos and audios. They are able to classify and label the content as well. The most important element of social networking is that it allows for mass socialization; that is, the enablement of collective action by Internet users. This session in interested in attracting new topics covering the major role of social network in advancing the three Cs especially in academic institutions for developing nations.
Keyword : Social Media; Coordination; Cooperation; Collaboration




Mehdi Bahrami
Doi : 10.7321/jscse.v3.n3.6
Page : 23 - 24
Show Summary
Abstract . Software architecture has emerged as a key sub-discipline of software engineering; particularly in the realm of the large-scale system development such as cloud computing systems. Cloud computing systems are constructed from many parts and components, the organization of the overall system and the software architecture presents a new set of design problems. This level of design has been addressed in high-level design for developing any application in a cloud computing systems. In this tutorial we provide an introduction to the emerging field of cloud computing software architecture. We begin by considering of what means of architecture and cloud computing systems, why we need software architecture for a cloud computing systems as a complex system, our motivation for study and research about software architecture, facts, goals, a number of common architectural styles upon which many systems are currently based and show how different styles can be combined in a single design. In this tutorial, we highlight some of the major problems and how people have an opportunity to persuade their innovations in this field. After that we will consider some issues in designing cloud computing architectures and selecting appropriate supporting technology. Then we present three case studies to illustrate how architectural representations can improve our understanding of cloud computing software systems. Finally, we survey some of the outstanding problems in the field, and consider a few of the promising research directions.
Keyword : Software Architecture, Cloud Computing Systems, Software Engineering, Network Software.




Paulette Acheson, Cihan Dagli, Nil Kilicay-Ergin
Doi : 10.7321/jscse.v3.n3.7
Page : 25 - 29
Show Summary
Abstract . Previous papers have described a computational approach to System of Systems (SoS) development using an Agent-Based Model (ABM). This paper describes the Fuzzy Decision Analysis used in the negotiation between the SoS agent and a System agent in the ABM of an Acknowledged SoS development. An Acknowledged SoS has by definition a limited influence on the development of the individual Systems. The individual Systems have their own priorities, pressures, and agenda which may or may not align with the goals of the SoS. The SoS has some funding and deadlines which can be used to negotiate with the individual System in order to illicit the required capability from that System. The Fuzzy Decision Analysis determines how the SoS agent will adjust the funding and deadlines for each of the Systems in order to achieve the desired SoS architecture quality. The Fuzzy Decision Analysis has inputs of performance, funding, and deadlines as well as weights for each capability. The performance, funding, and deadlines are crisp values which are fuzzified. The fuzzified values are then used with a Fuzzy Inference Engine to get the fuzzy outputs of funding adjustment and deadline adjustment which must then be defuzzified before being passed to the System agent. The first contribution of this paper is the fuzzy decision analysis that represents the negotiation between the SoS agent and the System agent. A second contribution of this paper is the method of implementing the fuzzy decision analysis which provides a generalized fuzzy decision analysis.
Keyword : fuzzy decision analysis ; agent based model ; system of systems




Atif Ali Khan, Oumair Naseer, Evor Hines, Daciana Iliescu
Doi : 10.7321/jscse.v3.n3.8
Page : 30 - 37
Show Summary
Abstract . One of the defining characteristic of human being is their ability to walk upright. Loss or restriction of such ability whether due to the accident, spine problem, stroke or other neurological injuries can cause tremendous stress on the patients and hence will contribute negatively to their quality of life. Modern research shows that physical exercise is very important for maintaining physical fitness and adopting a healthy life style. In modern days treadmill is widely used for physical exercises and training which enables the user to set up an exercise regime that can be adhered to irrespective of the weather. Among the users of treadmills today are medical facilities such as hospitals, rehabilitation centers, medical and physiotherapy clinics etc. The process of assisted training or doing rehabilitation exercise through treadmill is referred to as treadmill therapy. A modern treadmill is an automated machine having some built in functions and predefined features. Most of the treadmills used today are one dimensional and user can only walk in one direction. This paper presents the idea of using omnidirectional treadmills which will be more appealing to the patients as they can walk in any direction, hence encouraging them to do exercises more frequently. This paper proposes a fuzzy control design and possible implementation strategy to assist patients in treadmill therapy. By intelligently controlling the safety belt attached to the treadmill user, one can help them steering left, right or in any direction. The use of intelligent treadmill therapy can help patients to improve their walking ability without being continuously supervised by the specialist. The patients can walk continuously within a limited space and the support system will provide continuous evaluation of their state and can readjust the control parameters of treadmill accordingly to provide best possible assistance.
Keyword : Fuzzy Logic ; Treadmill Therapy ; Intelligent System ; Rehabilitation ; Omnidirectional Treadmill




Xiangyu Zeng, Dewang Chen, Zan Hou
Doi : 10.7321/jscse.v3.n3.9
Page : 38 - 45
Show Summary
Abstract . This paper adopts acceleration rate to evaluate the train ride comfort. Three comfort evaluation models are developed by using fuzzy system theory and the parameters of fuzzy sets are determined according to the measured data. Furthermore, the outputs of the three models are integrated by an ensemble learning method to give a comprehensive evaluation index for the ride comfort. The measured data of the train operation control system in Beijing subway Yizhuang line are used to validate the models. Moreover, a field experiment is conducted, and the experiment results have a good consistency with this paper’s methods. The results indicate that the three fuzzy models have good uniformity and the ensemble learning can enhance accuracy and robustness of the comfort evaluation.
Keyword : acceleration rate ; comfort ; fuzzy system ; ensemble learning




Alexandre Melo, Christiano Maciel, Suelene Correa, Antonio Morais
Doi : 10.7321/jscse.v3.n3.10
Page : 46 - 53
Show Summary
Abstract . Wireless Sensor Networks (WSN) have severe energy constraints imposed by limited capacity of the internal battery of sensor nodes. These restrictions stimulate the development of energy-efficient strategies aimed at increasing the period of stability and lifetime of these networks. In this paper, we propose a centralized control to elect more appropriate Cluster Heads, assuming three levels of heterogeneity and multi-hop communication between Cluster Heads. The centralized control uses the k-means algorithm, responsible for the division of clusters and Fuzzy Logic to elect the Cluster Head and selecting the best route of communication. The study results indicate that the proposed approach can increase the period of stability and lifetime in WSN.
Keyword : heterogeneous WSN ; cluster head ; fuzzy logic ; k-means




Maher Aburrous, Adel Khelifi
Doi : 10.7321/jscse.v3.n3.11
Page : 54 - 61
Show Summary
Abstract . Detecting phishing website is a complex task which requires significant expert knowledge and experience. So far, various solutions have been proposed and developed to address these problems. Most of these approaches are not able to make a decision dynamically on whether the site is in fact phished, giving rise to a large number of false positives. In this paper we have investigated and developed the application of an open source intelligent fuzzy-based classification system for e-banking phishing website detection. The main goal of the proposed system is to provide protection to users from phishers deception schemes, giving them the ability to detect the legitimacy of the websites. The proposed intelligent phishing detection system employed Fuzzy Logic (FL) model with classification mining algorithms. The approach combined the capabilities of fuzzy reasoning in measuring imprecise and dynamic phishing features, with the capability to classify the phishing fuzzy rules. The proposed intelligent phishing website detection system was developed, tested and validated by incorporating the scheme as a web based plug-in phishing toolbar. The results obtained are promising and showed that our intelligent fuzzy based classification detection system can provide an effective help for real-time phishing website detection. The toolbar successfully recognized and detected approximately 86% of the phishing websites selected from our test data set, avoiding many miss-classified websites and false phishing alarms.
Keyword : phishing website detection ; fuzzy logic ; data mining ; classification ; e-banking security ; intelligent plug-in toolbar




Mohammad Atique,Rahul Khokale
Doi : 10.7321/jscse.v3.n3.12
Page : 62 - 68
Show Summary
Abstract . Information retrieval on the internet is necessity of today’s quintessential technocrats. Enormous information is readily available on the internet. Information retrieval is the key application of internet, as it provides knowledge to the knowledge seekers. The volume of data on the internet is very large, and to fetch most appropriate and relevant information are the challenges in WBIR (Web Based Information Retrieval). Many times, internet applications need to deal with large amount of data collected from non-technical users and is imprecise and incomplete. In this paper, Web based information retrieval based on Fuzzy logic is presented. User query which can be vague or imprecise will be analyzed by using fuzzy inference rules and the optimum query will be generated for web crawlers so as to produce the desired web documents effectively and efficiently. The performance can be evaluated on the basis of precision and recall parameters.
Keyword : Web Based Information Retreival; Fuzzy Logic; Fuzzy Inference System




Shervin Ostadzadeh, Fereidoon Shams
Doi : 10.7321/jscse.v3.n3.13
Page : 69 - 74
Show Summary
Abstract . For the last two decades, software architecture has been adopted as one of the main viable solutions to address the ever-increasing demands in the design and development of software systems. Nevertheless, the rapidly growing utilization of communication networks and interconnections among software systems have introduced some critical challenges, which need to be handled in order to fully unleash the potential of these systems. In this respect, Ultra-Large-Scale (ULS) systems, generally considered as a system of systems, have gained considerable attention, since their scale is incomparable to the traditional systems. The scale of ULS systems makes drastic changes in various aspects of system development. As a result, it requires that we broaden our understanding of software architectures and the ways we structure them. In this paper, we investigate the lack of an architectural maturity model framework for ULS system interoperability, and propose an architectural maturity model framework to improve ULS system interoperability.
Keyword : ULS Systems ; Maturity Model ; Interoperability ; Software Architecture




Alexander Bustamante, Ernesto Galvis, Luis Gomez
Doi : 10.7321/jscse.v3.n3.14
Page : 75 - 82
Show Summary
Abstract . this paper presents the first version of an agile and soft method to develop business intelligence solutions. It is based upon three agile methods for software development, namely, SCRUM, XP and KANBAN, and also is based on the Soft System Methodology. The model is a partial result of an ongoing research project in small and low maturity teams with no experience in the business intelligence development process.
Keyword : Business Intelligence ; Agile Development, ; Action Research ; Soft Systems Thinking




Nuzulha Ibrahim, Sazilah Salam, Emaliana Kasmuri, Norazira A Jalil, Mohd Adili Norasikin, Mohamad Riduwan Nawawi
Doi : 10.7321/jscse.v3.n3.15
Page : 83 - 93
Show Summary
Abstract . Most vehicle license plate recognition use neural network techniques to enhance its computing capability. The image of the vehicle license plate is captured and processed to produce a textual output for further processing. This paper reviews image processing and neural network techniques applied at different stages which are preprocessing, filtering, feature extraction, segmentation and recognition in such way to remove the noise of the image, to enhance the image quality and to expedite the computing process by converting the characters in the image into respective text. An exemplar experiment has been done in MATLAB to show the basic process of the image processing especially for license plate in Malaysia case study. An algorithm is adapted into the solution for parking management system. The solution then is implemented as proof of concept to the algorithm.
Keyword : image processing ; preprocessing ; filtering ; feature extraction ; segmentation ; recognition ; experiment




Sandeep kaur
Doi : 10.7321/jscse.v3.n3.16
Page : 94 - 97
Show Summary
Abstract . This paper presents the comparison of two models of software engineering for developing a software product and quality management. It is deal with the software development through the development models, which are known as software development life cycle (SDLC). The main objectives of this paper to show process comparative analysis between two models water fall model and iterative models of software engineering by showing their flexibility to develop a good software product
Keyword : iterative model; Software Engineering; Models; Quality Management




Mojeeb Al-Rhman Al-Khiaty, Moataz Ahmed
Doi : 10.7321/jscse.v3.n3.17
Page : 98 - 106
Show Summary
Abstract . Software reuse allows the software industry to simultaneously reduce development cost and improve product quality. Reuse of early-stage artifacts has been acknowledged to be more beneficial than reuse of later-stage artifacts. In this regard, early-stage reference models have been considered as good tools to allow reuse across applications within the same domain. However, our literature survey reported in this paper reveals that the problem of automatically developing reference models from given instances has not caught enough researchers’ attention yet. Accordingly, in this paper we propose a framework for building a reference model that captures the common and variable analysis/design practices, across the different applications in a domain. The framework considers multi-view models in assessing the commonalities and variabilities among given instances. The proposed framework incorporates learning capabilities to allow improving the quality and reusability of the reference model as it is being used.
Keyword : Reuse ; Reference Model ; Early-stage Artifacts ; Multi-view Similarity ; Merging ; Learning




Abdelhak BOULAALAM, El Habib Nfaoui, Omar El Beqqali
Doi : 10.7321/jscse.v3.n3.18
Page : 107 - 114
Show Summary
Abstract . To improve the increasingly demands products that are customized, all business activities performed along the product life cycle must be coordinated and efficiently managed along the extended enterprise. For this, enterprise had wanted to retain control over the whole product lifecycle especially when the product is in use/recycling (End Of Life phase). Although there have been many previous research works about product lifecycle management in the beginning of life (BOL) and middle of life (MOL) phases, few addressed the end of life (EOL) phase, in particular, when the product is at the customers. In this paper, based on product embedded device identification (PEID) and mobile agent technologies, and with the advent of the development of the ‘‘intelligent products’’, we will try to improve innovation: (a) by minimize the lunch phase, (b) and the involvement of the customer in product lifecycle.
Keyword : Product lifecycle management ; End Of Life ; innovation ; Intelligent Product ; PEID ; Mobile Agent




Hamza Onoruoiza Salami, Moataz A. Ahmed
Doi : 10.7321/jscse.v3.n3.19
Page : 115 - 122
Show Summary
Abstract . The benefits that can be derived from reusing software include accelerated development, reduced cost, reduced risk and effective use of specialists. Reuse of software artifacts during the initial stages of software development increases reuse benefits, because it allows subsequent reuse of later stage artifacts derived from earlier artifacts. UML is the de facto modeling language used by software developers during the initial stages of software development such as requirements engineering, architectural and detailed design. This survey analyzes previous works on UML artifacts reuse. The analysis considers four perspectives: retrieval method, artifact support, tool support and experiments performed. As an outcome of the analysis, some suggestions for future work on UML artifacts reuse are also provided.
Keyword : software reuse ; software retrieval ; UML ; CASE tool




Siti Nurul Hayatie Ishak, Ariza Nordin
Doi : 10.7321/jscse.v3.n3.20
Page : 123 - 130
Show Summary
Abstract . This paper presents the conceptual framework for sequencing of Participatory Action Research (PAR) methodology with the implementation of i* modeling framework in capturing multiple roles requirements. There are multiple roles involved in the development of information system, thus it involves with difference users requirements and preferences, context as well as the demands which become a challenge in development of system. This is due to these roles where information of the project monitoring is perceived in accordance to their role and domain. In the development of information systems, requirement engineering is a vital methodology. Requirement engineering (RE) consists of several phases which elicitation is a crucial phase in RE since it requires researcher to gather the requirement from the users. Methods of eliciting requirements are now more co-operative. Based on the preliminary study of construction-based in Malaysia, evidence of dynamic requirements has been observed according to the environments, economic, technology and manpower involved in the construction project. An adaptive design for project monitoring is needed which allow the physical system to self-adapt in response to the changing environments. Adaptive design requires selecting the right techniques of requirements elicitation. The conceptual framework defined shall be used to elicit requirements from a local construction company.
Keyword : Requirement Engineering ; i* modeling framework ; Participatory Action Research (PAR) ; Action Research ; Role-Oriented Adaptive Design (ROAD)




Jitender Choudhari, Ugrasen Suman
Doi : 10.7321/jscse.v3.n3.21
Page : 131 - 136
Show Summary
Abstract . Software is developed with prior requirements and it is maintained continuously with rapid progresses in domain, technology, economy and other fields. The core activity of maintenance is code change, which changes the code to remove a bug or add new functionality. Maintenance projects contain an unstructured code due to patched and repatched software while addressing successive customer issues. Change in unstructured code without proper test coverage is a risky job. Software maintenance process slowdowns due to lack of proper test suite. Software maintenance process can also be affected due to staff turnover, low team morale, poor visibility, complexity of maintenance projects and lack of communication techniques among stakeholders. On the other hand, Extreme Programming (XP) practices such as Test Driven Development (TDD), refactoring, pair programming and collective ownership can overcome some of the challenges of maintenance up to some extent for non-XP projects. In this paper, an integrated code change approach is proposed for software maintenance using XP practices such as TDD, refactoring and pair programming. The proposed approach uses RC story, production code and test code of existing system during code change. The proposed approach is validated by applying it on several academic projects of software maintenance. It is observed that the proposed approach provides higher quality code in terms of the structure, correctness, robustness and maintainability hence improving software design. The XP practices based approach enhances both learning and productivity of the work by improving courage, team morale and confidence to support higher motivation in code change. In order to improve proposed approach, this experiment can be replicated in future to collect more data and to validate the observations.
Keyword : Software maintenance ; extreme programming ; code change approach




Matloub Hussain,Mian Ajmal
Doi : 10.7321/jscse.v3.n3.22
Page : 137 - 145
Show Summary
Abstract . Demand amplification, also known as bullwhip effect, is the amplification of demand variability as it progresses up a supply chain. Bullwhip effect has determinental effects on the performance of supply chains. Objective of this paper is to quantify the impact of information sharing on bullwhip effect in a model of inventory and order based multi-echelon supply chain. System dynamics simulation, with the help of iThink software package, has been applied. It has been found through simulation experiments that information sharing can be a very effective strategy to control bullwhip effect across supply chains. Increasing the percentage of information sharing results in bullwhip reduction. In a model of four tiers supply chains, information sharing can reduce bullwhip effect from 20:1 to 8:1. This shows that supply chains manager can effectively reduce cost, improve customer service level and increase efficiency of their supply chains by sharing information across whole supply chains.
Keyword : Supply chain; Bullwhip effect; Information sharing




Uttam Bhattacharya
Doi : 10.7321/jscse.v3.n3.23
Page : 146 - 148
Show Summary
Abstract . It is a continuing challenge for the Quality Assurance (QA) function to prove its relevance to business in the current market scenario. It necessitates moving away from traditional compliance focus to value addition through proactive risk identification, highlighting and escalation in projects / programs. Risk Based Quality Assurance (RBQA) framework is proposed to help the program / project in proactive identification of risks and to come up with optimal mitigation strategies. Quality Assurance function will play a major role in implementing this framework which ultimately preserves the relevance of this function from business perspective.
Keyword : Risk assessment ; Quality Assurance ; Risk based Quality Assurance ; Proactive Quality Assurance ; Business aligned QA




Afnan Bashir, Ghulam Rasool, Komal Bashir, Ayesha Haider Ali, Faria Kanwal
Doi : 10.7321/jscse.v3.n3.24
Page : 149 - 155
Show Summary
Abstract . The accurate recovery of design patterns from software applications is still debatable and it depends on different types of analysis methods performed on the source code during recovery of patterns. Structural, behavioral and semantic analysis methods are used to extract patterns from source code. Most approaches used combination of these analysis methods to extract patterns from different applications but the recovery process becomes heavyweight. We present a novel design pattern recovery technique based on attributes from .Net applications using only semantic analysis. Implemented attributes enhance the comprehension of source code related with design patterns. A prototyping tool is developed to realize the concept of approach.
Keyword : Design patterns ; Reverse engineering ; Patterns recovery ; Pattern evaluation ; Documentation recovery




Hamza Salami, Moataz Ahmed
Doi : 10.7321/jscse.v3.n3.25
Page : 156 - 162
Show Summary
Abstract . Software is typically modeled from different viewpoints such as structural view, behavioral view and functional view. Few existing works can be considered as applying multi-view retrieval approaches. A number of important issues regarding mapping of entities during multi-view retrieval of UML models is identified in this study. In response, we describe a framework for reusing UML artifacts, and discuss how our retrieval approach tackles the identified issues.
Keyword : UML ; software reuse ; software retrieval ; multi-view ; genetic algorithm




Komal Bashir,Faria Kanwal,Afnan Bashir,Ayesha Haider Ali
Doi : 10.7321/jscse.v3.n3.26
Page : 163 - 168
Show Summary
Abstract . Safety Critical Systems play significant role in almost every domain of technology. All systems ranging from nuclear power stations to cars, in one way or other impact environment or life. Failure or malfunction in such systems may severely harm people’s lives and environment. Highest level of accuracy and perfection is required from such system. It is important to ensure quality features and plausible outcomes as a result of the intended role for which that safety critical system was designed for. This paper proposes a framework that is composed of most optimum Software Quality Assurance practices in development of such systems. Paper presents nine phases to solidify the quality assurance perspectives of the said systems.
Keyword : Safety Critical Software; Software Quality Assurance




Hamdi Al-Jamimi, Moataz Ahmed
Doi : 10.7321/jscse.v3.n3.27
Page : 169 - 176
Show Summary
Abstract . Analysis and design phases are the most crucial part of the software development life-cycle. Reusing the artifacts of these early phases is very beneficial to improve the productivity and software quality. In this paper we analyze the literature on the automatic transformation of artifacts from the problem space (i.e., requirement analysis models) into artifacts in the solution space (i.e., architecture, design and implementation code). The goal is to assess the current state of the art with regard to the ability of automatically reusing previously developed software designs in synthesizing a new design for a given requirement. We surveyed various related areas such as model-driven development and model transformation techniques. Our analysis revealed that this topic has not been satisfactorily covered yet. Accordingly, we propose a framework consists of three stages to address uncovered limitations in current approaches.
Keyword : software analysis ; design ; model transformation ; software reuse




Christian Benjamin Ries,Vic Grout
Doi : 10.7321/jscse.v3.n3.28
Page : 177 - 186
Show Summary
Abstract . This paper describes the verification process of the UML4BOINC stereotypes and the semantic of the stereotypes. Several ways enable the verification and this paper presents three different ways: (i) specifications of Domain-specific Mod- eling Languages (DSMLs), (ii) the use of C++-Models, and (iii) the use of Visual-Models created with Visu@lGrid [12], [17]. As a consequence, specific code-generators for the trans- formation of these models are implemented into applicable parts for a Berkeley Open Infrastructure for Network Com- puting (BOINC) project [1]. As for the understanding of how the transformation is realised, a brief introduction about the language-recognition and the way how code-generators can be implemented by use of ANTLR (ANother Tool for Language Recognition) [11] is given. This paper does not cover all transformations because most of them can vary, i.e. they depend on the target language (e.g. C++) and how tool-vendors handle semantic-models. In addition, steps three and four are realised within the research iterations of this paper.
Keyword : DSML; BOINC; UML; ANTLR; AST




Kushal Ahmed
Doi : 10.7321/jscse.v3.n3.29
Page : 187 - 198
Show Summary
Abstract . Behavior Engineering (BE) provides a rigorous way to derive a formal specification of a software system from the requirements written in natural language. Its graphical specification language, Behavior Tree (BT), has been used with success in industry to systematically translate large, complex, and often erroneous requirements into an integrated model of the software system. BE’s process, the Behavior Modeling Process (BMP), allows requirements to be translated into individual requirement BTs one at a time, which are then integrated to form a holistic view of the system. The integrated BT then goes through a series of modifications to construct a specification BT, which is used for validation and verification. The BMP also addresses different types of defects in the requirements throughout its process. However, BT itself is a graphical modeling notation, and the types of integration relations, how they correspond to particular issues, how they should be integrated and how to get formal specification have not been clearly defined. As a result, the BMP is informal, and provides guidelines to perform all these tasks on an ad-hoc basis. In this paper, we first introduce a mathematical framework which defines the graphical form of BTs which we use to define the integration relationships of BTs and to formalize the integration strategy of the BMP. We then formulate semi- automated requirements defects detection techniques by utilizing this underlying mathematical framework, which may be extended to formalize the BMP, develop change management framework for it, build techniques for round-trip engineering and so on.
Keyword : Behavior Engineering ; Behavior Trees ; Requirements Engineering ; Software Requirements Specification ; Defects Detection




Amine ACHOURI,Leila Jemni Ben Ayed
Doi : 10.7321/jscse.v3.n3.30
Page : 199 - 204
Show Summary
Abstract . Giving a formal semantic to an UML Activity diagram (UML AD) is a hard task. The reason of this difficulty is the ambiguity and the absence of a precise formal semantic of such semi-formal formalism. A variety of semantics exist in the literature having tackled the aspects covered by this language. We can give as example denotational, functional and compositional semantics. To cope with the recent tendency which gave a heterogeneous semantic to UML diagrams, we aim to define an algebraic presentation of the semantic of UML AD. In this work, we define a formal semantic of UML 2.0 AD based on institution theory. For UML AD formalism, which is a graphical language, no precise formal semantic is given to it. We use the institution theory to define the intended semantic. Thus, the UML AD formalism will be defined in its own natural semantic.
Keyword : Institution theory ; UML 2.0 Activity Diagram ; Formal semantic




Aymen LOUATI, Chadlia JERAD, Kamel BARKAOUI
Doi : 10.7321/jscse.v3.n3.31
Page : 205 - 211
Show Summary
Abstract . Thanks to its graphical notation and simplicity, Unified Modeling Language (UML) is a de facto standard and a widespread language used in both industry and academia, despite the fact that its semantics is still informal. The Interaction Overview Diagram (IOD) is introduced in UML2; it allows the specification of the behavior in the hierarchical way. In this paper, we make a contribution towards a formal dynamic semantics of UML2. We start by formalizing the Hierarchical use of IOD. Afterward, we complete the mapping of IOD, Sequence Diagrams and Timing Diagrams into Hierarchical Colored Petri Nets (HCPNs) using the Timed colored Petri Nets (timed CP-net). Our approach helps designers to get benefits from abstraction as well as refinement at more than two levels of hierarchy which reduces verification complexity.
Keyword : IOD ; Hierarchical use, formal semantics ; HCPNs, timed CP-net, verification




Siddappa G Makanur, Shivanand M Handigund, M. Sreenivasa Rao
Doi : 10.7321/jscse.v3.n3.32
Page : 212 - 218
Show Summary
Abstract . Object Oriented (OO) concept is widely accepted for software development by the software development community for its naturalness and mathematical rigor. Object Oriented Analysis & Design (OOAD) is supported by Unified Modeling Language (UML). Though, OO Technology (OOT) is developed with the state of the art technology, it is passive i.e. it can not be implemented for the development of software projects on its own. On the other hand the Network Database Management System (NDBMS) is active for the implementation of the information system, but suffers from lack of au-courant technology. Both are having complementing characteristics of each other. This paper identifies these complementing virtues and lacunae of both the paradigms and super imposes one over the other. The super imposed paradigm nullifies lacunae of one with the virtues of the other and vice versa. This superimposition is used to develop a robust object network database management system. This act necessitates establishment of compatibility amongst model elements of OOT and NDBMS. In this paper, we have developed an ameliorated methodology that brings compatibility between these two paradigms. This transforms the OOT paradigm from passive to active at the same time it provides NDBMS with state of the art coating without enhancing its complexity. We have mapped model elements viz. Class to Record Types, inter relationships such as Association, Composition and Aggregation to Set Types, inheritance of superclass and subclasses to record type, at present this is achieved at the cost of introducing a constraint that subclasses are non overlapping.
Keyword : Modeling elements ; UML ; OOAD ; OOT ; CODASYL ; NDBMS ; Bachman diagram ; ORDBMS ; OODBMS ; Multiple Inheritance




José Jairo CAMACHO, Jenny Marcela SANCHEZ-TORRES, Ernesto GALVIS-LISTA
Doi : 10.7321/jscse.v3.n3.33
Page : 219 - 229
Show Summary
Abstract . Purpose - There has been an increasing interest about knowledge management in software engineering last years, most of the attention has been focused on knowledge codification and sharing, but less in knowledge transfer. The purpose of this paper is to make a general review of the work done about knowledge transfer in software engineering. Design/methodology/approach – An opportunistic systematic literature review protocol was made, looking for answer the question about what the topics are studied knowledge transfer in software engineering as a whole, which parts of software engineering needs deeper study and how knowledge transfer could be measured. Findings – Knowledge transfer in software engineering studies could be classified in two parts: firstly the companies size (multinationals, big, medium and small) and their social capital, and secondly the software process, where were found out that software requirements have not been deeply studied since the knowledge transfer perspective. Referring measurement, it was found out that is a topic still in his infancy. Originality/value – This paper use a systematic revision protocol to better understand what work has been done concerning knowledge transfer in software engineering, and argues why more attention is needed for the knowledge transfer in software requirements.
Keyword : component; Knowledge Transfer Process; Knowledge Management; Software Engineering; Systematic Review




Venu madhav,Rajalakshmi Selvaraj
Doi : 10.7321/jscse.v3.n3.34
Page : 230 - 235
Show Summary
Abstract . In every field, data is the most significant property, When the data concealed in the rare data it is exposed. Processing of non-trivial removal of novel, which actionable and implicit information from the huge data is Datamining which is an investigation of data, difficult to find out the most helpful patterns and information, which are not understandable to the data user. It computes the patterns and relationships in a rare data and distributes the outcome which can be either assessed by a human analyst or make use of in automatic decision support system. Multiple stages are available in the Data mining. In data mining very simple process are involved, they are post processing the mined result, pre-processing the data and choosing the suitable mining algorithm. At every stage, these are more option that is probable. With this data mining process, the proposed system is operated by selecting the suitable Data mining algorithm for the user’s needs. Proposed system first performs the pre-processing the data, which converts the data into most appropriate form for use by selecting the algorithm. Pre-processed mined data is post-processed and acquire a pattern as command through the user. The main aim of this project is solved by deploying the developed pattern. This proposed system general structure. This general structure is used for any type of data set and can used as make possible human analysis is the one of the particular tool. This tool mostly used to create intelligence huge quantity of data, which need processing to create knowledgeable conclusion. The conclusion is created by the back propagation method in neural network. At last, propose system the performance and evolution is shown, which clearly explains the proposed system have best tool to choose the appropriate algorithm in data mining.
Keyword : Data mining; A-priori algorithm; Pincer search alg; Pattern Recognition; Pincer search algorithm




Arun Nambiar
Doi : 10.7321/jscse.v3.n3.35
Page : 236 - 239
Show Summary
Abstract . Knowledge management is indispensible in today’s highly globalized economy. Effective management of information and people is vital for companies to survive in this age of stiff competition, shrinking margins, and short product lifecycles. It is important for the knowledge management system to facilitate effective integration between information and personnel in order to foster creativity and innovation. In this work, we look at the various aspects of knowledge management and the challenges involved in effective implementation of an efficient knowledge management system that encourages cooperation and collaboration in a global organization.
Keyword : knowledge management ; knowledge acquisition ; innovation ; information ; technology




Gang Ren, Zhe Wen, Xuchen Yang, Mark Bocko, Dave Headlam
Doi : 10.7321/jscse.v3.n3.36
Page : 240 - 247
Show Summary
Abstract . Semantic musical features reflect in-depth understanding of the music, instead of the uninterpreted music content, and serve as idea choices for multimedia content annotations. The proposed semantic music features are based on human music interpretations and their computational implementations. When employed for multimedia applications, these features enable us to simulate human-music interactions. This musical relevance provides significant performance improvement over conventional score or audio based multimedia annotation systems. Two types of semantic musical features, including reductive music analysis and musical expressive features, are introduced. The details of their feature extraction algorithms and semantic interpretations are also illustrated.
Keyword : knowledge engineering ; multimedia annotation ; feature analysis ; human-computer interaction




Patrick Uhr, André Klahold, Madjid Fathi
Doi : 10.7321/jscse.v3.n3.37
Page : 248 - 254
Show Summary
Abstract . The paper in hand introduces a new concept for imitating the human ability of building word association, henceforth called ’CIMAWA’. We have carried out comprehensive case studies to evaluate the ability for imitating human word association. In this context we have used existing studies to compare and examine the performance of our approach. The results have revealed that CIMAWA imitates human word association very accurately and is superior to existing approaches.
Keyword : Asymmetrical Word Association ; Knowledge Discovery from Text ; Text-Mining




Assem Shormakova, Aida Sundetova, Aida Sundetova
Doi : 10.7321/jscse.v3.n3.38
Page : 255 - 259
Show Summary
Abstract . The goal of this article is to examine a grammatical and lexical problems, which we often face while translating English texts, and not giving any detailed statement of grammatical or lexical phenomenon. Only some sides of the given phenomenon are reviewed in the article, particularly the ones that represent linguistic-culturological interest in respect of translation from English to Kazakh language on a Apertium platform.
Keyword : Machine translation; English; Kazakh; Apertium




Mohammed Abu Shquier,Omer Abu Shqeer
Doi : 10.7321/jscse.v3.n3.39
Page : 260 - 270
Show Summary
Abstract . Arabic is a highly inflectional language, with a rich morphology, relatively free word order, and two types of sentences: nominal and verbal. Arabic natural language processing in general is still underdeveloped and Arabic natural language generation is even less developed [32]. Word ordering plays an important role in the translation process between languages. This research is presenting work-in-progress to examine the implications of using verb subject object (VSO) and subject verb object (SVO) words order when dealing with the agreement requirements of irregular verbs in MT. several distinguishing cases of Arabic pertinent to MT will be explored in detail with reference to some potential difficulties that they might present. Irregular verbs can be defined as verbs that act differently from the basic patterns in all or some cases [31]; the definition of irregular verbs involves accounting doubled, hamzated and weak verbs. There are four categories of weak verbs depending on the position of the weak letter/Vowels in the root (first, middle, last letter, or more than one letter). The paper presents formalism to best suit word orders based on rules and examples of part of the morphological knowledge of the Arabic language based on irregular verbs and their derivatives. We will first perform a thorough study of irregular verbs of the Arabic Language and propose a model that is based on set theory and ontologies. We then show how this model can be used for some applications that include NLP applications. Approach: The main objective of this research is to reinforce a hybrid-based MT (EA-HBMT) to improve the quality of MT from English to Arabic. Arabic lexicon would be supported by a strong theoretical framework and implemented using robust tools that will facilitate its implementation. Rules will be used to recognise the derivative and inflexional nature of the Arabic language. Transfer-based MT is used to obtain an intermediate representation that captures the “meaning” of the original sentence in order to generate the correct translation. Example based-technique is used as well to handle the irregular cases. Semantic process is mainly conducted to detect the statements that require the use of SVO construction rather than VSO. Results: in this paper we built a module to detect irregular verbs, i.e, doubled, hamzated, Mithal, Hollow, defective, and enfolding. A set of 30 rules have been conducted based on the tense of the verb, place of the vowel root letter, first, second or third person representation, number and gender features, and diacritics preceding vowel letter, i.e., nominative, accusative or genitive case. Our proposed module has been effectively evaluated using real test data and achieved satisfactory results.
Keyword : agreement; irregular verb; hamzated verb; doubled verb; hollow verb; defective; EA-HBMT




Aziz-Ud Din, Bali Ranaivo-Malancon, Alvin W. Yeo
Doi : 10.7321/jscse.v3.n3.40
Page : 271 - 274
Show Summary
Abstract . This paper is related to the field of Natural Language Generation. NLG is a subfield of NLP, which is itself a subfield of AI. This paper describes the development of Pashto language generation system, which is in early stages of development. The system is based on the combinatory categorial grammar which is derived from categorial grammar. The system is being implemented using open source OpenCCG toolkit. The special focus of generation process is on the generation of clitics and endoclitics which would be incorporated into the final system when it is complete.
Keyword : Clitics; Endoclitic; Natural Language Generation; Prosody; Syntax




Ahmad Zmily, Dirar Abu-Saymeh, Dhiah Abou Tair
Doi : 10.7321/jscse.v3.n3.41
Page : 275 - 283
Show Summary
Abstract . With the advent of mobile devices and social networks, information and identity security concerns have increased. Mobile devices that have multiple sensing capabilities have been interfaced with social networks allowing users to post many of their activities and habits instantly onto social networks. This information can easily be used by social network providers to invade privacy and pose security risks to users. In this paper we propose an identity security framework that encapsulates a generic interface between mobile devices and social networks that utilizes identity hopping to secure and hide users' real identities. The framework also employs anti-correlation measures to prevent social network providers from being able to correlate the identities together. The proposed framework has been implemented on the Google Latitude application as a case study to hide users' real identities and prevent the service provider from tracking complete movement habits. The implementation shows the effectiveness of the proposed interface in enhancing identity security.
Keyword : Identity Security ; Hopping ; Location Privacy ; Security Framework ; Social Media Applications




Mohammed Shalhoub, Abdula Bataweel, Hassan A. Alsereihy
Doi : 10.7321/jscse.v3.n3.42
Page : 284 - 288
Show Summary
Abstract . The main objective of this article is to examine the overall information security by addressing the readiness of some more efficient attacks and attacks against the human race. This was achieved through studying the previous work in the field of information security and other relevant research areas. Also, we'll discuss using social engineering techniques against enterprise users. Through the application of methods of social engineering, we will discuss how to bridging the gap between the user and information security group. Let the best security awareness, and improve compliance with information security policy, and the least difficulties in user acceptance. We concluded that training should be given on information security awareness for all employees in the organization.
Keyword : Information security; Social engineers




Bassant Abagouri, Ibrahiem M. M. El Emary, Bader A. Alyoubi
Doi : 10.7321/jscse.v3.n3.43
Page : 289 - 294
Show Summary
Abstract . Social engineering and offensive security, adequate attention should be paid because of their ability to take advantage of the weakness of human trust and show. Social engineering attack can successfully lead to other serious crimes such as identity theft and industrial espionage. This is not only at the organizational level, but also at the individual level. This paper aims to examine some details regarding social engineering attack against an organization with their permission. The paper concludes important recommendations to reduce the threat of social engineering as well as the main architect of social attention; care should be taken to avoid the problems faced by various organizations.
Keyword : Information Systems; Attack; Social Engineering




Bader Alyoubi, Adel A. Alyoubi
Doi : 10.7321/jscse.v3.n3.44
Page : 295 - 300
Show Summary
Abstract . This report has examined the role of social engineering with reference to the computer security community. Social engineering is a method used by fraudsters to fool and cheat people into revealing their passwords and usernames and other confidential information such as server details, organization access information and other confidential information. The hacker can then use this information to enter the computer network, steal data and carry out other attacks. Social engineering is regarded as being more dangerous since it does not require expert programming skills that hacking does. The report has examined different types of social engineering methods and techniques and the methods used to fight such social engineering attacks. A few incidents of such attacks are also presented. The main issue that has emerged is that the social engineer is now also an expert hacker who combines the skills of a fraudster with that of an expert hacker. This has increased the sophistication and skills of the attack. The paper recommends that a primary and secondary study should be undertaken to evaluate how social engineering attacks are carried out and prevented by the security community of various organizations.
Keyword : social engineering; computer fraud; hacking; phish




Adel Alyoubi, Bader A. Alyoubi
Doi : 10.7321/jscse.v3.n3.45
Page : 301 - 307
Show Summary
Abstract . The role of social networks in the cognitive benefit of modern societies is explored in this paper drawing from various studies and literature review. Social networks play a crucial role in the development of positive cognitive processes of an individual. Studies have indicated the various physical and cognitive impact of social networking among individuals, and even a recent tragic event will show that mankind is inherently social and that interaction with others will improve his thinking and reasoning process. Ways to maximize social networking online and offline should be considered by societies in order to minimize aggression, crimes, and even self-destruction.
Keyword : Cognition; Cognitive Benefits; Social Networks; Modern Societies




Vasu Jain
Doi : 10.7321/jscse.v3.n3.46
Page : 308 - 313
Show Summary
Abstract . Social media content contains rich information about people’s preferences. An example is that people often share their thoughts about movies using Twitter. We did data analysis ontweets about movies to predict several aspects of the movie popularity. The main results we present are whether a movie would be successful at the box office.
Keyword : Sentiment Analysis; Twitter; Prediction Model; Opinion Mining; Tweet Analysis; Movie Success Prediction




kotaiah bonthu, Raees Ahmed Khan, Muralidhar Vejendla3
Doi : 10.7321/jscse.v3.n3.47
Page : 314 - 319
Show Summary
Abstract . The Software projects become critical systems now a days. Measuring software reliability in a continuous and disciplined manner leads to accurate estimation of project costs and schedules, and improving product and process qualities. Also, detailed analysis of software metric data gives important clues about the locations of possible errors in a programming code. The objective of this paper is to establish a method for identifying software errors using machine learning methods. We used machine learning methods to construct a two step model that predicts potentially modules with errors within a given set of software modules with respect to their metric data by using Artificial Neural Networks. The data set used in the experiments is organized in two forms for learning and predicting purposes; the training set and the testing set. The experiments show that the two step model enhances error prediction performance to improve the Software Reliability.
Keyword : Machine Learning Techniques; Software Reliability; Software Error Prediction; Artificial Neural Networks




Abdul Rauf, Eisa A. Aleisa, Imam Bakhsh
Doi : 10.7321/jscse.v3.n3.48
Page : 320 - 325
Show Summary
Abstract . Graphical User Interface (GUI) is a mean of interaction between an end user and a software system. Software systems have gained an unprecedented popularity in last twenty years or so and the biggest factor behind this success is Graphical user interface. Software developing companies and teams have always shown a thirst for fully assured high quality software. To fulfill this deep desire of companies, software must go through an intensive testing, but it seems almost impossible to test a GUI application manually due to complexity involve in such effort. Obvious alternative is to go for automated testing. Models or Graphs are being considered as basis for automated GUI testing. Event-flow graph is one of several efforts towards automation of GUI testing. Thorough testing to satisfy the test organization or team’s demands is also a terminology, facing lack of consensus among different researchers. Usually test criterion corresponds a “coverage function” that measures how much of the automatically generated optimization parameters satisfies the given test criterion. Our past work has demonstrated that with the help of evolutionary algorithms and event flow representation we can get promising test coverage of GUI applications. Now we are going to extend our previous work and proposing the use of an evolutionary algorithm to gain multiple objectives. These objectives are to gain maximum coverage while keeping number of test cases at minimum side, and the evolutionary algorithm we are going to use for this purpose is Non-dominated Sorting Genetic Algorithm II ( NSGA-II).
Keyword : NSGA; Testing Event Driven Software; Search Based Software Testing; GUI Testing




Suresh Yeresime, Santanu Ku Rath
Doi : 10.7321/jscse.v3.n3.49
Page : 326 - 332
Show Summary
Abstract . Software Testing is a process to identify the quality and reliability of software, which can be achieved through the help of proper test data. However, doing this manually is a difficult task due to the presence of number of predicate nodes in the module. So, this leads towards a problem of NP-complete. Therefore some intelligence-based search algorithms have to be used to generate test data. In this paper, we use a soft computing based approach, genetic algorithm to generate test data based on the set of basis paths. This paper combines the characteristics of genetic algorithm with test data, making use of the merits of respective global and local optimization capability to improve the generation capacity of test data. This automated process of generating test data optimally helps in reducing the test effort and time of a tester. Finally, the proposed approach is applied for ATM withdrawal task. Experimental results show that genetic algorithm was able to generate suitable test data based on a fitness value and avoid redundant data by optimization.
Keyword : genetic algorithm; basis path; test data; cyclomatic complexity; fitness function




Pourya Nikfard, Suhaimi Ibrahim, Mohammad Hossein Abolghasemzadeh
Doi : 10.7321/jscse.v3.n3.50
Page : 333 - 341
Show Summary
Abstract . Testing is one of the five main technical activity areas of software engineering. Software testing is important in quality assurance of web applications, in which test case is crucial. Compared with other software systems, web applications have many differences in the testing process. Web testing is an effective technique to ensure the quality of web applications. A number of approaches have been presented, to tackle this problem. In this paper, we classify these approaches into two classes (Functional Requirement and Non-Functional Requirement). Then, we evaluate these approaches based on some criteria (like testing criteria, technique used, performance, reliability, accuracy, testing method and testing criteria). Exploration of that categorization will help researchers who are working on web application testing to deliver more applicable solutions.
Keyword : Software Testing; Web Application testing; comparative evaluation; approaches; web application testing approaches




Jiangcheng Chen, Xiaodong Zhang, Lei Zhu
Doi : 10.7321/jscse.v3.n3.51
Page : 342 - 345
Show Summary
Abstract . The kinematics recursive equation was built by using the modified D-H method after the structure of rehabilitation lower extremity exoskeleton was analyzed. The numerical algorithm of inverse kinematics was given too. Then the three-dimensional simulation model of the exoskeleton robot was built using MATLAB software, based on the model, 3D Reappearance of a complete gait was achieved. Finally, the reliability of numerical algorithm of inverse kinematics was verified by the simulation result. All jobs above lay a foundation for developing a three-dimensional simulation platform of exoskeleton robot.
Keyword : Kinematics analysis ; rehabilitation ; extremity exoskeleton robot ; three-dimensional simulation




Carlos Daniel Nocito, Miroslav Kubat
Doi : 10.7321/jscse.v3.n3.52
Page : 346 - 350
Show Summary
Abstract . The use of data mining and adaptive learning is a very controversial issue among the algorithmic trading community in the financial world. The reason for the mistrust in the techniques arrives from some very well-known problems: overfitting to training data and insufficient support for the derived models. In this paper, we present a new element to the use of some classic data mining and adaptive learning algorithms: a set of objective distance measurements that track the similarity between the prediction model and the actual system. We use historical market data to develop algorithms, and investigate the correlation between prediction accuracy of the models and their distance measurements. We find that this tracking could allow investors to discard stale models earlier, thus decreasing losses.
Keyword : algorithmic trading; decision trees; performance tracking; data mining; jensen-shannon; kullback-leibler




Jing Huang, Kai Wu, Lok Kei Leong, Seungbeom Ma, and Melody Moh
Doi : 10.7321/jscse.v3.n3.53
Page : 351 - 358
Show Summary
Abstract . Cloud computing uses a great amount of heterogeneous resources to deliver countless different services to users of distinctive quality of services (QoS) requirements. Numerous diverse tasks need to be carried out to meet the vastly different QoS and budget requirements. Workflow scheduling is therefore critical for the success of large-scale cloud computing. Particle Swarm Optimization (PSO) has been adopted for workflow scheduling in cloud computing, yet most existing works focused on a single objective. This paper proposes a tunable fitness function for the PSO algorithm, based on which a workflow schedule may be selected for minimal cost or minimal makespan (completion time), or any level in between. A heuristics is further proposed to address bottleneck problems and attains a smaller makespan. Performance evaluation and complexity analysis are both presented, which show that the proposed algorithm surpasses the existing ones in both cost and makespan while maintaining a reasonable load balance and keeping the same time complexity. We believe that the tunable fitness function-based PSO have many potential applications in other soft computing and distributed computing models.
Keyword : cloud computing; makespan; particle swarm optimization (PSO); soft computing; workflow scheduling




Mohamed Abdel-Raheem, Ahmed Khalafallah
Doi : 10.7321/jscse.v3.n3.54
Page : 359 - 364
Show Summary
Abstract . Electimize is a new evolutionary algorithm (EA) algorithm that was introduced to overcome some limitations of existing evolutionary algorithms. Electimize simulates the phenomenon of the electrical current conductivity through the representation of solution strings as wires in closed electric circuits. Unlike some EAs, Electimize has the ability to assess the quality of each value in the solution string independently. The assessment of values in potential solution is based on Ohm’s law and Kirchhoff’s rule. One of the primary objectives of developing Electimize is to devise additional capabilities that would enable the algorithm to solve a wide range of discrete optimization problems. Specifically, this paper aims to: 1) assess the capabilities of the algorithm in solving a challenging class of discrete optimization problems, namely, NP-complete optimization problems, 2) compare the performance of Electimize to other EAs that were used to solve this class of problems. For this purpose, an instant (Bayg29) of the traveling salesman problem (TSP) was selected for the testing, application and comparison purposes.
Keyword : optimization; Electimize; evolutionary algorithms; ; NP-complete; traveling salesman problem




Juliana Wahid, Naimah Mohd Hussin
Doi : 10.7321/jscse.v3.n3.55
Page : 365 - 371
Show Summary
Abstract . In this paper, harmony search algorithm is applied to curriculum-based course timetabling. The implementation, specifically the process of improvisation consists of memory consideration, random consideration and pitch adjustment. In memory consideration, the value of the course number for new solution was selected from all other course number located in the same column of the Harmony Memory. This research used the highest occurrence of the course number to be scheduled in a new harmony. The remaining courses that have not been scheduled by memory consideration will go through random consideration, i.e. will select any feasible location available to be scheduled in the new harmony solution. Each course scheduled out of memory consideration is examined as to whether it should be pitch adjusted with probability of eight procedures. However, the algorithm produced results that were not comparatively better than those previously known as best solution. With proper modification in terms of the approach in this algorithm would make the algorithm perform better on curriculum-based course timetabling.
Keyword : Curriculum-based Course Timetabling; Harmony Search; Simulated Annealing




Ahmed El-Kishky, Stephen Macke, Roger Wainwright
Doi : 10.7321/jscse.v3.n3.56
Page : 372 - 379
Show Summary
Abstract . We developed minimal perfect hash functions for a variety of datasets using the probabilistic process of simulated annealing (SA). The SA solution structure is a tree representing an annealed program (algorithm). This solution structure is similar to the structure used in genetic programming. When executed, the SA program produces multiple hash functions for the given data set. An initial hash function called the distribution function is generated. This function attempts to uniformly place the keys into bins in preparation for a minimal perfect hash function determined later. For each trial, and for every data set of various size tested, our algorithm annealed a minimal perfect hash function. Our algorithm is applied to datasets of strings from the English language and to a list of URL's. Bloat control is used to ensure a small fixed depth limit to our solution, to simplify function complexity, and to ensure fast evaluation. Experimental results show that our algorithm generates hash functions which outperform both widely known non-minimal, non-perfect hashing schemes as well as other recent algorithms from the literature.
Keyword : Genetic Programming ; Simulated Annealing ; Minimal Perfect Hash Function ; Hashing




Harun Raşit Er, Nadia Erdoğan
Doi : 10.7321/jscse.v3.n3.57
Page : 380 - 386
Show Summary
Abstract . Traveling Salesman Problem (TSP) is one of the most common studied problems in combinatorial optimization. Given the list of cities and distances between them, the problem is to find the shortest tour possible which visits all the cities in list exactly once and ends in the city where it starts. Despite the Traveling Salesman Problem is NP-Hard, a lot of methods and solutions are proposed to the problem. One of them is Genetic Algorithm (GA). GA is a simple but an efficient heuristic method that can be used to solve Traveling Salesman Problem. In this paper, we will show a parallel genetic algorithm implementation on MapReduce framework in order to solve Traveling Salesman Problem. MapReduce is a framework used to support distributed computation on clusters of computers. We used free licensed Hadoop implementation as MapReduce framework.
Keyword : Hadoop ; MapReduce ; TSP ; Genetic Algorithm




Saad Khaleefah Al-Janabi
Doi : 10.7321/jscse.v3.n3.58
Page : 387 - 387
Show Summary
Abstract . Steganography means the use of a cover image to hide a bits of information or images in away that it is imperceptible to an observer . We use the Wavelet transforms because it gives perfect reconstruction of the original image. we proposed an algorithms that embeds the message bits stream in the LSBs of the wavelet coefficients of a color image reach up to half cover image. The algorithm used the PN sequence as a key for embedded and extracting in order to recover the embedded message without lose of quality of image. We use the MATLAB to implement the two Algorithms one for implements the embedding procedure the another for implements the Extracting procedure . The results showed the high invisibility of the proposed model even with large messages size were embedded.
Keyword : Implementation; Evaluation; Steganography; Wavelet



In Publish
Ali Kattan, Rosni Abdullah
Show Summary
Abstract . In order to solve global optimization problems of continuous functions, researchers rely on using meta-heuristic algorithms to overcome the computational drawbacks of the existing numerical methods. A recent such meta-heuristic is the Harmony Search algorithm that was inspired from the music improvisation process and has been applied successfully for solving such problems. The proper settings of the algorithm parameters prior to starting the optimization process plays a crucial role in its overall performance and ability to converge to a good solution. Several improvements have been suggested to automatically tune some of these optimization parameters to achieve better results in comparison to the original. This paper proposes new dynamic and self-adaptive Harmony Search algorithm that utilizes two new quality measures to drive the optimization process dynamically. They key difference between the proposed algorithm and many recent improvements is that the values for the pitch-adjustment rate and the bandwidth optimization parameters are determined independently from the current improvisation count and hence do not change monotonically but dynamically. Results show the superiority of the proposed method against several recent methods using some common benchmarking problems.
Keyword : computational intelligence; meta- heuristic




Malcolm McRoberts
Doi : 10.7321/jscse.v3.n3.60
Page : 395 - 402
Show Summary
Abstract . Cloud computing represents a major shift in information systems architecture, combining both new deployment models and new business models. Rapid provisioning, elastic scaling, and metered usage are essential characteristics of cloud services, and they require cloud resources with these same characteristics. When cloud services depend on commercial software, the licenses for that software become another resource to be managed by the cloud. This paper examines common licensing models, including open source, and how well they function in a cloud services model. It discusses creative, new, cloud-centric licensing models and how they allow providers to preserve and expand their revenue streams as their partners and customers transition to the cloud. The paper concludes by identifying the next steps to achieve standardized, “cloud-friendly” licensing models.
Keyword : software licensing ; cloud computing ; open source ; elastic scaling ; intellectual property ; license compliance ; SaaS ; Software as a Service




Sasan Adibi, Nilmini Wickramasinghe, Caroline Chan
Doi : 10.7321/jscse.v3.n3.61
Page : 403 - 410
Show Summary
Abstract . Cloud computing is a complex infrastructure revolved around (mobile and non-mobile) computing, database and storage capacity, and service delivery. This evolving concept aims to serve as the next generation heterogeneous service-based model, with centralized and decentralized clients, servers, services, and data storage entities across multiple platforms. Mobile cloud computing (mcc), which is a subset of the cloud computing space, is where a number of the cloud entities are mobile-based. This paper is focused around the idea of mcc deployment in the healthcare areas, defining the cloud computing mobile health (mhealth), (ccmh), which includes the relevant issues and challenges. The main contribution of this paper is a set of recommendations for the future expansions of both cloud computing and emerging mhealth technologies when they are merged together.
Keyword : Cloud computing ; Mobile Health (mHealth) ; security ; Quality of Service (QoS)




Kannan Gobinath, P. Sathish Kumar
Doi : 10.7321/jscse.v3.n3.62
Page : 411 - 415
Show Summary
Abstract . Software process is used to produce products according to plan, whereas simultaneously improving the organization capability to produce product. This research deals with an efficient process of developing software. It is concerned with the process model which builds the software using the cloud. The iterative model which evolved as a boon to eliminate the cons of waterfall model, also suffers from the drawbacks of repetitive process. So, to overcome the disadvantages of traditional iterative method we have proposed a novel iterative software development model using cloud computing. The main objective of this research is to revive the iterative model for improving the efficiency in software development process using cloud environment.
Keyword : software development; iterative; cloud computing; beta version




Jafar Shayan, Ahmad Azarnik, Suriayati Chuprat, Sasan Karamizadeh, Mojtaba Alizadeh
Doi : 10.7321/jscse.v3.n3.63
Page : 416 - 421
Show Summary
Abstract . Cloud computing is an emerging computing model where IT and computing operations are delivered as services in highly scalable and cost effective manner. Recently, embarking this new model in business has become popular. Companies in diverse sectors intend to leverage cloud computing architecture, platforms and applications in order to gain higher competitive advantages. Likewise other models, cloud computing brought advantages to attract business but meanwhile fostering cloud has led to some risks, which can cause major impacts if business does not plan for mitigation. This paper surveys the advantages of cloud computing and in contrast the risks associated using them. Finally we conclude that a well-defined risk management program that focused on cloud computing is an essential part of gaining value from benefits of cloud computing.
Keyword : Cloud computing ; Benefits ; Security ; Risks




Zeynab Moradpour Hafshejani, Seyedeh Leili Mirtaheri, Ehsan Mousavi Khaneghah, Mohsen Sharifi
Doi : 10.7321/jscse.v3.n3.64
Page : 422 - 429
Show Summary
Abstract . One of the most important issues in cluster computing systems is the efficient use of resources to increase the performance of systems and hence decrease their response times. These objectives can best be pursued by job schedulers in cluster computing systems. However, existing schedulers in cluster computing systems do not use resources efficiently. This paper proposes a new method for efficient allocation of submitted jobs to resources. Jobs consist of threads that are arranged in a two-dimensional matrix. Using the horizontal scanning of this matrix, threads of different jobs are allocated to different processors, preventing resources becoming idle. In previous method of scheduling, the resources are allocated to total threads of a job synchronization but in proposed method, the resources are allocated to threads of various jobs. In new method if there aren’t available resources enough for a job, threads of different jobs can run thus waste of resources are minimum. Simulation results of our proposed scheduling method show quicker cluster system response time than FCFS and Backfilling scheduling methods.
Keyword : cluster computing system; scheduling; computing resource; thread; response time




Yasser El Madani El Alami, El Habib Nfaoui, Omar El Beqqali
Doi : 10.7321/jscse.v3.n3.65
Page : 430 - 440
Show Summary
Abstract . This paper presents an integrated multi-agents architecture for indexing and retrieving video information. The focus of our work is to elaborate an extensible approach that gathers –a priori- almost of the mandatory tools which palliate to the major intertwining problems raised in the whole process of the video lifecycle (classification, indexing and retrieval). In fact, effective and optimal retrieval video information needs a collaborative approach based on multimodal aspects. Clearly, it must be taken into account the distributed aspect of the data sources, the adaptation of the contents, semantic annotation, personalized request and active feedback. These aspects constitute the backbone of a vigorous system and improve its performances in a smart way.
Keyword : Semantic web ; feedback ; multi-agents system, ; ontology ; artificial intelligence ; multimedia retrieval




Farzaneh Kimiaie, Seyed Javad Seyed Mahdavi Chabok, Reza Askari Moghadam
Doi : 10.7321/jscse.v3.n3.66
Page : 441 - 445
Show Summary
Abstract . One of important issues in wireless sensor networks is Routing. The essential function of a WSN is to monitor a phenomenon in a physical environment and report sensed data to a sink. A very common assumption in the analysis and development of routing algorithms is the full cooperation of the participating nodes. However, the reality may differ considerably. The existence of multiple domains belonging to different authorities or even the selfishness of the nodes themselves could result in a performance that significantly deviates from the expected one. The proposal algorithm induces a distributed and energy aware based game theory routing Simulation results show that compared to GEAR, our proposed routing scheme is almost 1.23 times more efficient in terms of network life time and 1.5 times more efficient in terms of data delivery. Simulation results show that this approach performs better than superior in total energy consumption and network lifetime.
Keyword : game theory; distributed algorithm; sensor network; aware routing; fairness




Hamid Reza Ranjbar, Mehdi Alimi MotlaghFard
Doi : 10.7321/jscse.v3.n3.67
Page : 446 - 449
Show Summary
Abstract . Today we collect data from different size of data, different locations and different type with a large scale in each site. Current computer server systems cannot process and collect these big data. For this issue, distributed computing system proposed in the literature. Supercomputers are changed to distribute computing, such as cloud computing systems. Software engineering is a part of each software system. The software engineering describes the architecture, connection. The software architecture should responsible for whole the life cycle of a system include analyzing, designing, implementing and maintaining a software system. In this paper we will review different methods of software engineering for distributed systems.
Keyword : Software Engineering; Software Methods; Distributed Systems, Distributed Software Engineering




Ghassem Tofighi, Kaamran Raahemifar, Anastasios N. Venetsanopoulos
Doi : 10.7321/jscse.v3.n3.68
Page : 450 - 454
Show Summary
Abstract . Distributed Software Systems (DSS) are used these days by many people in the real time operations and modern enterprise applications. One of the most important and essential attributes of measurements for the quality of service of distributed software is performance. Performance models can be employed at early stages of the software development cycle to characterize the quantitative behavior of software systems. In this research, performance models based on fuzzy logic approach, queuing network approach and Petri net approach have been reviewed briefly. One of the most common ways in performance analysis of distributed software systems is translating the UML diagrams to mathematical modeling languages for the description of distributed systems such as queuing networks or Petri nets. In this paper, some of these approaches are reviewed briefly. Attributes which are used for performance modeling in the literature are mostly machine based. On the other hand, end users and client parameters for performance evaluation are not covered extensively. In this way, future research could be based on developing hybrid models to capture user’s decision variables which make system performance evaluation more user driven.
Keyword : Distributed Software Systems ; Performance Evaluation ; Fuzzy logic ; Queuing Networks ; Petri Nets




Neven Dragojlovic
Doi : 10.7321/jscse.v3.n3.69
Page : 455 - 462
Show Summary
Abstract . At some point or another, everyone has surely had the experienced of, staring at mottled surroundings without any thought in mind and finding how one’s visual system started detecting patterns - like faces or animals or objects – in the mottled background. Once those patterns have been detected, moreover, it becomes difficult to observe the mottled surroundings and not perceive the patterns without engaging in active repression. This spontaneous organization of input into recognizable molds starts with just a few patches of visual input that activate a schema that is already present in the memory structure. The visual system then queries the surrounding features to see if they fit into the activated schema, thus constructing a richer and more compelling pattern. Psychology would interpret this process as a Rorschach test that shows what our unconscious mind is primed to perceive at any moment. This type of experiences suggeststhat, in order for intelligence to emerge, controlled chaos is necessary. Self-feedback and multiple calculation points, each of which follows simple rules, permit the emergence of chaotic attractor-networks, which bring stability to a system, and which, if networked together can create intelligence. This paper describes a computational system that is capable of such a feat, depending only on a mechanical process that does not require thought or consciousness. The systemonly requires local processing units, their associated memory, and simple software that interprets its immediate environment, (that is, the activity of surrounding processing units). In order to be functional, such a system cannot limit itself to the simplest possible case (such as letters or simple geometric shapes), but must be able to process all types of input and form active networks out of it. The system described in the following article uses a fully parallel pattern-type language that can be used in multiple, easily joined modules, where each module can be used in processing a specific type of information. As it uses simple programs in each computing element, the information is easily integrated and debugged. Complex statistical models, which form the foundation of most current search and recognition algorithms, are not necessary in this system as it automatically uses simple search and recognition strategies at each computing component. (Based on U.S. Patent 7,426,500 and pending patent US13/117,176)
Keyword : distributed computing; Cellular parallel architecture; distributed memory; hexagonal framework; networks; chaotic attractors in complex systems; multi-¬‐level up and down information flow; swarm intelligence




Mohammad Ali Torkamani, Hamid Bagheri, Abbas Bahrami, Ali Bayat, Seyyed Hossein Ahmadi , Mohammad Reza Khodabakhshi
Doi : 10.7321/jscse.v3.n3.70
Page : 463 - 466
Show Summary
Abstract . Ultra Large scale systems have some characteristics which are derived from their scale. These characteristics of ULS systems make it impossible to rely on our current approaches in software development and new approaches for development, deployment, control and management should be made. Today’s central developing approach in software systems to tackling ULS challenges will not suffice. One of the most important phases in developing information systems is configuration management. In ULS Configuration management is much more complicated than current practices. The available tools which are developed for configuration management so far are operating centrally while different developers participate in developing ULS systems independently. Due to the existing dependencies between components of such systems and their characteristics, change management needs new approaches. Change in one component may result in some side effects in other relating components. In some approaches, for instance, changing in one component which is used by many developers, while they are unaware, may influence their systems. To tackle such problems, in our paper we propose and analyze new approach for configuration management. This method is implemented in R&D department of Iranian Telecommunication Manufacturing Company (ITMC).
Keyword : Ultra Large Scale system(ULS) ; Configuration Management ; Component




Masaya Yoshikawa,Hikaru Goto
Doi : 10.7321/jscse.v3.n3.71
Page : 467 - 473
Show Summary
Abstract . The advanced encryption standard (AES) is the most popular encryption standard in the world. Although the AES algorithm is theoretically safe, it has been recently reported that confidential information could be illegally revealed when the AES algorithm is used in electronic circuits. In particular, the menace posed by fault analysis attacks has become extremely serious. This study develops a software simulator to evaluate the vulnerability of a cryptographic circuit against fault analysis attacks in which multiple analytical methods are combined. Simulation results proved the validity of the proposed simulator.
Keyword : Security verification ; Software simulator ; Fault analysis attacks ; Cryptogram




Carol Niznik
Doi : 10.7321/jscse.v3.n3.72
Page : 474 - 480
Show Summary
Abstract . The Massive Ordnance Penetrator(MOP) has been developed to destroy deeply buried nuclear components by controlled release from a B2 or B52 airplane. This type of release must be cockpit software controlled by the Tactical Optimal Strategy Game(TOSG) Protocol to optimally determine the war game aspects of the dueling from other countries' MOP releases, and the depth at which the MOP explosions can occur for maximal safety and risk concerns. The TOSG Protocol characteristics of games of strategy, games of optimal strategy and tactical games are defined initially by the game of strategy as a certain series of events, each of which must have a finite number of distinct results. The outcome of a game of strategy, in some cases, depends on chance. All other events depend on the free decision of the players. A game has a solution if there exist two strategies, which become optimal strategies when each mathematically attains the value of the game. The TOSG Protocol war game tactical problem for a class of games can be mathematically modeled as a combat between two airplanes, each carrying a MOP as the specification of the accuracy of the firing machinery and the total amount of ammunition that each plane carries. This silent duel occurs, because each MOP bomber is unable to determine the number of times its opponent has missed. The TOSG Protocol realizes a game theory solution of the tactical optimal strategy game utilizing the theory of games of timing, games of pursuit, games of time lag, games of sequence, games of maneuvering, games of search, games of positioning and games of aiming and evasion. The geometric software structure for the TOSG Protocol is a game tree identifying the possible depth of explosions. This finite game tree with a distinguished vertex is embedded in an oriented plane to facilitate the definition of a strategy as a geometric model of the character of a game for the successive presentation of alternatives. The tactical optimal strategy determination by the TOSG Protocol Cockpit Software is mandatory for the execution of the correct and maximally effective MOP release by the MOP bomber.
Keyword : Tactical Game; Zero Sum Two Person Game; TOSG Protocol Cockpit Software; Aiming and Evasion Game Theory; Game Tree Geometric Structure; risk constraint; optimization theory games of timing; invariant imbedding; optimal strategy; MOP Bomber; Performance Evaluation



In Publish
Shi-in Chang, John Copeland
Show Summary
Abstract . This paper demonstrates a novel type of a mobile NFC application based on the Android platform. Owing to the advantages of NFC features like simplicity, ease of use, and low-cost, a large number of applications using the NFC technology are spreading widely and providing convenience in everyday human life. Among the prevalent applications are making payments, ticketing, and use as a proximity card. This emerging technology is being integrated into mobile devices and is propelling a new paradigm, mobiquity, which means that every device all around us has been integrated into an interactive world with both mobility and ubiquity. There are three common modes in NFC, namely, card emulation, file reading and writing, and peer-to-peer modes. NFC is designed for a mode change among three modes according to a type of applications. Nevertheless, the peer-to-peer mode research has been less investigated by both industry and academia. Moreover, the security perspective has not received sufficient research attention as part of the application development, due to excessive reliance on the physical characteristics of the short-range communication. In this paper, we propose a new type of peer-to-peer NFC application to share sensitive data (e.g., access-authority encryption keys) with various levels of restrictions to another user, to access privilege-protected resources. The proposed application allows the resource owner to use a previously authenticated key securely stored in any of the owner's mobile devices. It also allows the owner to share the key via NFC technology with another person who wants to borrow the resource (e.g. laptop, tablet, or even car), temporarily and with limited authority.
Keyword : Near Field Communications; Security; Authentication; Android Application




Umer Asgher
Doi : 10.7321/jscse.v3.n3.74
Page : 487 - 491
Show Summary
Abstract . The economics of an internet crime has newly developed into a field of controlling black money. This economic approach not only provides estimated technique of analyzing internet crimes but also gives details to analyzers of system dependability and divergence. This paper will highlight on the subject of online crime, which has formed its industry since. It all started from amateur hackers who cracked websites and wrote malicious software in pursuit of fun or achieving limited objectives to professional hacking. In the past days, electronic fraud was main objective but now it has been changed into electronic hacking. This study focuses the issue through an economic analysis of available web forum to deals in malware and private information. The findings of this survey research provide considerable in-depth sight into the functions of malware economy spinning around computer impositions and compromise. In this regard, the survey research paper may benefit particularly computer security officials, the law enforcement agencies, and in general prospective anyone involved in better understanding cybercrime from the offender standpoint.
Keyword : Malware ; cyber crime ; crime economics ; Information security ; online crime ; IT industry ; electronic fraud ; hackers




Maryam Saeed, Hadi Shahriar Shahhoseini, Ali Mackvandi, Mohammad Reza Rezaeinezhad, Mansour Naddafiun
Doi : 10.7321/jscse.v3.n3.75
Page : 492 - 501
Show Summary
Abstract . Three-party Password Authenticated Key Exchange (3PAKE) protocols play a key role in providing security goals in communications. They enable two entities to share a common session key in an authentic manner based on a low entropy human-memorable password. In 2010, Lee and Hwang proposed S-IA-3PAKE and S-EA-3PAKE protocols based on the SPAKE protocol developed by Abdalla and Pointcheval. In 2011, Chang et al. presented an efficient three-party Password Authenticated Key Exchange Protocol and its parallel version based on LHL-3PAKE protocol proposed by Lee et al. In this paper, it is shown that both supposedly provably secure S-IA-3PAKE and S-EA-3PAKE protocols are vulnerable to serious threats such as Unknown Key Share (UKS) and password compromise impersonation attacks. It is also shown that the provably secure protocol of Chang et al. and its parallel version suffer from password compromise impersonation and ephemeral key compromise impersonation attacks. Indeed, our results highlight the need of more attention and precision during defining the provable security models and constructing proofs in this method, because there are still considerable gaps between what can be proven based on formal security models and what are actually secure in use.
Keyword : Password Authenticated Key Exchange; Cryptanalysis; Unknown Key Share attack (UKS); ephemeral key compromise impersonation attack; password compromise impersonation attack.




Iehab Alrassan
Doi : 10.7321/jscse.v3.n3.76
Page : 502 - 506
Show Summary
Abstract . Nowadays, most of enterprises are using Web Services as a new wave for exchanging information in their e-business integration. Security is a major concern when Web Services are emerged. Web Services are based on SOAP(Simple Object Access Protocol) message for exchanging information. In e- business, this information may be sensitive and there is a huge possibility that SOAP message is intercepted and modified by eavesdroppers. In This research, we discuss the significant impact of adoption Web Services in e-business, whereas Web Services support application-to-application interactions. However, security still the biggest challenge that faces Web Services. We also highlight the web services security standards that may be used to ensure Web Services security. Any proposed security model must consider the securities' goals which are integrity, authorization, authentication, confidentiality and non- repudiation. In this research, we focus on SOAP message and how to ensure its security, since the SOAP message is the transmission unit in Web Services. We proposed a security model to enhance security of e-business. This model is based on XML signature and XML encryption to sign and encrypt SOAP message. Moreover, RSA is the encryption algorithm that used for encryption. We expect that our proposed model will achieve a good security with acceptable performance.
Keyword : Web Services ; SOAP ; e-business ; XML encryption ; XML signature ; SAML ; XKMS




Ta Minh Thanh, Munetoshi Iwakiri
Doi : 10.7321/jscse.v3.n3.77
Page : 507 - 513
Show Summary
Abstract . In general, DRM (Digital Rights Management) system is responsible for the safe distribution of digital content, however, DRM system is achieved with individual function modules of cryptography, watermarking and so on. In this typical system flow, it has a problem that all original digital contents are temporarily disclosed with perfect condition via decryption process. In this paper, we propose the combination of the differential codes and fragile fingerprinting (DCFF) method based on incomplete cryptography that holds promise for a better compromise between practicality and security for emerging digital rights management applications. Experimental results with simulation confirmed that DCFF keeps compatibility with standard JPEG codec, and revealed that the proposed method is suitable for DRM in the network distribution system.
Keyword : DRM (Digital Rights Management) ; Incomplete Cryptography ; Differential Codes ; Fragile Fingerprinting




Xiaohua Feng, Jerry Louise
Doi : 10.7321/jscse.v3.n3.78
Page : 514 - 516
Show Summary
Abstract . Session hacking is one of the important issues in computer security. Here, a new framework is proposed to solve this kind of MITM attack-detection in computer network.
Keyword : Wireless Computing; Mitm Attack, Session Hacking; Cloud Breaches; Intrution Detection And Computer Security Tools




Carol Niznik
Doi : 10.7321/jscse.v3.n3.79
Page : 517 - 523
Show Summary
Abstract . The Universal Interface Software(UIS) Protocol was a Theater Missile Defense Gateway Protocol which linked the Strategic Defense Initiative(SDI) Architecture Killer Satellite Software Protocol to the National Test Bed Simulation Software Protocol to enable neural network shock loop operation when ICBMS were approaching the SDI Shield. A Gateway Software is required for Homeland Defense and Security Systems to communicate the sensor information from hardware and software boxes at airports and government buildings and other locations to the Global Information Grid(GIG). Therefore, a Homeland Defense and Security UIS(HDSUIS) Protocol is achieved by UIS conversion to HDSUIS for Thresholds Stabilization and GIG and terrorist sensor Enhancements, Homeland Defense and Security Lagrangian equation and GIG simulation facility timing chart Alterations, and two Catastrophe Theory Protocol Attachments to the UIS Geometric software structure inner cube. This UIS Protocol conversion to the HDSUIS Protocol will track and provide a Congestion Controlled, i.e.,prevention of deadlock and livelock, communication of (1) Shoe bombers and copycat shoe bombers, (2) deeply buried and imbedded boxes with explosives, (3) damage to lase1 equipment, (4) shoulder missile fired armament, and (5) surface to air missiles from their sensor equipment to the Global Information Grid with Theater Missile Defense Characteristics. The Homeland Defense and Security GNNO(Geometric Neural Network Overlay) Protocol will be derived as a conversion of the UIS GNNO Protocol.
Keyword : Global Information Grid(GIG); SDI Shield; Catastrophe Theory Protocol; shoe bomber; GIG Simulation Facility; sensor equipment; Strategic Defense Initiative(SDI); Lagrangian equation; UIS Protocol; HDSUIS Protocol; Homeland Defense; Theater Missile Defense; GNNO Protocol



In Publish
Alain Tchana, Suzy Temate, Laurent Broto, Daniel Hagimont
Show Summary
Abstract . The Autonomic Administration technology has proved its efficiency for the administration of complex com-puting systems. However, experiments conducted with several Autonomic Administration Systems (AAS) revealed the need to adapt the AAS according to the administrated system or the considered administration facet. Consequently, users usually have to adapt even to re-implement the AAS according to their specific needs but these tasks require high expertise on the AAS implementation that users do not necessarily have. In this paper we propose a service-oriented components approach to build a generic, flexible, and useful AAS. We present an implementation of this approach, the design principles and the prototype called TUNeEngine. We illustrate the flexibility of this prototype through the administration of a complex computing system which is a virtualized cloud platform.
Keyword : Autonomic Administration ; Adaptable System ; Components Model




Mohammad Zarour, Abdulrahman Alarifi, Alain Abran, Jean-Marc Desharnais
Doi : 10.7321/jscse.v3.n3.81
Page : 536 - 543
Show Summary
Abstract . Software Process Assessment (SPA) is an effective method used to understand organizations’ software processquality. Assessment methods are tools used to identify the possible software process improvement opportunities. This paper studies the design process of the SPA methods from an engineering viewpoint and uses Vincenti’s classifications of engineering design knowledge as an analytical tool. The analyses end up with the necessary pieces of knowledge that the SPA methods’ designers bring with them before starting the design process of the SPA method. These pieces of knowledge provide useful guidelines, mainly for less experienced designers, to start SPA methods design. For the already developed SPA methods, these pieces of knowledge can be used as evaluation criteria that disclose the strengths and weaknesses of theses SPA methods.
Keyword : Software, Process, Assessment, Evaluation ; Engineering, Design




Raghu Hudli, Shrinidhi Hudli
Doi : 10.7321/jscse.v3.n3.82
Page : 544 - 548
Show Summary
Abstract . Most modern programming languages support multiple programming paradigms. For example, C++ supports procedural and object-oriented programming. Java supports mostly object-oriented programming, though one could stretch its features to write procedural programs. Languages like Ruby Python, Groovy, and Scala, among others, support functional programming, procedural programming, and object-oriented programming. Our interest is in examining the features pertaining to functional programming and object-oriented programming. Specifically, out interest is in the correspondence between closures in functional paradigm and objects. In this paper we show that closures and subsumed by objects. We demonstrate subsumption using structural analysis.
Keyword : Closure ; object-oriented programming ; functional programming



In Publish
Leandro Cupertino, Georges Da Costa, Amal Sayah, Jean-Marc Pierson
Show Summary
Abstract . The popularity of hand-held and portable devices put the energy aware computing in evidence. The need for long time batteries surpasses the hardware manufacturer, impacting the operational system policies and software development. Power modeling of applications has been studied during the last years and can be used to estimate their total energy. In order to aid the programmer to implement energy efficient algorithms, this paper introduces an application’s energy profiler, namely Valgreen, which exploits the battery’s information in order to generate an architecture independent power model through a calibration process.
Keyword : application ; energy ; profiler ; power ; model




Rupinder Pal Kaur, Vishal Goyal, Gagandeep Kaur
Doi : 10.7321/jscse.v3.n3.84
Page : 557 - 563
Show Summary
Abstract . A systematic and quantitative engineering-based approach is followed in this research, to develop a web quality model following well-known international standards and guidelines. The quality model can be used to measure the external quality and to evaluate and compare the quality of web-sites developed in Punjabi and Hindi. The quality model consists of two parts first includes the attributes that need visual observation and the other part can be automated.
Keyword : Web Quality model, External Quality, Quantitativel




Vijey Thayananthan, Ahmed Alzahrani and Iyad Katib
Doi : 10.7321/jscse.v3.n3.85
Page : 564 - 569
Show Summary
Abstract . Applications of multiple input and multiple output (MIMO) with Stiefel manifold are growing in the next generation communications and their system engineering developments. These applications need to be optimized efficiently with less complexity and cost. In this research, optimization problems in MIMO systems using Stiefel manifold are considered. As far as general manifolds are concerned, optimization methods are applied in a different environment that deals with basic signal processing. Here, specific optimization technique, which can be applicable in current MIMO system with Stiefel manifold, is given as a methodology. The current MIMO system is a new technology, and it can be applied in the next generation technology because it uses manifolds in feedback of MIMO systems. Even though, overall performance of new and current technology is better than conventional techniques, final performance of MIMO applications is anticipated with efficient optimizations.
Keyword : Stiefel manifold; MIMO system; optimization; Feedback




J.Abdul Jaleel, Sibi Salim, Aswin.R.B
Doi : 10.7321/jscse.v3.n3.86
Page : 570 - 575
Show Summary
Abstract . Skin cancer is a deadly condition occurring in the skin. It is a gradually evolving condition which starts in the melanocyctes in skin. So it is also called as Melanoma. First it occurs in a small region and later spreads to other parts of the body through the lymphatic system. If the skin cancer is detected at the early stages, it can be cured. So an early detection system is inevitable in skin cancer diagnosis. Melanoma can be of Benign or Malignant. Malignant melanoma is the dangerous condition, while benign is not. At initial stages, both of them resembles in appearance. So a classification of benign and malignant melanoma is difficult. Only an expert Dermatologist can make correct classification. Conventional diagnosing procedures include preliminary diagnosis by direct observation by doctors and Biopsy method for confirmation. Biopsy method is a painful and time consuming one. So an efficient classification system using Artificial Intelligence (AI) and Image Processing Techniques (IPT)is proposed. Dermoscopic images are given as input to the system. Images contain noises and hairs. The noises are removed using image processing techniques. After that, region of interest or suspicious region of skin is separated from normal skin using Segmentation. Segmentation method used here is Color Threshold Segmentation. Two feature extraction techniques used- Gray Level Co-occurrence Matrix (GLCM) method and Red, Green, Blue (RGB) color features. These features are gives as the input to Artificial Neural Network Classifier. It classifies the given data set into Cancerous and Non-cancerous.
Keyword : Skin cancer, Segmentation, Gray Level Co-occurrenc




Toshiyuki MAEDA, Masanori FUJII
Doi : 10.7321/jscse.v3.n3.87
Page : 576 - 580
Show Summary
Abstract . We present a skill analysis with time series image data using data mining methods, focused on table tennis. We do not use body model, but use only hi-speed movies, from which time series data are obtained and analyzed using data mining methods such as C4.5 and so on. We identify internal models for technical skills as evaluation skillfulness for the forehand stroke of table tennis, and discuss mono and meta-functional skills for improving skills.
Keyword : Time Series Data, Sport Skill, Data Mining, Image




Ren Gang, Xuchen Yang, Zhe Wen, Dave Headlam, Mark F. Bocko
Doi : 10.7321/jscse.v3.n3.88
Page : 581 - 587
Show Summary
Abstract . Music performance conveys profound music understanding and artistic expression in musical sound. These performance-related dimensions can be extracted from audio and encoded as musical expressive features, which is based on a high-dimensional sequential data structure. In this paper we propose a structure learning based method using probabilistic graphical models that obtains a hierarchical dependency graph from musical expressive features. The hierarchical dependency graph we proposed serves as an intuitive visualization interface of the internal dependency patterns within feature data series and helps music scholars identify in-depthconceptual structures.
Keyword : knowledge engineering ; feature analysis ; probabilistic graphical model ; music performance analysis




Ilnaz Jamali, Sattar Hashemi
Doi : 10.7321/jscse.v3.n3.89
Page : 588 - 592
Show Summary
Abstract . This paper introduces a game theoretic framework for feature selection in imbalance data sets. In this method which is called FSSH (Feature Selection based on Shapley value), first some coalitions will be constructed and the marginal importance of each feature in its coalition will be computed. Then, the weighted mean of each feature’s value considered as the Shapley value. Finally features will be ranked according to their Shapley value and high ranked features will be selected in the realm of feature selection. Experimental results and comparison with several existing feature selection methods show the advantages of presented approach across the data sets adopted in this study.
Keyword : feature selection, imbalance data sets, game theor




Umer Asgher
Doi : 10.7321/jscse.v3.n3.90
Page : 593 - 599
Show Summary
Abstract . The issues of “research required in the field of bio-medical engineering” and “externally-powered prostheses” are attracting attention of regulatory bodies and the common people in various parts of the globe. Today, 90% of prostheses used are conventional body-powered cable-controlled ones which are very uncomfortable to the amputees as fairly large amount of forces and excursions have to be generated by the amputee. Additionally, its amount of rotation is limited. Alternatively, prosthetic limbs driven using electrical motors might deliver added functionality and improved control, accompanied by better cosmesis, however, it could be bulky and costly. Presently existing proposals usually require fewer bodily response and need additional upkeep than the cable operated prosthetic limbs. Due to the motives mentioned, proposal for mechanization of body-powered prostheses, with ease of maintenance and cost in mind, is presented in this paper. The prosthetic upper limb which is being automated is for Transhumeral type of amputees i.e., amputated from above elbow. The study consists of two main portions: one is lifting mechanism of the limb and the other is gripping mechanism for the hand using switch controls, which is the most cost effective and optimized solution, rather than using complex and expensive myoelectric control signals.
Keyword : Prosthetic upper limb ; Body-powered cable-controlled ; Externally-powered prostheses ; Transhumeral ; Myoelectric control signals ; Switch controls




Nadia Otmani-Benmehidi, Meriem Arar, Imene Chine
Doi : 10.7321/jscse.v3.n3.91
Page : 600 - 607
Show Summary
Abstract . Numerical modeling using computers is known to present several advantages compared to experimental testing. The high cost and the amount of time required to prepare and to perform a test were among the main problems on the table when the first tools for modeling structures in fire were developed. The discipline structures-in-fire modeling is still currently the subject of important research efforts around the word, those research efforts led to develop many software. In this paper, our task is oriented to the study of fire behavior and the impact of the span reinforced concrete walls with different sections belonging to a residential building braced by a system composed of porticoes and sails. Regarding the design and mechanical loading (compression forces and moments) exerted on the walls in question, we are based on the results of a study conducted at cold. We use on this subject the software Safir witch obeys to the Eurocode laws, to realize this study. It was found that loading, heating, and sizing play a capital role in the state of failed walls. Our results justify well the use of reinforced concrete walls, acting as a firewall. Their role is to limit the spread of fire from one structure to another structure nearby, since we get fire resistance reaching more than 10 hours depending on the loading considered.
Keyword : Fire, Resistance, Flame, Behavior, wall




Samaneh Jalaliyan, Mahboubeh Moghaddas, Mehdi Yousefi Tabari, Hadi Ebrahimi
Doi : 10.7321/jscse.v3.n3.92
Page : 608 - 612
Show Summary
Abstract . In this paper the main objective of this study is to investigate on chaotic behavior of fractional-order modeled LU system and its controllability. It has been shown that this problem could lead to synchronization of two master and slave systems with the different fractional-order. The proposed method which is based on active sliding mode control (ASMC) has been developed to synchronize two chaotic systems with the partially different attractor. The numerical simulation results, verify the significance of the proposed controller even for chaotic synchronization task.
Keyword : fractional calculus; fractional order active sliding mode controller; synchronization ; LU-LU



In Publish
Seyed Gholamreza Eslami, Ali Peiravi, behzad molavi
Show Summary
Abstract . Smart card technology has resulted in vast developments in many aspects of modern human life. User acceptance of fuel rationing smart cards based on adoption model involves many factors such as: satisfaction, security, external variables, attitude toward using, etc. In this study, user acceptance and security factors for fuel rationing smart cards in Iran have been evaluated based on an adoption model by distributing a questionnaire among UTM (University Technology Malaysia) Iranian students, MMU (Multimedia University) Iranian students, either asking by e-mail from people who are not available.
Keyword : smart card; adoption model; fuel smart card; user acceptance; security; satisfaction; external variables; attitude toward using; adoption; technology




Asma Sbeih, Feras Yaghmor, Taha Abed Rabu, Mohammad Issa
Doi : 10.7321/jscse.v3.n3.94
Page : 618 - 624
Show Summary
Abstract . In recent year, the users of smart mobiles increased rapidly, and Onset of diabetes in our daily lives dramatically and thus it became like flu disease for these reasons, we want to build a smart mobile application to help diabetics manage and control their diabetes, and make a self-evaluation by inputting the results of their blood glucose, blood pressure, height, weight, diseases and other variables. There is no need for the patients to be guessing for 3 or 4 months what their A1C results will be[2]. The application will predict HbA1c from daily checking blood sugar levels with a blood sugar monitor, to remove anxiety of the unknown, and help to have a better control of the ups and downs in blood sugar. Control and remind time to take blood sugar tests, medications, make exercise and diet. Store all data taken from inputs, generate graphs for the outputs, and share it with doctors by email, or simply print it. The application is connected with online feeds and brochures about the latest diabetes news, articles, and advises will be delivered right to the mobile between the patients hands.
Keyword : Mobile healthcare ; Diabetics ; Diabetes Self-Management ; Mobile Phones ; mhealth




Aleksandra Karimaa
Doi : 10.7321/jscse.v3.n3.95
Page : 625 - 631
Show Summary
Abstract . High resolution cameras with powerful chips, efficient compression algorithms, and heterogeneous access to capable infrastructure favor the creation of new innovative video solutions for communication, collaboration, or video monitoring. The efficiency of such new solution is obviously important as it contributes to implementation quality and it helps to estimate the cost and direction of product development. Despite of this, the efficiency evaluation measurements are typically limited to few metrics, such as end-to-end video latency, bandwidth usage and efficiency of compression algorithms for utilizing expensive data storage space. This article presents functional approach to measuring the efficiency of systems with video sources. We propose sets of applicable metrics identifying the efficiency of system functional areas. We utilize these sets to provide an outline of evaluation tool in form of scoring system. Finally, we present the tool by evaluating efficiency aspects of designing example video system in cloud environment.
Keyword : video; scoring; functional design; efficiency




John Ronczka
Doi : 10.7321/jscse.v3.n3.96
Page : 632 - 636
Show Summary
Abstract . Wisdom open system semantic identification (WOSSI) modelling and simulation tool may be used for Soft Computing and Software Engineering (SCSE) pattern recognition. As a way forward, a ‘Human sustainment system’ (HSS) has been put forward as a conceptualisation that WOSSI entities might be aware and adaptable to the users need and wants within the context of Informatics medicine based on SCSE driven intelligent decision technologies.
Keyword : WOSSI; Coalescence Theory; entanglement; biorheology; logic gates




Sivasankari N, Malleswaran M
Doi : 10.7321/jscse.v3.n3.97
Page : 637 - 643
Show Summary
Abstract . Integration of Global Positioning System (GPS) and Inertial Navigation System (INS) has been extensively used in aircraft applications like autopilot, to provide better navigation, even in the absence of GPS. Even though Kalman Filter (KF) based GPS/INS integration provides a robust solution to navigation, it requires prior knowledge of the error model of INS, which increases the complexity of the system. Hence Neural Networks (NN) based GPS/INS integration are available in literature. But the NN based solutions have problems such as convergence and inaccuracy. To get better convergence ability the Recurrent Neural Network like Jordan Neural Network is proposed. Normally Back propagation Algorithm (BPA) is used to train the Recurrent Neural Network. But BP algorithm has disadvantages such as slow convergence rate and inaccuracy due to local minima. To overcome these problems, Evolutionary Algorithms like Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) trained Jordan Neural Network is proposed to get better position accuracy of the target. In this work, GPS/INS integration based on neural networks like Back Propagation Neural Network (BPNN) and Jordan Neural Network using BPA, GA and PSO are also analyzed and their performance parameters are compared.
Keyword : Recurrent Neural Network (RNN); ; Jordan Neural Network; ; Genetic Algorithm (GA); ; Particle Swarm Optimization (PSO).




Maria Jose Marquez, ,Luis Manuel Sarro
Doi : 10.7321/jscse.v3.n3.98
Page : 644 - 650
Show Summary
Abstract . Calibration is nowadays one of the most important processes involved in the extraction of valuable data from measurements. The current availability of an optimum data cube measured from a heterogeneous set of instruments and surveys relies on a systematic and robust approach in the corresponding measurement analysis. In this sense, the inference of configurable instrument parameters, as part of data modelling, can considerably increase the quality of the data obtained. Any measurement devoted to scientific purposes contains an element of uncertainty. The level of noise, for example, determines the limit of usability of an image. Therefore, a mathematical model representing the reality of the measured data should also include at least the sources of noise which are the most relevant ones for the context of that measurement. This paper proposes a solution based on Bayesian inference for the estimation of the configurable parameters relevant to the signal-to-noise ratio. The information obtained by the resolution of this problem can be handled in a very useful way if it is considered as part of an adaptive loop for the overall measurement strategy, in such a way that the outcome of this parametric inference leads to an increase in the knowledge of a model comparison problem in the context of data modelling and measurement interpretation. The context of this problem is the multi-wavelength measurements coming from diverse cosmological surveys and obtained using various telescope instruments. As a first step, a thorough analysis of the typical noise contributions will be performed based on the current state-of-the-art of modern telescope instrumentation. A second step will then consist of identifying configurable parameters relevant to the noise model under consideration. Then, as a third step, a Bayesian inference for these parameters estimations will be applied, taking into account a proper identification of the nuisance parameters and the adequate selection of a prior probability. Finally, a corresponding set of conclusions will be derived.
Keyword : signal; noise; gain; quantum efficienty; count; read noise; dark current; nuisance parameters




Majid Tajamolian, Majid Taghiloo, Mohammad Ali Agheli
Doi : 10.7321/jscse.v3.n3.99
Page : 651 - 658
Show Summary
Abstract . Traditional software development methods have different phases that must be accomplished step by step. The most important stage is the software analyzing and design phase that the result architecture will be the base of implementation. Since the framework of software is created from scratch, maximum flexibility can be found in the architecture design and development of software. But In term of methodology, product development based on open source software is different from traditional methods. In this method, software product will be produced by integration of the separate open source modules. Each of these modules is an independent standalone product and to cover the additional functional requirements, they must be putted together. To provide its own functionalities, each independent module uses a set of Blocks as a architectural component of module. In this paper, a new methodology is proposed to describe all of the challenges in the course of product development based on open source software.
Keyword : Software development methodology; Open source; FOSS; Security




Thomas Chowdhury, Happy Rani Debi
Doi : 10.7321/jscse.v3.n3.100
Page : 659 - 662
Show Summary
Abstract . Advertising over the internet has gained great momentum in recent years. Compare to other traditional media such as television and newspapers, the internet and World Wide Web provides a more rapid ways for advertising and it largely decreases the cost of publishing and updating advertisements. Currently different types of websites normally host lots of advertisements as an embedded system but apparently it is not relevant and consistent to the users’ interest. In this paper, an online intelligent advertising system has been introduced which provides more efficient, effective, and smart solution for online advertising. It provides advertisements according to the user’s activities such as user’s visit like or dislike by calculating the priority from the trust network among the users. Besides, it represents advertisements according to user’s information such as location, environment, gender, income etc. An intelligent approach has been proposed in the paper that attempts to learn a user’s profile from his given information and finally suggest relevant advertisements accordingly.
Keyword : online advertising system; trust network; embedded system; intelligent approach; subjective logic




Tehmina Ayub, M. Fadhil Nuruddin, Sadaqat Ullah Khan, Fareed Ahmed Memon
Doi : 10.7321/jscse.v3.n3.101
Page : 663 - 667
Show Summary
Abstract . This paper focuses on the development of a predictive model for compressive strength of concrete confined with Ferrocement using MATLAB Artificial Neural Network (ANN) approach. Data of fifty five (55) plain concrete cylinders confined with Ferrocement in three (03) ways, has been gathered from existing literature, out of which basic parameters of randomly selected nineteen (19) specimens have been used in the multilayer feed forward neural network model to develop a predictive model through training. Basic eight input parame-ters included cylinder and core dimension, no. of wire-mesh layers, wire diameter and spacing, yield strength of the wire of wire-mesh and unconfined compressive strength. After train-ing, predictive model had been tested using overall data of fifty five (55) specimens which showed excellent agreement between the results generated by the ANN predictive model and exper-imental results. Regression value (R), root mean square error (RMS) and absolute fraction of variance (V) were also calcu-lated to compare experimental and ANN predictive model re-sults which also showed better performance of the ANN pre-dictive model.
Keyword : compressive strength; confinement; ferrocement; wire-mesh layers; artificial neural network




Sonal Rane, Satish Shah, Dharmistha Vishwakarma
Doi : 10.7321/jscse.v3.n3.102
Page : 668 - 673
Show Summary
Abstract . Real time application and QOS are important services provided by IEEE 802.15.4 standard for Wireless Sensor Network (WSN). By providing Guaranteed Time Slot (GTS) mechanism, time critical application can be fulfilled by the IEEE 802.15.4 standard. This paper explores the under utilization of bandwidth in WSN and analyzed GTS mechanism by evaluating throughput using Artificial Neural Network (ANN) soft computing technique in OPNET Modeler. Our focus is for better GTS throughput, which can be achieved due to the packet size based on data rate and Interarrival rate using ANN soft computing technique.
Keyword : Wireless sensor network (WSN); OPNET; Artificial Neural Network; IEEE 802.15.4; GTS; Packet Medium Access Delay




Narasimha Prasad L V, Prudhvi Kumar Reddy K, M M Naidu
Doi : 10.7321/jscse.v3.n3.103
Page : 674 - 682
Show Summary
Abstract . The population of the world has been increasing substantially. The populous countries like India, seriously lagging behind to provide the basic needs to the people. Food is one of the basic needs that any country has to fulfill. Agriculture is one of the major sectors on which one third of Indian population depends on. The irrigation based countries like India where the water has been the basic resource that forges the plants’ growth. The main resource for the irrigation is rainfall which is scientifically a liquid form of precipitation. The atmospheric nimbus clouds are responsible for this precipitation. Prediction of the precipitation is necessary, as it has to be considered during the financial planning of a country. The meteorological departments of every nation are very keen in recording the datasets of precipitation which are huge in content. Hence, data mining is found to be an apt tool which would extract the relation between the datasets and their attributes. A Supervised Learning in Quest is one such data mining algorithm which is eventually a decision tree used to predict the precipitation based on the historical data. The Supervised Learning in Quest decision tree using gain ratio is a statistical analysis for establishing the relation between attribute set and precipitation which furnishes the prediction with an accuracy of 77.78%.
Keyword : Data Mining ; Decision Tree ; Meteorology ; Precipitation ; Prediction ; Rainfall ; SLIQ




Basel Magableh, Michela Bertolotto
Doi : 10.7321/jscse.v3.n3.104
Page : 683 - 691
Show Summary
Abstract . With continuous increase of available geographical information services, requirements for personalising map content according to the user’s profile and context information are increasingly important. Map personalisation applications could adapt their functionality/behaviour to provide the user with specific spatial data related to his interests at runtime. However, this can be achieved if map applications are able to filter and prioritise geospatial data using dynamic decision-making processes, which considers users’ profiles and context for selecting and styling map content according to their needs. To this aim, this article proposes a new approach for map personalisation using dynamic rule-based engine, which provides the map application with the ability to change its styles and rules dynamically according to users’ profiles. This approach differs from the majority of existing works, which seek to embed the styling rules on the functional implementation of the map application. In addition, the personalisation engine is integrated with context-driven adaptation, which allows the application to monitor, detect, analyse, and react over changes on the computational environment and users’ profiles. This enables map applications to use a styling rule that provides different levels of personalisation and adapt to changes in the computational environment including level of resources and quality of services.
Keyword : self-adaptive map personalisation service ; context oriented software development ; map personalisation




Sami Alwakeel, Muhammad Ammad-uddin
Doi : 10.7321/jscse.v3.n3.105
Page : 692 - 699
Show Summary
Abstract . Femtocell is a small cellular base station, typically designed for use in a home or small business. With a Public access to Femtocell system, a large number of public users will migrate to the Femtocell network, whenever they enter the Femtocell coverage area. As a result, the Femtocell will be severely congested, and its home users will suffer from performance degradation. In this study, we propose a soft computing system forcall admission in Femtocell networks. The system allows public call to be accepted only if it does not degrade the performance of Femtocell indoor home calls. The system constitutes of both resource partition policy and a call admission algorithm for dynamic resource management of Femtocell network. The partition based control policy allows Femtocell resource partition and prioritizes its calls as: Femtocell voice calls, Femtocell data calls, Handover public voice calls, Handover public data calls, New public voice calls, and New public data calls respectively. The Call Admission Control (CAC) algorithm is implemented for optimization of the calls admission to the Femtocell network partitions. We tested this policy with the combination of six different CAC algorithms 1- Open Access (OA), 2- Restricted Access (RA), 3- Open Shared Access (OSA), 4-Probabilistic Shared Access (PSA), 5-Probabilistic Access (PA) and 6-Bayesian based dynamic probability Access (BDP). The performance results of these algorithms are compared. The research results shows that our proposed soft computing based system for dynamic management of Femtocell resources improves network efficiency and throughput.
Keyword : Soft computing based Femtocell resource management ; Soft computing based Femtocell resource management



In Publish
Jun Chen, Tao Han, Guangjun Wang, Yang Yu, Zhihua Yu, Linbo Luo
Show Summary
Abstract . In this paper, we propose a new method for remote sensing image registration. The method contains three steps. First, it extracts SIFT feature correspondences from the given remote sensing image pair. The correspondences in general contains lots of mismatches. We then estimate the transform between the image pair from the feature correspondences based on a robust algorithm named based on vector field consensus, which is the main novelty of our approach. The transform is modeled in a function space named reproducing kernel Hilbert space. Finally, we use the backward approach for image resampling and transformation. We compare our approach to the typical RANSAC-based method, and experimental results show the superiority of our method.
Keyword : Image registration ; Remote sensing ; Transform estimation ; Vector field consensus ; Robust estimator




Farhad Nematy, Naeim Rahmani
Doi : 10.7321/jscse.v3.n3.107
Page : 706 - 713
Show Summary
Abstract . A wireless sensor network (WSN) is a large-scale ad-hoc multi-hop network deployed (usually, at random) in a region of interest for surveillance purpose. Coverage is one of the important aspects of wireless sensor networks and many approaches introduced to maximize it. In this paper, a novel approach for maximizing coverage proposed. Voronoi diagram divides the field into cells and inside of each cell some holes exist. Different number of additional nodes must be placed inside cells to cover the holes because voronoi cells have different sizes. Genetic algorithm is used to determine best places for additional nodes to maximize the coverage. The proposed algorithm is distributed and optimization for each voronoi cell can be done in parallel to others. Optimal placement of nodes can guarantee the maximum coverage with less number of nodes and energy consumption decreases. Simulations results show that our new approach can outperform other earlier works.
Keyword : wireless sensor network; voronoi diagram; genetic algorithm




Ayesha Haider Ali, Faria Kanwal, Afnan Bashir, Kiran Zia, Komal Bashir
Doi : 10.7321/jscse.v3.n3.108
Page : 714 - 719
Show Summary
Abstract . Due to the growing dependence on web for information, entertainment, business, education and real time applications, the demand for high speed broadband wireless systems is increasing rapidly. The IEEE 802.16 (WiMAX) standard has emerged as a solution to meet all these requirements. The IEEE 802.16 has the following advantages; high data rate, wireless access for last mile, point to multipoint communication, high frequency range and QoS for various types of application flows. However the details of packet scheduling mechanisms are left unspecified in the standard. Therefore, we propose a QoS scheduling architecture for efficient bandwidth allocation. Our main goals are to provide delay and bandwidth guarantees for various type of applications and to maintain fairness among various flows.
Keyword : WiMAX, QoS, IEEE802.16




Kevin Fuchs
Doi : 10.7321/jscse.v3.n3.109
Page : 720 - 725
Show Summary
Abstract . This paper introduces a new approach for the automated discovery of heterogeneous network topologies. The algorithm uses only information that is stored in the Address Forwarding Tables (AFTs) of the network devices. There have been different efforts to find an algorithmic solution using only AFTs. This has always involved the problem that AFTs contain incomplete information, which made it difficult to develop efficient solutions. This paper describes a new probabilistic method, for which the basis is the calculation of degrees to which entries in the AFTs overlap. These overlap degrees are used to calculate interconnection probabilities of network devices. Finally the topology which is the most probable one is selected. Although the search space may become extremely large, the algorithm is pleasantly efficient.
Keyword : topology discovery; heterogeneous networks ; link layer; overlap degrees of address sets; connection probabilities; Kruskal’s algorithm




Ziaeddin Beheshtifard, Mohammad Reza Meybodi
Doi : 10.7321/jscse.v3.n3.110
Page : 726 - 732
Show Summary
Abstract . In this paper, we look into the problem of channel assignment in multi-channel multi-radio wireless mesh networks. We propose a new learning automata based channel assignment scheme that adaptively improve network overall throughput by expecting channel state. Since the ability of sending packets via upstream links will be evaluation bases for assigning channels to radio interfaces on each node. We use a link capacity function that potentially reflects degree of interferences imposed by selected channels by each node. According to dynamics of system, proposed algorithm assigns channels to radio interface in distributed fashion such that minimize interference in neighborhood of a node. We analyze the stability of the system via appropriate Lyapunov-like trajectory; we show that stability and optimum point of the system is converged.
Keyword : Learning Automata ; Wireless Mesh Networks ; Channel Assignment




Pooja Suratia, Pooja Suratia, Nirmalkumar Reshamwala
Doi : 10.7321/jscse.v3.n3.111
Page : 733 - 738
Show Summary
Abstract . Paper propose a robust channel estimator for downlink Long Term Evolution-Advanced (LTE-A) system using Artificial Neural Network (ANN) trained by backpropa- gation algorithm (BPA) and ANN trained by genetic algorithm (GA). The new methods use the information provided by the received reference symbols to estimate the total frequency response of the channel in two phases. In the first phase, the proposed method learns to adapt to the channel variations, and in the second phase it predicts the channel parameters. The performance of the estimation methods is confirmed by simula- tions in Vienna LTE-A Link Level Simulator. Performances of the proposed channel estimator, ANN trained by GA and ANN trained by BPA is compared with traditional Least Square (LS) algorithm for Closed Loop Spatial Multiplexing-Single User Multi-input Multi-output (2X2) (CLSM-SUMIMO) case.
Keyword : LTE-A, MIMO, Artificial Neural Network, Back- Prop




Sarra Mamechaoui, Didi Fedoua, Guy Pujolle
Doi : 10.7321/jscse.v3.n3.112
Page : 739 - 744
Show Summary
Abstract . In the last few years, Telecom operators, Internet Service Providers and public organizations reported statistics of network energy requirements and the related carbon footprint, showing an alarming and growing trend. With this increasing demand for energy in these field related with the increase in carbon dioxide levels in the environment produced by wireless devices in the idle mode, it became necessary to develop to develop the technology that reduce energy consumption. In this context, Wireless Mesh Networks (WMNs) are commonly considered the most suitable architecture because of their versatility that allows flexible configurations. This paper focuses mainly on different studies that propose number of protocols in different types of wireless networks. Several approaches are presented and discussions on the details related to energy management WMN are also presented. The classification of different techniques of the largest existing approaches spent on energy conservation is treated.
Keyword : WMN; Power control; Connected Active Subset; Asynchronous Wake-up; Synchronized Wake-up




Anthony Marcus, Ionut Cardei, Borko Furht, Osman Salem, Ahmed Mehaoua
Doi : 10.7321/jscse.v3.n3.113
Page : 745 - 752
Show Summary
Abstract . Various implementations of wireless sensor networks (i.e. personal area-, wireless body area- networks) are prone to node and network failures by such characteristics as limited node energy resources and hardware damage incurred from their surrounding environment (i.e. flooding, forest fires, a patient falling). This may jeopardize their reliability to act as early warning systems, monitoring systems for patients and athletes, and industrial and environmental observation networks. Following the current trend and widespread use of hand held, mobile communication devices, we outline an application architecture designed to detect and predict faulty nodes in wireless sensor networks. Furthermore, we implement our design as a proof of concept prototype for Android-based smartphones, which may be extended to develop other applications used for monitoring networked wireless personal area and body sensors used in other capacities. We have conducted several preliminary experiments to demonstrate the use of our design, which is capable of monitoring networks of wireless sensor devices and predicting node faults based on several localized metrics. As attributes of such networks may change over time, any models generated when the application is initialized must be updated periodically such that the applied machine learning algorithm maintains high levels of both accuracy and precision. The application is designed to discover node faults and, once identified, alert the user so that appropriate action may be taken.
Keyword : wireless sensor networks ; data mining and machine learning ; smartphone



In Publish
Marc Jansen
Show Summary
Abstract . One of the technical building blocks of Cloud Computing infrastructures are Web Services. With respect to mobile devices their role as Web Service consumers is widely accepted and today a large number of mobile applications already consume Web Services in order to fulfill their task. Still, not much research is conducted, as yet, to allow deploying Web Services on mobile devices and thus uses these kinds of devices as Web Service providers. This paper presents an analysis of one already implemented approach for provisioning mobile Web Services with respect to energy/battery consumption. Here, after shortly presenting the implementation for the provisioning of mobile Web Services an evaluation of the battery consumption that results in using the approach is presented. Last but not least, an improvement with respect to the battery consumption is presented. The performance test shows that the improved approach provides a reasonable way to introduce Web Service provisioning for mobile devices.
Keyword : mobile devices; Web Services; mobile Web Service p




Ghazal Riahi
Doi : 10.7321/jscse.v3.n3.115
Page : 760 - 768
Show Summary
Abstract . Today security plan has a key factor in all computing systems. We were using two infrastructures in this paper. First, Smart Grid, which has many benefits of distributed computing and communications to deliver a real-time information and enable the near-instantaneous balance of supply and demand at the device level. The second, AGC4ISR architecture, which is organizing with Autonomic Grid Computing and C4ISR (Command, Control, Communications, Computers and Intelligence, Surveillance, & Reconnaissance) Architecture. This paper proposed a security plan for Smart Grid System based on AGC4ISR. We will present a solution for as security plan for encryption, intrusion detection, management and detail of cyber security in Smart Grids. In this paper, we were using the cryptography for the packet in the AGC4ISR and management for sending and receiving the packet in the smart grid. This role is necessary for smart grids to keeping away from packet missing.
Keyword : Security Plan, Encryption, Intrusion Detection, Key management, C4ISR, AGC4ISR, SAGC4ISR



In Publish
Nikita Shilnikov
Show Summary
Abstract . With the integrated use of frequency-efficient mod- ulation and high-speed energy-effective noise-immune coding it is possible to reach Shannon’s bound asymptotically. It is the main reason why this article deals with not binary but ternary concatenated codes (TCC) together with two dimensional modulation QAM-9. The flowchart of processing proposed by the authors imply the use of two-dimensional QAM modulation and two NC-codecs (in each quadrature branch). The signal processing method based on the flowchart makes it possible to approach to Shannon’s bound asymptotically with lowest power consumption Ebit/N0 =
Keyword : Noise–immune coding ; bit error rate (BER) ; constant weight subcode (CWS) ; Shannon’s bound ; ternary concatenated codes (TCC) ; word error probability ; the continuous AWGN channel




Xiaodong Zhang, Ping Zhang, Qi Zhang, Sanyong Yan, Xiaoni Dong
Doi : 10.7321/jscse.v3.n3.117
Page : 773 - 779
Show Summary
Abstract . In order to eliminate the pulse and noise interference of the weak signal, a filtering algorithm based on the composite cascade mathematical morphology is investigated in this paper, which mainly includes the principle analysis of morphological filter, the structure design of the filter element, and the construction of the composite cascade mathematical morphology. Finally, by simulation analysis and experimental verification, the results show that the implementation of the filtering algorithm based on the composite cascade mathematical morphology is simple and fast, and it can effectively filter out the pulse interference and random noise interference of the weak signal such as the actual oil film thickness signal obtained by the fiber optic displacement sensor. Furthermore, the filtering algorithm will play a very important role for the feature extraction of the weak electroencephalography signal.
Keyword : fiber optic displacement sensor ; oil film thickness ; morphological filter




Semseddin Gunduz
Doi : 10.7321/jscse.v3.n3.118
Page : 780 - 785
Show Summary
Abstract . In this study, the effect of online –cooperative learning homework practices on academic success of students is searched. The experience group of the research consists of 58 students from Anadolu University Education Faculty Education of Computer and Instruction Technology Section. Students in A section are taken to traditional method by neutral appointment; those in B section are taken to online homework practice method. In each class consisting of 29 people, it’s decided that 14 students prepare their homework individually; the rest 15 students prepare their homework with cooperative as triple groups. It’s students’ own choice to prepare their homework individually or cooperatively. There has been a success scale at the end of the teaching period. According to research results, there isn’t statistically considerable difference between students who attend traditional homework practices and online homework practices. According to research results, there isn’t statistically considerable difference between students who attend individual homework practices and cooperative homework practices. The academic success of the students who attend online-based individual homework practices is higher than traditional individual and online based cooperative learning homework practices.
Keyword : online learning; cooperative learning; homework




Behnam Faghih, Mohammad Reza Azadehfar, Serajddin Katebi
Doi : 10.7321/jscse.v3.n3.119
Page : 786 - 794
Show Summary
Abstract . User interface (UI) is point of interaction between user and computer software. The success and failure of a software application depends on User Interface Design (UID). Possibility of using a software, easily using and learning are issues influenced by UID. The UI is significant in designing of educational software (e-Learning). Principles and concepts of learning should be considered in addition to UID principles in UID for e-learning. In this regard, to specify the logical relationship between education, learning, UID and multimedia at first we readdress the issues raised in previous studies. It is followed by examining the principle concepts of e-learning and UID. Then, we will see how UID contributes to e-learning through the educational software built by authors. Also we show the way of using UI to improve learning and motivating the learners and to improve the time efficiency of using e-learning software.
Keyword : e-Learning ; User Interface Design ; Self learning




Muhammad Salim Javed, Ahmad Kamil Bin Mahmood, Suziah Bt. Sulaiman, Arshad Javed, Abdulhameed Rakan Alenezi
Doi : 10.7321/jscse.v3.n3.120
Page : 795 - 799
Show Summary
Abstract . The quality enhancement in higher learning institution is very important for all levels. The standard of quality of higher learning institutions has significantly affected by the low quality of intake, teaching staff and physical facilities, and lack of quality control. The main objective of this paper is to present status of quality education in the higher learning institutions of Arab State Universities which is quite questionable in the global context in terms of knowledge, facilities and the systems. The data collected from fifteen (15) higher learning institutions of Arab State Universities by conducting an interview and structured Questionnaire. The respondents were given the choice of being interviewed or self-administer the questionnaires to provide the data. This paper pointed out the wisdom of quality assurance move in Arab State higher learning education system when many of the conditions necessary for its success are not present. In this paper, we propose Quality Enhancement Cell as a focal point for quality learning in almost all the faculties of higher learning institutions in All Arab state universities. The study has been used to put forward some recommendations regarding the improvement in quality education for Arab State Universities.
Keyword : QEC, QAC, HLI




Farrukh Amin, Norizan Yasin
Doi : 10.7321/jscse.v3.n3.121
Page : 800 - 808
Show Summary
Abstract . This paper investigates the effect of the performance of E-Learning System (ELS) among the college teachers. The study incorporates DeLone & McLean (1992, 2002) model of measuring the performance of information system. In addition, it also tests the correlation among various dimensions of Information Systems. The investigation of this exploratory research states that there is higher level of user satisfaction once the system is user-friendly and easy-to-use. This leads to higher correlation between user-friendliness and fast processing and retrieval of information. This paper does not represent the views of all the users of ELS since it has been tested only in Yanbu Industrial College environment.
Keyword : E-Learning System; Information System Dimension; Individual Impact; performance measurement




Sazilah Salam, Siti Nurul Mahfuzah Mohamad, Norasiken Bakar, Linda Khoo Mei Sui
Doi : 10.7321/jscse.v3.n3.122
Page : 809 - 815
Show Summary
Abstract . This paper addresses the designing of Online Multiple Intelligence (MI) Teaching Tools for Polytechnic lecturers. These teaching tools can assist lecturers to create their own teaching materials without having any knowledge of Information Technology (IT) especially in programming. The theory of MI is used in this paper and this theory postulates that everybody has at least two or more intelligences. Multiple approaches embedded into a series of activities via online teaching tools must be implemented in order to achieve effective teaching and learning in the classroom. The objectives of this paper are to identify the relationship between the students self-perceived MI and their academic achievement in Polytechnic, and design online MI tools for teaching at Polytechnic. This paper also addressed the theoretical framework and MI teaching activities. The instrument used for this study was Ujian Multiple Intelligence (UMI). The results showed Polytechnic students have strength in Interpersonal, Visual-Spatial and Verbal-Linguistic intelligences.
Keyword : Multiple Intelligences ; Online Teaching Tools ; Interpersonal ; Visual-Spatial ; Verbal-Linguitic




Alka Singhal, Rajni Jindal
Doi : 10.7321/jscse.v3.n3.123
Page : 816 - 820
Show Summary
Abstract . Absence of proper metrics to evaluate cloud services holds back the customers to decide whether to shift their E-learning services on clouds or not. Therefore there is a requirement for a reliable Quality of Service Model which will act as a base to a well defined Service Level Agreement between Cloud providers and consumers. The Quality of Service model will be based on various functional and non functional requirements. This paper will provide a set of Cloud metrics like Availability, Elasticity etc which will help the cloud provider to create Cloud computing a mature and stable framework for E-learning system which will be adopted by the consumers at any level with reliability. The paper proposes a Quality of Service model for Cloud computing based E-learning system where construction and maintenance of the e-learning system is done by the cloud service providers and further these services are used by the E-learning provider by paying in cost per unit.
Keyword : Framework ; Quality of Service ; Cloud Computing ; Service layers ; E-learning system




Arshia Khan
Doi : 10.7321/jscse.v3.n3.124
Page : 821 - 825
Show Summary
Abstract . Integrated curricula and experiential learning are the main ingredients to the recipe to improve student learning in higher education. In the academic computer science world it is mostly assumed that this experiential learning takes place at a business as an internship experience[3]. The intent of this paper is to schism the traditional understanding of equating experiential learning with internships. A model was created and tested in three consecutive years of software engineering classes. This model is based on the integrated curricula concept. A survey was conducted to measure the usability of this model. The results indicated that the students’ hard/technical and soft professional skills improved. The paper will first describe the model and then discuss the results of the survey. According to [1] most of the models in this are created for the freshmen whereas this model has been created for juniors and seniors who are at a level of extensive independent learning.
Keyword : Integrated Pedagogical, Experiential learning.




Syarifah Diyanah Yusoh, Sazilah Salam
Doi : 10.7321/jscse.v3.n3.125
Page : 826 - 828
Show Summary
Abstract . This paper presents the preliminary study of the tablet acceptance among children with high functioning autism. Children who were diagnosed with high functioning autism have an IQ score of 80 or above. Generally, their autism characteristics are not obvious and often they are mistakenly undeserved as having a low profile characteristic. The discussion is generally based on observation of three sample high functioning autism children, feedback from distributed questionnaire to all of 20 caregivers at National Autism Society of Malaysia (main center at Titiwangsa, Kuala Lumpur) and interview with autism specialist from the same organization. This preliminary study will help to see on the tablets potential of becoming a new assertive technology device as a pacing technology for autism children.
Keyword : autism; assistive technology; special education; tablet; high functioning autism children




Vikram Kumar Kamboj, Mukesh Kumar
Doi : 10.7321/jscse.v3.n3.126
Page : 829 - 831
Show Summary
Abstract . Today education scenario use of computer and computer based software techniques are increasing day by day. Use of computer and soft computing techniques has influenced a revolution in the Quality Management of higher education system over the past few decades. This research study reveals the coherent view of the impact of soft computing techniques for quality management in higher education system in India based on survey conducted in various universities. The research paper aims to provide an overview of the importance of soft computing techniques for quality management in higher education system.
Keyword : Higher Education, Software, Soft Computing Techniq



In Publish
Bader Alyoubi, Adel A. Alyoubi
Show Summary
Abstract . Academic technologies have been used for technology-enhanced learning since the 1960s.Each new phase of development of these academic technologies has been driven by technological innovations rather than by the learning needs of the students in academic institutions. E-learning research has attempted—in the last 30 years of its existence—to keep up with these technological innovations, with research studies focusing on various aspects of e-learning in an attempt to improve the pedagogical function. Unfortunately, because most of these innovations were technologically, rather than pedagogically, based, their implementation in e-learning environments has necessitated further research to solve problems that are encountered as a result. A number of research studies have focused on student perceptions of their technological learning environments, including its perceived quality, ability to engage with the learner, and usability. The purpose of these studies, in most cases, has been to improve a localized implementation (usually within a particular module or course) of learning technologies. Student perceptions about their e-learning environment can be used to improve the implementation and continued use of new academic technologies in higher education institutions. The focus of this research is to develop an e-learning framework that can be used to implement Internet applications in higher education institutions. This research articles seeks to use two methods to evaluate a range of Internet applications that have been used for e-learning, with the objective of recommending an e-learning framework for the use of Internet applications for e-learning. The two methods that will be used are a modified SERVQUAL instrument and a WBLT (web-based evaluation tools) scale in order to evaluate students’ perceptions of their experience of the use of these Internet applications in the technology-enhanced learning environment. The e-learning framework will be modified and further developed so that it is re-usable and extendible to a wide range of higher education institutions.
Keyword : e-learning, Internet application, Web 2.0




Stanislav Simeonov, Neli Simeonova
Doi : 10.7321/jscse.v3.n3.128
Page : 844 - 847
Show Summary
Abstract . In this paper a concept for hardware realization of graphic tactile display for visually impaired peoples is presented. For realization of tactile actuators bi-stable, solenoids and PIC based control board are used. The selected algorithm for series activation of each row of display allows using minimal number of active components to set and reset the solenoids. Finally, a program algorithm of control board is discussed. The project is funded by Bulgarian National Science Fund – NSF Grant No D-ID-02/14, 2009-2013
Keyword : tactile display; taxel; bi-stable solenoids; visua




Kamel Echaieb, Maher Azaza, Amine Chouchaine, Abdelkader Mami
Doi : 10.7321/jscse.v3.n3.129
Page : 848 - 852
Show Summary
Abstract . This paper presents the application of the photovoltaic energy to supply electrical power for a temperature controlling system of a greenhouse. A linearizing algorithm is applied to the physical model of the greenhouse in order to calculate the needed air flow to cool down the temperature if it exceeds a wanted level. The control of the temperature is ensured by a fan ventilation system through a DC/AC converter and an asynchronous machine. The model of the PV system and the greenhouse has been implemented in the Matlab/Simulink software and simulations have been presented to show the efficiency of the proposed system.
Keyword : Photovoltaic Energy; DC/AC inverter ; Asynchronous Machine ; Greenhouse physical model ; Linearizing algorithm




Poonam S Pardeshi, Hyacinth J Kennady
Doi : 10.7321/jscse.v3.n3.130
Page : 853 - 856
Show Summary
Abstract . The present photovoltaic solar cell (PV) converts solar energy into electricity with efficiency, of less than 20%. Photovoltaic thermal (PV/T) system consists of PV module along a heat removing passage to remove the heat below the PV. Also PV/T systems can provide simultaneously electricity and heat, and hence can serve dual purpose. In this paper comparative study of three roof models are discussed -1) Conventional roof: Conventional roof is taken as a base case which is made up of concrete.2) roof with photovoltaic, 3) roof with photovoltaic thermal (PV/T). The electricity consumption for each system is computed using EQUEST software. Also estimation of carbon di-oxide emission is compared in each case done for each system.
Keyword : solar, simulation, rooftop, EQUEST




Pablo Cristian Tissera, Alicia Castro, A. Marcela Printista, Emilio Luque
Doi : 10.7321/jscse.v3.n3.131
Page : 857 - 863
Show Summary
Abstract . Computer based models describing pedestrian behavior in an emergency evacuation play a vital role in the development of active strategies that minimize the evacuation time when a closed area must be evacuated. The reference model has a hybrid structure where the dynamics of fire and smoke propagation are modeled by means of Cellular Automata and for simulating people’s behavior we are using Intelligent Agents. The model consists of two sub-models, called environmental and pedestrian ones. As part of the pedestrian model, this paper concentrates in a methodology that is able to model some of the frequently observed human’s behaviors in evacuation exercises. Each agent will perceive what is happening around, select the options that exist in that context and then it makes a decision that will reflect its ability to cope with an emergency evacuation, called in this work, behavior. We also developed simple exercises where the model is applied to the simulation of an evacuation due to a potential hazard, such as fire, smoke or some kind of collapse.
Keyword : Evacuation Simulation ; Behaviours ; Cellular Automata ; Intelligent Agents




Jihyun Lee, Jun-Hee Park, Kyeong-Deok Moon, Kyungshik Lim
Doi : 10.7321/jscse.v3.n3.132
Page : 864 - 871
Show Summary
Abstract . This paper presents a simulating energy consumption (SEC) system which has real-time energy monitoring and virtual energy simulation capabilities for a specific real home environment. The energy monitoring function provides in real time not only the total amount of energy usage in a target house but detailed energy usage profiles for individual home appliances. Based on the real-time measurement of energy usage, the energy simulation function predicts energy consumption when one or more home appliances are removed or newly added virtually. All operations are visualized with 3D spatial information developed from the Building Information Modeling (BIM) database of a target house. Spatial information is integrated with device information in the home resource management middleware, where device information is directly obtained from home appliances interconnected via bridging function of home networks. The interoperability with various home appliances is also achieved by home resource management middleware. Our experimental testbed shows that the SEC system could be used as an efficient energy management tool to meet energy saving goals.
Keyword : Simulating Energy Consumption; real-time energy monitoring; smart energy management




Yoram Haddad, Yuval Cohen, Ronen Goldsmith
Doi : 10.7321/jscse.v3.n3.133
Page : 872 - 880
Show Summary
Abstract . Widespread use of smartphones has, in parallel, opened a wide range of opportunities for new applications. To be sure, the ever-increasing use of digital information processing and communications devices has created a commensurate increase in electricity consumption. However, one can also develop applications which encourage and spread ecologically friendly behavior. This article presents the design and implementation of a car ride-sharing application for a mobile environment. The application enables users to share automobile transportation in an efficient and simple way. Use of this system can significantly reduce the number of private automobiles on the roads, thus yielding substantial ecological, economical, and social benefits. Since the application is designed for smartphones, the sharing facility may be implemented in realtime, from anywhere, anytime. The application is based on an algorithm for finding subroutes in a user-defined path, according to the number of matched points along the path. This application differs from existing car sharing applications in several, crucial ways. The article will describe both the system and the specific differences from other, existing software.
Keyword : Mobile communication systems; Software engineering for Internet projects; Location-dependent and sensitive; Real-time and embedded systems




Payam Porkar Rezaeiye, Mehrnoosh Bazrafkan, Ali Akbar Movassagh, Mojtaba Sedigh Fazli, Gholam Hossein Bazyari
Doi : 10.7321/jscse.v3.n3.134
Page : 881 - 889
Show Summary
Abstract . These days to gain classification system with high accuracy that can classify complicated pattern are so useful in medicine and industry. In this article a process for getting the best classifier for Lasik data is suggested. However at first it's been tried to find the best line and curve by this classifier in order to gain classifier fitting, and in the end by using the Markov method a classifier for topographies is gained. What are mentioned in this article are supposed to gain a strong classifier so that under Marko theory can choose eyes appropriate for corneal graft.
Keyword : HMM, KNN, classification, topography, corneal.




Mourad Abbas
Doi : 10.7321/jscse.v3.n3.135
Page : 888 - 891
Show Summary
Abstract . In this paper, we present an Arabic language learning software that we have implemented by incorporating both Automatic Speech Recognition (ASR) and Text to Speech (TTS) systems. Indeed, this educational software allows, in the first step, the non-Arabic speakers and/or learners of primary education to acquire the Arabic alphabet and some basic rules of standard Arabic; And helps users, in the second step, to practice reading. As the software requires the use of the two aforementioned systems, we will present firstly the experiments on an isolated word recognition system that we have achieved, and secondly we will give a description of the educational software.
Keyword : Educational software; Arabic Language; ASR; TTS




Khalid AlMutib, Ebrahim Mattar, H. Ramdane Muhammad Emad AL-Dean
Doi : 10.7321/jscse.v3.n3.136
Page : 892 - 899
Show Summary
Abstract . Mobile robots visual control system does suffer from a number of issues, as due to speed and complexity. Complicated kinematics relations, in addition, the needed computational time to execute a task, are also some related issues. This manuscript highlights a mechanism through which to approximate an inter-related visual kinematics relations that are part of visual servo closed loop system through an Artificial Neural Networks (ANN) system for a mobile robot visual servoing. The methodology followed and being applied to KSU-IMR mobile robot project, is based on the concept of integration of Neural Networks with an Image Based Visual Servoing system. ANN have been fully employed here to learn and approximate relations that relate a target movements to the mobile robot movement (POWERROB,[1]), through a visual servo.
Keyword : Visual Servo; ANN; Epipolar Gemetry; KSU-IMR Robo ; Visual Servo; ANN; Epipolar Gemetry; KSU-IMR Robo



In Publish

Show Summary
Abstract . One of the problems that distribution networks are faced is occurring faults caused by contamination of insulators. These faults sometimes log to a sudden and serious damages to the system, and reduce the power system reliability and power quality. Since the insulators washing has a high cost it should be done by a primary and organized planning, although the contamination level is reduced naturally by heavy precipitation and fault probability is declined. One of the solutions of washing planning is the investigation of feeder insulators leakage current (ILC). In this paper a new method for monitoring the average of insulators contamination level using Packet Wavelet Transform (PWT) and Principal Component Analysis (PCA) has been presented. The proposed method has been examined in a three-month period and an index for contamination level has been extracted. Finally the algorithm of proposed method has been presented.
Keyword : Insulators Leakage Current ; Packet Wavelet Transform ; Principal Componenet Analysis




Manaf Sharifzadeh, Saeid Aragy, Kaveh Bashash, Shahram Bashokian, Mehdi Gheisari
Doi : 10.7321/jscse.v3.n3.138
Page : 901 - 904
Show Summary
Abstract . The creation of small and cheap sensors promoted the emergence of large scale sensor networks. Sensor networks allow monitoring a variety of physical phenomena, like weather conditions (temperature, humidity, atmospheric pressure ...), traffic levels on highways or rooms occupancy in public buildings. Some of the sensors produce large volume of data such as weather temperature. These data should be stored somewhere for user queries. In this paper two known sensor data storage methods that store data semantically has been compared and it has been shown that storing data in ontology form consumes more energy so the lifetime of sensor network would decreases. The reason we choose them is that they are useful and popular.
Keyword : wireless, semsos, SWE