KaaShiv InfoTech, Number 1 Inplant Training Experts in Chennai.
Today is the world of information era, where information is available on just our single click. Web applications are playing a magnificent role in this, every organizations are mapping their business from a room to the world with the help of these Web Apps. Web applications generally consist of a three tier architecture where database is in the third pole, which is the most valuable assets in any organization, as the adaptation of web applications are increases day by day, various attacks are possible against this. SQL injection is an attack in which an attacker directly compromises the database, that's why this is a most threatening attack. Various Vulnerability scanners has been proposed to deal with this, but none of them are able to detect SQLI completely, the existing tools have the accuracy ratio very less as well as they produce a high rate of false positive, apart from that all these tools take much time to scan. So here we are presenting a network based vulnerability scanner approach which provides a better coverage and with no false positive within a short span of time.
A vulnerability scanner is a minor computer program designed to assess/analyze the computers, computer systems, networks or applications for prone to weaknesses.There are many number of vulnerability scanners available in the market, distinguished from another by a focus on particular targets.The goal of running/executing a vulnerability scanner is used to identify devices on your network that are open to known vulnerabilities.To determine what types of software is present, along with additional information about the possible issues in the computers—such as the type and version of the OS. This information can be used to analyze for known or recently discovered vulnerabilities that can be exploited to gain access to secure networks and computers.One major issue with vulnerability scanners is their performance impact on the devices they are scanning. On the one hand you want the scan to be able to be performed in the background without affecting the application. On the other hand, you want to be sure that the application scan should be thorough.In our project, we are going to use the concept of crawling the entire application and identify the possible attacks happening on the system.
Network security situation awareness provides the unique high level security view based upon thesecurity alert events. But the complexities and diversities of security alert data on modern networksmake such analysis extremely difficult. In this paper, we analyze the existing problems of networksecurity situation awareness system and propose a framework for network security situation awareness based on knowledge discovery. The framework consists of the modeling of network securitysituation and the generation of network security situation. The purpose of modeling is to construct the formal model of network security situation measurement based upon the D-S evidence theory, and support the general process of fusing and analyzing security alert events collected from securitysituation sensors. The generation of network security situation is to extract the frequent patterns and sequential patterns from the dataset of network security situation based upon knowledge discovery method and transform these patterns to the correlation rules of network security situation, and finally to automatically generate the network security situation graph. Application of the integrated NetworkSecurity Situation Awareness system (Net-SSA) shows that the proposed framework supports for the accurate modeling and effective generation of network security situation.
Enterprise network information system is not only the platform for information sharing and information exchanging, but also the platform for enterprise production automation system and enterprise management system working together. As a result, the security defense of enterprise networkinformation system does not only include information system network security and data security, but also include the security of network business running on information system network, which is the confidentiality, integrity, continuity and real-time of network business. According to the security defense of enterprise network information system, this paper proposes the "network business security" concept. In this paper, the object of information security is defined in three parts - data security, network systemsecurity and network business security, and the network business security model is described. The proposal of the concept "network business security" provides theoretical basis for security defense of enterprise automatic production system and enterprise management information system.
With tremendous attacks in the Internet, there is a high demand for network analysts to know about the situations of network security effectively. Traditional network security tools lack the capability of analyzing and assessing network security situations comprehensively. In this paper, we introduce a novel network situation awareness tool CNSSA (Comprehensive Network Security Situation Awareness) to perceive network security situations comprehensively. Based on the fusion of networkinformation, CNSSA makes a quantitative assessment on the situations of network security. It visualizes the situations of network security in its multiple and various views, so that network analysts can know about the situations of network security easily and comprehensively. The case studies demonstrate how CNSSA can be deployed into a real network and how CNSSA can effectively comprehend the situation changes of network security in real time.
Based on analysis on applications by perception control technology in computer network security status and security protection measures, from the angles of network physical environment and networksoftware system environmental security, this paper provides network security system perception control solution using Internet of Things (IOT), telecom and other perception technologies. SecurityPerception Control System is in the computer network environment, utilizing Radio Frequency Identification (RFID) of IOT and telecom integration technology to carry out integration design for systems. In the network physical security environment, RFID temperature, humidity, gas and perception technologies are used to do surveillance on environmental data, dynamic perception technology is used for network system security environment, user-defined security parameters, security log are used for quick data analysis, extends control on I/O interface, by development of API and AT command, Computer Network Security Perception Control based on Internet and GSM/GPRS is achieved, which enables users to carry out interactive perception and control for network security environment by WEB, E-MAIL as well as PDA, mobile phone short message and Internet. In the system testing, through middleware server, security information data perception in real time with deviation of 3-5% was achieved, it proves the feasibility of Computer Network Security Perception Control System.
The foundation of network security have not been paid enough concentrations, and the comprehensive considerations for the solution models in network security have not been explored thoroughly. In this paper, we make the first attempt to establish several models for the security of network protocols. We divide the security of network protocols into two folders: the implementation security of networkprotocols, and the design security of network protocols. Four models are proposed to clarify the securityproblems: software vulnerability model, scalability model, authentication model, and covert model. We also propose several defense principles for all models. The security reduction is also proposed to transform the solution method for security problems to other available security verification and testing approaches. For example, the implementation security of network protocols is reduced to the security of software implementation for parsing protocols, so that the fuzzy test can be used for verification. The pressure test are used for scalability model. The exploration of the paper can help to stimulate the further discussions on the foundations of network security, especially the design security of networkprotocols.
Managing complex enterprise networks requires an understanding at a fine granularity than traditionalnetwork monitoring. The ability to correlate and visualize the dynamics and inter-relationships among various network components such as hosts, users, and applications is non-trivial. Network securityvisualization is a highlighted topic of network security research in recent years, The existing research situation of network security visualization is analyzed. the paper first proposed the network securitysituation awareness model, and analysis network security situation awareness method, at last, and designed and implemented the security situation visualization prototype system based on geographic information systems, network topology graph, attack paths. The security situation data show in multiple views, multi-angle, multi-level display to the user by visualization technology, therefore the performance of the security situation will be more accurate and vivid, assessment of network security situation become timely and accurate, laying the foundation for rapid decision-making.
Internet security problems remain a major challenge with many security concerns such as Internet worms, spam, and phishing attacks. Botnets, well-organized distributed network attacks, consist of a large number of bots that generate huge volumes of spam or launch Distributed Denial of Service (DDoS) attacks on victim hosts. New emerging botnet attacks degrade the status of Internet securityfurther. To address these problems, a practical collaborative network security management system is proposed with an effective collaborative Unified Threat Management (UTM) and traffic probers. A distributed security overlay network with a centralized security center leverages a peer-to-peer communication protocol used in the UTMs collaborative module and connects them virtually to exchange network events and security rules. Security functions for the UTM are retrofitted to sharesecurity rules. In this paper, we propose a design and implementation of a cloud-based security center for network security forensic analysis. We propose using cloud storage to keep collected traffic data and then processing it with cloud computing platforms to find the malicious attacks. As a practical example, phishing attack forensic analysis is presented and the required computing and storage resources are evaluated based on real trace data. The cloud-based security center can instruct each collaborative UTM and prober to collect events and raw traffic, send them back for deep analysis, and generate new security rules. These new security rules are enforced by collaborative UTM and the feedback events of such rules are returned to the security center. By this type of close-loop control, the collaborative network security management system can identify and address new distributed attacks more quickly and effectively.
With the rapid development of the Internet, the network structure becomes larger and more complicated and attacking methods are more sophisticated, too. To enhance network security, Network SecuritySituation Analysis (NSSA) technology is a research hot spot in the network security domain. But at present, the NSSA framework and model which not only analyze the affected results of the networksecurity but also the process how the network security is affected are less. In this paper, a novel NSSA framework is presented. The framework includes two parts: calculate the Network Security Situation Value (NSSV) and discover intrusion processes. NSSA quantitative assesses the impact on networksecurity caused by attacks upon Analytical Hierarchy Process (AHP) and hierarchical network structure. Based on attack classification, intrusion processes discover the process how network security is affected. At last from the experiments results, NSSV exactly changes as attacks take place and the accurate intrusion processes are discovered. The applicability of the framework and algorithms are verified.
With the development of networks countermeasure technology, network security early-warning has become a key technology of constructing networks defense in depth architectures. Focusing onnetwork real environment, upgrading comprehensive capacity of the network security defense, a complete set of network security early-warning control mechanism are first discussed; then, based onnetwork defense in depth model, the design ideas, reaching goals, design principle and implementation technology of network security early-warning system are presented; and finally, from the dynamic monitoring, intrusion detection, real-time early-warning and process status tracking, the system function design and the procedure design of main function module are also given. This design model is valuable for guiding the developing practice of network security early-warning system.
At present, network security attacks are numerous. Traditional single defense equipment and testing equipment are unable to meet the requirements of network security under the new circumstances. Therefore, the research on network security situation has become a hot topic in the field of networksecurity. To enhance the accuracy and time effectiveness of the network security situation forecast, a fuzzy prediction method of network security situation based on Markov is proposed in this paper. The method is based on the Markov state transition matrix that depicts the correlation of network securityand predicts the security status. By introducing the vulnerability information to build the membership degree of fuzzy security situation for the security status and integrating improved Zadeh formula, the prediction value of the network security situation is obtained. Finally, the effectiveness of the method is shown by the experiment results on KDD CUP99 data and DARPA2000 data.
Network security situational awareness(NSSA) has been a hot research spot in the network securitydomain. In this paper, a quantification method for NSSA based on conditional random fields(CRFs) was proposed. The data of network attacks from intrusion detection system (IDS), the hosts' vulnerabilities and the hosts' states were firstly combined as the network security factors. And then the networksecurity threat degree was defined to quantify the risk of the whole network and classify the attacks. A diverse set of effective features were incorporated in CRFs Model. Finally the experiments on the DARPA 2000 data set generate the explicit network security situational graph. It proves that the method introduced in this paper can represent network risk more accurate and offer a good quantification for thenetwork security situation.
Nation's network infrastructure such as the Global Information Grid (GIG) for the Department of Defense (DoD) and the OneNet for the Homeland Security Department are tran-sitioning to the Internet Protocol version 6 (IPv6) per DoD CIO Memorandum of June 2003 and the Office of Management and Budget memorandum OMB-05-22. There exist IPv6 specific security vulnerabilities in these networkinfrastructures that need to be mitigated in order to achieve security parity with the existing IPv4 operations. From the perspective of the Homeland Security technologies, the existence of additionalsecurity vulnerabilities implies a possibility for two pronged threats. First, the IPv6 specific vulnerabilities reduce the security posture of the network infrastructure itself; second, other critical infrastructure sectors that depend on IPv6 need additional protection. For example, the future supervisory control and data acquisition (SCADA) industrial capabilities would increasingly use the IPv6 infrastructure, as would the voice communications, the voice and video collaboration, and sharing of data such as the image data and surveillance and reconnaissance data. This paper presents three contiguous results. First, it briefly presents the new IPv6 capabilities; second, it presents a brief analysis of the securityvulnerabilities arising from these capabilities; and third, it presents a new security model for IPv6network infrastructures that has the potential to mitigate these vulnerabilities. The new model is based on the end-to-end connectivity that is restored in IPv6, thus allowing the use of host based security(HBS) systems together with the perimeter security devices. However, the use of HBS complicates thesecurity trust management. Therefore the third component of the model is introduced, namely a policy based security management (PBSM) approach. The PBSM approach allows the secure deployment of the host based security systems. It provides the capabilities needed to specify - - the trust zones via a set of security policy rules that together specify a trust zone. Hosts belong to one or more trust zones. Accordingly, the host based security policies are derived from the zone security policies for all the zones to which a host belongs. In addition, the PBSM approach has the potential to support more sophisticated security capabilities such as a risk adaptive access control and dynamic securityresponse to a changing operational picture. The capabilities are needed to enable net-centric securityoperations.
Network security requirements are generally regarded once network topology is implemented. In particular, once firewalls are emplaced to filter network traffic between different Local Area Networks(LANs). This commun approach may lead to critical situations: First, machines that should not communicate could belong to a same LAN where the network traffics do not pass through the firewall for being filtered. Often overwhelmed by the complexity of security requirements and the growth ofnetworks, network administrators are struggling to resolve such design faults while ensuring not to cause further vulnerabilities. Second, according to network security policy, the required number of LANs, and therefore the number, range and thus, the cost required for both network and securityequipments, can be much more reduced than that originally proposed by the network administrator. In this paper, we present an automatic approach that consists on proposing a network topology which is both safe and optimal by taking into account the network security policy, given in a high-level language. The safety property ensures that every prohibited traffic has to cross the firewall to be filtered. The optimal property allows to deduce the necessary and sufficient resources (Sub networks, network switches, firewalls range) to be used. To our best knowledge, such problematic has not been explored in previous works, despite the importance of these challenges. Our method has been implemented using Graph Coloring Theory. The first results are very promising. Experiment conducted on large-scalenetworks demonstrate the efficiency and the scalability of our approach.
This paper described the current network of primary language XML in network applications, introduced its own XML language features and development to illustrate aspects of XML technology in the application of network security and significance. The network security is a systems engineering which is need to carefully consider the security needs of the system, and a variety of security technologies, such as passwords and technology combine to produce a highly efficient, universal, secure networksystems. Secondly, this paper analysis of network security architecture and the current networksecurity system for the protection of technical methods used: the network against viruses, configuration, firewall, intrusion detection systems used, Web, Email, BBS's safety monitoring system, vulnerability scanning systems, IP Theft solution, using network monitoring to maintain system securitysubnet. Finally, the XML technology for network security enabled areas of security, XML has become a field for the safety of a valuable mechanism for exchange of data, related development is related to XML encryption and XML signature.
Due to the extensive use of Internet services and emerging security threats, most enterprise networksdeploy varieties of security devices for controlling resource access based on organizational securityrequirements. These requirements are becoming more fine-grained, where access control depends on heterogeneous isolation patterns like access deny, trusted communication, and payload inspection. However, organizations are looking to design usable and optimal security configurations that can harden the network security within enterprise budget constraints. This requires analyzing various alternative security architectures in order to find a security design that satisfies the organizational security requirements as well as the business constraints. In this paper, we present ConfigSynth, an automated framework for synthesizing network security configurations by exploring various security design alternatives to provide an optimal solution. The main design alternatives include different kinds of isolation patterns for traffic flows in different segments of the network. Config Synth takes security requirements and business constraints along with the network topology as inputs. Then it synthesizes optimal and cost-effective security configurations satisfying the constraints. ConfigSynth also provides optimal placements of different security devices in the network according to the given network topology. ConfigSynth uses Satisfiability Modulo Theories (SMT) for modeling this synthesis problem. We demonstrate the scalability of the tool using simulated experiments.
In the wake of recent events, network security and reliability have become top issues for service providers and enterprises. The worldwide cost of cyber attacks is estimated to have been in the $145 billion dollar range for 2003. 2003 was also regarded as the "worst year ever" for computer viruses and worms; in 2001 the Code Red worm took several days to create widespread damage, whereas Slammer in 2003 had significant impact in just minutes. Over 90% of network attacks resulting in significant financial loss originate from inside a network's perimeter. Unfortunately, there appears to be no end in sight to these threats to network security; in fact, there is an increasing trend of attacking financial resources in addition to computing resources. The newly ratified ITU-T Recommendation X.805 "security architecture for systems providing end-to-end communications" was developed as the framework for the architecture and dimensions in achieving end-to-end security of distributed applications. It provides a comprehensive, multilayered, end-to-end network security framework across eight security dimensions in order to combat network security threats. We introduce the X.805 standard and describe how it can be applied to all phases of a network security program. We also provide examples of the business impact of network security vulnerabilities and the application of X.805 fornetwork security assessments. Enterprises and service providers alike should use X.805 to provide a rigorous approach to network security throughout the entire lifecycle of their security programs
In recent years, the agricultural information network construction has made a great progress in China. With the level of network openness improved, the probability of network attacked is increasing. So, it needs a higher demand for network stability and security. Through analyzing the status quo of agricultural information network security and network security defensive strategy architecture, this paper proposes a construction solution of agricultural information network security comprehensive management platform. Based on the different functions and regions of agricultural information networksystem, this solution optimizes the design and deployment with the way of security management andsecurity technology. It makes the target of systematic and intensive management about agricultural information network security comprehensive defensive architecture is achieved.
This paper studied the related theories of the network security event correlation analysis methods, and proposed the network security event correlation analysis method based on similar degree of the attributes. a detailed description and analysis of the method is gived in this paper, the method can realize the classification and merge of network security events according to the attributes similar degree of network security events. The similar degree of security events are identified by the similar degrees of characteristic attributes. It can not only remove redundant safety incidents, but also can compresssecurity event number. Thus, it can effectively improve the network administrator's security incident analysis efficiency. The experimental results show that: the method is suitable for the massive securityevent information analysis and aggregation, can effectively reduce the number of security incidents, has a certain value.
This paper presents an ontological approach to perceive the current security status of the network. Computer network is a dynamic entity whose state changes with the introduction of new services, installation of new network operating system, and addition of new hardware components, creation of new user roles and by attacks from various actors instigated by aggressors. Various securitymechanisms employed in the network does not give the complete picture of security of completenetwork. In this paper we have proposed taxonomy and ontology which may be used to infer impact of various events happening in the network on security status of the network. Vulnerability, Network and Attack are the main taxonomy classes in the ontology. Vulnerability class describes various types of vulnerabilities in the network which may in hardware components like storage devices, computing devices or networks devices. Attack class has many subclasses like Actor class which is entity executing the attack, Goal class describes goal of the attack, Attack mechanism class defines attack methodology, Scope class describes size and utility of the target, Automation level describes the automation level of the attack Evaluation of security status of the network is required for networksecurity situational awareness. Network class has network operating system, users, roles, hardware components and services as its subclasses. Based on this taxonomy ontology has been developed to perceive network security status. Finally a framework, which uses this ontology as knowledgebase has been proposed.
SSAP is developed for national backbone networks, large network operators, large enterprises and other large-scale networks. The system collects, interprets and displays the security factors which cause changes of network situation, and predicts the future development trend of these security factors. This paper describes its architecture and key technologies: security data integration technology for distributed heterogeneous network, association analysis technology oriented the major network securityevents, real-time analysis technology based on the data flow and multi-dimensional analysis for networksecurity data, network security situation prediction technology, and so on. The performance tests show that SSAP has high real-time and accuracy in security situation analysis and trend prediction. The system meets the demands of analysis and prediction for large-scale network security situation.
The security evaluation for an information network system is an important management tool to insure its normal operation. We must realize the comprehensive network security risks and take effective securitymeasures. A network evaluation model and the corresponding fuzzy algorithm are presented and adapt the hierarchical method to characterize the security risk situation. The model combined with the importance of the security measure, environment and the key nodes. The evaluation method based on RST is used to evaluate the key nodes and the fuzzy mathematics is used to analyze the wholenetwork security situation. Compared with others, the method can automatically create a rule-basedsecurity evaluation model to evaluate the security threat from the individual security elements and the combination of security elements, and then evaluation the network situation. It is shown by experimental results that this system provides a valuable model and algorithms to help to find the security rules, adjust the security measure, improve the security performance and design the appropriate security risk evaluation and management tools.
The term security network intelligence is widely used in the field of communication security network. A number of new and potentially concepts and products based on the concept of security networkintelligence have been introduced, including smart flows, intelligent routing, and intelligent Web switching. Many intelligent systems focus on a specific security service, function, or device, and do not provide true end-to-end service network intelligence. True security network intelligence requires more than a set of disconnected elements, it requires an interconnecting and functionally coupled architecture that enables the various functional levels to interact and communicate with each other. We propose a uniform work for understanding end-to-end communication security network intelligence (CSNI), which is defined as the ability of a network to act appropriately in a changing environment. We consider an appropriate action to be one that increases the optimal and efficient use of network resources in delivering services, and we define success as the achievement of behaviour sub-goals that support the service provider's ultimate goals, which are defined external to the network system. The work presented incorporates the functional elements of intelligence into computational modules and interconnects the modules into networks and hierarchies that have spatial, logical, and temporal properties. Based on the work proposed, we describe an end-to-end multiservice network application spanning the networksecurity management layer, optical layer, switching/routing layer, security services layer, and other layers.
The proposal of network security situational awareness (NSSA) research means a breakthrough and an innovation to the traditional network security technologies, and it has become a new hot research topic in network security field. Combined with evolutionary strategy and neural network, a quantitative method of network security situational awareness is proposed in this paper. Evolutionary strategy is used to optimize the parameters of neural network, and then the evolutionary neural network model is established to extract the network security situational factors, so the quantification of network securitysituation is achieved. Finally simulated experiment is done to validate that the evolutionary neuralnetwork model can extract situational factors and the model has better generalization ability, which supports the network security technical technologies greatly.
Stochastic game theory is proposed to apply in the research on network security situational awareness (NSSA), which is a research focus in network security field at present. A novel dynamic awareness method of network security situation (NSS) based on analyses of network service states is proposed in this paper. Realizing situation awareness is a dynamic process, and the diverse states of networkservices are just direct mirrors of the whole network security situation. Network security situation reflects what is happening in the network, including both the offense and defense behaviors in it. Stochastic game model of network security system is constructed in this paper, and network securitysituation is quantified by the game mathematical formulation, costs or rewards of attackers and defenders are established, and finally non-linear programming is used to compute the Nash equilibrium points, at which point both of the two sides get a balance between their benefits. Network securitysituation can then be dynamically achieved by visualizing the diverse metrics information of networkservices at Nash equilibrium during the operating of network system.
In the Age of Information, network education pays more attention to the application of IT technology and the training of talents, which makes learning more of customization and of opening up. In order to better enable learners to go beyond the limitations of space and time to acquire knowledge; in order to provide excellent learning environment for greater freedom and greater choice of learning activities space, the project to building campus network has become the basis of all university building work. It is directly related to the quality and level of their teaching and scientific research work. The campus network has a number of tasks such as teaching, research, management and communication with the outside. Therefore, the issue of network security has become a priority to campus network management. Obviously, the current Internet is convenient but at the same time it is unsafe. As part of the Internet and the unique attributes of campus network, it is more easily attacked when enjoying the service provided by the Internet. This paper starts from the current security status of the campus network, analyzing threatens to campus network security and strategies to maintenance of network security, so as to establish a suitable campus network security system, and introduce some current popular campusnetwork information security solutions.
Data security has become one of core problems of cloud computing. Many security solutions have been proposed, however, most of them only focus one stage of data life cycle, such as storage stage, which is not enough to solve cloud data security problem as threats exist in whole data life cycle. In this paper, we argue that the cloud data security problem should be solved form data life cycle. After detail analysis of data life cycle model and data security threats, a suggested design process of data security solution is given. The proposed idea is simple but important to create the complete security solution to clouddata security.
Public cloud data security in cloud computing is due to the data stored in the cloud. It's in a low cost and high efficiency, also bring the data the possibility of abuse. Therefore, in this paper, researching the model respond to threats based on the analysis of threats to public cloud factors, and puts forward the user management, data security, data center hardware and software security and control transfer to strengthen the data security of the public cloud. And to find the balance between risk and business mission.
Cloud computing is considered to be the next generation of information technology framework. It is the next generation computing platforms that can provide dynamic resource pools, virtualization and high availability. The new character brings a lot of new security challenges which have not been taken into account completely in the current cloud computing system. As a consequence, to build a cloud computing data security system is the basis to build cloud computing security system. In this article, the cloud computing technology architecture and the cloud computing data security features are the first to be studied and considered, then the cloud computing data security model is raised. At last, the realization of data security model has been researched. The model adopts a multi-dimension architecture of three - layers defense. First of all, user authentication is required to ensure that user datacannot be tampered. Users who pass the anthentication can get relative operation on the user data, such as addition, modification, deletion. If the unauthorized user use illegal means to deceive the authentication system, the file entered the system encrypt and privacy defense levels. In this layer, userdata is encrypted. If key has been got by the intruder. The user data cannot be got valid information even it is obtained through function of privacy protection. It is very important for commercial users of the cloud computing to protect their business secrets. The last is the file quick regeneration layer, user datacan get maximum regeneration even it is damaged through rapid regeneration algorithm in this layer. Each layer accomplishes its own job and combines with others to ensure data security in the cloud computing.
Classifying data according to its permissible use, appropriate handling, and business value is critical fordata privacy and security protection. This is essential for compliance with the constantly evolving regulatory landscape concerning protected data. Problems arise when users compromise data privacy and security by overlooking the critical need to manage data according to these requirements. This paper considers the creation and application of data classification systems for security and privacy purposes. It focuses primarily on classifying information in a meaningful way through the use of a partially automated methodology that normalizes and classifies structured data throughout an enterprise. We introduce the three pillars of the data-centric security model, which are based on thedata-centric security classification offering by IBM Global Business Services (GBS) and the IBM Research Division. In particular, we describe the data classification pillar of the data-centric securityarchitecture, which provides the framework and method for partially automated classification of data to meet the demands of compliance standards.
The user concerns more and more about data security since off-site storage of data on cloud computing. So, we put forward management ideas of user data classification and designed a cloud-based data security policy through user demand for data security protection. It ensures that internal datais not spread to the public cloud through strong authentication, data evaluation classified, sensitive information filtering, cloud computing security gateway strategy and management to protect such as thesecurity management, rules and regulations, safety education.
Usage of payment cards such as credit cards, debit cards, and prepaid cards, continues to grow.Security breaches related to payment cards have led to billion dollar losses annually. In order to offset this trend, major payment card networks have founded the Payment Card Industry (PCI) SecurityStandards Council (SSC), which has designed and released the PCI Data Security Standard (DSS). This standard guides service providers and merchants to implement stronger security infrastructures that reduce the risks of security breaches. This article mainly discusses the need for the PCI DSS and the data security requirements defined in the standard to address the ongoing security issues, especially those pertaining to payment card data handling. It also surveys various technical solutions, offered by a few security vendors, for merchant companies and organizations involved in payment card transaction processing to comply with the standard. The compliance of merchants or service providers to the PCI DSS are assessed by PCI Qualified Security Assessors (QSAs). This article thus discusses the requirements to become PCI QSAs. In addition, it introduces the PCI security scanning procedures that guide the scanning of security policies of a merchant or service provider and prepare relevant reports. We believe that this survey sheds light on potential technical research problems pertinent to the PCI DSS and its compliance.
The use and access of space-based and terrestrial assets for maritime surveillance has developed rapidly in the recent decades. These developments have implications for civil, commercial and military stakeholders and are used by both advanced and developing countries. There are many sensors and systems applicable to the maritime domain such as Vessel Monitoring Systems (VMS), Automatic Identification System (AIS), airborne and space borne optical and radar imagery, ship and coastal radar, underwater acoustics and underwater electromagnetic. We are designing a ship detection and identification system (SDIS) that integrates the results of all of these methods to determine the position and identity of ships for the purpose of monitoring the marine traffic in particular area. The data required by SDIS is readily available from a variety of third-party data providers. By employing redundant datafrom different detection methods we aim to improve the ship detection and identification accuracy. We realize that data providers will transmit their data to clients in a secure manner in order to protect theirdata from unauthorized viewing or alteration. SDIS needs to function with the varying security methods of the different data providers. Although data security has the highest risks during transmission we have chosen to use a data-at-rest approach and encrypt the data streams within SDIS at all stages including transmission, storage and in-use. SDIS also has the challenge of ensuring users' adherence to the contractual data use restrictions as stipulated by the third-party data providers. This paper describes the data security aspects of the SDIS as determined in the initial design phases.
In order to improve personal data security in the electronic commerce, and to avoid information exposing of personal privacy, personal data was used as the encrypted principle, applying homomorphism and random perturbation to strengthen personal data security. This paper analyze the usage of personal data in electronic commerce firstly, then discuss the P3P to realize security of personal data. This mechanism cannot guarantee that Web sites do act according to their policies once they have obtained user's personal data. In light of this, a new algorithm was used to acquire improvedsecurity. The proposed algorithm uses homomorphism as principle to preserve privacy. This method can make the important data to be appeared with secret content in the web service, and prevent personal data from being misused. The simulation results proved that the modified method can protect personal privacy effectively, and can carry on data mining to provide the characteristic service for the customer, and modified method has shorter response time.
This article discusses cloud computing data security issues, including tile security of data transmission, storage, security and management of security. Focus on universal data management affect cloudsecurity analysis, and pointed out that a breakthrough in the development of this cloud computing, try to enumerate the corresponding strategies and long-term development direction. Final part is a summary and outlook about future development of cloud computing security issues.
Cloud Computing becomes the next generation architecture of IT Enterprise. In contrast to traditional solutions, Cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique feature, however, raises many new security challenges which have not been well understood. In cloud computing, both data and software are fully not contained on the user's computer; Data Securityconcerns arising because both user data and program are residing in Provider Premises. Clouds typically have a single security architecture but have many customers with different demands. Every cloud provider solves this problem by encrypting the data by using encryption algorithms. This paper investigates the basic problem of cloud computing data security. We present the data security model of cloud computing based on the study of the cloud architecture. We improve data security model for cloud computing. We implement software to enhance work in a data security model for cloud computing. Finally apply this software in the Amazon EC2 Micro instance.
Cloud computing is one of the most exciting paradigm shifts in distributed computing. Cloud computing is being widely used due to its readily available services at low cost. When the number of cloud users increase this may subsequently lead to data security and privacy threats. Data confidentiality and efficient data retrieval are major issues which block users to adopt cloud computing. The focus of this paper is data confidentiality and efficient data retrieval issues in cloud computing. The data classification and cloud model are proposed to overcome data confidentiality and efficient data retrieval issues. In this paper we propose a new cloud model, i.e., the Hybrid Multi-Cloud Data Security (HMCDS) model withdata classification. The model is based on multi-cloud, different clusters and data classification. Two levels of security are considered, i.e., model and data classification based levels.
In order to improve personal data security in the electronic commerce, and to avoid information exposing of personal privacy, personal data was used as the encrypted principle, applying homomorphism and random perturbation to strengthen personal data security. This paper analyzes the usage of personal data in electronic commerce firstly, and then some solved method was given in the database currently. A new query algorithm was used to acquire improved encrypted data. This method can make the important data to be appeared with secret content in the Web service, and prevent personal data from being misused. The simulation results proved that the modified method can protect personal privacy effectively, and modified method has shorter response time.
Securing confidential data in emergency situations is essential for mission-critical applications. The damage that can result if top-secret information falls into unauthorized or enemy hands can be devastating. Although sanitizing mechanical disks and magnetic tapes can irretrievably delete confidential data, the process is long and arduous, requiring special degaussers, stable power conditions through its duration and adequate time $all of which is often lacking in emergency conditions. The incident in which the US Navy surveillance plane (EP-3E ARIES II) was forced to land in China on April 1, 2001 serves as a chilling reminder of current-day needs for a media that can guarantee datasecurity in emergencies. Solid-state flash disks are being used as drop-in replacements for mechanical disks in military and aerospace mission-critical applications. Flash disk are rugged and operate under the harshest environmental extremes. Flash disk manufacturers have incorporated several features to enhance data security in emergencies. These include fast secure erase and sanitize that sanitize the disk in seconds with unstable power conditions, and security erase that operates with no power at all. This work reviews the levels established by security agencies such as $the US DoD, NSA, US Army, Navy and Air Force - for "erasing" sensitive data for various types of storage media, including tapes, magnetic mechanical disks, optical disks and semiconductor memories. It discusses current options for achieving the required data security levels in emergencies, the pros and cons of the most popular methods, and presents a selected case study within the military and aerospace.
Web applications are increasingly used to provide e-services such as online banking, online shopping, and social networking over the internet. With this advancement, the attacks over the web applicationshave also increased. According to Cenzic 2013 report 99% of web applications are vulnerable tested in 2012 . The root causes behind these vulnerabilities are lack of security awareness, design flaws & implementation bugs. Writing secure code for web application is a complex task as developer emphasis more on implementation of business logic for web application rather than implementing it with secure logic. These vulnerabilities might be exploited by malicious users which can harm the database & reputation of an organization. In this paper we have proposed an Application Intrusion Detection System tool which can detect & prevent web application attacks at the time of occurrence. We have implemented proposed approach with ASP.NET web application and also perform Chi Square test to validate our assumptions. Once completed SWART has future potential to detect and prevent maximum attacks with less complexity.
Quality of Web-applications plays major role in its success. And the high quality Web-application is really possible by following high quality Web engineering process. The use of strong Web-applicationarchitecture with strong development platform not only make Web-applications robust and of high quality but also give Web-application an ability to meet changing and demanding customer requirements in efficient manner. A Model View Controller (MVC) design pattern has remained fundamental architectural design pattern even after great evolution in architectures of user interactiveapplications. In this paper, we discuss the support for achieving quality attributes for the Web-application, the support for Web-application development process and the support for meeting demanding features of Web application on Java EE platform. This contribution will help a lot for small scale as well as large scale Web-application development and for moulding Web-application into a future's high quality finished product from inception phase itself.
There are large demands for re-engineering human-oriented Web application systems for use as machine-oriented Web application systems, which are called Web services. This paper describes a framework named H2W, which can be used for constructing Web service wrappers from existing, multi-paged Web applications. H2Ws contribution is mainly for service extraction, rather than for the widely studied problem of data extraction. For the framework, we propose a page-transition-based decomposition model and a page access abstraction model with context propagation. With the proposed decomposition and abstraction, developers can flexibly compose a Web service wrapper of their intent by describing a simple workflow program incorporating the advantages of previous work onWeb data extraction. We show three successful wrapper application examples with H2W for real worldWeb applications.
Parametric Web application cost estimation is referred to the usage of mathematical model to derive the estimated effort and duration of Web application development. Typically, majority of Web applicationdevelopers are applying expert judgment and estimation by analogy in Web application development. This paper is focusing on feasibility study of WEBMO (Web model), a parametric Web application cost estimation model, in Web application development within Klang Valley in Malaysia. WEBCOMO, a parametric Web application tool is developed based on WEBMOpsilas methodology to fulfill the objective of the study.
The emergence of the Internet era leads to the widely used of Web-based applications. Diverse demographic of users requires Web applications to have the features that enable different level of users to learn and understand their functionalities easily and instantly. Thus, a Web application that is learnable and understandable allows users to interact better and gain more advantages of using theWeb application. Users' interactions with a system often reflect users' knowledge and understanding of the system. By studying users' interactions, Web developers can provide guidance to users, which promote the learnability and understandability of a Web application. The aim of this paper is to improve usability in the aspects of learnability and understandability through user interface design of a Webapplication. This paper proposes an action-based technique to improve users' learnability and understandability by studying users' interactions while they are interacting with the user interface of aWeb application. Web developers can apply the proposed technique in designing a more usable Webapplication in the future.
Nowadays evaluation of software is important term in software engineering world. Engineers use conventional metrics to evaluate a software production which was defined in software engineering . Also defining a metric is depending on application of software. Web application is wellknown as software. Evaluation of Web application is possible by conventional metrics. As we know competition in developing of Web application is for producing intelligent product. We want to define a new metric as intelligence of a Web application in addition to available solutions. We also want to evaluate Webapplications according to this new metric. How much a Web application is intelligent? This is an important question and we want to answer the question. The most important points of this report are to define the intelligence of a Web application in the forms of intelligence levels and intelligence parameters.
This paper describes various Web application frameworks and related emerging technologies pertinent to the Java EE model from a technical perspective. A definition of "Web application framework" is specified, as this terminology has been widely used and implies drastically different meanings in different contexts. The value proposition of a Web application framework is presented to illustrate how a framework can improve the application development productivity and quality. The design philosophy ofWeb application frameworks is articulated. A comprehensive taxonomic scheme is defined to classify various software frameworks and Web application frameworks into appropriate categories. Among dozens of Web application frameworks available as commercial and open source solutions, the predominant products are investigated, followed by the selection guidelines and recommendations. A reference card is constructed to summarize the key aspects of Web application frameworks. Relevant technologies and future trends are also discussed.
The open environment of Web is autonomous, heterogeneous and dynamic, and this problem causes the difficulties of sharing, interoperation and cooperation between Web applications. To solving this problem, a semantic-based architecture for Web application cooperation - SI4WAC is presented. Firstly, in order to provide a sharable vocabulary, extension-based and intension-based methods of ontologies combination are designed to interpret interfaces of Web applications. Then, this paper analyzes the process of state transition of Web application, and depicts the dynamic transition process with the state transition graph. Comparing with existing methods of integration and interoperation forWeb application, SI4WAC considers static and dynamic semantics synthetically, and gives two algorithms to bridge distributed ontologies. Meanwhile, with the formal semantic, ontology and process can be used to reason and model validation for Web application.
As applications based on the World Wide Web are increasing rapidly in terms of both scale and complexity, it has been well recognized that the World Wide Web has evolved from a hypermedia information medium into a new distributed application platform, and it is therefore proper and prospective to view Web applications as software. Developing and maintaining large and complex Webapplications demands a systematic process and an effective engineering methodology. A suitable model of Web application that can capture its features is crucial and fundamental to the establishment of such approaches. In this paper, based on our understanding of Web applications as software, we propose a component-oriented Web application (CoOWA) model in which a Web application is regarded as a collection of components, each having its own functionality and cooperating with others through certain interfaces. It is aimed at establishing a foundation for an engineering methodology for more effective Web application development and maintenance that can benefit from the notion of a component-oriented approach.
A Web application is an application that is invoked with a Web browser over the Internet. Ever since 1994 when the Internet became available to the public and especially in 1995 when the World WideWeb put a usable face on the Internet, the Internet has become a platform of choice for a large number of ever-more sophisticated and innovative Web applications. In just one decade, the Web has evolved from being a repository of pages used primarily for accessing static, mostly scientific, information to a powerful platform for application development and deployment. New Web technologies, languages, and methodologies make it possible to create dynamic applications that represent a new model of cooperation and collaboration among large numbers of users. Web application development has been quick to adopt software engineering techniques of component orientation and standard components. For example, search, syndication, and tagging have become standard components of a new generation of collaborative applications and processes. Future developments in Web applications will be driven by advances in browser technology, Web Internet infrastructure, protocol standards, software engineering methods, and application trends.
Both server performance and web services are crucial components to make sure the web applicationrun smoothly. When one of web services fail to release resources from memory, the web applicationwill not respond. This research paper shows the evaluations on the web server-based performance's on parallel architectures. From empirical research, by using parallel technologies, server's performance becomes more reliable due to high load of web-based processing. This research also proposed a new deployment of network architecture for web application server, portal server and databases.
The World Wide Web has become a sophisticated platform capable of delivering a broad range ofapplications. However, its rapid growth has resulted in numerous security problems that current technologies cannot address. Researchers from both academic and private sector are devoting a considerable amount of resources to the development of Web application security scanners (i.e., automated software testing platforms for Web application security auditing) with some success. However, little is known about their potential side effects. It is possible for an auditing process to induce permanent changes in an application's state. Due to this potential, we have so far avoided large-scale empirical evaluations of our Web Application Vulnerability and Error Scanner (WAVES). we introduce a testing methodology that allows for harmless auditing, define three testing modes - heavy, relaxed, and safe modes, and report our results from two experiments. In the first, we compared the coverage and side effects of the three scanning modes using 5 real-world Web applications chosen from the 38 found vulnerable in a previous static verification effort. In the second, we used the relaxed mode to conduct a 48-hour test involving 1120 random Web sites, of which 55 were found to be vulnerable.