KaaShiv InfoTech, Number 1 Inplant Training Experts in Chennai.
A "public" cloud infrastructure is obtainable to the overall public and is in hand by a 3rd party cloud service supplier (CSP). In a very public cloud, workplace dynamically provisions computing resources over the web from a CSP World Health Organization shares its resources with different organizations. Like that of an electrical utility charge system, the CSP bills the agency for its share of resources. this could be the foremost price effective preparation model for agencies because it provides them the pliability to obtain solely the computing resources they have and delivers all services with consistent accessibility, resiliency, security, and tractableness. Still, to profit from a public cloud, workplace should settle for the reduced management and observance over the CSP’s governance and security.
A "private" cloud infrastructure is operated only for one organization or agency: the CSP dedicates specific cloud services thereto agency and no different shoppers. The agency specifies, architects, and controls a pool of computing resources that the CSP delivers as a uniform set of services. A standard reason for agencies to obtain personal clouds is their ability to enforce their own information security standards and controls. workplace can generally host a non-public cloud on-premises, hook up with it through personal network links, and solely share its resources among the agency. As a result of resources aren't pooled across multiple independent organizations, workplace pays for all of the cloud's capability. still, the agency's Chief data Officer (CIO) will give these resources as services on-demand to organizations and programs among the agency and charge them consequently. A "community" cloud infrastructure is procured put together by many agencies or programs that share specific desires like security, compliance, or jurisdiction concerns. The agencies or CSP might manage the community cloud and will keep it on-premises or off-premises. Once agencies have a standard set of needs and customers, a community cloud allows them to mix assets and share computing resources, data, and capabilities. By eliminating the duplication of comparable systems, agencies will economize and portion their scarce resources additional expeditiously. Procuring a community cloud is additionally how that workplace will advance the Federal IT Shared Service Strategy. A "hybrid" cloud contains 2 or additional clouds (private, community, or public) with a mixture of each internally and outwardly hosted services. Agencies can doubtless not limit themselves to 1 cloud preparation however can rather incorporate totally different and overlapping cloud services to fulfill their distinctive needs. Hybrid preparation models are complicated and need careful reaching to execute and manage particularly once communication between 2 totally different cloud deployments is critical.
This paper investigates the mapping characteristics of one dimensional cloud model controller, points out the cloud model controller essentially is a mapping relationship, and the key of the design of one-dimensional cloud model controller is the design of the one-dimensional cloud model mapping processor. A kind of the active principles of the cloud model inference rules is proposed, a one-dimensional cloud model mapping processor is designed, pointed out the mapping the region is bounded and the convergence, finally, a design method of the one-dimensional cloud model mapping processor with P controller features is proposed, a numerical example illustrates the effectiveness of the proposed method.
In order to improve the support ability of dealing with the complex decision problem, grey cloud model and intelligent decision support system (IDSS) based on grey cloud model is proposed. A new uncertain model, grey cloud model is proposed. Firstly the definition depiction of grey model is given. Then representation uncertain of knowledge based on grey cloud model in knowledge database is proposed. The data mining method based on is proposed. Lastly the case of data mining based on grey cloud model in intelligent decision support system is given to verify the validity of grey cloud model. The innovation of method is that the grey cloud model and intelligent decision support system based on grey cloud objectively express the character of complex problem. This increases the support ability of dealing with the complex decision problem in IDSS.
Cloud model is an effective tool in uncertain transforming between qualitative concepts and their quantitative expressions. The cloud model is extensively studied presently and the research on interval cloud model does not exist. Compared to the cloud model, interval cloud model has extensive exist in daily life and owns merits to represent numerical concept, Some related definitions of interval cloud are given, and then the algorithms of interval cloud generators including forward interval cloud generator, backward interval cloud generator and interval cloud generator under different conditions are given. These provide the base to bridge the gap between quantitative methodology and qualitative methodology in research on fuzzy phenomenon in both social and natural sciences.
Cloud model is an effective tool in uncertain transforming between qualitative concepts and their quantitative expressions. In this paper, we introduce a new optimization method inspired from cloud model theory. The innovations of the algorithm are the estimation of good solution regions and new solution production according to the cloud model theory. First, the algorithm uses information obtained during optimization to build cloud model of good solution regions, and calculates three digital characteristics of the cloud model by backward cloud generator. Second, three digital characteristics of the cloud model are used to produce new solutions by forward cloud generator. Then, the best solutions are selected from current solutions and new solutions to form next population. The proposed algorithm is applied to some well-known benchmarks. The relative experimental results show that the algorithm is effective and achieves better solutions.
Research and commercial efforts are currently addressing challenges and providing solutions in cloud computing. Business models are emerging to address different use case scenarios of cloud computing. In this paper, we present the evaluation of a virtual enterprise (VE)-enabled cloud enterprise architecture for small medium and micro enterprises (SMMEs) against EC2 pricing model to prove that our pricing model is more suitable for small medium and micro enterprises (SMMEs). This model is based on the realization that it is not economically viable for SMMEs to acquire their own private cloud infrastructure or even subscribe to public cloud services as a single entity. In our VE-enabled cloud enterprise architecture for SMMEs, temporary co-operations are created to realize the value of a short term business opportunity that the partner SMMEs cannot (or can, but only to a lesser extent) capture on their own. The pricing model obtained from our proposed business model shows the benefits that are derived from using the VE cloud model over subscription to a public cloud as a single business enterprise. The pricing structure of our VE cloud model is up to 17.82 times economical compared to the equivalent Amazon EC2 instance type pricing model.
We present a specification language Cloud# for modeling the internal organisation of cloud. By reasoning about cloud models, clients understand more on how services are delivered inside cloud. In this sense, cloud models make cloud services more transparent to clients. The transparency of cloud services are expected to increase the confidence of clients to move their business-critical applications to cloud. The expressiveness of Cloud# is evaluated by giving four cloud models, which demonstrate basic features of cloud computing, such as resource virtualization and air scheduling. We describe an application of Cloud# by building an architecture, in which Cloud# models are combined with remote attestation to deliver trusted services.
There are many differences between human faces, but still having common characteristics. The person's facial contour can be approximated as ellipses, and the relative position of eyebrows, eyes, nose, mouth and other organs is stable in the whole face. Such shapes are similar and can provide the basis for the realization of human face synthesis. Whether in technology or in the application, human face synthesis with computer has broad prospects. As mathematical conversion model of uncertain knowledge, cloud model integrates the fuzziness and randomness to constitute the mapping between qualitative and quantitative, while the facial expression is a kind of uncertainty data. This paper proposes face synthesis technology based on cloud model. First of all, expand the cloud model algorithm from data points to data set and then put each piece of face image as a M'N (M rows, N columns are actually the image positioning) grid in order to make each image grid have a grayscale value (0-255). Secondly, extract cloud numerical characteristics (Ex, En, He) of inputted human face image with backward cloud generator. Thirdly, by positive cloud generator, generate a set of cloud droplets which have corresponding figures feature. And finally, achieve human face synthesis with backward cloud generator. Human face synthesis technology based on cloud model, realizes human face synthesis of multi-face expression sources based on different weighting ratio. The experimental results show that it can obtain different expression of modes, and enrich the connotation of the performance of facial expression by adjusting values of the weight vector..
In this paper, we extend the concept of model-mediated teleoperation (MMT) to six degrees-of-freedom in complex environments using a time-of-flight (ToF) camera. Compared to the original MMT method, the remote environment is no longer approximated by a simple planar surface, but by a point cloud model. Thus, object surfaces with complex geometry can be used in MMT. In our proposed system, the point cloud model is captured by the ToF camera with high temporal resolution (up to 160fps) and a flexible work range (10cm to 5m). Updating the model of the remote environment while the robot is in operation is thus easier compared to the original MMT approach. The point cloud model is transmitted from the teleoperator to the operator using a lossless H.264 codec. In addition, a simple point-cloud-based haptic rendering algorithm is adopted to generate the force feedback signal directly from the point cloud model without first converting it into polygons. Moreover, to compensate for the estimation error of the point cloud model, adaptive position and force control schemes are applied to enable stable and transparent teleoperation. Our experiments demonstrate the feasibility and benefits of utilizing the proposed method in MMT.
The reasoning error of existing high dimensional cloud model is large when some domains are correlated, and a correction high dimensional cloud model is proposed based on cloud model and multivariate normal distribution theory. The entropy and hyper entropy were replaced by covariance entropy and covariance hyper-entropy in the correction cloud model, which were used to describe the correlation of some domains. The error could be reduced by the new parameters when the correlation exists. The correction high dimensional backward and forward cloud generators are also proposed. The correction high dimensional cloud was used in fault diagnosis of flight control system. With the residual errors of correlated states as input and the degrees of fault as output, the correction high dimensional cloud multi-rule system could get exact results.
In tactical ad hoc network (TA), most of the existing group mobility models exhibit the drawback in controlling dynamic topology of mobile nodes (MNs) by adjusting parameter. We propose a group mobility model based on normal cloud model to overcome this drawback. In this model, forward normalized cloud is used to convert qualitative concept into nodal position sets of numerical representation. And backward normalized cloud is used to convert nodal position sets into node distribution. Moreover, by adjusting entropy and hyper entropy of normal cloud model, controlling coverage area and dispersion degree of MNs are respectively obtained. Additionally, we conduct a comparative performance of routing protocol using Reference Point Group Mobility (RPGM) model as well as Gibbs sampler based simulated annealing Group Mobility (GGM) model. As expected, the results show that different group mobility models have different impact on the performance evaluation of ad hoc network protocol.
Collaborative optimization is a good method for solving multidisciplinary optimization design of the complex system. Satellite design is a complex and systematic project, the efficiency of the traditional optimization design methods are not efficiently. Recently cloud computing has been widely used in intelligent control, data mining, optimization computation and the other fields. The normal cloud model based on the normal distribution is an important cloud model for cloud computing and has been a wide range of adaptability. In this paper, the normal cloud model is in combination with collaborative optimization. With the structure of the collaborative optimization, the velocity and direction of optimized search can be controlled by the cloud droplets generated by the normal cloud model. Numerical results indicate the high quality of the algorithm on precision stability and convergence rate and improve the efficiency of the satellite optimization design.
Based on the knowledge representation of cloud theory and rough sets, a rough-cloud model is put forward, which bridges the gap between quantitative knowledge and qualitative knowledge. In relation to classical rough sets, the rough-cloud model can deal with the uncertainty of the attribute and make a soft discretization for continuous ones. A novel approach, including discretization, attribute reduction, value reduction and data complement, is presented. With the origin data rough reduction, a combination cloud generator is put forward, which combines forward cloud generator and backward cloud generator to a close-loop structure. The generator is used to load model for power system to solve the load origin data shortage of distribution system. Considering of the distribution system load data characteristic, the restriction equations and system data complement unit are joined to combination cloud generator, which ensure that the created load data cover most of the system situation without impossible data. The cloud drop reflects the fuzziness and randomness of the load data. The loads are identified by T-S fuzzy model based on the generation cloud drop. The identification result implies the effectiveness and usefulness of the approach by the contrast with some kinds of universal load model.
Satellite design is a complex and systematic project. In the design process, due to the complexity of the relations among different disciplinary and the coupling of design variables, the efficiency of the optimization design is not efficiently. Recently cloud computing has been widely used in intelligent control, data mining, optimization computation and the other fields. The normal cloud model based on the normal distribution is an important cloud model for cloud computing and has been a wide range of adaptability. In this paper, the normal cloud model is in combination with evolutionary algorithm. With the idea of evolutionary algorithm, the velocity and direction of optimized search can be controlled by the cloud droplets generated by the normal cloud model. Numerical results indicate the high quality of the algorithm on precision stability and convergence rate and improve the efficiency of the satellite optimization design.
In this study, we developed a soil moisture retrieval technique for vegetation fields using a modified water-cloud model. The water-cloud model has been a typical algorithm used to analyze scattering from vegetation fields for a long time, because it is simple to be used for retrieving a variety of information such as soil moisture and vegetation water mass. However, its accuracy has been questionable because the water-cloud model contains lots of approximations in the process of simplification. To improve the accuracy of the algorithm, we modified the water-cloud model with the estimation of parameters using a radiative transfer model. Soil moisture is retrieved from SAR images by the modified water-cloud model for vegetation fields and compared with in-situ measured ground truth data.
In this paper, we introduce cloud model theory to the particle swarm optimization algorithm to improve the global search ability and make a faster convergence speed of the algorithm. Some modifications are presented. First, we adopt cloud model to initialize the positions and velocities for entire population in the initialization range. Second, inertia weight is dynamically, nonlinearly decreased as the search progresses by using the data set, which can be obtained by cloud model. Third, two random variants in the velocity rule are assigned with cloud model. Four, inertia weight and the two random variants are correlated by cloud model. The modified particle swarm optimization is tested on some benchmark functions and the results are compared with the result of the standard particle swarm optimization. Experimental results indicate that the modified particle swarm optimization outperforms the standard particle swarm optimization in the global search ability with a quicker convergent speed..
Decision support system using data mining to find decision knowledge is called intelligent decision support system (IDSS). Neural network as a data mining method commonly is used to find classification knowledge in IDSS. But the classic data mining based on neural network is short in dealing with the blank value data or data with the character of blurring and randomicity. Such data is called as imperfect data. In order to overcoming this shortcoming the method of combination of cloud model and neural network to find knowledge from imperfect data in IDSS is proposed. Firstly the cloud is used to depict the imperfect data by group decision. In the following, attribution generation based on cloud model or grey cloud model is used to generate the upper concept layer. In this step the cloud model depicting the imperfect data is classified into the concept layer that is proximal to itself according to distance between two cloud models. Then classic neural network method is used to gain knowledge. The data is input into the neural network to training and gaining the classification knowledge. Lastly an experiment is given to verify the validity of the method.
Differential Evolution (DE) is a numerical optimization approach, which is simple to implement, requires little parameter tuning, and known for remarkable performance. It mainly uses the distance and direction information from the current population to guide its further search. However, it has no mechanism to extract and use global information about the search space. Cloud model is an effective tool in uncertain transforming between qualitative concepts and their quantitative expressions. It can be used to extract the global statistical information about the search space. In this paper, we introduce a new optimization algorithm combining differential evolution and cloud model. The innovation of the algorithm is extraction of the global statistical information about the search space and production of new solutions according to the cloud model. The best individuals from the current population are used to build cloud digital characteristics of good solution regions by backward cloud generator. And then these cloud digital characteristics are used to produce new individuals by positive cloud generator. Both the local information from DE and global information from cloud model are used to guide the further search. The proposed algorithm is applied to some well-known benchmarks. The relative experimental results show that the algorithm has stronger global search ability than original version of DE. Finally, an application of the algorithm to RFID sensor deployment is presented.
In this paper, we present an enhanced 3D reconstruction algorithm designed to support an autonomously navigated unmanned ground vehicle. An unmanned system can use the technique to construct a point cloud model of its unknown surroundings. The algorithm presented focuses on the 3D reconstruction of a scene using image sequences captured by only a single moving camera. The original reconstruction process, resulting with a point cloud, was computed utilizing extracted and matched Speeded Up Robust Feature (SURF) points from subsequent video frames. Using depth triangulation analysis, we were able to compute the depth of each feature point within the scene. We concluded that although SURF points are accurate and extremely distinctive, the number of points extracted and matched was not sufficient for our applications. A sparse point cloud model hinders the ability to do further processing for the autonomous system such as object recognition or self-positioning. We present an enhanced version of the algorithm which increases the number of points within the model while maintaining the near real-time computational speeds and accuracy of the original sparse reconstruction. We do so by generating points using both global image characteristics and local SURF feature neighborhood information. Specifically, we generate optical flow disparities using the Horn-Schunck optical flow estimation technique and evaluate the quality of these features for disparity calculations using the SURF keypoint detection method. Areas of the image that locate within SURF feature neighborhoods are tracked using optical flow and used to compute an extremely dense model. The enhanced model contains the high frequency details of the scene that allow for 3D object recognition. The main contribution of the newly added preprocessing steps is measured by evaluating the density and accuracy of the reconstructed point cloud model in relation to real-world measurements.
The current Black-Scholes option pricing model is based on the assumption that prices of assets such as stocks subject to ITO (continuous Markov process). However, the value of assets for project may not always meet this assumption. Therefore there is subjectivity and arbitrariness. Cloud model can reflect the associative characteristics of fuzziness and randomness. The reverse cloud generator algorithm based on interval data can express the randomness of the intervals, which estimated by experts and provide the numerical features of cloud formation objectively in the process of generating cloud formation model. In this paper, we applied the option pricing theory-based normal cloud model and interval analysis the normal reverse combining interval analysis and the algorithm of reverse cloud model to offer a reasonable estimate of the strategic mergers and acquisitions pricing. At the end, a example is given to simulate verification of the model..
The paper focuses on the image segmentation methods based on histogram analysis, and proposes a novel image segmentation approach based on cloud model. Firstly, the paper introduces the basic principles of cloud model. Similar to type-2 fuzzy sets, cloud model considers the uncertainty of membership grades. But it also considers the randomness of them. It is a new kind of uncertain model which is different from type-2 fuzzy sets. Secondly, the proposed image segmentation approach is described. The histogram of image is transformed into discrete quality concepts expressed by cloud models. Based on these quality concepts represented by cloud models, image segmentation is realized by the principle of maximum certainty degree. In the end, the paper compares the proposed method with fuzzy C means (FCM) method and Gaussian mixture model (GMM) method. Experiments demonstrate the effectiveness of the proposed method.
At present, electric load forecasting method and model are all point forecasting to the load, the paper proposes a method of short-term load forecasting using the cloud model which represents the artificial intelligence with uncertainty. The forecasting results are many discrete data sets which are uncertain and change in some range, so they can represent the changing characteristic of electric load more actually. In the paper, the author firstly introduces the conception and characteristic of cloud model and gives the process of data discretization and conception zooming for the load data and the weather factors based on cloud model. Then the paper carries on the mining and inference of uncertainty rules using the associated knowledge algorithm based on cloud model (Cloud-Association- Rules), and finally uses the data of some area as the forecasting analysis example, gives two kinds of results expression which are the forecasting sets distribution chart and the excepted values graphic chart. The forecasting results can meet the practical standard of electric load forecasting.
Cloud model can be used for reliability evaluation when the working condition is changeable and described by the qualitative linguistic values. It is neglected that the most typical sample in cloud model also contains some uncertainty when evaluate the reliability of products using cloud model. For this reason, a cloud model is used to describe the uncertainty of the most typical sample. The experimental results show that the reliability evaluation result is closer to the reality when using the cloud model that considering the uncertainty of the most typical sample.
Cloud model is an advanced theory for solving uncertainty problems. Taking the uncertainty of images into account, this paper proposes an object detection algorithm based on the cloud model. First, adopt cloud model theory to transform the imagepsilas qualitative model to its quantitative model. Then, use climbing policy to get different level concepts which represent different level objects. At last, compute a certainty degree and determine which concept each pixel belongs to. Experimental results proved that the algorithm can have a better effect in accuracy of detecting objects and it is good at resolving the edges of different objects.
A cloud model-based multiple attribute evaluation method is put forward. The ways of the cloud transformation is used to represent the cloud model of each different attribute. On the base of one-dimension cloud, the multidimensional cloud model of system is confirmed. And then, evaluation value is achieved by calculating the deviation degree of the system cloud gravity center from the ideal cloud gravity center. Finally, this method is applied to evaluate the data link system fighting effectiveness as an example, which provides a new idea for evaluation and decision of multiple attribute object system.
Because matter-element theory's shortcoming in ignoring the uncertainty of boundary values when setting up a fault diagnosis model, the diagnosing results are deviated from the actual situation. Considering cloud model theory can be applied to represent the uncertainty of the boundary, we combine cloud model theory with matter-element to propose a new diagnosing model. Meanwhile, by using data such as the concentration of dissolved gases, gas production rate and the ratio between gases in oil, the proposed diagnosing model is tested. Compared with other techniques for transformer diagnosing with case analysis, the new proposed model has a higher diagnosing precision..
To solve the subjective and uncertainly about trust in e-commerce system, a new trust model based on multidimensional trust cloud is proposed, which is on the basis of the past research on cloud model. History evaluation data is regard as the quantitative universe of discourse, trust cloud of entities' property is based on the weighted backward generation cloud algorithm, so the history behavior of an entity can be reflected very well through three numerical characteristics of cloud model. The results of experiments show that the model proposed in the paper can reflect a seller's property more accurately and can provide a favorable reference for a buyer entity in e-commerce..
In order to evaluate complex system in selection of aircraft scheme, the evaluation method based on cloud model was utilized. This paper analyzed the evaluation steps based cloud model. Because of imprecision in judgment from cloud model figure in past, the measurement method of cloud comparability was introduced to judge which rating the evaluation result belongs to. Finally, the application in the use of the design in certain Vertical/Short Takeoff and Landing (V/STOL) aircraft proved the validity of the evaluation method. The results indicate that the evaluation method based on cloud model can give consideration to both randomness and fuzziness of complex system, and can effectively evaluate the aircraft scheme.
Campus cloud is a new form of campus information system implementation. This paper proposes the new scheme for campus cloud implementation based on telecom public service platform. To address the existing limitations of region applicability and resource-consuming in cloud model, a method for resources calculated and positioning strategy is proposed based on regional priorities and the resources type. In the cloud model, the use of middleware services layer and the virtual services will make application components easy coupling and decoupling, implementation services and flexible configuration, reduce resource consumption. Application shows that using this model can reduce difficulty in the campus cloud development and deployment platform, and achieving higher cost-effective campus information applications.
It is a key problem to represent properly the knowledge in space decision support system. The knowledge has the character of multi-dimension, fuzzy and random city in uncertain problems on space decision. In order to improve the ability of dealing with uncertain problems in space decision support system the representation and induction based on cloud model is proposed in this article. Firstly the multi-dimension cloud model is given to represent the knowledge stored in knowledge database in space decision support system. Then knowledge induction method based on multi-dimension cloud model is proposed. Lastly a case in space environment decision is given to verify the methodpsilas validity. Contrast to previous method, the innovation of method is that the complete knowledge representation and induction based on cloud model with the merit of representing multi-dimension, fuzzy and random knowledge is given. .
Cloud model is an effective tool for qualitative and quantitative transform, a specific structure generator is formed by normal cloud model through expectation, entropy and hyper entropy. This particular structure makes the normal cloud model has more general applicability and simply and straightforward completed conversion process between qualitative and quantitative. Since The frequency of network security incidents is nonlinear, traditional prediction methods network security incidents. which stands for the number such as ARMA and Gray systems are difficult to deal with of incidents per unit time. The cloud model used in the network security evaluation systems and the evaluation methods in the network security domain presented in this paper.
Personalized service recommendation is a hot research spot of web log mining in the application realm. Through introducing the related theories of web log mining and personalized service, analyzing the status quo of personalized service recommendation of web log mining, combining with basic theories of cloud model and its feasibility and superiority in web log mining, the author promotes a personalized service recommendation model in web log mining based on cloud model. This model utilizes the uncertain reasoning algorithm of cloud, handles with the randomness and obscurity of web log data and realizes the mutual exchange between qualitative and quantitative of user's personalized service recommendation language value. Finally, the experimental simulation of recommendation model illustrates that the application of cloud model improves the accuracy of personalized service recommendation of web log mining and is possessed of dependability and validity.
Two layer water cloud model (WCM2), which is a refined version of the conventional water cloud model (WCM), considering the vertical inhomogeneity in the vegetation layer. The vertical inhomogeneity of the vegetation layer is described by the distribution of water content per unit volume. A piecewise linear function is used to describe the distribution of water content per unit volume. We analyze the variation tendency of vegetation scattering under different vertical inhomogeneity conditions by means of WCM2 and validate it using a physically-based model developed at Tor Vergata University, Rome, Italy. The result shows that the vertical inhomogeneity affects vegetation scattering significantly. Comparison between model predications and field measurements on radar backscattering coefficients for soybean shows WCM2 has the potential to get a better prediction result.
Due to the “premature” phenomenon and poor local search ability of genetic algorithm, an improved genetic algorithm, adaptive and parallel simulated annealing genetic algorithm based on cloud model (PCASAGA), is proposed in this paper. This algorithm integrates cloud model, multi-populations optimization mechanism, parallel techniques, simulated annealing algorithm and adaptive mechanism. It applies qualitative reasoning technology - cloud model to the regulation of crossover probability and mutation probability to improve the adaptive ability. The use of new multi-threading building blocks TBB parallel technology has greatly enhanced the operational efficiency of the algorithm. simulation results illustrate that PCASAGA has better convergence speed and optimal results than original genetic algorithm, and takes full advantage of the current multi-core resources of computers.
Danger Theory is a novel method of Biological Immunology. Artificial Immune Systems researchers may extract benefits from the theory, especially in anomaly detection. The definition of danger signals is one of the most important problems in Danger Theory. For the distinction between danger and safety is fuzzy and precarious, the precise calculation method is not suit for this problem. Cloud Model is an effective tool to transform qualitative concepts into quantitative expressions for uncertain problems. In this paper, a suggestive definition of danger signals based on Cloud Model is presented. The changes of key features of a computer system are collected and integrate by the rules generator based on Cloud Model, and danger signals are presented. The experimental results demonstrated that the proposed definition of danger signals is feasible.
Trust model is fundamental for information security in distributed networks. Based on the outstanding characteristics of cloud model on processing transformation between a qualitative concept and a set of quantitative numerical values, we present a formalized method for subjective trust evaluation using cloud model. This paper had exploited uncertainty and fuzziness characters which are always be considered as two internal properties of subjective trust and made a step into the direction of properly understanding and defining human trust. The qualitative reasoning mechanisms are also given to accomplish trust evaluation..
For random deployment in sensor networks, there exists uneven node density, seriously reducing the network performance after clustering. The idea of the power control in CDMA is applied to handle the communication coverage control for clusters in wireless sensor networks. The cloud model-based uncertainty reasoning and control mechanism are introduced to adjust the transmit power adaptively to keep the node number within each cluster in an appropriate range in accordance with the node distribution density. The tendentious rules of cloud model ensure the convergence performance of the coverage control, while random process concealled in cloud model ensures the best iteration times in all. After clustering, there exists an appropriate node number within each cluster, improving the network topology. The experiment results validate its rationality and effectiveness.
Cloud computing is defined as the storage, management, processing, and accessing information and other data stored in a specific server. With the advent of internet, intrusion attacks have gained sophistication over the time. Distributed attacks could not be detected by the present available intrusion detection system. In this case, we propose a distributed intrusion detection model based on Cloud theory. Our model is composed by Intrusion Detection Agent subsystem and Data Aggregation subsystem. Intrusion Detection Agent subsystem has three parts: data collection module, Cloud decision-making module and communication module. An intrusion detection algorithm based on Cloud theory was proposed to detect intrusion behavior and improve the detection ability to complicated intrusion. Followed by our model, we introduced a strategy to defend DDoS attack using the elastic properties of cloud platform.
KNN algorithm is particularly sensitive to outliers and noise contained in the training data set. In this paper, we use the reverse cloud algorithm to map the training samples into clouds. Each attribute is mapped to a cloud vector. Reverse cloud algorithm is not sensitive to the noise on data sets and it can eliminate the impact of noise on classification effectively. By comparing the similarity of clouds in the cloud vector, we can calculate the attributes weights. For those attributes with a low weight of properties, we find out merger them to a new attribute which can generate more significant attribute weight than original ones. We present a new KNN algorithm based on Cloud Model and compare our algorithm with classic KNN algorithms and other well-known improved KNN algorithms using 10 data sets. Experiments show that our approach could achieve a better or at least a comparable classification accuracy with other algorithms..
Cloud computing technology is a new concept of providing dramatically scalable and virtualized resources. It implies a SOA (Service Oriented Architecture) type, reduced information technology overhead for the end level user, greater flexibility model, reduced total cost of ownership and on-demand service providing structure. From the user point of view, one of the main concerns is cloud security from the unknown threats. The lack of physical access to servers constitutes a completely new and disruptive challenge for investigators. The Clients can store, transfer or exchange their data using public cloud model. This paper represents the encryption method for public cloud and also the cloud service provider's verification mechanism using the third party auditors with framework model. The Cloud Data Storage is one of the mandatory services which are acquiring in this rapid development business world..
Bacterial foraging optimization (BFO) algorithm, the newest social foraging behavior of Escherichia coli inspired optimization algorithm, is computationally expensive due to the slow nature of the collective intelligence of bacterial swarm. This paper presents a novel bacterial foraging oriented by atomized feature cloud model strategy(BFOAFC) with two main novel steps to accelerate BFO. The first is an atomized feature cloud model based generation jumping to generate a candidate swarm, and second is a novel updated formula to update the tumble movements in chemotaxis steps of virtual bacterial. A comprehensive set of complex benchmark functions including a wide range of dimensions is employed for experimental verification. Experimental results confirm that the BFOAFC outperforms the original BFO and BFO oriented by particle swarm optimization in terms of convergence speed and solution accuracy.
Because there are lots of random and fuzzy uncertainty factors in power system, the certainty model used to be adopted in reliability research were not reasonable. But cloud models theory is a powerful tool to convert numerical quantitative analysis to conceptual qualitative analysis. In this paper on the basis of introduction of cloud models, the parameter and load cloud models in actual operation of power system is proposed and established to solve the uncertainty of the value FOR and the changing of load. And base the model takes the risk assessment by used the RBTS reliability test system and come out the risk indexes represented by cloud model. The result show that system actual operation condition based cloud models suit the Scene actual.
Mobility Model has drawn more and more attentions since its important role in Mobile Ad hoc Network (MANET) protocol performance assessment. While MANET technology is widely used in Tactical Networks, research on Mobility Models oriented to Tactical Networks is still comparatively confined to the combination of traditional MANET ones. In this paper, a novel Tactical Unit Mobility Model Framework is proposed in order to generate different mobility models in a tactical environment. We considered Tactical intention, the uncertainties of both upper-level intention and lower-level node movement and obstacles which would significantly influent node movement. Cloud Model from Artificial Intelligence with Uncertainty is imported as a key algorithm to deal with the uncertainties afore-mentioned. In order to tell the effectiveness of our framework, different simulation scenarios are proposed. The result illustrates that numerous brand new mobility models according with tactical reality can be generated by our framework..
Because the traditional evaluation of the Automatic Test System (ATS) just considered the fuzzification of the evaluation index, and ignored the randomicity and indetermination of the index, so a new method based on neural net and cloud model is proposed to solve the problem. An ameliorated neural net algorithm is used to get the weight of the index. Cloud model is used to express the quantificational data with qualitative index, and the fuzzification, randomicity and indetermination of the index are considered in the model. The experiment results show that it's feasible and effective in the comprehensive evaluation of ATS.
The model of operating speed is important for the safety design of road geometry alignment. The model of operating speed based on radius of horizontal curve, longitudinal grade, and sight distance is founded by cloud theory in this paper. The concept trees for input parameters are mined automatically extracted from actual experiment data. The rule base is auxiliary created by fuzzy neural logic system in Matlab ANFIS. Simulation result indicates that cloud model of operating speed is able to describe the uncertainty feature of the operating speed.
In project scheduling management, cloud operation and cloud model are introduced for a better consideration of both randomicity and fuzziness in project scheduling, as well as their relationship, giving us a clearer view of uncertainty in the project. Traditional methods, like probability statistics analysis and fuzzy mathematics, focus on their respective fields, that is, the former attaches more importance on randomicity while the latter considers fuzziness only. Based on the normal cloud model of project optimal scheduling, we have calculated the certainty index of completion probability and completion risk, and also made a relative analysis of randomicity and fuzziness, so that more information is provided for project management and decision..
A cloud model-based controller which needs no mathematical models of plant is presented in this paper for the trajectory tracking control of a flexible-link manipulator with poorly known dynamics. Based on the singular perturbation method and the time-scale decomposition, the flexible-link manipulator model is decomposed into a slow subsystem of an equivalent rigid-link manipulator and a fast subsystem of flexible mode. A composite adaptive controller is proposed to implement the angle position control of the slow subsystem and simultaneous suppressing the tip vibration. In the proposed control strategy, the control experience qualitatively expressed by linguistic is transformed into the control rulers using the normal cloud models. Experiment studies on the test-bed of a two-link flexible manipulator are carried out to show the robustness, viability and effectiveness of the proposed control approach.
Differential Evolution (DE) is one of the current best evolutionary algorithms. It becomes important in many fields such as evolutionary computing and intelligent optimization. At present, DE has successfully been applied to diverse domains of science and engineering , such as signal processing, neural network optimization, pattern recognition, machine intelligence, chemical engineering and medical science. However, almost all the DE-related evolutionary algorithms still suffer from the problems such as premature convergence, slow convergence rate and difficult parameter setting. To overcome these drawbacks, we propose a novel Cloud Model-Based Differential Evolution Algorithm (CMDE) in which the pheromone and the sensitivity model of free search algorithm replaces the traditional roulette wheel selection model. The model incorporates Opposition-Based Leaning (OBL) to present an improved artificial bee colony algorithm. Experimental results verify the superiority of CMDE is over several state-of-the-art evolutionary optimizers.
The cloud model is transformation model between qualitative concepts and quantitative expression, the qualitative concept is represented by expected value, entropy and hyper entropy in this mode. It combines the fuzziness with randomness, and possesses strong robustness for the uncertain question. Considering the main influence of factors, the 2D cloud model is applied to the prediction of monthly discharge in non-dry season, this mechanism can effectively overcome the fuzziness and the random in monthly discharge prediction, and can offer a novel approach for monthly discharge prediction. The results show that this method has excelled precision, it is also in accord with the real ones..
The uncertainties of grid nodes security are main hurdle to make the job scheduling secure, reliable and fault-tolerant. The fixed fault-tolerant strategy in jobs scheduling may utilize excessive resources. In this paper, the job scheduling decides which kinds of fault-tolerance strategy will be applied to each individual job for more reliable computation and shorter make span. And we discuss the fuzziness or uncertainties between TL and SD attributes by the subjective judgment of human beings. Cloud model is a model of the uncertain transition between qualitative concept and its quantitative representation. Based on the cloud model, We propose a security-aware and fault-tolerant jobs scheduling strategy for grid (SAFT), which makes the assess of SD and SL to become more flexible and more reliable. Meanwhile, the different fault-tolerant strategy has been applied in grid job scheduling algorithm by the SD and job workload. Moreover, much more important, we are able to set up some rules and active each qualitative rule to select a suitable fault-tolerant strategy for a scheduling job by input value (the SD and job workload) to realize the uncertainty reasoning. The results demonstrate that our algorithm has shorter make span and more excellent efficiencies on improving the job failure rate than the fixed fault-tolerant strategy selection.
Qualitative rules mining and reasoning is one of the important aspects of data mining and knowledge discovery. Backward cloud can transform quantitative values into qualitative concept naturally, and the rule generator constructed by cloud model can describe qualitative rules expressing by natural language effectively. In this paper, we employ backward cloud to obtain qualitative rules from a small quantity of precise cases, and construct rule generator to provide a reasoning mechanism. Finally, qualitative rules mining and reasoning approach is used to control the electrical machine running speed under practical working conditions, which demonstrates the validity of the cloud method..
As online trade and interactions on the internet are on the rise, a key issue is how to use simple and effective evaluation methods to accomplish trust decision-making for customers. It is well known that subjective trust holds uncertainty like randomness and fuzziness. However, existing approaches which are commonly based on probability or fuzzy set theory cannot attach enough importance to uncertainty. To remedy this problem, a new quantifiable subjective trust evaluation approach is proposed based on the cloud model. Subjective trust is modeled with cloud model in the evaluation approach, and expected value and hyper-entropy of the subjective cloud is used to evaluate the reputation of trust objects. Our experimental data shows that the method can effectively support subjective trust decisions and provide a helpful exploitation for subjective trust evaluation.
Cloud computing models vary:Infrastructure as a Service (IaaS), Platform as a Service (PaaS) , and Software as a Service (SaaS). Manage your cloud computing service level via the surrounding management layer. Infrastructure as a Service (IaaS). The IaaS layer offers storage and compute resources that developers and IT organizations can use to deliver business solutions.Platform as a Service (PaaS). The PaaS layer offers black-box services with which developers can build applications on top of the compute infrastructure. This might include developer tools that are offered as a service to build services, or data access and database services, or billing services.Software as a Service (SaaS). In the SaaS layer, the service provider hosts the software so you don’t need to install it, manage it, or buy hardware for it. All you have to do is connect and use it. SaaS Examples include customer relationship management as a service. The umbrella of cloud computing is a big one. Like any technology in the early stages of adoption, there are competing models, each claiming to be the optimal configuration and each, more than likely, suited to specific kinds of businesses and specific kinds of business needs. Indeed, the number of cloud permutations is nearly as diverse as the number of companies using them.Still, over time, there are consistent models that begin to emerge. Here’s a look at some of the top cloud computing models in production today: 1. The Internal Cloud. This is, in many ways, the most common type of cloud computing. The internal cloud occurs within a single organization, allowing them to implement virtualization for in-house services. The premise is that internal infrastructure including server, networks, storage and applications will be connected and virtualized, which in turn allows It to move things around in such a way as to maximize efficiency. This is different from a simply virtualized situation in that it allows a higher degree of automation and even a chargeback capability for the other business units. 2. External Cloud Hosting. This type of cloud model uses an external service via a cloud provider, and it’s access by the organization via the Internet. This is probably the most cost-effective way to utilize the cloud. The big concern with this model, of course, is security. Performance is also a concern, in many quarters. 3. The Hybrid Cloud. The Hybrid cloud model mixes both internal cloud computing and external cloud hosting. This is where most businesses shine. It allows a highly customized approach, and lets a business use the cloud when it makes sense and avoid ti when it doesn’t make sense. 4. Traditional SaaS. SaaS is still out there, and it’s especially common among SMBs. A small business that uses 37Signals for project management or Google for its company email is adopting the cloud on the most micro of levels. As you can imagine, each model fits some business models better than others. Large corporations might benefit from the internal cloud, whereas smaller businesses will most likely be external or traditional SaaS. As cloud computing continues to evolve, businesses will continue to shift back and forth through these four major paradigms.