2019, 28(7):1-8. DOI: 10.15888/j.cnki.csa.006988
Abstract:Spatial interpolation analysis algorithm is a kind of algorithm applied to transform measurement data of discrete points into continuous data surface. It can compare the distribution of continuous data surfaces with other spatial phenomena, and it has a wide range of applications in spatial information, especially in terms of geographic information. The interpolation principle and application scenarios of spatial interpolation algorithms such as Thiessen polygon method, inverse distance weight interpolation method, spline function interpolation method and Kriging interpolation method are reviewed. The progress and future research direction of spatial interpolation analysis algorithm is discussed.
2019, 28(7):9-16. DOI: 10.15888/j.cnki.csa.007000
Abstract:In order to cope with the real-time solution requirements of the Vehicle Routing Problem with Time Window (VRPTW) in the era of big data, this study proposed an improved parallel ant colony algorithm based on Spark platform. At the algorithm level, the improved state transition rule and roulette selection mechanism are used to construct the initial solution, and the k-opt local search is used to optimize the path construction. In addition, the improved pheromone update strategy of max-min ant colony algorithm is applied. At the implement level, the ant colony is encapsulated into RDD, which is operated by Spark API to realize distributed construction solution. The experimental results of the Solomon benchmark and Gehring & Homberger benchmark show that the proposed algorithm can improve the accuracy and speed of large-scale problems.
2019, 28(7):17-25. DOI: 10.15888/j.cnki.csa.006976
Abstract:In order to balance the cruise speed and performance of obstacle avoidance for mobile robots controlled by membrane controller, an adaptive cruise speed obstacle avoidance method based on enzymatic numerical P systems (ENPS) is proposed. The method adopts the enzymatic numerical P systems and uses membranes to fuse multiple sensors' distance values. The cruise speed is adjusted by the sensors' fusion value. Moreover, the method helps mobile robots avoiding obstacles effectively and keeps a high robot speed when there is no obstacle. Both the simulation and practical experiments on robot Pioneer 3-DX show that the enzymatic numerical membrane controller designed by the proposed method is feasible, and the effect of the controller is better.
2019, 28(7):26-34. DOI: 10.15888/j.cnki.csa.006977
Abstract:The Knowledge Graph is intended to describe the entities that exist in the real world and the relationships between entities. Since Google introduced the "Google Knowledge Graph" in 2012, knowledge graph have received widespread attention in academia and industry. Aiming at the lack of systematic organization in the field of education, the Educational Assessment Knowledge Graph (EAKG) for high schools is constructed. The construction of EAKG includes knowledge graph schema layer construction based on ontology technology and knowledge graph data layer construction based on schema layer structure. Compared with the traditional knowledge graph constructed by web crawling and other technical means, the knowledge graph constructed in this study has the advantages of clear logical structure and the description of the relationship between entities follows the definition of knowledge graph schema layer. EAKG provides good support for knowledge sharing, knowledge reasoning, knowledge representation learning and other tasks in the field. The experimental results on real simulated test data show that the EAKG constructed by introducing domain ontology as schema layer has better performance than EAKG constructed by data facts alone without domain ontology schema layer on the embedded representation learning tasks such as entity link prediction of test paper score prediction, knowledge point score prediction and triplet classification. Experiments show that the introduction of domain ontology has a certain guiding significance for knowledge graph representation learning.
2019, 28(7):35-43. DOI: 10.15888/j.cnki.csa.006984
Abstract:Along with the increasing demand for cloud services, the cloud service market has developed rapidly, but it also produced a trustworthy problem with cloud services. Various cloud service trust evaluation model have been proposed for this problem domestically and internationally. They Use direct trust and indirect trust, user evaluation similarity and preference similarity to participate in the process of cloud service trustworthy assessment. However, the cloud service trustworthy evaluation model in the existing research cannot meet the basic properties such as slow growth and rapid decline in the process of trust evaluation. This paper introduces the way of cloud service trustworthy sliding window to strengthen the process of trustworthy evaluation of the entire cloud service. The problem was solved by using a sliding window to satisfy the slow growth property, and then it was verified by simulation experiments that it can effectively solve the problem.
2019, 28(7):44-50. DOI: 10.15888/j.cnki.csa.006991
Abstract:Aiming at the problems of service security, load balancing and scalability faced by the land archives system designed by traditional SOA architecture, a distributed archival system based on microservice architecture is designed and implemented. According to the idea of micro-service architecture, the function of archives system is divided into fine-grained micro-service components, and the Authentication Service module between microservices is designed to realize the secure access control of microservices, and the soft load problem of the system is solved by service registry, service Gateway and SpringCloud system framework. The Docker microservice cluster is used to complete the independent deployment run and business function scaling of the microservice components. The inverted index of file data file is established, which improves the speed and accuracy of file data query.
2019, 28(7):51-57. DOI: 10.15888/j.cnki.csa.006982
Abstract:Visual acuity is one of the most important indicators of group health and a vital survey content of building a healthy city. Traditional methods of investigating group vision have limitations. In this study, the pedestrian's face attributes in the surveillance video are analyzed by deep learning. We identify the number and proportion of visual impairments in public group and study them by gender, and use it as a sample index of regional population health. Aiming at the problem of face attributes recognition in video, the convolutional neural network for face detection is introduced to detect pedestrian's face. On this basis, an improved convolutional neural network for face analysis is proposed to recognize gender and whether or not to wear glasses. Finally, a regional visual data display system based on Baidu Map is established, and the visual data of the proportion of male and female visual impairment is displayed in streets and regions of the Web, which lays the foundation for the next practical application. The experimental results and system demonstration show that the proposed method can effectively identify group visual impairment and provide a new idea for group visual health survey.
2019, 28(7):58-64. DOI: 10.15888/j.cnki.csa.006971
Abstract:In order to meet the needs of science and technology policy research, China Association for Science and Technology designs and implements a policy database system. This study first introduces the overall design scheme and system workflow of the science and technology policy database. Then it introduces the system components in detail. The system consists of three subsystems:data acquisition subsystem, data cleaning subsystem and data analysis subsystem. The data acquisition subsystem is based on the Scrapy framework for designing manageable web crawlers for a large number of heterogeneous sites, as well as ABBYY FineReader-based OCR (Optical Character Recognition) for historical documentation. The data cleaning subsystem implements functions such as data deduplication, non-correlated data identification, and data attribute defect recognition based on machine learning algorithms. The data analysis subsystem further carries out text classification, association analysis and full-text search for the effective policies. Since its launch in October 2018, the system has collected 564 749 pieces of data from 226 data sources. After data cleaning, it stores 404 083 pieces of data, which can strongly support the research of science and technology policy.
2019, 28(7):65-71. DOI: 10.15888/j.cnki.csa.006968
Abstract:In the modern power industry, a large amount of power equipment needs to be purchased every year for the construction of power grid projects. Due to the complicated construction of the power grid and the large number of materials and equipment required, it is difficult to manage the material contracts signed by many suppliers. Signing the efficiency and ensuring the progress and quality of the power grid construction project, the article puts forward the application of the blockchain technology in the management of the power industry material contract, the alliance chain established between the grid enterprise and the equipment supplier for signing the contract, and A practical Byzantine Fault Tolerance (PBFT) consensus mechanism based on improved Debt Proof of Entitlement (DPoS) is proposed. This mechanism combines the number of contract awards to dynamically authorize nodes and optimizes the selection strategy of proxy nodes. The electronic contract realized by blockchain technology not only has the security of traditional electronic contracts based on cryptographic algorithms, but also has multi-party cooperation, non-tampering and traceability of transparency and authenticity, so that the signing and follow-up of material contracts Management is more convenient and safe.
2019, 28(7):72-78. DOI: 10.15888/j.cnki.csa.006964
Abstract:This paper mainly introduces the OpenNMS, an open source software of enterprise-level network management platform, and an implementation of it in the Hunan Provincial Center for Disease Control and Prevention (CDC). We use OpenNMS to realize real-time monitoring of network equipment, servers, and applications in the data center of Hunan Provincial CDC. Especially in the scenario of limited funds, and the lack of SMS gateways and private e-mail servers, network bottlenecks and fault alarms are discovered in a timely manner with minimal cost. HP Data Protector (hereinafter referred to as DP) is an HP's backup software that provides reliable data protection and high accessibility for fast-growing business data. This paper also describes the usage of HP DP's reporting and alerting capabilities, and the HP DP alarm is used as an example to demonstrate the e-mail alarm function of OpenNMS.
2019, 28(7):79-84. DOI: 10.15888/j.cnki.csa.006998
Abstract:This work studies situational visualization techniques. They are popular 2D and 3D display technologies, transformation method of WGS84 coordinates and Mercator projection coordinates, message middleware, AAR archive system, and database API gateway. 2D and 3D linked situation visualization system is designed and achieved. The system uses message middleware to synchronize process information of simulation, uses the database API gateway to manage the 2D and 3D models of entities, uses the AAR archive system to record and play back the simulation process. The test demonstrates that the situation visualization system can meet the simulation mission's requirements, and the situations displayed by the 2D and 3D display parts are linked.
2019, 28(7):85-90. DOI: 10.15888/j.cnki.csa.006986
Abstract:The major of garment performance is of characteristic specialty of art colleges, and the organization and management of independent enrollment examination for this major is arduous and complicated. At present, the level of information management is relatively low, facing the double test of data security and examination fairness. Aiming at the actual demand of management informatization of independent enrollment examination for garment performance major in art colleges, a lightweight solution of independent enrollment examination system is proposed. Combining with the functional issues to be solved, the core idea of system design and the key technology of system implementation are given. Practice shows that the solution proposed in this study is feasible and has high practical application value.
2019, 28(7):91-95. DOI: 10.15888/j.cnki.csa.006939
Abstract:For real-time libraries, the effects of batch reading and writing, concurrent operation, single large number reading and writing, memory usage and network usage on real-time performance are analyzed. On this basis, a test system with real-time library function and performance testing is designed. The software part of the test system is implemented on Qt/C++ in Linux system. The practical application shows that the test system is effective, and greatly improves the accuracy of the test results and test efficiency, and the system has a friendly interface and is easy to operate.
2019, 28(7):96-100. DOI: 10.15888/j.cnki.csa.006989
Abstract:In order to realize the control of the cooperative robot, the control system of the cooperative robot was studied. On the premise of ensuring the robustness and real-time of the system, it is proposed to build a control system based on Ubuntu system combined with Robot Operating System (ROS), and use Controller Area Network (CAN) communication to build a collaborative robot. Finally, through the simulation experiment and the physical robot control experiment, the application effect of the collaborative robot control system is verified. The experimental results show that the cooperative robot control system has the basic working ability to control the coordinated robot for path planning, and can well establish the communication between the upper and lower machines and control the cooperative robot. At the same time, the control system has the characteristics of modularity, high portability, clear frame, and low delay.
2019, 28(7):101-108. DOI: 10.15888/j.cnki.csa.006975
Abstract:Crowd animation has been researched and applied in many domains in recent years, such as robotics, movies, games, and so on. But the traditional technologies for creating crowd animation all need complex calculating for motion planning or collision avoidance, the computing efficience is low. This paper presents a new algorithm for generating motion trajectory based on Markov Decision Processes (MDPs) for crowd animation, it can generate all agents' collision-free motion trajectories without any collision detecting. At the same time, this paper presents a new improved value iteration algorithm for solving the state-values of MDPs. We test the performance of the new improved value iteration algorithm on grid maps, the experimental results show that the new alogithm outperforms the value iteration algorithm using Euclidean distance as heuristics and Dijkstra algorithm. The results of crowd animation simulating experiments using the motion trajectory generating algorithm in three-dimensional (3D) scenes show that the proposed motion generating algorithm can make all agents move to the goal position without any collision, meanwhile, agents' motion trajectories are different when we run the algorithm at different time and this effect makes the crowd animation much more alive.
2019, 28(7):109-113. DOI: 10.15888/j.cnki.csa.006892
Abstract:In order to improve the prediction accuracy of software aging, a New Adaptive Genetic Simulated Annealing algorithm (NAGSA) is proposed to optimize the BP neural network prediction model. The model's selection operator is combined with the elite retention strategy using the roulette selection method, stretching the fitness function by simulated annealing algorithm in the late iteration. Compared with the traditional Adaptive Genetic Algorithm (AGA), it can adaptively adjust the crossover probability and mutation probability nonlinearly when the individual fitness is low, thereby optimizing and weighting the BP neural network weights and thresholds, injecting a memory leak code into the online book-sending website to age it, collecting the aging data required for the experiment for simulation training. The experimental results show that the BP neural network model optimized by the NAGSA-BP model compared with the traditional Genetic Algorithm (GA), traditional AGA, and traditional Adaptive Genetic Simulated Annealing algorithm (NGSA) improves the prediction accuracy and achieves excellent results. The effectiveness of the proposed method is verified in this application field.
2019, 28(7):114-120. DOI: 10.15888/j.cnki.csa.006969
Abstract:In this study, traffic flow forecasting in the field of traffic data mining is studied and implemented. This paper presents an algorithm for feature selection of traffic flow data and establishment of traffic flow prediction model based on data mining technology. After cleaning the sampled data, the classification and regression decision tree are used as base learners, and the gradient lifting decision tree is used for regression fitting to calculate the characteristic importance of traffic data. The importance is used as the basis of adaptive feature selection. Secondly, the clustering algorithm is used to cluster the selected feature data, which reduces the sample size and makes the similar data more similar. Finally, real-time data matching and clustering are used as training data sets, and support vector machine is used to predict traffic flow after parameters optimization by Artificial Fish Swarm Algorithm (AFSA). At the end of this paper, experimental data are presented to demonstrate the proposed algorithm and model.
2019, 28(7):121-126. DOI: 10.15888/j.cnki.csa.007001
Abstract:The Relief algorithm is a filtering feature selection algorithm that maximizes the instance margins in the nearest neighbor classifier in a greedy manner. Combined with the local weight method, the authors proposed a Class Dependent RELIEF (CDRELIEF) algorithm that trains one feature weight for each category. This method can better reflect the correlation of features. However, feature weight vector are only effective for measuring the correlation of features to a certain class, and classifying them in actual classification. In the actual classification, the classification accuracy is not high enough. In order to apply the CDRELIEF algorithm to the classification process, this study changes the weight update process, and assigns an instance weight to each instance in the training set. By combining the instance weight value into the weight updating formula, the influence of data points far from the classification boundary and outliers on weight updating is excluded, thereby improving the classification accuracy. The Instance Weighted CDRELIEF (IWCDRELIEF) algorithm proposed in this study is compared with CDRELIEF algorithm on multiple UCI 2-class datasets. Experimental results show that the algorithm proposed in this study has significantly improved the CDRELIEF algorithm.
2019, 28(7):127-132. DOI: 10.15888/j.cnki.csa.007003
Abstract:A scheduling algorithm of scientific workflow in commercial clouds is proposed. To solve the problem of the existing scheduling algorithms and the slack time allocation strategies that do not consider the OR control structure, the critical activity priority (CAP) is defined. Also the service benefit ratio (SBR) and slack time allocation strategy of activity are presented. Then, from the two levels of definition time and running time, the deadline distribution algorithm of activity is proposed. The results of this study provide a more suitable solution for solving the time-cost optimization problem in the scientific workflow scheduling.
2019, 28(7):133-138. DOI: 10.15888/j.cnki.csa.006958
Abstract:Support Vector Machine (SVM) has been widely used in the field of credit evaluation as non-parametric method. However, it cannot actively select attributes when processing high-dimensional data which may cause a drop in accuracy. In order to overcome this shortcoming, credit evaluation model of C4.5 decision tree optimized SVM is constructed to select attributes, and reduce redundant attributes. The model determines the optimal parameters through grid search, uses F-score and average accuracy to evaluate model performance on two sets of public data sets. Empirical analysis shows that the proposed model effectively reduces data learning process, and has higher classification accuracy and practicability than the various traditional types of single models.
2019, 28(7):139-144. DOI: 10.15888/j.cnki.csa.006997
Abstract:The existing deep residual network, as a variant of a convolutional neural network, is used in various fields due to its sound performance. Although the depth residual network obtains higher accuracy by increasing the depth of the neural network, there are still other ways to improve the accuracy at the same depth. In this study, three optimization methods are used to optimize the depth residual network. (1) Dimension filling by mapping through a convolutional network. (2) Building a residual module based on the SELU activation function. (3) Learning rate decays with the number of iterations. Testing the improved network on the dataset Fashion-MNIST, the experimental results show that the proposed network model is superior to the traditional deep residual network in accuracy.
2019, 28(7):145-150. DOI: 10.15888/j.cnki.csa.006972
Abstract:Text classification is an important task in the field of natural language processing. It has a wide range of applications, such as knowledge question and answer, text topic classification, text emotion analysis, and so on. There are many methods to solve the task of text classification, such as Support Vector Machines (SVM) model and Naïve Bayes model. Typical neural network models widely used now are the Recurrent Neural Network (RNN) and the Text Conventional Neural Network (TextCNN). In this study, the sequence model and convolution model in the field of text classification are analyzed, and a hybrid model of combining sequence model and convolution model is proposed. By comparing the performance of the different models on the open dataset, it is proved that the performance of the combined model is better than that of the single model.
2019, 28(7):151-156. DOI: 10.15888/j.cnki.csa.007005
Abstract:Cloud computing resource scheduling is a key and complex scheduling problem in cloud computing, and many factors need to be considered. In order to reduce the time of cloud computing, an Improved Particle Swarm Optimization (IPSO) algorithm is proposed. Based on the linear decreasing inertia weight, the chaotic constant disturbance is added to increase the inertia weight with little probability, so as to get rid of the local search and get the global search. Meanwhile, in order to solve the defect that the two algorithms fall into partial optimization easily, the proposed algorithm combines the optimization strategy of particle swarm optimization and ant colony optimization. The Matlab simulation and the testing of practical examples results show that the improved algorithm can get a more accurate solution under the same condition.
2019, 28(7):157-161. DOI: 10.15888/j.cnki.csa.006983
Abstract:The Patch-based Multi-View Stereopsis (PMVS) dense reconstruction method can automatically ignore external points and obstacle points. Compared with other 3D dense reconstruction algorithms, the algorithm is more accurate, simple, and efficient. However, holes appear in areas with sparse textures, and existing candidate points selection strategies may cause edge defects and local detail distortion. Aiming to solve these problems, this study proposes a method of pixel interpolation feature point selection based on Scale Invariant Feature Transform (SIFT), which increases the feature points of texture sparse regions and makes the feature points be evenly distributed. A more reasonable candidate point selection strategy is proposed to reduce false matching. Experiments show that the proposed method can not only ensure the reconstruction effect of sparse texture regions, but also effectively eliminate mismatched points and improve reconstruction accuracy.
2019, 28(7):162-168. DOI: 10.15888/j.cnki.csa.006981
Abstract:The recommendation system predicts the unknown information according to the user's historical information. Sparsity of user item scoring matrix is one of the main bottlenecks faced by recommendation system. Cross-domain recommendation system is an effective method to solve the problem of data sparsity. In this study, an Efficient Recommendation Algorithm based on effective Feature Subset selection (FSERA) is proposed. FSERA extracts subset information of auxiliary domain to expand target domain data, so as to collaboratively filter recommendation for target domain. In this study, K-means clustering algorithm is used to extract data from the auxiliary domain to reduce redundancy and noise, and to obtain an effective subset of the auxiliary domain, which not only reduces the complexity of the algorithm, but also expands the target domain data and improves the recommendation accuracy. Experiments show that this method has higher recommendation accuracy than traditional methods.
2019, 28(7):169-173. DOI: 10.15888/j.cnki.csa.006993
Abstract:The Simultaneous Localization And Mapping (SLAM) is a difficult problem in the field of robotics. Rao-Blackwellized particle filters algorithm is widely used to solve this problem. In the traditional implementation, the proposed distribution with high error will calculate a large number of sampled particles to fit the target distribution. Frequent resampling steps will lead to gradual dissipation of particles and waste a lot of computing resources. In this study, the motion model and observation information are combined to optimize the proposed distribution, reduce the number of sampled particles, and the adaptive resampling method is introduced to reduce the steps of resampling. In the implementation of the algorithm, the tree data structure is used to store the environment map. The experimental results show that the improved algorithm can significantly improve the computational efficiency, reduce the storage consumption, and build more accurate map.
2019, 28(7):174-179. DOI: 10.15888/j.cnki.csa.006965
Abstract:Dimensional disaster is a common problem in machine learning tasks. The feature selection algorithm can select the optimal feature subset from the original data set and reduce the feature dimension. A hybrid feature selection algorithm is proposed. Firstly, the chi-square test and filtering method are used to select the important feature subsets and normalize scale, and then SBS-SVM wrapped by SBS and SVM. The algorithm selects the optimal feature subset to maximize the classification performance and effectively reduce the number of features. In the experiment, the SBS-SVM in the parcel stage and the other two algorithms are tested on three classical data sets. The results show that the SBS-SVM algorithm has better performance in classification performance and generalization ability.
2019, 28(7):180-183. DOI: 10.15888/j.cnki.csa.006999
Abstract:The complex environment of abdominal aorta inevitably leads to the problems of weak edges and inhomogeneities. To solve the problem, an algorithm for segmentation of aorta CT scans based on local edge features via level set is proposed. The energy function is minimized by assigning weighting factors according to their relative importance of the inside and outside of the evolving contour. The results show that the new approach is more accurate and stable and can obtain fairly satisfied effect.
2019, 28(7):184-190. DOI: 10.15888/j.cnki.csa.006811
Abstract:As for that existing personalized privacy anonymous technology can not solve the problem that the numerical sensitive attribute is vulnerable to the proximity breach, an anonymous model called (εi, k)-anonymity model is proposed and the model is based on clustering technology. Firstly, the model divides the sensitive attribute values in ascending order into several sub-intervals based on the clustering method; then, it proposes an (εi, k)-anonymity principle for numerically sensitive attributes against proximity breach; finally, a maximum bucket-first algorithm is proposed to implement the (εi, k)-anonymity principle. The experimental results show that compared with the existing scheme used for resisting proximity breach, the information loss of the proposed anonymous scheme is reduced, the algorithm execution efficiency is improved and it can reduce the leakage risk of user privacy effectively.
2019, 28(7):191-198. DOI: 10.15888/j.cnki.csa.006980
Abstract:Scheduling of construction machinery customer service involves service vehicle, servicer, and engineering machinery. This study establishes a model with the goal of minimizing the total completion time under the premise that the service resources are sufficient and an engineer assigns at most one task, combining with path length, skill matching, service time, and other factors. Considering the combination of service vehicle, service person, and engineering machinery as a special three-part graph matching problem, a hybrid genetic algorithm solution based on the minimum weight matching of bipartite graphs is proposed. The roulette selection operator and dynamic mutation probability of the embedded elite strategy are introduced. The superiority of the algorithm is proved by a large-scale case study.
2019, 28(7):199-205. DOI: 10.15888/j.cnki.csa.006961
Abstract:In view of the fact that today's railway engineering staff cannot be real-time monitored during operation, and safety incidents occur from time to time, seven main behaviors of them are analyzed by taking railway inspectors as an example. An embedded device integrated with an accelerometer is worn by every worker, collects their behavior data and extracts features, and uses four kinds of classifiers, which are C4.5 decision tree, random forest, KNN, and SVM, to carry out experiments. The results show that classifier SVM performs the best, the behavioral recognition accuracy rate reaches 99.2%. This research has certain engineering application value for eliminating the safety hazards of railway field engineering staffs.
2019, 28(7):206-213. DOI: 10.15888/j.cnki.csa.006974
Abstract:In the process of handling the case by the prosecutor, the nature of the theft case is not accurate, and the lack of experience in the sentencing suggestion leads to insufficient accuracy of the sentencing recommendations. In order to enable the prosecutor to give more accurate sentencing recommendations, provide an auxiliary sentencing reference, through the collation and analysis of the legal documents theory and knowledge system of the theft case, excavate and reason the implicit relationship and deep relationship, and build the ontology model. This study proposes a method for constructing knowledge graphs of legal documents based on ontology, and designs custom inference rules. It realizes the knowledge of the legal documents of theft cases in similar sentencing cases, and obtains the ideal test results. The research proves that the knowledge graph of the legal documents based on ontology-based theft cases is constructed, and the intelligent reasoning technology is used to provide the prosecutor with a similar case sentencing reference, which assists the prosecutor to give more reasonable sentencing suggestions.
2019, 28(7):214-220. DOI: 10.15888/j.cnki.csa.006942
Abstract:How to predict the aging trend of the software accurately, and take the corresponding recovery strategy is a key problem of preventing software aging. To solve the problem, this study designs a resource prediction method based on Recurrent Neural Network (RNN) and its variant-Long Short-Term Memory (LSTM), and builds an accelerated aging test platform to model and forecast the aging phenomenon of the Web server due to memory leak. The experiments show that LSTM network prediction model proves to be superior to the other traditional models in dealing with the time sequence modeling of aging parameters, with the predicted results closer to the actual values and the higher prediction accuracy, which can effectively improve the reliability and availability of the software system.
2019, 28(7):221-227. DOI: 10.15888/j.cnki.csa.006973
Abstract:For the traditional Lukas-Kanade (LK) method, it is hard to capture the ball center in the Ball-Plate system and the accuracy of tracking is not exact. A way to improve the LK method based on the Hough transform is proposed. Traditional LK method chooses corner where the change of gray scale is obvious in the image. Usually the cornet is distributed at the edge of circle and cannot go deep into the center of the circle. By using the improved LK method to select the corner, the two-dimensional XY coordinate system is transformed into 3D ABR coordinate system. The center is determined by accumulate number and accumulate weight. The ball is tracked by this center point's gray scale. The results show that the improved optical flow method is better than the traditional LK optical flow and histogram method.
2019, 28(7):228-233. DOI: 10.15888/j.cnki.csa.006967
Abstract:Aiming at the problem of encoder failure of dual rudder wheel AGV under rough road conditions, a scheme of using low-cost RGB-D camera as visual odometer is proposed. This method can avoid the direct kinematics modeling of dual rudder wheel AGV and solve the problem of excessive accumulated error in odometer track estimation. In this study, we use pinhole camera model to find the correspondence between spatial points and pixels firstly. Then the ORB operator is used to extract and match the image, after that the method of Iterative Closest Point(ICP) is used for pose estimation. A visual odometer is built on the platform of Linux + ROS, which is fused with lidar for localization using particle filter algorithm. The positioning effects of encoder and visual odometer are compared in different environments, and the robustness of the whole system is verified.
2019, 28(7):234-239. DOI: 10.15888/j.cnki.csa.007006
Abstract:Given a long, untrimmed video consisting of multiple action instances and complex background contents, temporal action detection needs not only to recognize their action categories, but also to localize the start time and end time of each instance. To this end, a temporal action detection network based on two-stream convolutional networks is proposed. First, the two-stream convolutional networks is used to extract the feature sequence of the video, and then TAG (Temporal Actionness Grouping) is used to generate the proposal. In order to construct high-quality proposals, the proposal is feed to the boundary regression network to correct the boundary and make it closer to the ground truth, then extend the proposal to a three-segment feature design with context information, and finally use a multi-layer perception to identify behavior. The experimental results show that the proposed algorithm achieves a great mAP in the THUMOS 2014 dataset and the ActivityNet v1.3 dataset.
2019, 28(7):240-245. DOI: 10.15888/j.cnki.csa.006955
Abstract:The single-angle feature coding identity authentication method cannot meet the current demand in terms of data capacity because of the increasing number of cattle. A cascade structure is used to detect the cattle's face and then estimate the angle of the cattle's face, which build a solid feature base for multi-angle feature coding. The result of experiments shows that the cascade structure can obtain a higher accuracy in both the face detection and attitude angle estimation tasks.
2019, 28(7):246-251. DOI: 10.15888/j.cnki.csa.006996
Abstract:In Android system, in order to avoid being killed by the system, some Android applications will occupy the system's CPU, memory and other resources in the background in various ways to achieve background live preservation. This kind of behavior accelerates Android's power consumption. One way to keep alive in the background is to hold Audiomix lock in the background and play silent audio data. Considering this kind of behavior, we designs corresponding schemes to detect this problem. By modifying the Android system source code, we collect the audio data that Android applications are playing, and then use the program to check whether the Android application keeps alive in the background by playing silent audio data in realtime. We analyze 50 Android applications in the experiment, and the results show that this method can detect such behaviors effectively.