The fast development of big data computing contributes to the fact that large-scale graph processing has become a basic computing model in both academic and industrial communities, and it has been applied in many actu...
详细信息
The fast development of big data computing contributes to the fact that large-scale graph processing has become a basic computing model in both academic and industrial communities, and it has been applied in many actual big data computing works, such as social network analysis, Web search, and product promotion. These computing works include large-scale graphs of billions of vertices and trillions of edges. Such scale has brought many challenges to large-scale graph processing. This paper mainly introduces the essential features and challenges of large-scale graph processing and how we can handle billions of edges on a multi-core machine, for which we represent out-of-core processing system and semi-external memory processing systems. This paper also summarizes the key technologies in graph processing systems and forecasts the future development of large-scale graph processing systems.
We use a dynamic programming algorithm to establish a lower bound on the domination number of complete grid graphs of the form CnPm, that is, the Cartesian product of a cycle Cn and a path Pm, for m and n sufficiently...
详细信息
Numerous approaches study the vulnerability of networks against social contagion. graph burning studies how fast a contagion, modeled as a set of fires, spreads in a graph. The burning process takes place in synchrono...
详细信息
ISBN:
(数字)9783030148126
ISBN:
(纸本)9783030148119;9783030148126
Numerous approaches study the vulnerability of networks against social contagion. graph burning studies how fast a contagion, modeled as a set of fires, spreads in a graph. The burning process takes place in synchronous, discrete rounds. In each round, a fire breaks out at a vertex, and the fire spreads to all vertices that are adjacent to a burning vertex. The selection of vertices where fires start defines a schedule that indicates the number of rounds required to burn all vertices. Given a graph, the objective of an algorithm is to find a schedule that minimizes the number of rounds to burn graph. Finding the optimal schedule is known to be NP-hard, and the problem remains NP-hard when the graph is a tree or a set of disjoint paths. The only known algorithm is an approximation algorithm for disjoint paths, which has an approximation ratio of 1.5. We present approximation algorithms for graph burning. For general graphs, we introduce an algorithm with an approximation ratio of 3. When the graph is a tree, we present another algorithm with approximation ratio 2. Moreover, we consider a setting where the graph is a forest of disjoint paths. In this setting, when the number of paths is constant, we provide an optimal algorithm which runs in polynomial time. When the number of paths is more than a constant, we provide two approximation schemes: first, under a regularity condition where paths have asymptotically equal lengths, we show the problem admits an approximation scheme which is fully polynomial. Second, for a general setting where the regularity condition does not necessarily hold, we provide another approximation scheme which runs in time polynomial in the size of the graph.
The use of networks for modelling and analysing relations among data is currently growing. Recently, the use of a single networks for capturing all the aspects of some complex scenarios has shown some limitations. Con...
详细信息
The use of networks for modelling and analysing relations among data is currently growing. Recently, the use of a single networks for capturing all the aspects of some complex scenarios has shown some limitations. Consequently, it has been proposed to use Dual Networks (DN), a pair of related networks, to analyse complex systems. The two graphs in a DN have the same set of vertices and different edge sets. Common subgraphs among these networks may convey some insights about the modelled scenarios. For instance, the detection of the Top-k Densest Connected subgraphs, i.e. a set k subgraphs having the largest density in the conceptual network which are also connected in the physical network, may reveal set of highly related nodes. After proposing a formalisation of the approach, we propose a heuristic to find a solution, since the problem is computationally hard. A set of experiments on synthetic and real networks is also presented to support our approach.
In the influence maximization (IM) problem, we are given a social network and a budget k, and we look for a set of knodes in the network, called seeds, that maximize the expected number of nodes that are reached by an...
详细信息
In the influence maximization (IM) problem, we are given a social network and a budget k, and we look for a set of knodes in the network, called seeds, that maximize the expected number of nodes that are reached by an influence cascade generated by the seeds, according to some stochastic model for influence diffusion. Extensive studies have been done on the IM problem, since this definition by Kempe etal.[26]. However, most of the work focuses on the non-adaptiveversion of the problem where all the kseed nodes must be selected before the cascade starts. In this paper we study the adaptiveIM, where the nodes are selected sequentially one by one, and the decision on the i-th seed can be based on the observedcascade produced by the first i - 1seeds. We focus on the full-adoption feedbackin which we can observe the entire cascade of each previously selected seed under the independent cascade model where each edge is associated with an independent probability of diffusing influence. Previous works showed that there are constant upper bounds on the adaptivity gap, which compares the performance of an adaptive algorithm against a non-adaptive one, but the analyses used to prove these bounds only workfor specific graph classes such as in-arborescences, out-arborescences, and one-directional bipartite graphs. Our main result is the first sub-linear upper bound that holds for any graph. Specifically, we show that the adaptivity gap is upper-bounded by (3)root n+ 1, where nis the number of nodes in the graph. Moreover, we improve over the known upper bound for in-arborescences from 2e/(e - 1) approximate to 3.16 to 2e(2)/(e(2) - 1) approximate to 2.31. Then, we consider (beta, gamma)-bounded-activation graphs, where all nodes but beta influence in expectation at most gamma is an element of[0, 1) neighbors each;for this class of influence graphs we show that the adaptivity gap is at most root beta+ 1/1-gamma. Finally, we study alpha-bounded-degree graphs, that is the class of u
In this contribution we consider a variant of the vertex cover problem in temporal graphs that has been recently introduced to summarize timeline activities in social networks. The problem is NP-hard, even when the ti...
详细信息
In this contribution we consider a variant of the vertex cover problem in temporal graphs that has been recently introduced to summarize timeline activities in social networks. The problem is NP-hard, even when the time domain considered consists of two timestamps. We further analyze the complexity of this problem, focusing on temporal graphs of bounded degree. We prove that the problem is NP-hard when (1) each vertex has degree at most one in each timestamp and (2) each vertex is connected with at most three neighbors, has degree at most two in each timestamp and the time domain consists of three timestamps. On the other hand, we prove that the problem is in P when each vertex is connected with at most two neighbors. Then we present a fixed-parameter algorithm for the restriction where we bound the number of interactions in each timestamp and the length of the interval where a vertex has incident temporal edges. (c) 2023 Elsevier B.V. All rights reserved.
In a right-angle crossing (RAC) drawing of a graph, each edge is represented as a polyline and edge crossings must occur at an angle of exactly 90◦, where the number of bends on such polylines is typically restricted ...
详细信息
For the Traveling Salesman Problem (TSP), many algorithms have been developed. These include heuristic solvers, such as nearest neighbors and ant colony optimization algorithms. In this work, the ATT48 and EIL101 inst...
详细信息
For the Traveling Salesman Problem (TSP), many algorithms have been developed. These include heuristic solvers, such as nearest neighbors and ant colony optimization algorithms. In this work, the ATT48 and EIL101 instances are examined to better understand the difference between biased and unbiased methods of tour construction algorithms when combined with the 2-opt local search operator. First, a sample of tours are constructed. Then, we examine the frequencies of global edges of different sizes using n-grams. Using 2-opt as the tour improvement algorithm, we analyze randomly initialized local optima compared to nearest neighbors local optima as well as ant colony solutions with and without 2-opt. This comparison serves to better understand the nature of these different methods in their relation to the global optimum. We also provide some ways the algorithms may be adapted to take advantage of the global frequencies, particularly the ant colony optimization algorithm.
In many cooperative networks, individuals participate actively as long as they recognize a sufficient value in participation, which depends not only on the number, but also on the attributes of other participating mem...
详细信息
In many cooperative networks, individuals participate actively as long as they recognize a sufficient value in participation, which depends not only on the number, but also on the attributes of other participating members. In this paper, we present a generalized model of individuals' participation in such networks, and a strategy to maximize the number of participating individuals. Unlike most of the existing literature, our model incorporates both the network structure and the heterogeneity of individuals in terms of their attributes and resources. We consider that each individual possesses a subset of available resources (attributes), which it shares with neighbors as long as neighbors reciprocate and provide the missing resources to the individual. However, individual leaves the network if it cannot find all the resources in its neighborhood. To model this phenomenon, we introduce a graph-theoretic notion of the (r, s)-core, which is the sub-network consisting of only those individuals who can access all the resources by collaborating with their neighbors. Since disengagement of an individual could initiate a cascading withdrawal of more individuals from the network, one of our main goals is to prevent this unraveling and maximize the number of participating individuals. For this purpose, we utilize the notion of anchors-individuals that continue to participate (due to incentives) even if they cannot find all of the resources in their neighborhood. By introducing only a few anchors, we can significantly increase the number of participating individuals, which in our model corresponds to increasing the size of the (r, s)-core. We formulate and thoroughly analyze the anchors' selection problem by classifying the cases in which the problem is polynomial-time solvable, NP-complete, and inapproximable. Further, we provide greedy and metaheuristic search algorithms to compute a set of anchors and evaluate our results on various networks. Our results are applicable to a
暂无评论