To get the out- and in-centrality measures, you need to first aggregate your network to the subgroup level (i.e., create a new network where the subgroups are the nodes) and then you can specify the type parameter as either “out” (default) or “in”.

Best,

Tore

thank you very much for excellent work. It would be extremely useful to use your measure of weighted centrality as I dispose of weighted interaction data. In particular my work focuses on inter-group interaction, and I would like to ask whether it would be possible to get a measure of weighted in-degree and out-degree centrality measure for inter-subgroup relationships.

Thank you in advance

Best

Carmine

]]>I do not know a specific implementation of link / edge betweenness for weighted networks. However, you can alter the tnet code to handle this case. Below is one example of such code.

Let me know if you are able to use it.

Tore

edge_betweenness_w <- function(net, directed = NULL, alpha = 1) { if (is.null(attributes(net)$tnet)) net <- as.tnet(net, type = "weighted one-mode tnet") if (attributes(net)$tnet != "weighted one-mode tnet") stop("Network not loaded properly") if (is.null(directed)) { tmp <- symmetrise_w(net, method = "MAX") directed <- (nrow(tmp) != nrow(net) | sum(tmp[, "w"]) != sum(net[, "w"])) } netTransformed <- net netTransformed[, "w"] <- (1/netTransformed[, "w"])^alpha g <- tnet_igraph(netTransformed, type = "weighted one-mode tnet", directed = directed) out <- data.frame(as_edgelist(g), betweenness = 0) colnames(out) <- c("i","j") out[, "betweenness"] <- igraph::edge_betweenness(graph = g, directed = directed) return(out) }]]>

Thank you for your good works. I want to use link betweenness to analyse some links’ intermediary in weighted networks.But you know the tnet only provides node betweenness’ function, could you give me some suggestions for solving this problem?

many thanks.

Zhitao Zhang ]]>

The weight columns are numeric, the nodes are ids. I have also sent you the code and data. Let me know if you have more thoughts.

Thanks! ]]>

There seems to be some issues with your dataset. Are all the columns of the data.frame integer or numeric? Send me an email with the code and data if you have more issues.

Best,

Tore

Thank you for this awesome package. It is really useful. However, is there an upper limit to the number of nodes one can analyze using tnet? My data has 416 nodes, and every time I run I keep getting. Specifically, here is the code I have used:

set2008df =read.table(“test2008.txt”,header=TRUE) # read the data, this is in the form of i.j.w and is read as a dataframe.

set2008 closeness_w(set2008)

Error in unique(y[ind]) :

Value of SET_STRING_ELT() must be a ‘CHARSXP’ not a ‘bytecode’

Similarly, when I usee the degree_w(set2008), the following error:

*** caught segfault ***

address 0x0, cause ‘unknown’

Traceback:

1: cbind(1:max(net[, c(“i”, “j”)]), 0, 0, 0)

2: rbind(k.list, cbind(1:max(net[, c(“i”, “j”)]), 0, 0, 0))

3: degree_w(set2008)

Possible actions:

1: abort (with core dump, if enabled)

2: normal R exit

3: exit R without saving workspace

4: exit R saving workspace

Can you explain what is going on?

]]>There are two key values:

a) Alpha = 0 produces degree

b) Alpha = 1 produces strength

The closer alpha is set to 0, the more relevance is attached to degree over strength. See the discussion here: https://toreopsahl.com/tnet/weighted-networks/node-centrality/

Also, you might want to see comment #40 above. It outlines how to detect the most relevant alpha given an outcome / performance variable.

Hope this helps,

Tore

Thank you for your very good works.

i want to use this metric in the industrial engineering discipline, but i am a bit confused about the choose of alpha parameter, is that any procedure that i can use to determine the best value of alpha for my network? i mean that for example i consider degree is relatively more important for me, haw can i found which score i should choose?

many thanks.

Farnaz ]]>

actually the distribution i use is zipfian, ranging on skewness from 0.1 to 0.9. I get a lot zero values, (i assume this occurs because these nodes are not present in shorterst paths). Do you think there is a problem with zipfian ?

many thanks again,

Pavlos

Betweenness is not limited to the giant / main component; however, its distribution might be highly skewed and zero-inflated. Be sure to check before using the raw scores in frameworks that assumes a Gaussian distribution.

Best,

Tore

as you guided me in one of our previous discussions on how to use closeness on networks with disconnected components, is there a similar approach for betweenness in weighted and directed networks?

thanks you for your time,

Pavlos

for my results, thank you very much for your time and your help

Best,

Pavlos

There are two ways of dealing with running out of memory in R: buy more memory or make the code more memory-efficient. I had some time, and below is a suggestion for doing the latter. Note you might want to use JIT compiling to speed things up.

# To speed thing up, you might want to enable JIT compiling library(compiler) enableJIT(3) # Load tnet library(tnet) # Load sample network from blog post net <- cbind( i=c(1,1,2,2,2,3,3,3,4,4,4,5,5,6,6,7,9,10,10,11), j=c(2,3,1,3,5,1,2,4,3,6,7,2,6,4,5,4,10,9,11,10), w=c(1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1)) # New function closeness_w2 <- function (net, directed = NULL, gconly = TRUE, alpha = 1) { if (is.null(attributes(net)$tnet)) net <- as.tnet(net, type = "weighted one-mode tnet") if (attributes(net)$tnet != "weighted one-mode tnet") stop("Network not loaded properly") net[, "w"] <- net[, "w"]^alpha if (is.null(directed)) { tmp <- symmetrise_w(net, method = "MAX") directed <- (nrow(tmp) != nrow(net) | sum(tmp[, "w"]) != sum(net[, "w"])) } # From distance_w-function g <- tnet_igraph(net, type = "weighted one-mode tnet", directed = directed) if (gconly) { stop("This code is only tested on gconly=FALSE") } else { gc <- as.integer(V(g)) } # Closeness scores out <- sapply(gc, function(a) { row <- as.numeric(igraph::shortest.paths(g, v=a, mode = "out", weights = igraph::get.edge.attribute(g,"tnetw"))) return(sum(1/row[row!=0])) }) out <- cbind(node = gc, closeness = out, n.closeness = out/(length(out) - 1)) return(out) } # JIT compiled closeness_w2c <- cmpfun(closeness_w2) # Scores with old function closeness_w(net, gconly=FALSE) # Scores with new function (regular and compiled) closeness_w2(net, gconly=FALSE) closeness_w2c(net, gconly=FALSE) # Disable JIT compiling enableJIT(0)

Hope this helps!

Tore

i am using the version of closeness for the complete network topology and disconnected components as you suggested, however its seems that there is not enough memory to store the distance array for large networks. The networks i am testing are with nodes of size 83000, weighted and directed networks. I tested R in both 32-bit and 64-bit system and i also saw through other posts that it is a memory problem. How can we overcome this problem ?

Best,

Pavlos

thank you again,

Best,

Pavlos

Indeed. No need to invert tie weights in tnet.

To compute closeness for the full network, see https://toreopsahl.com/2010/03/20/closeness-centrality-in-networks-with-disconnected-components/ In other words, specify gconly=FALSE when running the function.

All the information on the site is in various papers, so you can just those. The relevant one for the information on a page is normally listed at the bottom of it.

Tore

]]>sink seems to work fine so i ‘ll leave it as it is for now.

I was thinking of inverting the weights so as to have the desired outcome but since you confirmed that tnet prefers higher values then i do not have to invert the weights and in both betweenness and closeness 0.9 will be prefered over 0.4, correct?

One more thing is that closeness is computed for the largest connected component, is there a way to compute it for the entire network topology?

Lastly in my work i am referencing the your website should i reference :

Opsahl, T., Agneessens, F., Skvoretz, J. (2010). Node centrality in weighted networks: Generalizing degree and shortest paths. Social Networks 32, 245-251.

instead of the webpage?

many thanks again,

Pavlos

Great that you are applying the weighted metrics.

Your code seems fine. I would suggest that you take the output from the functions (e.g., closeness) and save it to an object, and then write that object to a file. But if the sink-function works for you, go ahead!

tnet assumes that a higher weight is better. This is in contrast to the shortest path functions of igraph, which assumes that people will invert the tie weights first themselves. From my own experience, we are not always so good at understanding the details of the functions before running them, so I have actually made tnet based on the common assumption of higher values = stronger ties = higher transmission. As such, a tie with a weight of 0.9 is preferred over a tie with 0.4.

Hope this clarifies!

Tore

i am using tnet to compute centrality measures (Closeness, Betweenness) in weighted directed networks using datasets from different sources (e.g http://snap.stanford.edu/data/links.html) and give probabilities as weights to the links in [0-1].

What i do in R is:

install.packages(“tnet”)

library(tnet)

directed.net <- read.table("C:\\Users\\Pavlos\\Desktop\\net.txt", sep="\t")

directed.net <- as.tnet(directed.net, type="weighted one-mode tnet")

sink("C:\\Users\\Pavlos\\Desktop\\bet.txt", append=FALSE, split=FALSE)

betweenness_w(directed.net, alpha=0.5)

sink("C:\\Users\\Pavlos\\Desktop\\clo.txt", append=FALSE, split=FALSE)

closeness_w(directed.net, alpha=0.5)

Everything seems to work perfect. First i would like to ask if i missed anything in the commands or if i did something wrong to this part. And also, i read to a previous post that weigths are accounted as costs, so a link of 0.4 will be prefered over a link of 0.9. Is that correct?

Many thanks for your time and work,

Pavlos

Isolates should not be included in the edgelist as only ties are listed there. A jump in an edgelist assumes that isolates exists (e.g., see the code on the blog post for closeness in disconnection components where node K -an isolate- is assigned a score; https://toreopsahl.com/2010/03/20/closeness-centrality-in-networks-with-disconnected-components/).

For most metrics, a change in network size doesn’t matter (e.g., degree). However, if you calculate average degree, I would calculate degree and then sum the degree score and divide by the network size you have instead of simply writing mean(degree_w(net, measure=”degree”)[,”degree”]).

Hope this helps,

Tore

Thanks for letting me know.

Best,

]]>Great that you are using tnet! I would suggest that you do not load the network into igraph before loading it in tnet. If you transform the node identifiers into integers, you should be able to run as.tnet on the regular data.frame-object.

Best,

Tore

I am interested in using tnet to calculate weighted centrality for a one-mode network. I am having trouble reading the data into tnet. Would you please let me know where I got wrong?

“edge_PP 60.csv” is an edgelist with three columns, i, j, and weight without header.

el=read.csv(“edge_PP 60.csv”, header=FALSE)

el[,1]=as.character(el[,1])

el[,2]=as.character(el[,2])

el=as.matrix(el)

g=graph.edgelist(el[,1:2])

E(g)$weight=as.numeric(el[,3])

as.tnet(g)

Using above code, I got the following error message: “Error in if (NC == 2) net <- data.frame(tmp[, 1], tmp[, 2]) :

argument is of length zero"

Thank you for your suggestion!

]]>Kindest Regards

Leila ]]>