How can I find the minimum number of edges need to be added to convert a tree graph to binconnected component such that if we remove an edge from that graph, the graph remain connected? Please give me some idea to do that! Thanks in advance :)

# | User | Rating |
---|---|---|

1 | MiFaFaOvO | 3681 |

2 | tourist | 3669 |

3 | apiadu | 3397 |

4 | TLE | 3374 |

5 | Um_nik | 3358 |

6 | 300iq | 3317 |

7 | maroonrk | 3232 |

8 | Benq | 3230 |

9 | LHiC | 3229 |

10 | 1919810 | 3203 |

# | User | Contrib. |
---|---|---|

1 | antontrygubO_o | 191 |

2 | Errichto | 185 |

3 | tourist | 182 |

4 | vovuh | 170 |

5 | pikmike | 169 |

6 | Radewoosh | 164 |

7 | ko_osaga | 162 |

8 | Um_nik | 161 |

9 | 300iq | 156 |

10 | rng_58 | 154 |

How can I find the minimum number of edges need to be added to convert a tree graph to binconnected component such that if we remove an edge from that graph, the graph remain connected? Please give me some idea to do that! Thanks in advance :)

↑

↓

Codeforces (c) Copyright 2010-2020 Mike Mirzayanov

The only programming contests Web 2.0 platform

Server time: Mar/31/2020 02:58:02 (h1).

Desktop version, switch to mobile version.

Supported by

User lists

Name |
---|

See the solutions to the problem called network here. Basically one way is to find the node x such that if x and all its edges are removed, no more than half of the leaf nodes of the original tree are in a single connected component. This can be done with an algorithm similar to finding the centroid of a tree. Then, the edges to add can be found greedily by connecting two edges from the two components with the most remaining leaves and repeating.

Isn't that whole centroid-finding step unnecessary overkill?

All you need to do is add the new edges in such a way that each old edge will lie on some cycle.

To do that, just root the tree at any non-leaf vertex and run a plain dfs. Each time you encounter a leaf, append its number to

`leaves[]`

. At the end, let`s = leaves.size()/2`

. For each valid i, connect`leaves[i]`

and`leaves[i+s]`

.Why does this work? Consider any edge u — parent(u) in the original tree. The leaves in vertex u's subtree form a contiguous interval in the array

`leaves`

. Then we can easily see that some of the new edges we added must have exactly one endpoint in this interval. Each such new edge forms a cycle that contains the u — parent(u) old edge.thanks a lot misof :) Your idea help me to solve the problem.

Hi, I have some troubles with that problem, I use the mkirsche's idea of finding the centroid and after that match leaves from different subtrees.

For the matching I use the misof's method connecting leaves[i] and leaves[i+s].

However I'm getting WA. Now I know that the step of the centroid is overkill, but it should be correct. I really appreciate if you could help me finding my error.

I posted my code and method yesterday in CF POST

Thanks

I've reduced my code to the following CODE

I assign a different color to each adjacent of the centroid (root) and propagate that color to its subtree.

I build a vector of pairs <color of i, i> in the dfs, so all the leaves of the same color are adjacent. Doing the match of leaves[i] and leaves[i+s] whit s = leaves.size()/2 I get AC.

However, when I sort the vector of leaves () (line 79 of my code) I get WA. According to me, sorting the vector of pairs doesn't change the order of colors, just the order between nodes of the same color, so the matching should still working (match 2 leaves of different color).

I can't understand why my solutions es wrong, I hope you could help me. PD: Maybe you dont remember me but I am one of the students who assisted to your training camp in Peru last year.