### nchn27's blog

By nchn27, history, 10 months ago, ,

In use, multiset seems more versatile than priority queue due to the fact that you can remove things other than the first element. In C++, when would priority_queue be used instead of multiset?

• +1

 » 10 months ago, # |   +13 Priority_queue is faster, but usually for competitive programming using multiset is better since it is more convenient and for problems that arent very tight it makes no difference.
•  » » 2 months ago, # ^ | ← Rev. 2 →   -66 How is Priority_queue faster than a multiset?Wont Deletion through an iterator take constant time in a multiset where as pop takes log(n) time in priority queue
•  » » » 2 months ago, # ^ |   +20 I don't know the implementations in C++ nor have I benchmarked the structures, but I would definitely expect priority queue to be faster, maybe even a lot faster. Why would you expect otherwise?The functions of a priority queue are a subset of the functions of a multiset, if the latter is faster then the former is useless.You can implement a priority queue through a heap structure, while for a multiset you need a balanced binary tree structure, which is significantly heavier.
•  » » » » 2 months ago, # ^ |   -56 Agreed, but Wont Deletion through an iterator take constant time in a multiset where as pop takes log(n) time in priority queue
•  » » » » » 2 months ago, # ^ |   +16 It may be so, but that's not a great way to look at it. If you add $N$ elements and then remove them all, both priority queue and multiset will be $O(N log N)$ (and in fact no structure is faster than that, because you'd break the sorting complexity lower bound). Even if multiset is quicker in the deletion part, it likely pays a price for that during the addition part.It may be interesting to properly compare them, but I'd be quite surprised if multiset turns out to be faster, as my experience has showed that set/map in C++ are incredibly heavy.
•  » » » » » » 2 months ago, # ^ |   +8 I get your point.Thanks for helping me out!
•  » » » » » 2 months ago, # ^ | ← Rev. 2 →   +8 As far as I know, multiset is implemented as a balanced binary tree as Enchom mentioned. Since it should be 'balanced' after insertion and deletion, it has to do some 're-balancing' (algorithms can vary by implementation, but generally rotating trees). Unless it does that job, there might be some possbile occasions which a sequence of insertion and deletion makes multiset's internal implementation tree unbalanced and makes operation after that very slow (up to O(n))To sum up, priority queue is a heap structure, which takes $O(\log n)$ time to delete top element. Multiset is a balanced binary search tree, which takes up to $O(\log n)$ time to delete anything and then assuring balance. Latter can be a lot slower (bigger constant factor). Currently priority queue is somewhere 1.5x to 2x faster than multiset with compiler optimizations. (Depends on if you turned -O2 on, and bunch of other factors, including your compiler version and OS) There are some benchmarks in stackoverflow, which I haven't throughly read yet.Edit : my friend pointed out that my numbers are wrong.
•  » » » » » » 2 months ago, # ^ | ← Rev. 2 →   +24 According to this reference, he is actually right that deletion is amortized constant. I ran a few quick tests and indeed multiset deletion seems faster than priority queue deletion.However, as I suspected and mentioned in my other comment, this is at the cost of a very slow addition in multiset compared to priority queue. Since you can't delete more elements than you add, I can't imagine a scenario where you'd prefer multiset only due to the faster deletion.
•  » » » » » » » 2 months ago, # ^ |   +1 Oh, I misread the question and considered erasing by value. I was talking about wrong thing then. Thanks for pointing it out.
•  » » » » » » » 2 months ago, # ^ |   +10 Right...Maybe as gratus907 said, Addition in Multiset would probably be slower due to ReBalancing Algorithms (Rotating Tree).Thanks once Again!
•  » » » » » » » 2 months ago, # ^ |   -29 Amortised isn't worst case, that's still $O(\log)$.
•  » » » » » » » » 2 months ago, # ^ |   0 Worst case may be o(log) but the average time, ie the amortized time is constant.Check out Enchom comment on this blog. He ran few tests, and found out that deletion in a multiset is faster than popping from a priority queue
•  » » » » » » » » » 2 months ago, # ^ |   -7 Yes, I read the comments before posting, unsurprisingly. It's just that when you don't mention that you're talking about amortised complexity, the assumption is that it's worst case.There are implementation tricks for faster insert/delete operations for a lot of structures, for example deletion flags on nodes and lazy rebalancing. That's why I'm more interested in a test where insert/delete is mixed with searching, but that comparison is tough because heap doesn't have searching.
•  » » » » » » » » 2 months ago, # ^ |   0 What's your point? Why would we care specifically about worst case complexity in this discussion? It is worst case $O(logN)$ to delete one element, but $O(1)$ amortized. This means if you were to delete all $N$ elements it'd be $O(N)$.
•  » » » » » » » » » 2 months ago, # ^ |   -35 Why would we care specifically about worst case complexity in this discussion? Because that's normal?
•  » » » » » » » » » 2 months ago, # ^ |   0 Still don't get your point. Is your claim that when mixed with other operations, deletion will no longer be amortized $O(1)$? If that's true that's interesting, but I don't think it's easy to tell without knowing the specific implementation.
•  » » » » » » » » » 2 months ago, # ^ |   0 Not a claim, just one hypothesis. I'm wondering about the real implementation and what implementations there could be.
•  » » » » » 2 months ago, # ^ |   -22 NO. Deletion of anything in a balanced binary tree requires rebalancing, which is $O(\log)$. Whether it's through an iterator is irrelevant.
•  » » » » » » 2 months ago, # ^ |   +15 Deletion through an iterator takes amortized constant time in Multiset
•  » » » » » » » 2 months ago, # ^ |   -28 See my comment above.
•  » » » » » » 2 months ago, # ^ |   0 Apparently, this is not true for red-black trees. Note that sorting lower bound doesn’t apply here as everything is already sorted and the node is already found.
•  » » » » 2 months ago, # ^ |   +4 I think pq is faster since it is cache friendly. pq's are internally stored in a contiguous memory(like vector), While multiset's are not.
•  » » » 2 months ago, # ^ | ← Rev. 2 →   0 UPD : Deleted
 » 2 months ago, # |   -53 The comments on this blog made me re-realize how much I still have to learn to become a Master. XD
•  » » 2 months ago, # ^ |   +8 This blog post, unfortunately, did not make me a master :(
•  » » 2 months ago, # ^ |   +8 I'm pretty sure there are many masters like me that don't know how multisets nor priority queues are implemented...
•  » » » 2 months ago, # ^ |   -11 Just kidding bro, I was just amazed by how much detailed knowledge some people have regarding their languages. :D
•  » » » 2 months ago, # ^ |   +11 grandmasters too
 » 2 months ago, # |   -92 Multisets and priority ques are used only on Div1 E-F problems and top onsite competitions like ICPC WF, and are not needed to score well in CF rounds.Personally I have never seen a reasonable problem that would require priority que during my CP career, and believe me, I consider myself experienced competitive programmer.
•  » » 2 months ago, # ^ |   +23 Too young too simple...
•  » » 2 months ago, # ^ |   +6 20C - Dijkstra? is a classic problem that needs to be solved by Dijkstra algorithm with heap optimisation.
•  » » » 2 months ago, # ^ |   -45 Hmm, it is still alpha contest when CF was new and Mike just copied tasks. There is no such problem in recent CF rounds (about last 2 or 3 years).
•  » » » » 2 months ago, # ^ |   +9 What about 960B - Minimize the error? It's a Div. 1 + Div. 2 B with difficulty *1500.
•  » » » » » 2 months ago, # ^ |   -12 It can be solved in $O(n \cdot (k_1 + k_2))$. $k_1$ times choose $i$ such that $a_i - b_i$ is largest and decrease $a_i$, then do the same for array $B$. I don't see how priority que or multiset can be utilized here.
•  » » » » » » 2 months ago, # ^ |   +6 See the tutorial please."This can be implemented using a priority queue or by sorting the array C and iterating over it."Although this problem has a solution which doesn't need priority queue, it seems that priority queue is well known by most competitive programmer according to the sentence.
•  » » » » » » » 2 months ago, # ^ |   -16 Well, according to my comment above it's not well known. Or do you believe thousands of grey coders actually know multiset and priority_queue?
•  » » » » » » » » 2 months ago, # ^ |   0 "Well, according to my comment above it's not well known." Well, I say something according to the tutorial because the tutorial is reliable. Is what you said reliable?"Or do you believe thousands of grey coders actually know multiset and priority_queue?" Of course grey coders don't know them, because they're Newbies. Most of them are amateur competitive programmers, maybe not even competitive programmers. They don't know multiset and priority_queue, but do you think they can solve div. 2 C~D? "Newbies don't know multiset and priority_queue" can't prove that multiset and priority_queue aren't well known.
•  » » » » » » » » » 2 months ago, # ^ |   +11 I will bet most grey coders even at least know what it is lol. They may not know when to use it well, but even many grey coders know things fundamental things like that.
•  » » 2 months ago, # ^ | ← Rev. 2 →   +14 And 1314A - Recommendations only Div.1A using priority_queue. And it's only four weeks ago.
•  » » » 2 months ago, # ^ |   -20 Solved it with multisets during a contest, runtime is 218 ms. Maybe you know problem that cant be solved with multiset, only with priority_queue?
•  » » » » 2 months ago, # ^ |   0 This problem is solvable with sets, but just barely: https://codeforces.com/problemset/gymProblem/102201/E
•  » » 2 months ago, # ^ |   +24 You're just without experience to say such words.
•  » » 2 months ago, # ^ | ← Rev. 2 →   +16 And 527C - Glass Carving is a typical multiset problem in Codeforces. Only Div.2C which has a similar difficulty to Div.2D today.
•  » » 2 months ago, # ^ |   -8 Bruh you are like if LanceTheDragonTrainer was a grey coder...
•  » » » 2 months ago, # ^ |   0 Lol. Couldn't understand your English. But what do you think my rating really is?
•  » » » » 2 months ago, # ^ | ← Rev. 3 →   0 I assume quite good. You seem to know what you're talking about on a variety of cp topics and most people (unlike cyan i responded to) would need to be quite good to have the audacity to make bold statements and say stuff like "and believe me, I consider myself experienced competitive programmer" or belittle people for having low color. My guess would be IM or GM.
•  » » » » » 2 months ago, # ^ |   +4 Wow. You just made my day. If I ever return to Competitive Programming, I will train hard and share my knowledge with interested parties, to repay your praise.
 » 2 months ago, # |   +13 i dont use multiset nor i use qriority queue ,i use set> where i store value in pair.first and index in pair.second automatically it becomes mutiset + it supports more functions like searching a specific element after a particular index etc.. :)
•  » » 2 months ago, # ^ |   0 Is it map ?
•  » » » 2 months ago, # ^ | ← Rev. 2 →   0 no set < pii >
 » 2 months ago, # |   0 Also priority queue construct in O(n)