radoslav11's blog

By radoslav11, history, 8 years ago, In English

Hello!

Today, at 08:00 UTC will be held Deadline24's qualifying round.

Let's discuss problems here, after the contest of course!

  • Vote: I like it
  • +26
  • Vote: I do not like it

»
8 years ago, # |
  Vote: I like it -20 Vote: I do not like it

Is it rated?

»
8 years ago, # |
  Vote: I like it -26 Vote: I do not like it

Can you give the registration link?

»
8 years ago, # |
  Vote: I like it -34 Vote: I do not like it

Is it rated?

»
8 years ago, # |
  Vote: I like it -16 Vote: I do not like it

I got lost in their website. I din't find registration link for 10 minutes. Please give the exact registration link :)

  • »
    »
    8 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Last date for registration was 25th Feb

»
8 years ago, # |
Rev. 2   Vote: I like it +22 Vote: I do not like it

Any solution for B with better complexity than O(T*SZLIST^3)?

  • »
    »
    8 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Did you meant O(T * K3)?

    It seems that orders07.in was only test with K > 200. I wonder what was the purpose tests 8-10, which had large N and small K.

»
8 years ago, # |
Rev. 2   Vote: I like it +2 Vote: I do not like it

Let's share our results to C and D.

My results for Rancho:

D1	4:59:41	17700	OK
D2	4:59:41	271945	OK
D3	4:59:41	178750	OK
D4	4:59:41	260625	OK
D5	4:59:41	243390	OK
D6	4:59:41	628140	OK
D7	4:59:41	468705	OK
D8	4:59:41	281935	OK
D9	4:59:41	194405	OK
D10	4:59:41	206355	OK

EDIT: I copied wrong results ;)

  • »
    »
    8 years ago, # ^ |
    Rev. 3   Vote: I like it +16 Vote: I do not like it

    Lol, my solution is total trash, but

    D1	4:45:02	23825	OK
    D2	4:45:02	434920	OK
    D3	4:45:02	205465	OK
    D4	4:56:47	121540720	OK
    D5	4:46:12	359250	OK
    D6	4:59:42	2214650	OK
    D7	4:59:42	822010	OK
    D8	4:59:42	405045	OK
    D9	4:59:42	342130	OK
    D10	4:58:13	8226985	OK
    

    (it's the 27th result)

  • »
    »
    8 years ago, # ^ |
      Vote: I like it +6 Vote: I do not like it

    Our:

    D1	3:09:19	6585		OK
    D2	3:09:19	92315		OK
    D3	3:09:19	84025		OK
    D4	3:09:19	49155		OK
    D5	2:58:29	162500		OK
    D6	3:09:19	167505		OK
    D7	2:58:29	207890		OK
    D8	3:14:22	148550		OK
    D9	2:58:29	128615		OK
    D10	3:07:09	124300		OK
    
»
8 years ago, # |
  Vote: I like it +5 Vote: I do not like it

Problem D (Rancho) was very similar to TCO15 Marathon Round 1 problem. Did organizers know about that?

  • »
    »
    8 years ago, # ^ |
    Rev. 2   Vote: I like it +5 Vote: I do not like it

    For me this was very fortunate :) I worked on the problem C-Hospital in the first 3 hours of the contest and with less than 2 hours left to go the approach that my team mate used had trouble finding any solutions on many of the larger test cases. So I decided to stop improving C and also try something at D, since there were many more points to gain there.

    Since there would be no chance to implement anything from scratch in the remaining time, I took my solution from TCO'15 Marathon Round 1 and modified it slightly, in order to find small area polygons. Note that I only considered the case of polygons with exactly N-K vertices (no time to try anything fancy there). The main algorithm I used was to repeatedly select a random subset of N-K vertices, then pick a random empty triangle consisting of 3 selected points. After that, I would repeatedly add the smallest area "good" triangle for which 2 vertices were already on the polygon and the 3rd one was still outside the polygon (this essentially added one more point to the polygon).

    Once the min-area solution worked, I thought about what I could quickly do to get large area triangles. Since the remaining time was very short, I modified the above solution to repeatedly add the largest area "good" triangle at each step :) This worked, too.

    In the end, this approach produced the best solutions for most of the test cases ("best" in the sense of compared to other approaches we tried, not compared to other teams). Since I made all the submissions for this approach in the last hour of the contest, we didn't get any relative scores, so we didn't know how well this performed. We were quite surprised to see our score for this problem to be 779.68 (7th place overall, and I think 6th place for this problem), given that:

    • by mistake we didn't submit the best solution for test 10 (for which we only had a solution which was about 10 times worse from an earlier submission)
    • I didn't get the chance to modify too much of my solution for TCO'15 Marathon Round 1, so I only added empty triangles, i.e. triangles not including any of the N given points (when, in fact, I should have completely excluded the K points which I didn't care about to add on the polygon)

    I would be very interested in knowing how other teams approached this problem.

    • »
      »
      »
      8 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      You can reduce maximizing area, to taking convex hull + minimizing area of the inside points. You then have to join those polygons, but it's not that hard.

      I worked on C, because marek.cygan forgot to tell me that this task is nearly identical to TCO15R1 ;)

    • »
      »
      »
      8 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      I just used 95% of my code from TCO'15 Marathon Round 1.

»
8 years ago, # |
Rev. 2   Vote: I like it +16 Vote: I do not like it

How to solve Task-C — Hospital?

  • »
    »
    8 years ago, # ^ |
      Vote: I like it +13 Vote: I do not like it

    I used some greedy ideas, such as:
    1) Choose man, who can be cured earlier, than others.
    2) Choose man, whose treatment can be started earlier, than others.
    3) Choose man, who has got a maximum number of injures now.
    4) If you look at formula, you can see that amount of used tables is very important. So it is a good idea to reduse number of tables to only one. Also, you can use some greedy ideas to reduse number of types of used tables(for example, if two treatments can be perfomed at tables of types 1 and 2, you can use only first type).
    Different combinations of these ideas gave us 654.54 points.

    • »
      »
      »
      8 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Our idea (worth only 666.22 points) was also greedy-based. We considered the treatment stages in increasing order of the earliest time at which they are available (initially we have the first stage for each patient available at time 0, but once a stage is scheduled at some time, we have the earliest start time of the next stage for that same patient).

      So the only decision to be made is which type of table to use for the current stage (of some patient) and whether to use a new table of that type or not. For a given table type T, we either reuse the table which becomes available the earliest or a new table. For this decision we used a scoring function consisting of a linear combination of 4 parameters:

      • a (fixed) penalty for using a new table (to try to reduce the number of tables used)
      • a penalty for the time at which the current stage is completed (to try to schedule it earlier, if possible)
      • a penalty for "wasted" time (if a table is available at time T1, but we can only start the treatment at time T2>T1 due to dependencies on previous stages)
      • a bonus for each table type, which depended on the set of treatments which could not yet be treated by any previously used type of table (the weight of a treatment is just the number of patients requiring it)

      Each parameter (except the fixed new table penalty W0) was of the form weight * parameter_value (e.g. W1 * completion_time, W2 * wasted_time, W3 * bonus_score). We considered various combinations of these 4 weights, ran the greedy for each of them and picked the best.

  • »
    »
    8 years ago, # ^ |
      Vote: I like it +10 Vote: I do not like it

    Assume you have fixed number of tables.

    You can do a following very simple greedy: Take the guy that currently does nothing, and choose an earliest available table for him. Split ties randomly. Randomizing ties is enough to create completely different solutions.

    Having this greedy you can do a HC-like solution on amount of tables used. Just make sure that you sometimes should force a move outside of local minimum since it easily gets stuck.

    Additionally, there were some cases where the optimal number of tables was somewhat in the middle. And you couldn't start with the maximum, since the evaluation function isn't convex based on the number of tables used. The ugly quick fix is to just start with a random number of tables of each type.

    This was enough for a (rather) solid 2nd place after 2-3h of sleep :)