This is the problem on **geeksforgeeks** job sequencing problem. This problem requires greedy approach..

So in the particular problem,it is asked to find **maximum jobs possible and maximum profit**, if each job takes 1 unit time and must be completed by its deadline... So input format is **(job id, deadline, profit).**

#### My approach----

So basically what I did is I created a **map<int,multiset>** and I used **deadline as Key** and *stored all given profits related to it in sorted order*. While storing these values, I calculated **Maximum deadline value among all deadline**. Let it me **max_val**; Then I created a **multiset final_jobs** to store min( D , given jobs with deadline D ) largest profits for each deadline ( where D = deadline) i.e. among all jobs which have D = 1,I took largest value; for D = 2, I took 2 largest value if number of jobs with D = 2 is greater or equal to D = 2 and so on. After that I took **M largest value from final_jobs and added them up.** { where M = min( max_val, size of final_jobs )}..

But for few test-cases my solution is failing, I think My approach is correct and tried to debug but I am not able to find any error. I have one more approach in mind but until I don't find error in this approach I can't use other approach (it's against my norms).

Could anyone just help me in finding errors in my approach and implementation...