It has been said that a watch that is stopped keeps better time than one that loses 1 second per day. The one that is stopped reads the correct time twice a day while the one that loses 1 second per day is correct only once every 43,200 days. This maxim applies to old fashioned 12-hour analog watches, whose hands move continuously (most digital watches would display nothing at all if stopped).
Given two such analog watches, both synchronized to midnight, that keep time at a constant rate but run slow by k and m seconds per day respectively, what time will the watches show when next they have exactly the same time?