class Solution:
def averageWaitingTime(self, customers: List[List[int]]) -> float:
current_time = 0
total_wait = 0
for arrival, time in customers:
# Chef becomes available at max(current_time, arrival)
start = max(current_time, arrival)
finish = start + time
total_wait += finish - arrival
current_time = finish
return total_wait / len(customers)
class Solution {
public:
double averageWaitingTime(vector<vector<int>>& customers) {
long long currentTime = 0, totalWait = 0;
for (const auto& c : customers) {
int arrival = c[0], time = c[1];
currentTime = max(currentTime, (long long)arrival);
currentTime += time;
totalWait += currentTime - arrival;
}
return (double)totalWait / customers.size();
}
};
class Solution {
public double averageWaitingTime(int[][] customers) {
long currentTime = 0, totalWait = 0;
for (int[] c : customers) {
int arrival = c[0], time = c[1];
currentTime = Math.max(currentTime, arrival);
currentTime += time;
totalWait += currentTime - arrival;
}
return (double) totalWait / customers.length;
}
}
var averageWaitingTime = function(customers) {
let currentTime = 0, totalWait = 0;
for (const [arrival, time] of customers) {
currentTime = Math.max(currentTime, arrival);
currentTime += time;
totalWait += currentTime - arrival;
}
return totalWait / customers.length;
};
You are given a list of customers, where each customer is represented as a list [arrivalTime, timeNeeded]
. Each customer arrives at the restaurant at arrivalTime
, and it takes timeNeeded
units of time to prepare their order. The chef can only serve one customer at a time and starts working on the next customer as soon as possible (either when the previous order is done or when the customer arrives, whichever is later).
The waiting time for each customer is defined as the time from their arrival to when their order is complete. Your task is to calculate and return the average waiting time for all customers.
Constraints:
At first glance, the problem seems to require simulating a queue at a restaurant. For each customer, you need to keep track of when the chef is available and how long the customer waits. A brute-force approach might involve simulating every single time unit, but this isn't necessary since the chef can only work on one order at a time, and the customers arrive in a given sequence.
The key insight is that for each customer, their actual waiting time depends on whether the chef is already idle or still busy with the previous order when they arrive. The chef's current time is either the finish time of the last order or the arrival time of the customer, whichever is greater.
Instead of tracking every time unit, we only need to track:
Let's break down the solution step by step:
current_time
: Tracks when the chef is next available (initially 0).total_wait
: Accumulates the total waiting time for all customers.arrival
and time_needed
.max(current_time, arrival)
.start_time + time_needed
.finish_time - arrival
.current_time
to finish_time
.total_wait
.total_wait / number_of_customers
as a floating-point number.This approach is efficient because:
Let's consider the following input:
customers = [[1, 2], [2, 5], [4, 3]]
current_time = 3
, total_wait = 2
.current_time = 8
, total_wait = 8
.current_time = 11
, total_wait = 15
.Final average waiting time: 15 / 3 = 5.0
Brute-force approach:
This efficiency comes from only tracking when the chef is available and updating totals as we process each customer.
The Average Waiting Time problem is a simulation of a simple queue, where each customer's wait depends on the chef's availability and their own arrival. By keeping track of the chef's current time and each customer's waiting time, we can efficiently compute the average without simulating each time unit. The solution is both simple and efficient, running in linear time with constant space, and demonstrates how understanding the problem's structure allows us to avoid unnecessary computation.