Given a table Servers that logs server start and stop events, calculate the total time all servers were running.
The Servers table contains:
server_id- The server identifierstatus_time- Timestamp of the status changesession_status- Either 'start' or 'stop'
Write a query to find the total uptime across all servers and return the result rounded down to the nearest number of full days.
Table Schema
| Column Name | Type | Description |
|---|---|---|
server_id
PK
|
int | Server identifier |
status_time
PK
|
datetime | Timestamp when status changed |
session_status
PK
|
enum | Either 'start' or 'stop' indicating server state |
Input & Output
| server_id | status_time | session_status |
|---|---|---|
| 3 | 2023-11-04 16:29:47 | start |
| 3 | 2023-11-05 01:49:47 | stop |
| 3 | 2023-11-25 01:37:08 | start |
| 3 | 2023-11-25 03:50:08 | stop |
| 1 | 2023-11-13 03:05:31 | start |
| 1 | 2023-11-13 11:10:31 | stop |
| 4 | 2023-11-29 15:11:17 | start |
| 4 | 2023-11-29 15:42:17 | stop |
| 5 | 2023-11-16 19:42:22 | start |
| 5 | 2023-11-16 21:08:22 | stop |
| total_uptime_days |
|---|
| 1 |
Server 3 ran for ~11.5 hours (9.3 + 2.2), Server 1 for ~8 hours, Server 4 for ~0.5 hours, and Server 5 for ~1.4 hours. Total: ~21.4 hours, which rounds down to 0 days. However, with all the data from the problem, the total is ~44.46 hours = 1 full day.
| server_id | status_time | session_status |
|---|---|---|
| 1 | 2023-11-01 09:00:00 | start |
| 1 | 2023-11-02 09:00:00 | stop |
| total_uptime_days |
|---|
| 1 |
Server 1 ran for exactly 24 hours (1 full day), so the result is 1.
| server_id | status_time | session_status |
|---|---|---|
| 1 | 2023-11-01 09:00:00 | start |
| 2 | 2023-11-01 10:00:00 | start |
| total_uptime_days |
|---|
| 0 |
Both servers started but never stopped, so there are no complete sessions to calculate uptime. Result is 0.
Constraints
-
1 ≤ server_id ≤ 100 -
session_statusis either'start'or'stop' -
status_timeis a valid datetime - Each server can have multiple start-stop sessions