Date and DateTime values in ClickHouse are immutable. You can't directly modify individual components like year, month, or day within an existing value. Instead, you create new dates using ClickHouse's specialized functions that add, subtract, or reset specific time components.
This guide walks through the complete toolkit of date modification functions, from basic addYears
operations to handling timezone edge cases and building real-time APIs that serve dynamically adjusted temporal data.
Key functions for updating date parts
Here's the thing about ClickHouse dates: you can't actually change them. Date and DateTime types are immutable, meaning you can't directly modify individual components like year, month, or day within an existing value. Instead, you create new Date or DateTime values using functions that add or subtract intervals from existing dates.
The primary approach involves using the addX
family of functions, where X represents the time component you want to modify. Pass your original date and a numeric value (positive or negative) to calculate a new date with the adjusted component.
addYears
The addYears(date, num)
function adds or subtracts years from any Date or DateTime value. When you pass a negative number, it subtracts years instead of adding them.
SELECT addYears('2024-03-15', 2) AS future_date;
-- Result: 2026-03-15
SELECT addYears('2024-03-15 14:30:00', -1) AS past_datetime;
-- Result: 2023-03-15 14:30:00
addMonths
The addMonths(date, num)
function handles month arithmetic with automatic overflow correction. If adding months results in an invalid date (like February 31st), ClickHouse adjusts to the last valid day of that month.
SELECT addMonths('2024-01-31', 1) AS adjusted_date;
-- Result: 2024-02-29 (leap year adjustment)
SELECT addMonths('2023-01-31', 1) AS adjusted_date;
-- Result: 2023-02-28 (non-leap year adjustment)
addDays
The addDays(date, num)
function provides straightforward day arithmetic without complex edge cases. It works consistently across month and year boundaries.
SELECT addDays('2024-02-28', 2) AS leap_forward;
-- Result: 2024-03-01
SELECT addDays('2024-01-01', -1) AS year_backward;
-- Result: 2023-12-31
addHours
The addHours(datetime, num)
function only works with DateTime and DateTime64 types, not plain Date types. Hour additions automatically handle day rollovers and maintain timezone consistency.
SELECT addHours('2024-03-15 22:30:00', 3) AS next_day;
-- Result: 2024-03-16 01:30:00
addMinutes
The addMinutes(datetime, num)
function operates similarly to addHours
but targets minute-level precision. It cascades changes up to hours and days when necessary.
SELECT addMinutes('2024-03-15 23:45:00', 30) AS rollover;
-- Result: 2024-03-16 00:15:00
addSeconds
The addSeconds(datetime, num)
function provides the finest granularity for time adjustments. With DateTime64 types, it preserves subsecond precision during calculations.
SELECT addSeconds('2024-03-15 23:59:30', 45) AS precise_time;
-- Result: 2024-03-16 00:00:15
toStartOfInterval variants
When you want to set a component to a specific value rather than add to it, use functions like toStartOfYear
, toStartOfMonth
, or toStartOfDay
. Functions reset the specified component and all smaller components to their minimum values.
SELECT toStartOfMonth('2024-03-15 14:30:00') AS month_start;
-- Result: 2024-03-01 00:00:00
SELECT toStartOfHour('2024-03-15 14:30:45') AS hour_start;
-- Result: 2024-03-15 14:00:00
Date and DateTime type recap
ClickHouse offers several date and time types, each with specific ranges and precision levels that affect how you can modify their components.
Date range limits
The standard Date
type covers years 1900 to 2299, while Date32
extends the range to 1900-2399. When using addYears
or similar functions, results outside the ranges will cause errors or unexpected behavior.
-- This works fine
SELECT addYears(toDate('2290-01-01'), 5) AS within_range;
-- This might cause issues depending on your ClickHouse version
SELECT addYears(toDate('2295-01-01'), 10) AS outside_range;
DateTime precision and storage
DateTime
types store seconds since Unix epoch with optional timezone information, while DateTime64
adds fractional seconds up to nanosecond precision. The precision affects how addSeconds
behaves with subsecond values.
Syntax and examples for each component
Let's walk through practical examples showing how to modify each date component using real-world scenarios.
Change year example
Updating years often comes up when analyzing year-over-year trends or creating test data. The addYears
function handles leap year boundaries automatically.
-- Update order dates to current year for analysis
SELECT
order_id,
original_date,
addYears(original_date, 2024 - toYear(original_date)) AS current_year_date
FROM orders
WHERE toYear(original_date) < 2024;
Change month example
Month modifications require careful handling of day overflow situations. ClickHouse's addMonths
function automatically adjusts invalid dates to the last day of the target month.
-- Shift all January dates to March, handling Feb 29th edge cases
SELECT
event_date,
addMonths(event_date, 2) AS march_date,
-- Show the day adjustment when it occurs
toDayOfMonth(event_date) AS original_day,
toDayOfMonth(addMonths(event_date, 2)) AS adjusted_day
FROM events
WHERE toMonth(event_date) = 1;
Change day example
Day modifications work predictably across month boundaries. Converting all dates to the first day of their respective months is a common pattern.
-- Convert all dates to month start for grouping
SELECT
toStartOfMonth(transaction_date) AS month_start,
COUNT(*) AS transaction_count,
SUM(amount) AS total_amount
FROM transactions
GROUP BY month_start
ORDER BY month_start;
Change hour example
Hour modifications on DateTime columns automatically handle day transitions. The pattern commonly appears in time-based data processing and timezone adjustments.
-- Shift all timestamps forward by 3 hours for timezone correction
SELECT
user_id,
original_timestamp,
addHours(original_timestamp, 3) AS adjusted_timestamp
FROM user_activities
WHERE toDate(original_timestamp) = '2024-03-15';
Change minute example
Minute adjustments often help with data alignment or rounding operations. You can combine addMinutes
with extraction functions to achieve precise timing modifications.
-- Round timestamps to nearest 15-minute interval
SELECT
log_timestamp,
addMinutes(
toStartOfHour(log_timestamp),
toInt32(toMinute(log_timestamp) / 15) * 15
) AS rounded_timestamp
FROM system_logs;
Change second example
Second-level precision becomes important when working with high-frequency data or precise timing requirements. DateTime64 types preserve fractional seconds during modifications.
-- Add random seconds to timestamps for testing
SELECT
event_id,
base_timestamp,
addSeconds(base_timestamp, rand() % 3600) AS randomized_timestamp
FROM test_events;
Handling time zone and overflow edge cases
Date modifications can produce unexpected results when timezone handling or edge cases aren't properly considered.
UTC versus local time
ClickHouse DateTime types can include timezone information, which affects how additions and subtractions behave. When modifying DateTime values with timezone data, the functions maintain timezone consistency.
-- Functions behave differently with timezone-aware dates
SELECT
addHours(toDateTime('2024-03-15 10:00:00', 'America/New_York'), 5) AS ny_time,
addHours(toDateTime('2024-03-15 10:00:00', 'UTC'), 5) AS utc_time;
Always specify timezones explicitly when working with DateTime modifications across different regions or when daylight saving time transitions might affect your results.
Leap year safety
February 29th creates special cases when adding or subtracting years. ClickHouse automatically adjusts leap day dates when the target year isn't a leap year.
-- Demonstrates leap year handling
SELECT
addYears('2024-02-29', 1) AS non_leap_year, -- Results in 2025-02-28
addYears('2024-02-29', 4) AS next_leap_year; -- Results in 2028-02-29
Invalid day or month detection
When date arithmetic produces impossible dates, ClickHouse applies correction logic rather than throwing errors. Understanding the behavior helps you anticipate results and validate data integrity.
-- Month addition with day overflow
SELECT
'2024-01-31' AS original,
addMonths('2024-01-31', 1) AS february_adjusted, -- Becomes 2024-02-29
addMonths('2024-01-31', 2) AS march_result; -- Becomes 2024-03-31
Choosing between addX functions and toStartOfInterval
Different modification approaches serve different use cases, and choosing the right function affects both performance and correctness.
Performance comparison
The addX
functions perform arithmetic operations, while toStartOfX
functions essentially reconstruct dates with specific components zeroed out. For large datasets, toStartOfX
functions often execute faster when you're normalizing dates rather than shifting them.
-- Faster for date normalization
SELECT toStartOfMonth(event_date) FROM large_table;
-- Better for relative date shifts
SELECT addMonths(event_date, -1) FROM large_table;
Use cases by granularity
Choose your approach based on whether you're performing relative shifts or absolute positioning:
- Relative modifications: Use
addYears
,addMonths
,addDays
when shifting dates by specific amounts - Absolute positioning: Use
toStartOfYear
,toStartOfMonth
,toStartOfDay
when normalizing to specific boundaries - Precision control: Use
addHours
,addMinutes
,addSeconds
for fine-grained time adjustments - Rounding operations: Combine both approaches to achieve complex date manipulations
Updating existing columns at scale
When you need to modify date columns in existing tables with millions or billions of rows, the approach you choose significantly impacts performance and cluster stability.
ALTER TABLE UPDATE pattern
The ALTER TABLE ... UPDATE
syntax allows in-place modifications of existing data. ClickHouse processes operations as mutations, which run asynchronously in the background.
-- Update all timestamps to current year
ALTER TABLE user_events
UPDATE event_timestamp = addYears(event_timestamp, 2024 - toYear(event_timestamp))
WHERE toYear(event_timestamp) < 2024;
Monitor mutation progress through the system.mutations
table to track completion status and identify any issues.
Create table as select swap
For large-scale modifications, creating a new table with corrected data often performs better than in-place updates. The approach also provides an easy rollback path.
-- Create corrected table
CREATE TABLE user_events_corrected AS
SELECT
user_id,
addHours(event_timestamp, 3) AS event_timestamp,
event_type,
properties
FROM user_events;
-- Swap table names after validation
RENAME TABLE user_events TO user_events_backup,
user_events_corrected TO user_events;
Monitoring system.mutations
Track mutation progress and health using ClickHouse's system tables. Long-running mutations can impact cluster performance, so monitoring helps you identify and resolve issues early.
-- Check active mutations
SELECT
table,
mutation_id,
command,
create_time,
is_done,
latest_fail_reason
FROM system.mutations
WHERE is_done = 0
ORDER BY create_time DESC;
Common mistakes and debugging tips
Date modifications can produce subtle errors that only surface with specific data combinations or edge cases.
Misordered parameters
ClickHouse date functions expect parameters in specific orders, and mixing them up produces incorrect results rather than obvious errors.
Common parameter order mistakes include:
- Correct:
addMonths(date, number_of_months)
- Incorrect:
addMonths(number_of_months, date)
- Correct:
makeDateTime(year, month, day, hour, minute, second)
- Incorrect:
makeDateTime(day, month, year, hour, minute, second)
Time zone mismatch
When working with DateTime values across different timezones, inconsistent timezone handling can shift your dates unexpectedly. Always specify timezones explicitly when timezone accuracy matters.
-- Problematic: mixed timezone handling
SELECT addHours(toDateTime('2024-03-15 10:00:00'), 5); -- Assumes server timezone
-- Better: explicit timezone specification
SELECT addHours(toDateTime('2024-03-15 10:00:00', 'UTC'), 5);
Large mutation rollbacks
Failed mutations on large tables can consume significant cluster resources. When mutations fail or need cancellation, use the KILL MUTATION
command with the specific mutation ID from system.mutations
.
-- Find problematic mutation
SELECT mutation_id, latest_fail_reason
FROM system.mutations
WHERE table = 'large_table' AND is_done = 0;
-- Kill the mutation
KILL MUTATION WHERE mutation_id = 'mutation_123';
Building time-series analytics APIs with date manipulation
A common analytics pattern involves normalizing and shifting timestamps to create consistent time-based reporting across different time zones and periods. Using ClickHouse's date modification functions, you can build APIs that dynamically adjust temporal data for business intelligence, user-facing dashboards, and real-time analytics.
Here's how to build a complete time-series analytics API using Tinybird's managed ClickHouse platform.
Step 1: Create a data source for time-series events
First, create a data source to store time-series events with various date components. Create a time_series_events.datasource
file:
SCHEMA >
`event_id` String `json:$.event_id`,
`user_id` String `json:$.user_id`,
`event_timestamp` DateTime64(3, 'UTC') `json:$.event_timestamp`,
`event_type` LowCardinality(String) `json:$.event_type`,
`value` Float64 `json:$.value`,
`device_timezone` String `json:$.device_timezone`,
`region` LowCardinality(String) `json:$.region`,
`properties` String `json:$.properties`,
-- Materialized columns for common date transformations
`event_date` Date MATERIALIZED toDate(event_timestamp),
`event_hour` UInt8 MATERIALIZED toHour(event_timestamp),
`event_year` UInt16 MATERIALIZED toYear(event_timestamp),
`event_month` UInt8 MATERIALIZED toMonth(event_timestamp)
ENGINE "MergeTree"
ENGINE_PARTITION_KEY "toYYYYMM(event_timestamp)"
ENGINE_SORTING_KEY "event_timestamp, user_id"
Step 2: Test locally with sample data
Start the local Tinybird development server:
tb dev
This starts a local Tinybird instance at http://localhost:7181
. Your data sources are automatically deployed to the local environment.
Ingest sample time-series data using the local Events API:
curl -X POST \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TINYBIRD_TOKEN" \
"http://localhost:7181/v0/events?name=time_series_events" \
-d '[
{"event_id": "evt_001", "user_id": "user_123", "event_timestamp": "2024-01-15 10:30:00", "event_type": "login", "value": 1.0, "device_timezone": "America/New_York", "region": "US", "properties": "{\"device\": \"mobile\"}"},
{"event_id": "evt_002", "user_id": "user_456", "event_timestamp": "2024-02-29 14:45:00", "event_type": "purchase", "value": 99.99, "device_timezone": "Europe/London", "region": "UK", "properties": "{\"product\": \"premium\"}"},
{"event_id": "evt_003", "user_id": "user_789", "event_timestamp": "2024-12-31 23:59:00", "event_type": "logout", "value": 0.0, "device_timezone": "Asia/Tokyo", "region": "JP", "properties": "{\"duration\": 3600}"},
{"event_id": "evt_004", "user_id": "user_012", "event_timestamp": "2024-06-15 08:15:00", "event_type": "view", "value": 1.0, "device_timezone": "America/Los_Angeles", "region": "US", "properties": "{\"page\": \"dashboard\"}"}
]'
Step 3: Create date manipulation analytics APIs
Create a pipe for flexible date adjustment and analytics. Create date_analytics.pipe
:
NODE date_adjusted_events
SQL >
%
SELECT
event_id,
user_id,
event_timestamp AS original_timestamp,
-- Dynamic date adjustments based on API parameters
{\% if defined(year_offset) %}
addYears(event_timestamp, {{Int32(year_offset, 0)}}) AS adjusted_timestamp,
{\% elif defined(month_offset) %}
addMonths(event_timestamp, {{Int32(month_offset, 0)}}) AS adjusted_timestamp,
{\% elif defined(day_offset) %}
addDays(event_timestamp, {{Int32(day_offset, 0)}}) AS adjusted_timestamp,
{\% elif defined(hour_offset) %}
addHours(event_timestamp, {{Int32(hour_offset, 0)}}) AS adjusted_timestamp,
{\% else %}
event_timestamp AS adjusted_timestamp,
{\% end %}
-- Time normalization options
{\% if defined(normalize_to) %}
{\% if String(normalize_to) == 'start_of_year' %}
toStartOfYear(event_timestamp) AS normalized_timestamp,
{\% elif String(normalize_to) == 'start_of_month' %}
toStartOfMonth(event_timestamp) AS normalized_timestamp,
{\% elif String(normalize_to) == 'start_of_day' %}
toStartOfDay(event_timestamp) AS normalized_timestamp,
{\% elif String(normalize_to) == 'start_of_hour' %}
toStartOfHour(event_timestamp) AS normalized_timestamp,
{\% else %}
event_timestamp AS normalized_timestamp,
{\% end %}
{\% else %}
event_timestamp AS normalized_timestamp,
{\% end %}
event_type,
value,
device_timezone,
region,
-- Timezone conversion for display
{\% if defined(display_timezone) %}
toTimeZone(event_timestamp, {{String(display_timezone, 'UTC')}}) AS display_timestamp,
{\% else %}
event_timestamp AS display_timestamp,
{\% end %}
-- Date component extraction
toYear(event_timestamp) AS year,
toMonth(event_timestamp) AS month,
toDayOfMonth(event_timestamp) AS day,
toHour(event_timestamp) AS hour
FROM time_series_events
WHERE event_timestamp >= {{DateTime(start_date, '2024-01-01 00:00:00')}}
AND event_timestamp <= {{DateTime(end_date, '2024-12-31 23:59:59')}}
{\% if defined(event_type_filter) %}
AND event_type = {{String(event_type_filter)}}
{\% end %}
{\% if defined(region_filter) %}
AND region = {{String(region_filter)}}
{\% end %}
ORDER BY adjusted_timestamp DESC
{\% if defined(limit) %}
LIMIT {{Int32(limit, 100)}}
{\% end %}
TYPE ENDPOINT
Create a time-bucketed aggregation API. Create time_bucket_analytics.pipe
:
NODE time_bucket_aggregation
SQL >
%
WITH date_adjusted AS (
SELECT
event_timestamp,
-- Apply date modifications for trend analysis
{\% if defined(shift_years) %}
addYears(event_timestamp, {{Int32(shift_years, 0)}}) AS analysis_timestamp,
{\% elif defined(shift_months) %}
addMonths(event_timestamp, {{Int32(shift_months, 0)}}) AS analysis_timestamp,
{\% else %}
event_timestamp AS analysis_timestamp,
{\% end %}
event_type,
value,
region
FROM time_series_events
WHERE event_timestamp >= {{DateTime(start_date, '2024-01-01 00:00:00')}}
AND event_timestamp <= {{DateTime(end_date, '2024-12-31 23:59:59')}}
{\% if defined(region_filter) %}
AND region = {{String(region_filter)}}
{\% end %}
)
SELECT
-- Time bucketing based on granularity parameter
{\% if String(bucket_granularity, 'hour') == 'year' %}
toStartOfYear(analysis_timestamp) AS time_bucket,
'year' AS bucket_type,
{\% elif String(bucket_granularity, 'hour') == 'month' %}
toStartOfMonth(analysis_timestamp) AS time_bucket,
'month' AS bucket_type,
{\% elif String(bucket_granularity, 'hour') == 'day' %}
toStartOfDay(analysis_timestamp) AS time_bucket,
'day' AS bucket_type,
{\% elif String(bucket_granularity, 'hour') == 'minute' %}
toStartOfMinute(analysis_timestamp) AS time_bucket,
'minute' AS bucket_type,
{\% else %}
toStartOfHour(analysis_timestamp) AS time_bucket,
'hour' AS bucket_type,
{\% end %}
event_type,
region,
count() AS event_count,
sum(value) AS total_value,
avg(value) AS avg_value,
min(value) AS min_value,
max(value) AS max_value,
uniq(user_id) AS unique_users,
-- Time period analysis
dateDiff('day', min(analysis_timestamp), max(analysis_timestamp)) AS period_days,
-- Edge case handling for leap years and month boundaries
CASE
WHEN toMonth(analysis_timestamp) = 2 AND toDayOfMonth(analysis_timestamp) = 29
THEN 'leap_day_event'
WHEN toDayOfMonth(analysis_timestamp) = 31
THEN 'month_end_event'
ELSE 'regular_event'
END AS date_edge_case
FROM date_adjusted
GROUP BY time_bucket, bucket_type, event_type, region, date_edge_case
ORDER BY time_bucket DESC, total_value DESC
{\% if defined(limit) %}
LIMIT {{Int32(limit, 200)}}
{\% end %}
TYPE ENDPOINT
Step 4: Test the APIs locally
Test your date manipulation APIs locally:
# Test basic date adjustment
curl "http://localhost:7181/v0/pipes/date_analytics.json?start_date=2024-01-01%2000:00:00&end_date=2024-12-31%2023:59:59&hour_offset=3"
# Test year normalization for year-over-year analysis
curl "http://localhost:7181/v0/pipes/date_analytics.json?normalize_to=start_of_year&display_timezone=America/New_York"
# Test time bucket aggregation with month shifting
curl "http://localhost:7181/v0/pipes/time_bucket_analytics.json?bucket_granularity=month&shift_months=-6®ion_filter=US"
# Test leap year and edge case handling
curl "http://localhost:7181/v0/pipes/time_bucket_analytics.json?start_date=2024-02-28%2000:00:00&end_date=2024-03-02%2023:59:59&bucket_granularity=day"
You should see the date manipulation functions in action - timestamps are dynamically adjusted, normalized, and aggregated based on the API parameters.
Step 5: Deploy to production
Once you've tested your time-series analytics locally and verified the date manipulations work correctly, deploy to your Tinybird cloud workspace:
tb --cloud deploy
Your APIs are now available in production:
# Production API calls
curl "$TINYBIRD_HOST/v0/pipes/date_analytics.json?year_offset=1&normalize_to=start_of_month&display_timezone=Europe/London" \
-H "Authorization: Bearer $TINYBIRD_TOKEN"
curl "$TINYBIRD_HOST/v0/pipes/time_bucket_analytics.json?bucket_granularity=day&shift_months=-3" \
-H "Authorization: Bearer $TINYBIRD_TOKEN"
The APIs return comprehensive time-series analytics with proper date handling:
{
"meta": [
{"name": "event_id", "type": "String"},
{"name": "user_id", "type": "String"},
{"name": "original_timestamp", "type": "DateTime64(3)"},
{"name": "adjusted_timestamp", "type": "DateTime64(3)"},
{"name": "normalized_timestamp", "type": "DateTime64(3)"},
{"name": "event_type", "type": "String"},
{"name": "value", "type": "Float64"},
{"name": "device_timezone", "type": "String"},
{"name": "region", "type": "String"},
{"name": "display_timestamp", "type": "DateTime64(3)"},
{"name": "year", "type": "UInt16"},
{"name": "month", "type": "UInt8"},
{"name": "day", "type": "UInt8"},
{"name": "hour", "type": "UInt8"}
],
"data": [
{
"event_id": "evt_003",
"user_id": "user_789",
"original_timestamp": "2024-12-31T23:59:00.000",
"adjusted_timestamp": "2025-01-01T02:59:00.000",
"normalized_timestamp": "2025-01-01T00:00:00.000",
"event_type": "logout",
"value": 0.0,
"device_timezone": "Asia/Tokyo",
"region": "JP",
"display_timestamp": "2025-01-01T00:59:00.000",
"year": 2024,
"month": 12,
"day": 31,
"hour": 23
}
],
"rows": 1,
"statistics": {
"elapsed": 0.005,
"rows_read": 4,
"bytes_read": 512
}
}
This approach provides:
- Dynamic date adjustments using
addYears
,addMonths
,addDays
, andaddHours
functions - Time normalization with
toStartOfYear
,toStartOfMonth
,toStartOfDay
, andtoStartOfHour
functions - Timezone conversion for global applications with
toTimeZone
function - Flexible time bucketing for different granularities of analysis
- Edge case handling for leap years, month boundaries, and timezone transitions
- Materialized date components for optimal query performance
Your time-series data becomes available as production-ready API endpoints that automatically handle complex date manipulations while providing real-time analytics insights. Start building with Tinybird's free plan to create your own time-series analytics APIs.
What to do next with ClickHouse and Tinybird
ClickHouse's date modification functions provide powerful tools for transforming time series data so long as you understand of immutability, edge cases, and performance implications. The addX
family of functions handles relative date arithmetic, while toStartOfX
functions normalize dates to specific boundaries.
Tinybird's managed ClickHouse service can be used for production workloads involving large-scale time series data. Tinybird handles the operational complexity of running ClickHouse at scale while providing tools and infrastructure for building and hosting real-time analytics APIs.
Whether you're normalizing timestamps for analytics, building time-series APIs, or processing streaming data with date modifications, Tinybird's platform enables a workflow from data ingestion to API deployment.
You can sign up to Tinybird for free to build and test your first time-series analytics API in a few minutes.
Additional resources
- Date and Time Functions
- Type Conversion Functions
- DateTime Data Types
- MergeTree Engine
- Roll up data with materialized views
- Real-time analytics with billion rows at scale
- Tinybird vs. ClickHouse: What's the difference?
- Database Timestamps and Timezones Best Practices
FAQs about updating date parts in ClickHouse
Can I change only the hour component without affecting the date portion?
Yes, use addHours
function which only modifies the hour component while preserving the date. For example, addHours('2024-01-15 10:30:00', 5)
returns '2024-01-15 15:30:00'.
How do I roll back a failed ALTER TABLE UPDATE mutation?
Use KILL MUTATION
with the mutation ID from system.mutations
table to stop in-progress mutations. Completed mutations cannot be rolled back and require restoring from backup.
Does the date modification process differ for Date32 columns versus Date columns?
The same functions work for both Date and Date32 types, but Date32 supports a wider range (1900-2399) compared to Date (1900-2299). Function behavior remains identical.