1. Time Measurement Fundamentals
Basic Time Measurement
Explanation:
- Measures time by counting oscillations
- is measured time interval
- counts oscillations
- is reference clock frequency
Example: With 1MHz clock, counting 1000 ticks gives 1ms
Measurement Error
Sources:
- Quantization error ()
- Frequency instability ()
2. Clock Synchronization
Clock Drift
Key Points:
- Measures clock speed vs reference
- is local clock time
- is reference time
Example: Drift=1.001 means 0.1% fast
Drift Rate
Usage:
- Measures deviation from perfect timing
- Positive: running fast
- Negative: running slow
Clock Offset
Purpose:
- Measures time difference between clocks
- Used for synchronization algorithms
Precision
Application:
- Maximum offset between any two clocks
- Key metric for system synchronization
3. Synchronization Algorithms
Cristian’s Algorithm
Process:
- Client requests time ()
- Server responds ()
- is processing delay
- Correction adjusts local time
Master-Slave
Implementation:
- : forward delay
- : return delay
- Assumes symmetric delays
4. Error Handling
Interval Consistency
Purpose:
- Ensures time intervals overlap
- : lower bounds
- : upper bounds
Byzantine Fault Tolerance
Process:
- Remove k highest/lowest values
- Average remaining values
- Tolerates k faulty clocks
5. Physical Limitations
Clock Granularity
Impact:
- Minimum time measurement unit
- Affects synchronization accuracy
Synchronization Error Bounds
Application:
- Bounds actual delay
- Accounts for granularity
Jitter Analysis
Significance:
- Measures timing variation
- Affects system precision
- is number of nodes