1. Time Measurement Fundamentals

Basic Time Measurement

Explanation:

  • Measures time by counting oscillations
  • is measured time interval
  • counts oscillations
  • is reference clock frequency

    Example: With 1MHz clock, counting 1000 ticks gives 1ms

Measurement Error

Sources:

  1. Quantization error ()
  2. Frequency instability ()

2. Clock Synchronization

Clock Drift

Key Points:

  • Measures clock speed vs reference
  • is local clock time
  • is reference time

    Example: Drift=1.001 means 0.1% fast

Drift Rate

Usage:

  • Measures deviation from perfect timing
  • Positive: running fast
  • Negative: running slow

Clock Offset

Purpose:

  • Measures time difference between clocks
  • Used for synchronization algorithms

Precision

Application:

  • Maximum offset between any two clocks
  • Key metric for system synchronization

3. Synchronization Algorithms

Cristian’s Algorithm

Process:

  1. Client requests time ()
  2. Server responds ()
  3. is processing delay
  4. Correction adjusts local time

Master-Slave

Implementation:

  • : forward delay
  • : return delay
  • Assumes symmetric delays

4. Error Handling

Interval Consistency

Purpose:

  • Ensures time intervals overlap
  • : lower bounds
  • : upper bounds

Byzantine Fault Tolerance

Process:

  1. Remove k highest/lowest values
  2. Average remaining values
  3. Tolerates k faulty clocks

5. Physical Limitations

Clock Granularity

Impact:

  • Minimum time measurement unit
  • Affects synchronization accuracy

Synchronization Error Bounds

Application:

  • Bounds actual delay
  • Accounts for granularity

Jitter Analysis

Significance:

  • Measures timing variation
  • Affects system precision
  • is number of nodes