Skip to content

Performance Impact

Every performance-monitoring and observability tool inevitably introduces some overhead—after all, it must collect metrics in real time. Laritor’s client package has been engineered with care to make that overhead virtually unnoticeable in your apps.

Below is an overview of the measures we’ve taken to keep Laritor as lightweight and efficient as possible.

  • Minimal file count: Fewer than 50 files in the entire package.
  • Small download size: The ZIP archive is just ~1 MB.
  • Post-response dispatch: All events are sent after your app returns a response to the browser.
  • Non-blocking: Your users never wait for Laritor’s HTTP calls; the PHP process sends data in the background.

Note: In extremely high-traffic scenarios, holding onto PHP worker processes for outbound HTTP calls can become a bottleneck. See “Edge Location Ingestion” below for our mitigation.

  • Global Cloudflare Workers: We ingest data at the edge—over 330 Cloudflare locations worldwide—so payloads travel over the shortest possible network path.
  • Reduced latency: Your server pushes events to the nearest edge, not to a centralized data center.
  • Avoiding heavy abstractions: We bypass Laravel collections and other higher-level helpers in our core ingest path.
  • Raw PHP loops & streams: This trades a bit of readability for maximum throughput and lowest CPU usage.
  • Immediate ACK: Edge Workers acknowledge receipt of each payload instantly—before queueing.
  • Asynchronous queue writes: After sending the HTTP 200 back, the Worker persists data into our ingest queues in the background.
  • Consistent performance: This two-step approach yields single-digit millisecond responses at the edge, regardless of your application’s traffic volume.
  • Adaptive sampling: Adjust what percentage of traces or metrics are sent, keeping event volume under control.
  • Octane compatibility: Automatically flush any remaining event batches when the worker process shuts down or reloads.
  • Per-request state reset: Ensure instrumentation data doesn’t carry over between requests in persistent-process environments.

By combining a lean client library, deferred transmissions, global edge ingestion, and streamlined code paths, Laritor delivers the observability you need—without the performance penalty you don’t. Feel free to reach out if you have any questions or run into unusual traffic patterns.