In a project that I am working on, I am providing an interface that runs a process that takes a few moments (about 15 seconds) to run. The end user wants to see a progress of its completion as it’s happening. This isn’t an upload, so there isn’t a native way to get the progress via Javascript, so I need to implement the progress indicator manually.
WebSockets are a bit overkill for this particular feature. If I already had webSockets set up, I might use it, but since I don’t, setting up webSockets for this is unnecessary. There is a simpler way to inform the end user of the progress in real time, and that is streaming.
Streaming is a technology that most people attribute to things like watching videos online, listening to podcasts, or downloading large files. But you can use streaming to perform real time notifications from the server to the browser, or vice versa.
When streaming data from the server to the browser, you can use a technology called “Server Sent Events” or “SSE” for short. You can take advantage of this via the EventSource
API: EventSource - Web APIs | MDN.
In the simple example below, when we initialize an EventSource
object, the URL is immediately called. Then, whenever an update is sent, it logs to the console.
const eventSource = new EventSource('/api-endpoint');
eventSource.onmessage = (event) => {
console.log(`Data: ${event.data}`);
};
Server Sent Events must follow a specific structure. Each event is separated by 2 new lines. Each event consists of one-or-more key-value pairs where each pair is on a single line, separated by a colon followed by an optional space (more details can be found here). The 4 valid keys, or fields, are: data
, id
, event
, and retry
with data
being the only required field.
data: Here is some data being sent from the server
data: Here is some additional data from the server
The values of the fields are processed as strings, but you can send JSON and parse it in the browser:
data: {"name": "Patrick Stephan", "role": "Full-stack Developer"}
eventSource.onmessage = (event) => {
const data = JSON.parse(event.data);
console.log('Data:', data);
};
The event
field allows you to specify the type of event that is being sent. This enables you to set up individual event listeners for each type, if needed:
event: start
data: importing data
event: progress
data: 0.1277263
event: progress
data: 0.1746639
eventSource.addEventListener("start", (e) => {
console.log(e.data);
setProgressBarPercentage(0);
});// This fires once, sending "importing data" to the console
eventSource.addEventListener("progress", (e) => {
setProgressBarPercentage(Number(e.data) * 100);
});// This fires twice
eventSource.addEventListener("message", (e) => {
console.log('Data:', event.data);
});// This does not fire. `message` will fire for unnamed events or events of type `message`
eventSource.onmessage = (event) => {
console.log('Data:', event.data);
};// This does not fire. `message` will fire for unnamed events or events of type `message`
message
is a special event type that applies to events without a specified type.The other two fields, id
and retry
, are outside the scope of this post. But for a quick description, id
allows you to tell the server to pick up at the last event in case of reconnecting after connection lost. Otherwise you might resend everything. retry
tells the browser to attempt to reconnect after the given number of milliseconds when the connection is broken.
So, as I mentioned at the beginning of this article, I need to inform my end-user of the progress of a task as it completes.
On my front end, I have a Vue component with a button. This button sends an API request, triggering the task. As events are sent, I parse them as JSON, get the status, and display it to my user:
<script setup>
import { Head } from '@inertiajs/vue3';
import {ref} from "vue";
const stream = ref({
pageCount: 0,
completed: 0,
current: ''
});
const processTasks = async () => {
stream.value = {
pageCount: 0,
completed: 0,
current: ''
};
const eventSource = new EventSource('/api/pull-collections');
eventSource.onmessage = (event) => {
stream.value = JSON.parse(event.data);
};
eventSource.onerror = () => {
eventSource.close();
};
};
</script>
<template>
<Head title="Dashboard" />
<div class="box">
<button @click="processTasks" class="btn-primary">Process Tasks</button>
<div class="mt-4" v-if="stream.pageCount">
<p>Pulling: {{ stream.current }} </p>
<div class="progress-bar">
<div
class="progress-indicator"
:style="{ width: (stream.completed / stream.pageCount * 100) + '%' }"
></div>
</div>
</div>
</div>
</template>
On the server, when the request is received, I return a StreamedResponse
with a closure defining the logic for sending the events. In PHP, we must flush the output buffer for each event so that they are each sent in real time and not held in memory until the response is complete:
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use Symfony\Component\HttpFoundation\StreamedResponse;
class PullCollections extends Controller
{
private int $pageCount = 0;
private int $completed = 0;
private function sendEvent(string $label = ''): void
{
echo "data: " . json_encode([
'pageCount' => $this->pageCount,
'completed' => $this->completed,
'current' => $label,
]) . "\n\n";
ob_flush();
flush();
}
public function __invoke(Request $request): StreamedResponse
{
ini_set('max_execution_time', 300);
$response = new StreamedResponse(function () {
$tasks = getTasks();
$this->pageCount = $tasks->count();
foreach($tasks as $task) {
$task->run();
$this->sendEvent($task->label);
$this->completed++;
$this->sendEvent();
}
});
$response->headers->set('Content-Type', 'text/event-stream');
$response->headers->set('Cache-Control', 'no-cache');
$response->headers->set('Connection', 'keep-alive');
$response->headers->set('X-Accel-Buffering', 'no');
return $response;
}
}
The headers here are important. Here is a quick breakdown of their purposes:
Content-Type: text/event-stream
- It is important for us to tell the browser that we are sending SSE’s, otherwise it won’t handle the response correctly.
Cache-Control: no-cache
- We don’t want the browser or network to cache the response. If this is aGET
request, this is especially important.
Conection: keep-alive
- We are informing the browser to keep the connection open as long as events are being sent. Otherwise, it might shut the connection down prematurely.
X-Accel-Buffering: no
- This is a header that we are sending to Nginx to prevent it from buffering the response. We want Nginx to send each event to the browser immediately.
There are a few things to be mindful of:
- You must use HTTP/2, otherwise you are severely limited on the number of
EventSource
connections a browser can have open at one time. If you are using Laravel Forge and have SSL set up, then HTTP/2 will be set up by default.
- You want to be mindful of how long or how many
EventSource
connections will be open on your server at one time. EachEventSource
request will consume a free connection from your resource pool until it is ended. So if you have a lot of users at one time, using long-runningEventSource
requests, you server may get overloaded.
- If the user navigates away from the page before the response completes, then it will be interrupted. Depending on your use case, this may end your process early.
- Standard CORS considerations must be made if the API is on another origin.
ini_set('output_buffering', 'off');
. You can also disable buffering in Apache (SetEnv no-gzip 1
) and Nginx (proxy_buffering off; X-Accel-Buffering: no
).And there you have it! A clean, modern way of keeping the browser up to day with real-time events without using WebSockets or polling.