Live Video SDK for Web
Introduction
The Eagle Eye Networks Live Video Web SDK allows you to integrate live video streaming into your project using the native HTML VideoElement. The SDK supports playback of both video and audio and provides a number of customization options to suit your needs.
How it Works
Installation
To install the latest version, run the following command in your project's root directory:
npm i @een/live-video-web-sdk
Here is more information on the NPM package :
Usage
The Live Video Web SDK is designed to be easy to implement and to work with most modern web technologies. It is compatible with all major browsers.
To get started, you need to create a video element in the DOM as well as an instance of the LivePlayer
class using javascript. Then, you'll call the start()
method for the live player with the required configuration. The configuration object should include the following properties:
videoElement
: The video element in the DOM where the video stream will be rendered.cameraId
: The ESN or Camera ID of the camera you want to stream.baseUrl
: The base URL of the Eagle Eye Networks API, including the protocol (https).jwt
: The JSON Web Token (JWT) or access token for authentication.onFrame
: An optional callback function that will be called for each frame of the video stream.
Code Examples
Vanilla Javascript Example
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Document</title>
</head>
<body>
<div class="container">
<video autoplay muted id="videoElement"></video>
</div>
<script type="module">
import LivePlayer from '@een/live-video-web-sdk';
const player = new LivePlayer();
const config = {
videoElement: document.getElementById("videoElement"),
cameraId: "100a850f",
baseUrl: "https://api.c000.eagleeyenetworks.com",
jwt: "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhZG1pbiIsImlhdCI6MTUxNjIzOTAyMn0.1Q2...",
onFrame: (time) => console.log(time),
};
player.start(config);
</script>
</body>
</html>
React Example
import LivePlayer from '@een/live-video-web-sdk'
import React, { useEffect, useRef } from "react";
function App() {
const videoElement = useRef(null);
useEffect(() => {
const config = {
videoElement: videoElement.current,
cameraId: "1003dc62",
baseUrl: "https://api.c000.eagleeyenetworks.com",
jwt: "eyJraWQiOiI2O...",
}
const player = new LivePlayer();
player.start(config);
});
return (
<div className="App">
<div className="container">
<video autoPlay muted ref={videoElement}></video>
</div>
</div>
);
}
export default App;
Vue Example
<template>
<div>
<video id="videoElement" autoplay muted/>
</div>
</template>
<script setup>
import LivePlayer from '@een/live-video-web-sdk'
import {onMounted} from 'vue'
onMounted(() => {
const config = {
videoElement: document.getElementById("videoElement"),
cameraId: "100544a0",
baseUrl: "https://api.c000.eagleeyenetworks.com",
jwt: "eyJraWQiOiI2O..."
}
const player = new LivePlayer();
player.start(config);
})
</script>
Angular Example
import { Component, ElementRef, ViewChild, OnInit } from '@angular/core';
import LivePlayer from '@een/live-video-web-sdk'
@Component({
selector: 'app-video-player',
standalone: true,
template: `
<video #videoElement id="video"></video>
`
})
export class ViewerComponent implements OnInit {
@ViewChild('videoElement', { static: false }) videoElement!: ElementRef<HTMLVideoElement>;
ngOnInit() {
const video: HTMLVideoElement = this.videoElement.nativeElement;
const config = {
videoElement: video,
cameraId: "100544a0",
baseUrl: "https://api.c000.eagleeyenetworks.com",
jwt: "eyJraWQiOiI2O..."
}
player = new LivePlayer();
player.start(config);
}
}
Additional Documentaion
Configuration Object
It’s recommended to configure the player when calling start()
to make sure the configuration is still valid. A configuration file can also be provided on initialization.
dictionary livePlayerConfig {
videoElement: HTMLVideoElement (required) // render frames into a video element using MediaStreamTrack API
videoTech: "WebCodecs" (default) | "FLV"
cameraId: string // 6-digit camera ESN
baseUrl: string // example: "https://api.c000.eagleeyenetworks.com"
feedUrl: string // as retrieved via the feeds endpoint
jwt: string (required) // EEN JWT or access_token
onFrame: callback(EEN timestamp <number>) // returns the number of milliseconds since the epoch, which is defined as the midnight at the beginning of January 1, 1970, UTC
onStop: callback() // called when videostream has been stopped
onAudio: callback() // called when audio is detected
maxBuffer: unsigned short // milliseconds, defaults to 3000
minBuffer: unsigned short // milliseconds, defaults to 1000
};
Please note that WebCodecs
will only work on Chromium browsers such as Google Chrome and Microsoft Edge. If you want to use this SDK on other browsers change the videoTech
to FLV
.
Live Player Interface
interface VideoPlayer {
start(livePlayerConfig) // starts playback
stop() // stops playback
maxBuffer unsigned short // can be adjusted during playback
minBuffer unsigned short // can be adjusted during playback
// audio supported on multiPart only
getAudioMuted() boolean // defaults to true
setAudioMuted(boolean)
setMaxBuffer(unsigned short milliseconds)
setMinBuffer(unsigned short milliseconds)
getBufferLength() returns bufferLength in milliseconds
getAudioVolume() octet // range between 0 and 1
setAudioVolume() octet // range between 0 and 1
getAudioPlaybackError() boolean // defaults to false, true when audio playback encountered an error
isPlaying() boolean // indicates if the video is still playing
}
Updated 3 months ago