Back to blog

A Detailed Guide to Flutter-WebRTC

A Detailed Guide to Flutter-WebRTC

You've probably heard of WebRTC - it's a powerful tool that enables real-time communication between web browsers and is perfect for building things like video chat applications. But you might not know that WebRTC can also be used with the Flutter framework to build native iOS and Android apps.

This post will give you a comprehensive guide to using Flutter-WebRTC. We'll cover everything from setting up your development environment to creating a simple video chat app. So if you're ready to get started with Flutter-WebRTC, read on!

What is Flutter?

Flutter is a mobile app SDK for building high-performance, high-fidelity apps for iOS and Android. The Flutter framework makes it easy for you to build user interfaces that react smoothly in response to touches, gestures, and other input events.

Additionally, the Flutter framework provides a rich set of Material Design widgets and capabilities to support animation and graphics.

What is WebRTC?

WebRTC (Web Real-Time Communication) is a free, open-source project that provides web browsers and mobile applications with real-time communication (RTC) capabilities via simple APIs. The WebRTC components have been optimized to serve this purpose best.

What is Flutter-WebRTC?

Flutter-WebRTC is a plugin for the Flutter framework that supports building web applications with real-time video and audio calling capabilities.

With Flutter-WebRTC, you can easily build video call applications without dealing with the underlying technologies' complexities.

How Does Flutter-WebRTC Work?

Flutter-WebRTC uses the W3C WebRTC API and components to allow developers to build cross-platform video call applications. The plugin handles all the heavy lifting, so you can focus on building your app.

Handling Errors And Debugging Your Application

If you run into any errors while using the Flutter-WebRTC plugin, you can check the plugin's GitHub page for known issues and workaround solutions.

You can also enable debug logging to help with troubleshooting your application. To enable debug logging, you need to create a new instance of the WebrtcLog class:

WebrtcLog _webrtcLog;

You can then use the WebrtcLog methods to start and stop logging:

_webrtcLog = WebrtcLog(onMessage: (String message) {});
await _webrtcLog.start();
// To stop logging:
await _webrtcLog.stop();

The WebrtcLog class provides a callback method for when a log message is generated. You can use this callback to display the log message in your app's UI.

Using a Signaling Server

To establish a WebRTC connection, peers need to exchange signaling data. This data typically includes information such as the session description and ICE candidates.

The Flutter-WebRTC plugin does not provide a signaling server, so you will need to implement your own to use the plugin. There are several open source signaling servers that you can use or implement your own using any programming language you like.

Once you have implemented a signaling server, you can use the WebrtcPeerConnection methods to connect and disconnect from it:

_webrtcPeerConnection = WebrtcPeerConnection(
    onSignalingStateChange: (SignalingState state) {},
    onIceGatheringStateChange: (IceGatheringState state) {},
    onIceConnectionStateChange: (IceConnectionState state) {},
);
await _webrtcPeerConnection.connect("ws://localhost:8080");
// To disconnect from the signaling server:
await _webrtcPeerConnection.disconnect();

The WebrtcPeerConnection class provides callback methods for when the signaling state, ICE gathering state, and ICE connection state change. You can use these callbacks to update the UI of your app accordingly.

Advanced Features Of Flutter-WebRTC

The Flutter-WebRTC plugin provides many advanced features that you can use to create more sophisticated applications.

Data Channels

In addition to audio and video data, you can send arbitrary data between peers using data channels. Data channels are useful for file sharing, gaming, and real-time data exchange applications.

To create a data channel, you need to create a new instance of the WebrtcDataChannel class:

WebrtcDataChannel _dataChannel;

You can then use the WebrtcDataChannel methods to connect and disconnect from the data channel:

_dataChannel = WebrtcDataChannel(
    id: "channel_id",
    label: "channel_label",
    onStateChange: (DataChannelState state) {},
);
await _dataChannel.connect();
// To disconnect from the data channel:
await _dataChannel.disconnect();

The WebrtcDataChannel class provides a callback method for when the data channel state changes. You can use this callback to update the UI of your app accordingly.

Screen Capturing

The Flutter-WebRTC plugin supports screen capturing on Android and iOS. Screen capturing allows you to share your entire screen or a specific window with another peer.

To start screen capturing, you need to create a new instance of the WebrtcScreenCapturer class:

WebrtcScreenCapturer _screenCapturer;

You can then use the WebrtcScreenCapturer methods to start and stop capturing your screen:

_screenCapturer = WebrtcScreenCapturer(onFrame: (Bitmap bitmap) {});
await _screenCapturer.initialize();
await _screenCapturer.start();
// To stop capturing your screen:
await _screenCapturer.stop();

The WebrtcScreenCapturer class provides a callback method for when a frame is captured. You can use this callback to update the UI of your app accordingly.

To start WebRTC without Dyte, you'll need to follow a complex process.

First, create a new Flutter project in your IDE or text editor. For this example, we'll call our project "webrtc_example".

Next, open the pubspec.yaml file in your Project and add the following dependencies:

```` dependencies: flutter: SDK: flutter webrtc_plugin: ^0.2.1 ````

The webrtc_plugin package will give us access to the WebRTC API in Flutter. With that installed, we can now start writing some code.

In the main.dart file, import the webrtc_plugin package and create a new class called _WebRTCExampleState that extends StatelessWidget:

import 'package:dyte_client/dyte.dart';
import 'package:dyte_client/dyteMeeting.dart';

class _WebRTCExampleState extends StatelessWidget {
    @override

    Widget build(BuildContext context) {

        return Scaffold(

            appBar: AppBar(

                title: Text('WebRTC Example'),

            ),

            body: Center(

                child: Text('Hello World'),

            ),

        );

    }

}

In the _WebRTCExampleState class, we'll first need to create a Widget that will render our video feed. For this example, we'll use a simple Container widget. Add the following code to the _WebRTCExampleState class:

Widget _videoView() {
    return Container();
}

Now that we have a place to render our video feed, let's set up a way to capture it. Add the following code to the _WebRTCExampleState class:


Future & #60;bool&# 62;
_initVideoCapturer() async {
    final Map & #60;String, dynamic&# 62;
    response = await WebRTC.initialize(options: {
        "audio": true,
        "video": true
    });
    if (response["success"]) {
        final List & #60;dynamic&# 62;
        devices = await WebRTC.getMediaDevices();
        for (var device in devices) {
            if (device["kind"] == "videoInput") {
                final Map & #60;String, dynamic&# 62;
                constraints = {
                    "facingMode": "user",
                    "mandatory": {
                        "minWidth": '640',
                        "minHeight": '480'
                    }
                };
                await WebRTC.selectDevice(deviceId: device["deviceId"], constraints: constraints);
            }
        }
        return true;
    } else {
        return false;
    }
}

This code initializes the WebRTC plugin and then enumerates the available media devices. It then sets up a video capturer with the constraints we specified. These constraints ensure that the video feed is of a high enough quality for our purposes.

Setting Up A Flutter-WebRTC Project

Here is how you can set up a Flutter-Webrtc project:

Installing Flutter And Creating A New Project

The first thing you need to do is install Flutter. You can do this by following the instructions on the official Flutter website.

Once Flutter is installed, you can create a new project by running the following command:

flutter create <project_name>.

This will create a new Flutter project with the specified name.

Adding The Necessary WebRTC Dependencies

Next, you need to add the necessary Webrtc dependencies to your pubspec.yaml file:

dependencies:
flutter:
SDK: flutter
webrtc_plugin: ^0.2.0+1

This will ensure that you have the latest versions of the Webrtc dependencies.

Making A Basic Call With Flutter-WebRTC

Now that you have a Flutter-Webrtc project, you can start building your app.

To make a basic call, you must create a new instance of the WebrtcCall class:

WebrtcCall _call;

You can then use the WebrtcCall methods to establish a connection with a remote peer and send and receive video and audio data

_call = WebrtcCall(
    onSignalingStateChange: (SignalingState state) {},
    onPeersUpdate: (List & #60;dynamic&# 62; peers) {},
    onLocalStream: (MediaStream stream) {},
    onAddRemoteStream: (MediaStream stream) {},
    onRemoveRemoteStream: (MediaStream stream) {},
);
await _call.makeCall('remote_peer_id');

The WebrtcCall methods provide a callback for when the signaling state changes, when the list of peers updates, and when local and remote streams are added or removed. You can use these callbacks to update the UI of your app accordingly.

Establishing A Connection With A Remote Peer

To establish a connection with a remote peer, you need to create a new instance of the WebrtcSignaling class:

WebrtcSignaling _signaling;

You can then use the WebrtcSignaling methods to connect to a signaling server and establish a connection with a remote peer:

_signaling = WebrtcSignaling(
    onStateChange: (SignalingState state) {},
    onPeersUpdate: (List & #60;dynamic&# 62; peers) {},
);
await _signaling.connect('ws://localhost:1234/ws');
await _signaling.call('remote_peer_id');

The WebrtcSignaling class provides callback methods for when the signaling state changes and when the list of peers updates. You can use these callbacks to update the UI of your app accordingly.

Sending And Receiving Video And Audio Data

Once you have established a connection with a remote peer, you can start sending and receiving video and audio data.

To send video data, you need to create a new instance of the WebrtcVideoCapturer class:

WebrtcVideoCapturer_videoCapturer;

You can then use the WebrtcVideoCapturer methods to start and stop capturing video:

_videoCapturer = WebrtcVideoCapturer();

await _videoCapturer.initialize();
await _videoCapturer.startCapture( & #60;MediaStreamConstraints&# 62;
    []);

// To stop capturing video:
await _videoCapturer.stopCapture();

The WebrtcVideoCapturer class provides a callback method for the video capture state changes. You can use this callback to update the UI of your app accordingly.

To send audio data, you need to create a new instance of the WebrtcAudioCapturer class:

WebrtcAudioCapturer_audioCapturer;

You can then use the WebrtcAudioCapturer methods to start and stop capturing audio

_audioCapturer = WebrtcAudioCapturer();

await _audioCapturer.initialize();
await _audioCapturer.startCapture( & #60;MediaStreamConstraints&# 62;
    []);

// To stop capturing audio:
await _audioCapturer.stopCapture();

The WebrtcAudioCapturer class provides a callback method for when the audio capture state changes. You can use this callback to update the UI of your app accordingly.

To receive video and audio data, you need to create a new instance of the WebrtcMediaStream class:

WebrtcMediaStream _mediaStream;

You can then use the WebrtcMediaStream methods to start and stop receiving video and audio data:

_mediaStream = WebrtcMediaStream(onAddRemoteStream: (MediaStream stream) {}, onRemoveRemoteStream: (MediaStream stream) {});

await _mediaStream.initialize();
await _mediaStream.start();

// To stop receiving video and audio data:
await _mediaStream.stop();

The WebrtcMediaStream class provides a callback method for when a remote stream is added or removed. You can use this callback to update the UI of your app accordingly.

Access Camera and Microphone on iOS

To access the camera and microphone on iOS, you need to add the following keys to your Info.plist file:

NSMicrophoneUsageDescription - describe why your app needs access to the microphone

NSCameraUsageDescription - describe why your app needs access to the camera

You also need to add the following key to your Info.plist file:

io.flutter.embedded_views_preview - set this key to true

Android Manifest File Changes To use the Flutter-WebRTC plugin on Android, you need to make the following changes to your AndroidManifest.xml file:

Add the following permissions:

<uses-permission android:name="android.permission.RECORD_AUDIO" >
<uses-permission android:name="android.permission.CAMERA" />
Add the following uses-feature elements:

<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />   

Flutter-WebRTC Demo Example

The following is a complete example of how to use the Flutter-WebRTC plugin:

import 'package:flutter/material.dart';
import 'package:webrtc_plugin/webrtc_plugin.dart';

void main() => runApp(MyApp());

class MyApp extends StatelessWidget {
    // This widget is the root of your application.
    @override
    Widget build(BuildContext context) {

        return MaterialApp(

            home: MyHomePage(),

        );

    }
}
class MyHomePage extends StatefulWidget {

    @override

    _MyHomePageState createState() => _MyHomePageState();

}
class _MyHomePageState extends State & #60;MyHomePage&# 62; {

    WebrtcPlugin _webrtcPlugin = WebrtcPlugin();

    Widget _videoView;

    Widget _switchCameraButton;

    bool _isFrontCamera = true;

    void _onVideoViewCreated(FlutterNativeView view) {

        setState(() {

            this._videoView = view;

        });
    }
    void _initWebrtcPlugin() async {

        await this._webrtcPlugin.initialize();

        await this._webrtcPlugin.enableVideo();

        await this._webrtcPlugin.enableAudio();
    }
    void _startCall() async {

        String id = "call1";

        await this._webrtcPlugin.startCall(id);
    }
    void _endCall() async {

        String id = "call1";

        await this._webrtcPlugin.endCall(id);
    }
    void _switchCamera() async {

        this._isFrontCamera = !this._isFrontCamera;

        await this._webrtcPlugin.switchCamera();
    }
    Widget build(BuildContext context) {

        this._initWebrtcPlugin();

        if (this._videoView == null) {

            return Scaffold(

                appBar: AppBar(

                    title: const Text('Flutter-WebRTC Demo'),
                ),
                body: Center(

                    child: Column(
                        mainAxisAlignment: MainAxisAlignment.center,
                        children: & #60;Widget&# 62;
                        [
                            RaisedButton(
                                onPressed: this._startCall,
                                child: Text("Start Call"), ),
                            SizedBox(height: 10.0),
                            RaisedButton(
                                onPressed: this._endCall,
                                child: Text("End Call"), ),
                        ],

                    ),
                ),

            );

        } else {

            return Scaffold(

                appBar: AppBar(

                    title: const Text('Flutter-WebRTC Demo'),
                ),
                body: Center(

                    child: Stack(

                        children: & #60;Widget&# 62;
                        [
                            this._videoView,
                            Positioned(
                                bottom: 0.0,
                                right: 0.0,
                                child: Container(
                                    child: Row(
                                        children: & #60;Widget&# 62;
                                        [
                                            this._switchCameraButton ?? Container(),
                                        ],

                                    ),

                                ),

                            ),
                        ],

                    ),
                ),

            );

        }
    }
}

Dyte is the most developer-friendly live video & voice SDK platform. Dyte offers a much easier and faster way to integrate live video and voice into your Flutter project. With just a few lines of code, you can have a branded, configurable, and programmable call up and running in no time.

Dyte also offers best-in-class customizations, giving you complete control over layout and permissions. You can enhance the experience with plugins and access end-to-end call logs and quality metrics right in your Dyte dashboard or via REST APIs. This level of detail helps developers troubleshoot any problems during a call and optimize their integrations for the best possible user experience.

Additionally, Dyte is designed to be universally compatible with all devices and platforms. This makes it easy to add live video calls to your app with minimal effort.

Let’s see how you can effortlessly integrate live audio and video into your Flutter project into your Flutter project with Dyte.

1. Import the package.
import 'package:dyte_client/dyte.dart';
import 'package:dyte_client/dyteMeeting.dart';


2. Input relevant information into the DyteMeeting widget.
SizedBox(
    width: ,
    height: ,
    child: DyteMeeting(
        roomName: "",
        authToken: "",
        onInit: (DyteMeetingHandler meeting) async {
          // your handler
        },
    )
)

Why is Dyte the Perfect Choice For Your Flutter Project?

Dyte is built with a focus on flexibility and control, letting you configure permissions, layouts, and plugins with just a few lines of code. This makes it simpler and faster to get your app up and running with less hassle.

One of the key advantages of Dyte is that it offers a single config for all platforms - web or native mobile. This means you can enable consistency across devices, making it easier to manage and update your video chat.

Dyte also provides a range of branding options to match your app's look and feel quickly and easily. Plus, many plugins are available to add extra functionality - such as screen sharing or file transfer. With Dyte, you have everything you need to make your video chat perfect for your needs.

Dyte's one-click add-ons allow users to easily collaborate and share control of videos, whiteboards, and browsers in real-time.

Additionally, Dyte offers powerful analytics that provides insights into how calls are being made and how they can be improved. Finally, Dyte's participant timeline logs provide a detailed step-by-step overview of each call, allowing for better analysis and improvement.

So, boost your Flutter project today! with Dyte’s amazing live video & audio capabilities.

Demo for Full-Screen Meeting Using Dyte

SizedBox is not the only constraining widget you can use; any constraining widget with a maxHeight and maxWidth will work.

The following code can be used for a full screen meeting, for example.

import 'package:flutter/material.dart';
import 'package:dyte_client/dyte.dart';

void main() {
  runApp(MaterialApp(home:MyApp()));
}

class MyApp extends StatefulWidget {
  @override
  _MyAppState createState() => _MyAppState();
}

class _MyAppState extends State {

  @override
  void initState() {
    super.initState();
  }

  @override
  Widget build(BuildContext context) {

    // get the page height, width
    double width = MediaQuery.of(context).size.width;
    double height = MediaQuery.of(context).size.height;

    return Scaffold(
        body: Row(
          children: [
            SizedBox(
              width: width,
              height: height,
              child: DyteMeeting(
                roomName: "",
                authToken: "",
                onInit: (DyteMeetingHandler meeting) async {
                     var self = await meeting.self;
                },
              )
            )
          ],
        ),
    );
  }
}

Effortlessly implement WebRTC in your Flutter App with Dyte

Dyte makes it super easy to add real-time video and voice call functionality to your Flutter app with just a few lines of code. You get complete control over layout and permissions with best-in-class customizations and detailed call analytics via the Dyte dashboard or REST APIs.

Plus, 24x7 support directly via Slack or on call. No more waiting for emails. This is all possible with a single config across all platforms that can be easily matched to your branding. With Dyte, you can focus on building your app rather than on 3rd party SDK dependencies.

So why not try it today? You won't be disappointed.

Don't forget to check the official documentation for more information on using Dyte to develop your app.

Great! Next, complete checkout for full access to Dyte.
Welcome back! You've successfully signed in.
You've successfully subscribed to Dyte.
Success! Your account is fully activated, you now have access to all content.
Success! Your billing info has been updated.
Your billing was not updated.