Official Android Client SDK for LiveKit. Easily add video & audio capabilities to your Android apps.
Docs and guides at https://docs.livekit.io
LiveKit for Android is available as a Maven package.
...
dependencies {
implementation "io.livekit:livekit-android:<version>"
}
LiveKit uses WebRTC-provided org.webrtc.SurfaceViewRenderer
to render video tracks. Subscribed audio tracks are automatically played.
class MainActivity : AppCompatActivity(), RoomListener {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
...
val url = "wss://your_host";
val token = "your_token"
launch {
val room = LiveKit.connect(
applicationContext,
url,
token,
ConnectOptions(),
this
)
val localParticipant = room.localParticipant
localParticipant.setMicrophoneEnabled(true)
localParticipant.setCameraEnabled(true)
attachVideo(videoTrack)
}
}
override fun onTrackSubscribed(
track: Track,
publication: RemoteTrackPublication,
participant: RemoteParticipant,
room: Room
) {
if (track is VideoTrack) {
attachVideo(track)
}
}
private fun attachVideo(videoTrack: VideoTrack) {
// viewBinding.renderer is a `org.webrtc.SurfaceViewRenderer` in your
// layout
videoTrack.addRenderer(viewBinding.renderer)
}
}
LiveKit relies on the RECORD_AUDIO
and CAMERA
permissions to use the microphone and camera.
These permission must be requested at runtime. Reference the sample app for an example.
To develop the Android SDK itself, you'll need:
- Ensure the protocol submodule repo is initialized and updated with
git submodule update --init
- Install Android Studio Arctic Fox 2020.3.1+
- Download webrtc sources from https://webrtc.googlesource.com/src
- Add sources to Android Studio by pointing at the
webrtc/sdk/android
folder.