How To Build a Telegram Clone with SwiftUI
An iOS app like Telegram combines chat and video calling to enhance real-time and async communication for all users. Its thoughtful animations also provide unique and engaging text-based and live connection.
Let's build an iOS/SwiftUI app similar to Telegram by integrating Stream’s iOS chat SDK, iOS video calling SDK, and fine-tuned, expressive, and animated reactions.
Follow the step-by-step guide in this article to enhance your app and build FaceBubble, allowing people to send rich text chats and participate in face-to-face, one-on-one, and group conversations. If you'd prefer watching a video of this tutorial, we have it on YouTube.
Prerequisites
To follow along with this article and complete the tasks involved, you should install the latest version of Xcode, Stream Chat, and Video SDKs. The demo SwiftUI project uses Xcode 15. However, you can use Xcode 14 or any of its later versions.
Download Xcode. The development environment.
Stream Chat SDK: Provides the chat messaging functionality,
Stream Video: Provides audio/video calling support.
Explore the Final Project
The preview above shows the completed SwiftUI chat and video calling app you will create in this tutorial. The accompanying sample app is available on GitHub. Download and explore the features of chat messaging, video calling, and animated reactions.
Chat Messaging Features of the App
The key features of the app's chat messaging support include the following.
Attach Documents and Media: Add images, files, and documents to messages.
Message Composer: A customizable UI for composing messages.
Message Reactions: Send emoji reactions with a ready-made and easily configurable UI.
Offline Support: Browse chat history without the internet. You can also navigate channels and send messages while offline.Customizable Components: Build quickly with customizable and swappable building blocks.
Video Call Features of the App
Like your favorite video calling app, our sample project for this tutorial has the following main features.
Global Edge Network: Calls run on Strem's edge network to ensure optimal latency and scaling to support many call participants.
Group video calls: You can make one-on-one and group calls for large teams.
Picture-in-picture support: During an active call, the local participant's video turns into a draggable rectangular shape, allowing you to move it around the corners of the device’s screen.
The Starter SwiftUI Project
Let's create a new SwiftUI app with Xcode 15 or a later version and add all the Swift files shown in the image above to their corresponding folders. You can use your preferred project name. The image above shows that our app is called FaceBubble. We will add content to each Swift file in the various sections below.
Install the Chat and Video SDKs
After adding all the Swift files in the above section, you should configure the Chat and Video SDKs. We will use Swift Package Manager to add the SDKs to the project. From Xcode's toolbar, click File -> Add Package Dependencies…. Then, copy and paste the URL https://github.com/getstream/stream-chat-swiftui in the search bar on the top right. Follow the next few steps to install it.
You can similarly install the video SDK by clicking File -> Add Package Dependencies…. Copy and paste the URL https://github.com/GetStream/stream-video-swift to fetch the SDK from GitHub and install it.
Configure Permissions: Camera, Microphone, and Photo Library Usage
FaceBubble will need access to the user's camera, microphone, and photo album to perform some of its operations. We, therefore, need to add the privacy configurations, as highlighted in the image above.
Configure the Chat SDK
To access the chat SDK, we need a valid user and initialize the Stream chat client with the user's credentials, such as API key and token. The token can be generated from your server-side for a production app.
You can use your Stream's API key and our token generator service to create a token for testing. Sign up for a free Stream dashboard account to get an API key if you do not have one yet.
Like the Stream Video client, the chat client must be initialized in the location of your App where life cycle events occur. For a SwiftUI app, we can implement it in the App's comformer.
swift
@main
struct FaceBubbleApp: App { }
Open the main project file of the SwiftUI app (FaceBubble.swift
) and replace its content with the following sample code.
swift
// FaceBubbleApp.swift
// FaceBubble
import SwiftUI
import StreamChat
import StreamChatSwiftUI
@main
struct FaceBubbleApp: App {
// MARK: Step 1: Chat Client Setup
// Step 1a: Create a new instance of the SDK's chat client
var chatClient: ChatClient = {
//For the tutorial we use a hard coded api key and application group identifier
var config = ChatClientConfig(apiKey: .init("8br4watad788"))
config.isLocalStorageEnabled = true
config.applicationGroupIdentifier = "group.io.getstream.iOS.ChatDemoAppSwiftUI"
// The resulting config is passed into a new `ChatClient` instance.
let client = ChatClient(config: config)
return client
}()
// Step 1b: Setup StreamChat instance instance
@State var streamChat: StreamChat?
// MARK: Step 2: Connect the User
private func connectUser() {
// This is a hardcoded token valid on Stream's tutorial environment.
let token = try! Token(rawValue: "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoibHVrZV9za3l3YWxrZXIifQ.kFSLHRB5X62t0Zlc7nwczWUfsQMwfkpylC6jCUZ6Mc0")
// Call `connectUser` on our SDK to get started.
chatClient.connectUser(
userInfo: .init(
id: "luke_skywalker",
name: "Luke Skywalker",
imageURL: URL(string: "https://vignette.wikia.nocookie.net/starwars/images/2/20/LukeTLJ.jpg")!
),
token: token
) { error in
if let error = error {
// Some very basic error handling only logging the error.
log.error("connecting the user failed \(error)")
return
}
}
}
// MARK: Step 3: Initialize streamChat with the chat client and add initialization for connectUser
init() {
streamChat = StreamChat(chatClient: chatClient)
connectUser()
}
var body: some Scene {
WindowGroup {
CustomChannelView()
}
}
}
In summary, we import StreaChat
and StreamChatSwiftUI
components and initialize the chat client with a hard-coded API key and token so that when the app launches, the chat SDK becomes ready and accessible.
Note: We are using a hard-coded API key for testing. It would help if you do not use it for a production app.
Lastly, we must connect the user with the connectUser
method. Check out our SwiftUI Chat tutorial for a step-by-step guide on setting up Stream Chat for your iOS app.
Note: We are using a hard-coded API key for testing. It would help if you do not use it for a production app.
Display Incoming and Outgoing Chat Messages.
Chat conversations can be initiated from chat channels or contact lists, as in WhatsApp in a typical messaging app. Let's bypass the chat SDK's default channel list by showing incoming and outgoing chats when the app launches.
In the folder ChatSetUp
, replace the content of CustomChannelView.swift
with the sample code below.
swift
import SwiftUI
import StreamChat
import StreamChatSwiftUI
struct CustomChannelView: View {
@State var channelInfoShown = false
@State var messageDisplayInfo: MessageDisplayInfo?
@StateObject var viewModel: ChatChannelViewModel
@State private var isVideoCalling = false
init() {
_viewModel = StateObject(wrappedValue: ChatChannelViewModel(
channelController: InjectedValues[\.chatClient].channelController(
for: try! ChannelId(cid: "messaging:5A9427AD-E")
))
)
}
var body: some View {
NavigationView {
if let channel = viewModel.channel {
VStack(spacing: 0) {
MessageListView(
factory: DefaultViewFactory.shared,
channel: channel,
messages: viewModel.messages,
messagesGroupingInfo: viewModel.messagesGroupingInfo,
scrolledId: $viewModel.scrolledId,
showScrollToLatestButton: $viewModel.showScrollToLatestButton,
quotedMessage: $viewModel.quotedMessage,
currentDateString: viewModel.currentDateString,
listId: viewModel.listId,
onMessageAppear: viewModel.handleMessageAppear(index:),
onScrollToBottom: viewModel.scrollToLastMessage,
onLongPress: { displayInfo in
messageDisplayInfo = displayInfo
withAnimation {
viewModel.showReactionOverlay(for: AnyView(self))
}
}
)
MessageComposerView(
viewFactory: DefaultViewFactory.shared,
channelController: viewModel.channelController,
quotedMessage: $viewModel.quotedMessage,
editedMessage: $viewModel.editedMessage,
onMessageSent: viewModel.scrollToLastMessage
)
}
.overlay(
viewModel.reactionsShown ?
ReactionsOverlayView(
factory: DefaultViewFactory.shared,
channel: channel,
currentSnapshot: viewModel.currentSnapshot!,
messageDisplayInfo: messageDisplayInfo!,
onBackgroundTap: {
viewModel.reactionsShown = false
messageDisplayInfo = nil
}, onActionExecuted: { actionInfo in
viewModel.messageActionExecuted(actionInfo)
messageDisplayInfo = nil
}
)
.transition(.identity)
.edgesIgnoringSafeArea(.all)
: nil
)
.navigationBarTitleDisplayMode(.inline)
.toolbar {
ToolbarItem(placement: .topBarLeading) {
Button{
isVideoCalling.toggle()
} label: {
Image(systemName: "video.fill")
}
.fullScreenCover(isPresented: $isVideoCalling, content: CallContainerSetup.init)
}
DefaultChatChannelHeader(
channel: channel,
headerImage: InjectedValues[\.utils].channelHeaderLoader.image(for: channel),
isActive: $channelInfoShown
)
}
}
}
}
}
In the code above, we display incoming and outgoing messages in a list, a message composer, and reactions UI. The first screen displays the chat conversation history when you run the app. This composition serves as the home screen. In the screen's top section, we need a button to initiate a video call. We add it with the code snippet below as a leading .toolbar
item.
ToolbarItem(placement: .topBarLeading) {
Button{
isVideoCalling.toggle()
} label: {
Image(systemName: "video.fill")
}
.fullScreenCover(isPresented: $isVideoCalling, content: CallContainerSetup.init)
}
Test the Chat Messaging Features
You will see the screen in the video below when you run the app. You can send messages, reactions, and perform other chat operations like replying to threads. Tapping the attachment button allows you to add videos, images, documents, and files to messages.
Set up the Video SDK
The video SDK requires a valid user and token like the chat SDK. We have generated a user with credentials for running and testing the SwiftUI demo for this tutorial.
Let's create a user and initialize the user object with an API key and token. In the CallSetup folder of the Xcode project navigator, substitute the content of CallContainerSetup.swift
with the following code.
// CallContainerSetup.swift
import SwiftUI
import StreamVideo
import StreamVideoSwiftUI
struct CallContainerSetup: View {
@ObservedObject var viewModel: CallViewModel
private var client: StreamVideo
private let apiKey: String = "mmhfdzb5evj2" // The API key can be found in the Credentials section
private let userId: String = "Biggs_Darklighter" // The User Id can be found in the Credentials section
private let token: String = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoiQmlnZ3NfRGFya2xpZ2h0ZXIiLCJpc3MiOiJodHRwczovL3Byb250by5nZXRzdHJlYW0uaW8iLCJzdWIiOiJ1c2VyL0JpZ2dzX0RhcmtsaWdodGVyIiwiaWF0IjoxNzA0ODEwMjMwLCJleHAiOjE3MDU0MTUwMzV9.5-9C-PJHu16-kSDz7N1B1_xEcASgf0LD1QSbNQpCpIs" // The Token can be found in the Credentials section
private let callId: String = "ZAE5CL4nUaPn" // The CallId can be found in the Credentials section
init() {
let user = User(
id: userId,
name: "Amos G", // name and imageURL are used in the UI
imageURL: .init(string: "https://getstream.io/static/2796a305dd07651fcceb4721a94f4505/a3911/martin-mitrevski.webp")
)
// Initialize Stream Video client
self.client = StreamVideo(
apiKey: apiKey,
user: user,
token: .init(stringLiteral: token)
)
self.viewModel = .init()
}
var body: some View {
NavigationView{
VStack {
if viewModel.call != nil {
//CallContainer(viewFactory: DefaultViewFactory.shared, viewModel: viewModel)
CallContainer(viewFactory: CustomViewFactory(), viewModel: viewModel)
} else {
Text("loading...")
}
}
.ignoresSafeArea()
.onAppear {
Task {
guard viewModel.call == nil else { return }
viewModel.joinCall(callType: .default, callId: callId)
}
}
}
}
}
```
In summary, after initializing the user object, we create and join a call with a `default` `callType` and `callId`.
```swift
Task {
guard viewModel.call == nil else { return }
viewModel.joinCall(callType: .default, callId: callId)
}
In summary, after initializing the user object, we create and join a call with a default
callType
and callId
.
Task {
guard viewModel.call == nil else { return }
viewModel.joinCall(callType: .default, callId: callId)
}
If the call creation and joining is successful, we use the SDK's CallContainer
with a CustomViewFactory
to display the active call screen.
VStack {
if viewModel.call != nil {
//CallContainer(viewFactory: DefaultViewFactory.shared, viewModel: viewModel)
CallContainer(viewFactory: CustomViewFactory(), viewModel: viewModel)
} else {
Text("loading...")
}
}
The CallContainer
renders the app's audio and video calling UIs. The CustomViewFactory
helps us customize the chat and video calling UIs. You can, for example, use it to swap the call controls bar and provide a custom-made message bubble.
The credentials used in the code above have expiration. Therefore, to run and test the app, you should obtain newly generated user credentials from the video calling tutorial in our documentation. It is under the Create & Join a Call section.
Add a Local Call Participant
The local participant view automatically changes to a floating UI during an active call. To render the local call participant UI, you should use the video SDK's VideoRenderer
. In the CallSetup folder, replace the content of FloatingParticipantView.swift
with the following.
// FloatingParticipantView.swift
import Foundation
import SwiftUI
import StreamVideo
import StreamVideoSwiftUI
struct FloatingParticipantView: View {
var participant: CallParticipant?
var size: CGSize = .init(width: 140, height: 180)
var body: some View {
if let participant = participant {
VStack {
HStack {
Spacer()
VideoRendererView(id: participant.id, size: size) { videoRenderer in
videoRenderer.handleViewRendering(for: participant, onTrackSizeUpdate: { _, _ in })
}
.clipShape(RoundedRectangle(cornerRadius: 24))
.frame(width: size.width, height: size.height)
}
Spacer()
}
.padding()
}
}
}
Create Participants’ Screen
Here, we need a UI to display the local and remote participants' videos for an active call. In the CallSetup folder, replace the content of ParticipantsView.swift
with the following.
// ParticipantsView.swift
import SwiftUI
import StreamVideo
import StreamVideoSwiftUI
struct ParticipantsView: View {
var call: Call
var participants: [CallParticipant]
var onChangeTrackVisibility: (CallParticipant?, Bool) -> Void
var body: some View {
GeometryReader { proxy in
if !participants.isEmpty {
ScrollView {
LazyVStack {
if participants.count == 1, let participant = participants.first {
makeCallParticipantView(participant, frame: proxy.frame(in: .global))
.frame(width: proxy.size.width, height: proxy.size.height)
} else {
ForEach(participants) { participant in
makeCallParticipantView(participant, frame: proxy.frame(in: .global))
.frame(width: proxy.size.width, height: proxy.size.height / 2)
}
}
}
}
} else {
Color.black
}
}
.edgesIgnoringSafeArea(.all)
}
@ViewBuilder
private func makeCallParticipantView(_ participant: CallParticipant, frame: CGRect) -> some View {
VideoCallParticipantView(
participant: participant,
availableFrame: frame,
contentMode: .scaleAspectFit,
customData: [:],
call: call
)
.onAppear { onChangeTrackVisibility(participant, true) }
.onDisappear{ onChangeTrackVisibility(participant, false) }
}
}
Add the Video Reaction Animations
In the VideoCustomization folder, the ReactionsView.swift animations display by tapping the emoji button on the call controls bar CallControlsView.swift. The call controls view swaps the SDK's default call controls with a custom view factory implementation CustomUIFactory.swift. Check out each Swift file in the VideoCustomization folder to explore the views and animations in detail. When you tap the ♥️ and 👍 buttons, the emojis animate along with a splash view SplashView.swift.
Perform Messaging Operations
Congratulations! 🎉 👏. This is all we need to do to build our fully functional chat messaging and video calling app with SwiftUI. Let's test and preview what we have created. When the chats UI appears after running the app, you can compose and send messages and react to messages using emoji animations. We can attach media and different types of files to messages and more. Head to our chat documentation to check out all the available features.
Note: To run the app to experience these features, you should replace the user credential with a newly generated user from the iOS video calling tutorial in our documentation, as mentioned previously.
Perform Audio/Video Calling Operations
After running the app, you can tap the video button on the top-left of the messages screen to start a call. Tap the emoji button on the call controls bar to trigger the video reaction animations. In this project, we added the reaction animations to demonstrate how easy it is to add custom SwiftUI and iOS 17 animations to apps powered by our video SDK. Check the iOS documentation's reactions and custom events section to learn more about integrating reactions in a production app.
Where To Go From Here
You now know how to build an iOS app capable of real-time messaging and voice/video calling, like your favorite and famous communication apps, Telegram, WhatsApp, and Messenger. We only unlocked a single out of the many use cases of our chat and video SDKs. Learn more in the related links, YouTube video version of this article and within our chat docs and video docs.
Subscribe to my newsletter
Read articles from Amos Gyamfi directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by