This is one of icey's cleanest end-to-end paths.
The pipeline is:
MediaCapture -> VideoPacketEncoder -> WebRtcTrackSender -> browserNotably, there is no hidden libwebrtc media stack in the middle. icey owns the capture, encode, and stream graph. libdatachannel owns the transport.
smpl::Client connects to the Symple serverSympleSignaller moves SDP and ICE messagesPeerSession owns the call lifecyclePacketStream owns the media pipelineVideoPacketEncoder turns decoded frames into H.264WebRtcTrackSender pushes encoded packets into the browser trackThat is the whole shape.
The current webcam-streamer sample does this:
Connect the Symple client and join a room
Create a `SympleSignaller` and `PeerSession`
Auto-accept incoming calls
Build the media pipeline once
Start capture and streaming when the session reaches `Active`
Stop the pipeline when the call ends
Do not start pushing media just because you saw call:init. Wait for PeerSession::State::Active. That is when the transport is actually ready.
wrtc::SympleSignaller signaller(client);
wrtc::PeerSession session(signaller, config);
session.IncomingCall += [&](const std::string&) {
session.accept();
};
session.StateChanged += [&](wrtc::PeerSession::State state) {
if (state == wrtc::PeerSession::State::Active)
startStreaming();
else if (state == wrtc::PeerSession::State::Ended)
stopStreaming();
};Then the media side:
PacketStream stream;
stream.attachSource(capture.get(), false, true);
stream.attach(encoder, 1, true);
stream.attach(&session.media().videoSender(), 5, false);That is the real send path used by the sample.
Do not hand-roll browser codec guesses anymore. Use the codec negotiator:
av::VideoCodec videoCodec = wrtc::CodecNegotiator::resolveWebRtcVideoCodec(
av::VideoCodec("H264", "libx264", width, height, fps));That keeps the WebRTC-facing codec setup aligned with what the rest of the module and the samples actually use.
The sample can use:
That makes it useful both for real demos and for deterministic local bring-up when you do not want device negotiation in the loop.
The transport and signalling model are the same either way.
The current intended browser counterpart is the Symple player / call manager flow.
In practice that means:
PeerSession handles the SDP and ICE exchangeThis is not meant to be a generic signalling demo. It is the concrete "stream H.264 to a browser" path.
webcam-streamer for the runnable sample