What Bitrate Actually Is
Bitrate is the amount of data used per second of video. Measured in kilobits per second (kbps) or megabits per second (Mbps). A video at 5,000kbps uses 5,000 kilobits of data for every second of playback. A video at 50,000kbps (50Mbps) uses ten times as much. More data per second means more information is available to represent the image - which generally means better quality.
Think of it like painting. If you have 10 seconds to paint a scene and 10,000 brush strokes available, you can capture a lot of detail. If you only have 1,000 brush strokes for the same scene, you have to simplify. The broad shapes are right but the fine detail is gone. Bitrate is the number of brush strokes per second that video gets. Higher bitrate means more detail the encoder can preserve.
Constant Bitrate vs Variable Bitrate
CBR (Constant Bitrate) uses the same amount of data every second regardless of what's happening in the video. A static shot of someone standing still gets the same bitrate as a fast-moving action scene. This is useful for streaming where consistent data rates matter for buffer management.
VBR (Variable Bitrate) allocates more data to complex scenes and less to simple ones. A slow pan across a static background gets a lower bitrate. A fast-moving chase scene gets higher bitrate. VBR produces better quality at the same average file size because it puts data where it's needed and saves it where it isn't. Most video exports from editing software default to VBR for this reason.
For uploading to social platforms, VBR is generally better. For live streaming, CBR is typically required because streaming platforms need predictable data rates.
What Low Bitrate Looks Like
The artefacts from insufficient bitrate are recognisable once you know what to look for. Blocky areas where the encoder couldn't represent smooth gradients, called "macroblocking." Colour banding in skies or gradients where smooth transitions become steppy bands of colour. Motion blur that looks unnatural - fast movement leaves a smeared, pixelated trail rather than clean blur. Fine detail like hair, fabric texture, or text getting soft or disappearing.
This is why platform-compressed video sometimes looks visibly worse than the original source: the platform applied a bitrate ceiling significantly below what the source was encoded at. Instagram, TikTok, and YouTube all have maximum bitrates they serve, which may be lower than what you uploaded.
Recommended Bitrates by Platform
YouTube recommends 8Mbps for 1080p60 video and allows uploads up to 50Mbps or higher. They transcode on their end, but uploading at higher bitrate gives their encoder more information to work with, producing better output quality.
Instagram Reels: their encoding pipeline works well with uploads at 3.5-25Mbps for 1080p. Below 3.5Mbps you're giving Instagram's encoder very little to work with and the output quality suffers. Above 25Mbps sees diminishing returns.
TikTok: 5-30Mbps for 1080p is the practical range. Similar reasoning - uploading at higher bitrate gives TikTok's encoder more information, and the output to viewers tends to be noticeably better than from low-bitrate uploads.
For standard quality exports from editing software at 1080p: 10-20Mbps for most content, up to 25-30Mbps for high-motion content like sports or gaming. These are export targets before platform upload, not delivery targets.
How Resolution and Bitrate Interact
Resolution and bitrate are related. A 4K image has four times as many pixels as a 1080p image. Representing all those pixels requires more bitrate to maintain equivalent quality per pixel. A 4K video at the same bitrate as a 1080p video will look worse than the 1080p version because the encoder has to compress much more visual information into the same data budget.
This is why shooting 4K isn't automatically better than 1080p if bitrate is limited. A 1080p video at 20Mbps will often look better than a 4K video at 8Mbps. Higher resolution needs proportionally higher bitrate to actually benefit from the extra pixels.
Audio Bitrate vs Video Bitrate
Bitrate applies to audio too, though the numbers are much smaller. 128kbps is generally considered the minimum for acceptable audio quality. 192-256kbps is standard for music. 320kbps is high quality and overkill for most purposes. Video platforms typically serve audio at 128-192kbps regardless of what you uploaded.
For podcasts and voice-only audio, 128kbps mono is fine. For music, anything under 192kbps starts to show audible artefacts on good speakers or headphones. The encoder type matters too - AAC at 192kbps sounds noticeably better than MP3 at the same bitrate.
Bitrate and File Size
The relationship is straightforward. Bitrate x duration = file size (roughly). A 10-minute video at 5Mbps is about 375MB. The same video at 50Mbps would be about 3.75GB. Higher bitrate videos are proportionally larger files. This is why streaming services compress so aggressively - serving 4K content at full production bitrate to millions of simultaneous viewers would require impossible amounts of bandwidth.
For understanding how different codecs handle the same bitrate very differently, the codec comparison guide covers AV1, H.265, and H.264 and why the same bitrate produces different quality across them. And for the format context around bitrate, see the video formats guide.
The Misconception That Higher Bitrate Always Means Better Quality
It's not quite that simple. Bitrate is one variable in a codec's quality equation. The codec itself matters hugely. AV1 at 3Mbps can look better than H.264 at 8Mbps on the same content because AV1 is a more efficient codec - it gets more quality out of fewer bits. The relationship is bitrate x codec efficiency = perceived quality. Newer codecs make every bit work harder.
This is why a 4K YouTube video at 15Mbps in AV1 can look exceptional, while a 1080p video at 12Mbps in H.264 from ten years ago might look worse. Same bitrate, completely different perceptual quality because of codec efficiency gains. For more on this, the codec comparison covers the AV1, H.265, and H.264 differences in detail.
Bitrate When Streaming vs Downloading
Streaming platforms deliver video at variable bitrate based on your connection speed - this is what adaptive bitrate streaming means. If your internet slows down, the player drops to a lower quality tier. If it speeds up, it steps up to a higher tier. The number of available tiers varies by platform - YouTube has many small steps, Netflix has fewer but larger ones.
When you download a video through a tool like MyVideoCity, you're typically getting one specific quality tier - usually the best available for that content. That file has a fixed bitrate rather than the adaptive range you get during streaming. This is why downloaded content sometimes looks better than the streamed version of the same clip, especially on connections that couldn't sustain the highest streaming tier.